Mar 09 02:41:14 crc systemd[1]: Starting Kubernetes Kubelet... Mar 09 02:41:14 crc restorecon[4745]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 02:41:14 crc restorecon[4745]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 02:41:15 crc restorecon[4745]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 02:41:15 crc restorecon[4745]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 09 02:41:15 crc kubenswrapper[4901]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 09 02:41:15 crc kubenswrapper[4901]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 09 02:41:15 crc kubenswrapper[4901]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 09 02:41:15 crc kubenswrapper[4901]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 09 02:41:15 crc kubenswrapper[4901]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 09 02:41:15 crc kubenswrapper[4901]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.851823 4901 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.857785 4901 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.857818 4901 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.857828 4901 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.857838 4901 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.857849 4901 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.857861 4901 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.857873 4901 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.857883 4901 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.857891 4901 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.857921 4901 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.857930 4901 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.857941 4901 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.857950 4901 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.857958 4901 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.857968 4901 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.857978 4901 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.857986 4901 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.857995 4901 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.858003 4901 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.858011 4901 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.858018 4901 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.858027 4901 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.858037 4901 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.858046 4901 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.858054 4901 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.858065 4901 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.858073 4901 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.858081 4901 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.858088 4901 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.858096 4901 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.858104 4901 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.858111 4901 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.858119 4901 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.858127 4901 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.858135 4901 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.858142 4901 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.858150 4901 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.858159 4901 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.858168 4901 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.858176 4901 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.858184 4901 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.858192 4901 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.858200 4901 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.858208 4901 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.858215 4901 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.858249 4901 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.858257 4901 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.858265 4901 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.858273 4901 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.858280 4901 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.858288 4901 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.858295 4901 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.858303 4901 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.858311 4901 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.858318 4901 feature_gate.go:330] unrecognized feature gate: Example Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.858326 4901 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.858334 4901 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.858343 4901 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.858352 4901 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.858359 4901 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.858367 4901 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.858374 4901 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.858382 4901 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.858389 4901 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.858397 4901 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.858404 4901 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.858412 4901 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.858422 4901 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.858430 4901 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.858438 4901 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.858445 4901 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.858572 4901 flags.go:64] FLAG: --address="0.0.0.0" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.858590 4901 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.858603 4901 flags.go:64] FLAG: --anonymous-auth="true" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.858614 4901 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.858626 4901 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.858635 4901 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.858649 4901 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.858659 4901 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.858669 4901 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.858678 4901 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.858688 4901 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.858697 4901 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.858706 4901 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.858714 4901 flags.go:64] FLAG: --cgroup-root="" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.858723 4901 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.858732 4901 flags.go:64] FLAG: --client-ca-file="" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.858740 4901 flags.go:64] FLAG: --cloud-config="" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.858749 4901 flags.go:64] FLAG: --cloud-provider="" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.858759 4901 flags.go:64] FLAG: --cluster-dns="[]" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.858769 4901 flags.go:64] FLAG: --cluster-domain="" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.858778 4901 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.858787 4901 flags.go:64] FLAG: --config-dir="" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.858796 4901 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.858805 4901 flags.go:64] FLAG: --container-log-max-files="5" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.858816 4901 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.858825 4901 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.858835 4901 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.858844 4901 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.858853 4901 flags.go:64] FLAG: --contention-profiling="false" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.858863 4901 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.858872 4901 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.858881 4901 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.858890 4901 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.858900 4901 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.858910 4901 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.858918 4901 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.858927 4901 flags.go:64] FLAG: --enable-load-reader="false" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.858937 4901 flags.go:64] FLAG: --enable-server="true" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.858946 4901 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.858958 4901 flags.go:64] FLAG: --event-burst="100" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.858967 4901 flags.go:64] FLAG: --event-qps="50" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.858976 4901 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.858985 4901 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.858993 4901 flags.go:64] FLAG: --eviction-hard="" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859004 4901 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859013 4901 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859022 4901 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859031 4901 flags.go:64] FLAG: --eviction-soft="" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859040 4901 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859048 4901 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859058 4901 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859067 4901 flags.go:64] FLAG: --experimental-mounter-path="" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859076 4901 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859085 4901 flags.go:64] FLAG: --fail-swap-on="true" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859094 4901 flags.go:64] FLAG: --feature-gates="" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859105 4901 flags.go:64] FLAG: --file-check-frequency="20s" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859114 4901 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859124 4901 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859133 4901 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859143 4901 flags.go:64] FLAG: --healthz-port="10248" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859152 4901 flags.go:64] FLAG: --help="false" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859161 4901 flags.go:64] FLAG: --hostname-override="" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859169 4901 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859178 4901 flags.go:64] FLAG: --http-check-frequency="20s" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859187 4901 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859196 4901 flags.go:64] FLAG: --image-credential-provider-config="" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859205 4901 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859213 4901 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859251 4901 flags.go:64] FLAG: --image-service-endpoint="" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859261 4901 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859270 4901 flags.go:64] FLAG: --kube-api-burst="100" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859279 4901 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859288 4901 flags.go:64] FLAG: --kube-api-qps="50" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859298 4901 flags.go:64] FLAG: --kube-reserved="" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859307 4901 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859315 4901 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859324 4901 flags.go:64] FLAG: --kubelet-cgroups="" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859333 4901 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859342 4901 flags.go:64] FLAG: --lock-file="" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859350 4901 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859359 4901 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859369 4901 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859383 4901 flags.go:64] FLAG: --log-json-split-stream="false" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859392 4901 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859401 4901 flags.go:64] FLAG: --log-text-split-stream="false" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859409 4901 flags.go:64] FLAG: --logging-format="text" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859418 4901 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859428 4901 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859436 4901 flags.go:64] FLAG: --manifest-url="" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859444 4901 flags.go:64] FLAG: --manifest-url-header="" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859455 4901 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859465 4901 flags.go:64] FLAG: --max-open-files="1000000" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859475 4901 flags.go:64] FLAG: --max-pods="110" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859484 4901 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859494 4901 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859502 4901 flags.go:64] FLAG: --memory-manager-policy="None" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859511 4901 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859520 4901 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859529 4901 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859538 4901 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859556 4901 flags.go:64] FLAG: --node-status-max-images="50" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859565 4901 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859574 4901 flags.go:64] FLAG: --oom-score-adj="-999" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859583 4901 flags.go:64] FLAG: --pod-cidr="" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859592 4901 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859604 4901 flags.go:64] FLAG: --pod-manifest-path="" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859613 4901 flags.go:64] FLAG: --pod-max-pids="-1" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859622 4901 flags.go:64] FLAG: --pods-per-core="0" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859631 4901 flags.go:64] FLAG: --port="10250" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859643 4901 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859652 4901 flags.go:64] FLAG: --provider-id="" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859660 4901 flags.go:64] FLAG: --qos-reserved="" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859669 4901 flags.go:64] FLAG: --read-only-port="10255" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859679 4901 flags.go:64] FLAG: --register-node="true" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859687 4901 flags.go:64] FLAG: --register-schedulable="true" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859696 4901 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859710 4901 flags.go:64] FLAG: --registry-burst="10" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859720 4901 flags.go:64] FLAG: --registry-qps="5" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859728 4901 flags.go:64] FLAG: --reserved-cpus="" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859737 4901 flags.go:64] FLAG: --reserved-memory="" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859747 4901 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859756 4901 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859765 4901 flags.go:64] FLAG: --rotate-certificates="false" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859774 4901 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859783 4901 flags.go:64] FLAG: --runonce="false" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859791 4901 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859800 4901 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859810 4901 flags.go:64] FLAG: --seccomp-default="false" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859818 4901 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859827 4901 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859836 4901 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859845 4901 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859854 4901 flags.go:64] FLAG: --storage-driver-password="root" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859862 4901 flags.go:64] FLAG: --storage-driver-secure="false" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859871 4901 flags.go:64] FLAG: --storage-driver-table="stats" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859880 4901 flags.go:64] FLAG: --storage-driver-user="root" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859888 4901 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859897 4901 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859906 4901 flags.go:64] FLAG: --system-cgroups="" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859915 4901 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859928 4901 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859937 4901 flags.go:64] FLAG: --tls-cert-file="" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859947 4901 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859957 4901 flags.go:64] FLAG: --tls-min-version="" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859965 4901 flags.go:64] FLAG: --tls-private-key-file="" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859975 4901 flags.go:64] FLAG: --topology-manager-policy="none" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859984 4901 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.859993 4901 flags.go:64] FLAG: --topology-manager-scope="container" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.860002 4901 flags.go:64] FLAG: --v="2" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.860013 4901 flags.go:64] FLAG: --version="false" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.860023 4901 flags.go:64] FLAG: --vmodule="" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.860033 4901 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.860042 4901 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860270 4901 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860282 4901 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860291 4901 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860301 4901 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860309 4901 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860318 4901 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860326 4901 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860334 4901 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860342 4901 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860349 4901 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860357 4901 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860370 4901 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860378 4901 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860386 4901 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860393 4901 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860401 4901 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860409 4901 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860416 4901 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860424 4901 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860449 4901 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860459 4901 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860469 4901 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860477 4901 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860485 4901 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860494 4901 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860503 4901 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860512 4901 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860520 4901 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860529 4901 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860537 4901 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860545 4901 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860553 4901 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860561 4901 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860568 4901 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860576 4901 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860583 4901 feature_gate.go:330] unrecognized feature gate: Example Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860591 4901 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860602 4901 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860611 4901 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860620 4901 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860630 4901 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860638 4901 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860646 4901 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860658 4901 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860665 4901 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860673 4901 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860681 4901 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860689 4901 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860696 4901 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860704 4901 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860711 4901 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860721 4901 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860729 4901 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860738 4901 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860746 4901 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860753 4901 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860763 4901 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860773 4901 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860782 4901 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860791 4901 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860799 4901 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860808 4901 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860816 4901 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860825 4901 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860833 4901 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860841 4901 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860848 4901 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860856 4901 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860864 4901 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860872 4901 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.860879 4901 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.860891 4901 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.871401 4901 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.871463 4901 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871552 4901 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871561 4901 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871566 4901 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871572 4901 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871577 4901 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871580 4901 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871585 4901 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871589 4901 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871592 4901 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871596 4901 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871600 4901 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871603 4901 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871607 4901 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871611 4901 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871614 4901 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871617 4901 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871621 4901 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871626 4901 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871632 4901 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871636 4901 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871640 4901 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871643 4901 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871647 4901 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871651 4901 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871654 4901 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871658 4901 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871661 4901 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871665 4901 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871668 4901 feature_gate.go:330] unrecognized feature gate: Example Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871673 4901 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871679 4901 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871683 4901 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871687 4901 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871691 4901 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871702 4901 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871706 4901 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871712 4901 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871720 4901 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871724 4901 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871728 4901 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871732 4901 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871735 4901 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871739 4901 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871743 4901 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871747 4901 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871751 4901 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871755 4901 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871760 4901 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871764 4901 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871767 4901 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871772 4901 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871776 4901 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871780 4901 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871785 4901 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871789 4901 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871794 4901 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871798 4901 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871802 4901 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871807 4901 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871811 4901 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871818 4901 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871823 4901 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871828 4901 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871832 4901 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871836 4901 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871840 4901 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871843 4901 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871847 4901 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871851 4901 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871854 4901 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.871866 4901 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.871875 4901 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872085 4901 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872101 4901 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872107 4901 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872112 4901 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872117 4901 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872121 4901 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872126 4901 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872132 4901 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872140 4901 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872162 4901 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872168 4901 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872173 4901 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872178 4901 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872184 4901 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872188 4901 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872193 4901 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872197 4901 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872201 4901 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872204 4901 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872209 4901 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872213 4901 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872220 4901 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872238 4901 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872243 4901 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872248 4901 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872253 4901 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872257 4901 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872261 4901 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872266 4901 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872273 4901 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872277 4901 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872282 4901 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872286 4901 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872291 4901 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872305 4901 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872310 4901 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872315 4901 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872319 4901 feature_gate.go:330] unrecognized feature gate: Example Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872323 4901 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872328 4901 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872331 4901 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872337 4901 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872341 4901 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872344 4901 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872348 4901 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872352 4901 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872357 4901 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872361 4901 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872365 4901 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872369 4901 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872373 4901 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872377 4901 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872380 4901 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872384 4901 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872387 4901 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872391 4901 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872395 4901 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872399 4901 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872402 4901 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872406 4901 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872409 4901 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872413 4901 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872416 4901 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872420 4901 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872423 4901 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872427 4901 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872430 4901 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872433 4901 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872437 4901 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872440 4901 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 09 02:41:15 crc kubenswrapper[4901]: W0309 02:41:15.872450 4901 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.872458 4901 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.873490 4901 server.go:940] "Client rotation is on, will bootstrap in background" Mar 09 02:41:15 crc kubenswrapper[4901]: E0309 02:41:15.877073 4901 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.882125 4901 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.882303 4901 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.886743 4901 server.go:997] "Starting client certificate rotation" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.886793 4901 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.887406 4901 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.914060 4901 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 09 02:41:15 crc kubenswrapper[4901]: E0309 02:41:15.919997 4901 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.20:6443: connect: connection refused" logger="UnhandledError" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.921471 4901 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.939824 4901 log.go:25] "Validated CRI v1 runtime API" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.980331 4901 log.go:25] "Validated CRI v1 image API" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.982288 4901 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.987398 4901 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-09-02-36-40-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 09 02:41:15 crc kubenswrapper[4901]: I0309 02:41:15.987435 4901 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.003747 4901 manager.go:217] Machine: {Timestamp:2026-03-09 02:41:16.001034068 +0000 UTC m=+0.590697820 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:92bcf75b-bfed-4296-bbf2-d35c6ac3a586 BootID:4fb5477d-c9aa-418f-9a0d-560ac0227b13 Filesystems:[{Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:37:27:2b Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:37:27:2b Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:4a:1a:9d Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:18:34:1d Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:6b:36:a6 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:8b:d2:62 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:b9:c4:81 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:96:6e:02:6b:04:ec Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:66:f9:3c:83:31:89 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.004030 4901 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.004252 4901 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.004643 4901 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.004846 4901 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.004920 4901 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.005247 4901 topology_manager.go:138] "Creating topology manager with none policy" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.005260 4901 container_manager_linux.go:303] "Creating device plugin manager" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.005803 4901 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.005837 4901 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.006687 4901 state_mem.go:36] "Initialized new in-memory state store" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.006792 4901 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.011507 4901 kubelet.go:418] "Attempting to sync node with API server" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.011535 4901 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.011557 4901 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.011576 4901 kubelet.go:324] "Adding apiserver pod source" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.011592 4901 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.016515 4901 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.020014 4901 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 09 02:41:16 crc kubenswrapper[4901]: W0309 02:41:16.021528 4901 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.20:6443: connect: connection refused Mar 09 02:41:16 crc kubenswrapper[4901]: E0309 02:41:16.021747 4901 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.20:6443: connect: connection refused" logger="UnhandledError" Mar 09 02:41:16 crc kubenswrapper[4901]: W0309 02:41:16.022298 4901 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.20:6443: connect: connection refused Mar 09 02:41:16 crc kubenswrapper[4901]: E0309 02:41:16.023452 4901 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.20:6443: connect: connection refused" logger="UnhandledError" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.025311 4901 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.027390 4901 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.027435 4901 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.027452 4901 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.027469 4901 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.027492 4901 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.027505 4901 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.027519 4901 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.027542 4901 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.027558 4901 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.027574 4901 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.027614 4901 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.027629 4901 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.028686 4901 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.029480 4901 server.go:1280] "Started kubelet" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.030550 4901 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.030696 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.20:6443: connect: connection refused Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.030702 4901 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.031838 4901 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 09 02:41:16 crc systemd[1]: Started Kubernetes Kubelet. Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.035479 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.035527 4901 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.035763 4901 server.go:460] "Adding debug handlers to kubelet server" Mar 09 02:41:16 crc kubenswrapper[4901]: E0309 02:41:16.035983 4901 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.036054 4901 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.036085 4901 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.036168 4901 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 09 02:41:16 crc kubenswrapper[4901]: E0309 02:41:16.036642 4901 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.20:6443: connect: connection refused" interval="200ms" Mar 09 02:41:16 crc kubenswrapper[4901]: W0309 02:41:16.037263 4901 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.20:6443: connect: connection refused Mar 09 02:41:16 crc kubenswrapper[4901]: E0309 02:41:16.037437 4901 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.20:6443: connect: connection refused" logger="UnhandledError" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.037976 4901 factory.go:55] Registering systemd factory Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.038016 4901 factory.go:221] Registration of the systemd container factory successfully Mar 09 02:41:16 crc kubenswrapper[4901]: E0309 02:41:16.037264 4901 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.20:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189b0c00d5f31a5c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:16.02943446 +0000 UTC m=+0.619098222,LastTimestamp:2026-03-09 02:41:16.02943446 +0000 UTC m=+0.619098222,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.038507 4901 factory.go:153] Registering CRI-O factory Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.038544 4901 factory.go:221] Registration of the crio container factory successfully Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.038674 4901 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.038731 4901 factory.go:103] Registering Raw factory Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.038769 4901 manager.go:1196] Started watching for new ooms in manager Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.040408 4901 manager.go:319] Starting recovery of all containers Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.057181 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.057292 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.057316 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.057346 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.057368 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.057397 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.057464 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.057490 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.057521 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.057549 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.057568 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.057588 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.057609 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.057638 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.057659 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.057681 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.057706 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.057765 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.057787 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.057808 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.057838 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.057857 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.057877 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.057902 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.057925 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.057950 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.057974 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.057999 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.058054 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.058076 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.058099 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.058118 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.058136 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.058162 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.058181 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.058203 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.058255 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.058277 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.058297 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.058316 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.058338 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.058364 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.058384 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.058428 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.058453 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.058475 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.058496 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.058517 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.058541 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.058563 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.058583 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.058603 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.058631 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.058652 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.058675 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.058698 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.058720 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.058748 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.058770 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.058791 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.058812 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.058832 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.058851 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.058872 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.058892 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.058913 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.058932 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.058952 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.058973 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.058994 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.059015 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.059038 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.059057 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.059077 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.059097 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.059119 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.059139 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.059159 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.059180 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.059202 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.059250 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.059274 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.059294 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.059316 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.059335 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.059354 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.059375 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.059429 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.059456 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.059479 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.059499 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.059573 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.059596 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.059616 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.059639 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.059659 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.059681 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.059702 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.059724 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.059750 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.059773 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.059799 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.059823 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.059847 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.059878 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.059908 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.059930 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.059951 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.059970 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.059991 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.060011 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.060031 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.060054 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.060075 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.060096 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.060147 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.060166 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.060188 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.060211 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.060268 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.060287 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.060308 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.060327 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.060348 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.060368 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.060392 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.060413 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.060432 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.060452 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.060473 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.060493 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.060517 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.060537 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.060557 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.060582 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.060654 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.060675 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.060700 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.060721 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.060742 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.060760 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.060777 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.060797 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.060854 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.060876 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.060897 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.060918 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.061040 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.061069 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.061087 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.061106 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.061126 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.061145 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.061168 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.061187 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.061206 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.061250 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.061268 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.061289 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.061310 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.061329 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.061349 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.061369 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.061390 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.061413 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.061451 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.061477 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.061533 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.061555 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.061576 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.061598 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.061619 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.061638 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.061657 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.061676 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.061697 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.061744 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.061765 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.061786 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.061804 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.061827 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.061847 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.066386 4901 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.066478 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.066501 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.066518 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.066534 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.066550 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.066566 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.066579 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.066593 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.066610 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.066626 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.066641 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.066654 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.066671 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.066686 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.066702 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.066717 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.066730 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.066749 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.066764 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.066778 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.066796 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.066810 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.066826 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.066840 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.066857 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.066868 4901 reconstruct.go:97] "Volume reconstruction finished" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.066881 4901 reconciler.go:26] "Reconciler: start to sync state" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.083794 4901 manager.go:324] Recovery completed Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.101432 4901 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.102145 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.104337 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.104409 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.104432 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.104855 4901 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.104924 4901 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.104970 4901 kubelet.go:2335] "Starting kubelet main sync loop" Mar 09 02:41:16 crc kubenswrapper[4901]: E0309 02:41:16.105144 4901 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.108721 4901 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.108787 4901 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 09 02:41:16 crc kubenswrapper[4901]: W0309 02:41:16.108738 4901 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.20:6443: connect: connection refused Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.108820 4901 state_mem.go:36] "Initialized new in-memory state store" Mar 09 02:41:16 crc kubenswrapper[4901]: E0309 02:41:16.108844 4901 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.20:6443: connect: connection refused" logger="UnhandledError" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.132117 4901 policy_none.go:49] "None policy: Start" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.133343 4901 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.133383 4901 state_mem.go:35] "Initializing new in-memory state store" Mar 09 02:41:16 crc kubenswrapper[4901]: E0309 02:41:16.136815 4901 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.181340 4901 manager.go:334] "Starting Device Plugin manager" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.181398 4901 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.181416 4901 server.go:79] "Starting device plugin registration server" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.182002 4901 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.182029 4901 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.182494 4901 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.182581 4901 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.182591 4901 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 09 02:41:16 crc kubenswrapper[4901]: E0309 02:41:16.196115 4901 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.205515 4901 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.205662 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.207323 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.207354 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.207367 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.207528 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.208395 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.208444 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.212070 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.212101 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.212113 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.212295 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.212350 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.212378 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.212622 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.212801 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.212865 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.214251 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.214376 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.214493 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.214293 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.214696 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.214740 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.214984 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.215143 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.215207 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.216941 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.216981 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.216997 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.217202 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.217371 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.217503 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.217412 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.217884 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.217292 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.219848 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.219906 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.219924 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.220321 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.220390 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.220922 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.221519 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.221565 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.221583 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.221770 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.221797 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:16 crc kubenswrapper[4901]: E0309 02:41:16.237589 4901 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.20:6443: connect: connection refused" interval="400ms" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.268545 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.268599 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.268647 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.268685 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.268719 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.268750 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.268782 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.268813 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.268892 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.268961 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.269028 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.269092 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.269156 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.269210 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.269304 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.282405 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.283953 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.284021 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.284041 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.284085 4901 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 02:41:16 crc kubenswrapper[4901]: E0309 02:41:16.284835 4901 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.20:6443: connect: connection refused" node="crc" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.370707 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.370777 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.370813 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.370848 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.370883 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.370914 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.370947 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.371011 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.370970 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.371022 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.371134 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.371162 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.371173 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.371182 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.371192 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.371313 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.371382 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.371370 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.371448 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.371513 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.371549 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.371614 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.371627 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.371684 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.371686 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.371729 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.371757 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.371796 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.371802 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.371831 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.485696 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.488056 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.488114 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.488131 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.488167 4901 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 02:41:16 crc kubenswrapper[4901]: E0309 02:41:16.488946 4901 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.20:6443: connect: connection refused" node="crc" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.541271 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.560258 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.578538 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.595050 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 02:41:16 crc kubenswrapper[4901]: W0309 02:41:16.603879 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-a8be76bcc16aab83d1cd1b64b2644b7a2dd866621a7f39650ac7b2103c765844 WatchSource:0}: Error finding container a8be76bcc16aab83d1cd1b64b2644b7a2dd866621a7f39650ac7b2103c765844: Status 404 returned error can't find the container with id a8be76bcc16aab83d1cd1b64b2644b7a2dd866621a7f39650ac7b2103c765844 Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.604254 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 09 02:41:16 crc kubenswrapper[4901]: W0309 02:41:16.604704 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-b7f4547a6f6966bacf8dc0e12b28c24a31e02908ead090683eefcc7c2b0c04f6 WatchSource:0}: Error finding container b7f4547a6f6966bacf8dc0e12b28c24a31e02908ead090683eefcc7c2b0c04f6: Status 404 returned error can't find the container with id b7f4547a6f6966bacf8dc0e12b28c24a31e02908ead090683eefcc7c2b0c04f6 Mar 09 02:41:16 crc kubenswrapper[4901]: W0309 02:41:16.610286 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-4bca778df96eb80dedf582089a308f5b283934c7b5083416512a978d96eb1d19 WatchSource:0}: Error finding container 4bca778df96eb80dedf582089a308f5b283934c7b5083416512a978d96eb1d19: Status 404 returned error can't find the container with id 4bca778df96eb80dedf582089a308f5b283934c7b5083416512a978d96eb1d19 Mar 09 02:41:16 crc kubenswrapper[4901]: W0309 02:41:16.618653 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-f81de63807c4fc5f747289fdc31efcb7d8dcf54e2ae848b05c330e3368421e86 WatchSource:0}: Error finding container f81de63807c4fc5f747289fdc31efcb7d8dcf54e2ae848b05c330e3368421e86: Status 404 returned error can't find the container with id f81de63807c4fc5f747289fdc31efcb7d8dcf54e2ae848b05c330e3368421e86 Mar 09 02:41:16 crc kubenswrapper[4901]: W0309 02:41:16.622086 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-3cd6fbe79b0bea350f9a8d43dd585990647f08ed5cd51d4ec1ef47e28aa7dd94 WatchSource:0}: Error finding container 3cd6fbe79b0bea350f9a8d43dd585990647f08ed5cd51d4ec1ef47e28aa7dd94: Status 404 returned error can't find the container with id 3cd6fbe79b0bea350f9a8d43dd585990647f08ed5cd51d4ec1ef47e28aa7dd94 Mar 09 02:41:16 crc kubenswrapper[4901]: E0309 02:41:16.639155 4901 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.20:6443: connect: connection refused" interval="800ms" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.890058 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.892698 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.892752 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.892764 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:16 crc kubenswrapper[4901]: I0309 02:41:16.892796 4901 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 02:41:16 crc kubenswrapper[4901]: E0309 02:41:16.893627 4901 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.20:6443: connect: connection refused" node="crc" Mar 09 02:41:17 crc kubenswrapper[4901]: I0309 02:41:17.032267 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.20:6443: connect: connection refused Mar 09 02:41:17 crc kubenswrapper[4901]: I0309 02:41:17.114558 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f81de63807c4fc5f747289fdc31efcb7d8dcf54e2ae848b05c330e3368421e86"} Mar 09 02:41:17 crc kubenswrapper[4901]: I0309 02:41:17.116717 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"4bca778df96eb80dedf582089a308f5b283934c7b5083416512a978d96eb1d19"} Mar 09 02:41:17 crc kubenswrapper[4901]: I0309 02:41:17.118677 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"a8be76bcc16aab83d1cd1b64b2644b7a2dd866621a7f39650ac7b2103c765844"} Mar 09 02:41:17 crc kubenswrapper[4901]: I0309 02:41:17.120342 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b7f4547a6f6966bacf8dc0e12b28c24a31e02908ead090683eefcc7c2b0c04f6"} Mar 09 02:41:17 crc kubenswrapper[4901]: I0309 02:41:17.122546 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3cd6fbe79b0bea350f9a8d43dd585990647f08ed5cd51d4ec1ef47e28aa7dd94"} Mar 09 02:41:17 crc kubenswrapper[4901]: W0309 02:41:17.155124 4901 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.20:6443: connect: connection refused Mar 09 02:41:17 crc kubenswrapper[4901]: E0309 02:41:17.155315 4901 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.20:6443: connect: connection refused" logger="UnhandledError" Mar 09 02:41:17 crc kubenswrapper[4901]: W0309 02:41:17.164626 4901 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.20:6443: connect: connection refused Mar 09 02:41:17 crc kubenswrapper[4901]: E0309 02:41:17.164816 4901 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.20:6443: connect: connection refused" logger="UnhandledError" Mar 09 02:41:17 crc kubenswrapper[4901]: W0309 02:41:17.268497 4901 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.20:6443: connect: connection refused Mar 09 02:41:17 crc kubenswrapper[4901]: E0309 02:41:17.268715 4901 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.20:6443: connect: connection refused" logger="UnhandledError" Mar 09 02:41:17 crc kubenswrapper[4901]: W0309 02:41:17.291349 4901 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.20:6443: connect: connection refused Mar 09 02:41:17 crc kubenswrapper[4901]: E0309 02:41:17.291456 4901 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.20:6443: connect: connection refused" logger="UnhandledError" Mar 09 02:41:17 crc kubenswrapper[4901]: E0309 02:41:17.440844 4901 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.20:6443: connect: connection refused" interval="1.6s" Mar 09 02:41:17 crc kubenswrapper[4901]: I0309 02:41:17.693798 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:17 crc kubenswrapper[4901]: I0309 02:41:17.695248 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:17 crc kubenswrapper[4901]: I0309 02:41:17.695285 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:17 crc kubenswrapper[4901]: I0309 02:41:17.695296 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:17 crc kubenswrapper[4901]: I0309 02:41:17.695319 4901 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 02:41:17 crc kubenswrapper[4901]: E0309 02:41:17.695608 4901 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.20:6443: connect: connection refused" node="crc" Mar 09 02:41:18 crc kubenswrapper[4901]: I0309 02:41:18.032128 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.20:6443: connect: connection refused Mar 09 02:41:18 crc kubenswrapper[4901]: I0309 02:41:18.111813 4901 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 09 02:41:18 crc kubenswrapper[4901]: E0309 02:41:18.113063 4901 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.20:6443: connect: connection refused" logger="UnhandledError" Mar 09 02:41:18 crc kubenswrapper[4901]: I0309 02:41:18.127995 4901 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="f80cb96192c8d0dc32db69a7503fe1ee949c1e3510cb80e52e02d840b5f23d62" exitCode=0 Mar 09 02:41:18 crc kubenswrapper[4901]: I0309 02:41:18.128118 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:18 crc kubenswrapper[4901]: I0309 02:41:18.128124 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"f80cb96192c8d0dc32db69a7503fe1ee949c1e3510cb80e52e02d840b5f23d62"} Mar 09 02:41:18 crc kubenswrapper[4901]: I0309 02:41:18.129973 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:18 crc kubenswrapper[4901]: I0309 02:41:18.130016 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:18 crc kubenswrapper[4901]: I0309 02:41:18.130033 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:18 crc kubenswrapper[4901]: I0309 02:41:18.131609 4901 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="1b72286bfa72acc7e3ad2d5eb5734f182e165eacaa4a7ccfd35675c43fc65a82" exitCode=0 Mar 09 02:41:18 crc kubenswrapper[4901]: I0309 02:41:18.131692 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"1b72286bfa72acc7e3ad2d5eb5734f182e165eacaa4a7ccfd35675c43fc65a82"} Mar 09 02:41:18 crc kubenswrapper[4901]: I0309 02:41:18.131769 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:18 crc kubenswrapper[4901]: I0309 02:41:18.132757 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:18 crc kubenswrapper[4901]: I0309 02:41:18.132807 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:18 crc kubenswrapper[4901]: I0309 02:41:18.132824 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:18 crc kubenswrapper[4901]: I0309 02:41:18.136255 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2c86d550b0e0522073aa52f3a6aa6671f7ded403e37eda7c7b91c6e069272dff"} Mar 09 02:41:18 crc kubenswrapper[4901]: I0309 02:41:18.136324 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a9c9580eb124dab4af78808f51714f0bc0d74cda42b20a800c3f85dccafb84ec"} Mar 09 02:41:18 crc kubenswrapper[4901]: I0309 02:41:18.136355 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e0cd220ceed026b8b0448743b09537807f1d4c59e290c74e4afa63b3331aaf97"} Mar 09 02:41:18 crc kubenswrapper[4901]: I0309 02:41:18.139679 4901 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765" exitCode=0 Mar 09 02:41:18 crc kubenswrapper[4901]: I0309 02:41:18.139790 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765"} Mar 09 02:41:18 crc kubenswrapper[4901]: I0309 02:41:18.139906 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:18 crc kubenswrapper[4901]: I0309 02:41:18.144178 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:18 crc kubenswrapper[4901]: I0309 02:41:18.144248 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:18 crc kubenswrapper[4901]: I0309 02:41:18.144264 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:18 crc kubenswrapper[4901]: I0309 02:41:18.146133 4901 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf" exitCode=0 Mar 09 02:41:18 crc kubenswrapper[4901]: I0309 02:41:18.146202 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf"} Mar 09 02:41:18 crc kubenswrapper[4901]: I0309 02:41:18.146427 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:18 crc kubenswrapper[4901]: I0309 02:41:18.148129 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:18 crc kubenswrapper[4901]: I0309 02:41:18.148202 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:18 crc kubenswrapper[4901]: I0309 02:41:18.148276 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:18 crc kubenswrapper[4901]: I0309 02:41:18.153684 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:18 crc kubenswrapper[4901]: I0309 02:41:18.154644 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:18 crc kubenswrapper[4901]: I0309 02:41:18.154702 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:18 crc kubenswrapper[4901]: I0309 02:41:18.154751 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:18 crc kubenswrapper[4901]: W0309 02:41:18.869055 4901 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.20:6443: connect: connection refused Mar 09 02:41:18 crc kubenswrapper[4901]: E0309 02:41:18.869143 4901 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.20:6443: connect: connection refused" logger="UnhandledError" Mar 09 02:41:19 crc kubenswrapper[4901]: I0309 02:41:19.032286 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.20:6443: connect: connection refused Mar 09 02:41:19 crc kubenswrapper[4901]: E0309 02:41:19.041806 4901 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.20:6443: connect: connection refused" interval="3.2s" Mar 09 02:41:19 crc kubenswrapper[4901]: I0309 02:41:19.159718 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"bf2092a65f531dedb9529485252943746497cea4fec2b6c52ed5220eed868129"} Mar 09 02:41:19 crc kubenswrapper[4901]: I0309 02:41:19.159762 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"d78e716f348e3907e1861b6fd17ba499078ec41d0ee31bc8855d9744e4b0f640"} Mar 09 02:41:19 crc kubenswrapper[4901]: I0309 02:41:19.159774 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"a3836e17de731d173fb726c092d6f4a3ed70b2eab0297fb4e146004c22a3fbc6"} Mar 09 02:41:19 crc kubenswrapper[4901]: I0309 02:41:19.160947 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:19 crc kubenswrapper[4901]: I0309 02:41:19.162010 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:19 crc kubenswrapper[4901]: I0309 02:41:19.162034 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:19 crc kubenswrapper[4901]: I0309 02:41:19.162043 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:19 crc kubenswrapper[4901]: I0309 02:41:19.165648 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c04aae8495c8f5edddbe537c44cefdb20eaa781df485766f6e709a5d4c6e82df"} Mar 09 02:41:19 crc kubenswrapper[4901]: I0309 02:41:19.165701 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:19 crc kubenswrapper[4901]: I0309 02:41:19.170067 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:19 crc kubenswrapper[4901]: I0309 02:41:19.170163 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:19 crc kubenswrapper[4901]: I0309 02:41:19.170266 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:19 crc kubenswrapper[4901]: I0309 02:41:19.173035 4901 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886" exitCode=0 Mar 09 02:41:19 crc kubenswrapper[4901]: I0309 02:41:19.173130 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886"} Mar 09 02:41:19 crc kubenswrapper[4901]: I0309 02:41:19.173248 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:19 crc kubenswrapper[4901]: I0309 02:41:19.174217 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:19 crc kubenswrapper[4901]: I0309 02:41:19.174311 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:19 crc kubenswrapper[4901]: I0309 02:41:19.174365 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:19 crc kubenswrapper[4901]: I0309 02:41:19.186833 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f99df08aa182691c1bdb9ab9fb7b5352b276c236655b1076e2993639d63a94bf"} Mar 09 02:41:19 crc kubenswrapper[4901]: I0309 02:41:19.186909 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"72901ff5efbd52bce10e4651744925f7a5758a7a3c2572b581ab2cd222f58bdd"} Mar 09 02:41:19 crc kubenswrapper[4901]: I0309 02:41:19.186937 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"36d0fe7f5e959b5c71b03b4ad017949229c3a161ea79f7869e610d069248c552"} Mar 09 02:41:19 crc kubenswrapper[4901]: I0309 02:41:19.186956 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a290d0cc95a867bb7d924bfb2e63a518bb753765d4cdb890ccdf96cd36450b38"} Mar 09 02:41:19 crc kubenswrapper[4901]: I0309 02:41:19.192643 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"191fc8d52a7e5ac730ae6ee2a0c0c4c5b6f7c8299decd49623f3c1bb253ecf38"} Mar 09 02:41:19 crc kubenswrapper[4901]: I0309 02:41:19.192772 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:19 crc kubenswrapper[4901]: I0309 02:41:19.194732 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:19 crc kubenswrapper[4901]: I0309 02:41:19.194809 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:19 crc kubenswrapper[4901]: I0309 02:41:19.194832 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:19 crc kubenswrapper[4901]: W0309 02:41:19.258527 4901 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.20:6443: connect: connection refused Mar 09 02:41:19 crc kubenswrapper[4901]: E0309 02:41:19.258671 4901 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.20:6443: connect: connection refused" logger="UnhandledError" Mar 09 02:41:19 crc kubenswrapper[4901]: W0309 02:41:19.266037 4901 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.20:6443: connect: connection refused Mar 09 02:41:19 crc kubenswrapper[4901]: E0309 02:41:19.266332 4901 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.20:6443: connect: connection refused" logger="UnhandledError" Mar 09 02:41:19 crc kubenswrapper[4901]: I0309 02:41:19.296358 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:19 crc kubenswrapper[4901]: I0309 02:41:19.297659 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:19 crc kubenswrapper[4901]: I0309 02:41:19.297700 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:19 crc kubenswrapper[4901]: I0309 02:41:19.297714 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:19 crc kubenswrapper[4901]: I0309 02:41:19.297747 4901 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 02:41:19 crc kubenswrapper[4901]: E0309 02:41:19.298268 4901 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.20:6443: connect: connection refused" node="crc" Mar 09 02:41:19 crc kubenswrapper[4901]: I0309 02:41:19.645939 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 02:41:20 crc kubenswrapper[4901]: I0309 02:41:20.200086 4901 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4" exitCode=0 Mar 09 02:41:20 crc kubenswrapper[4901]: I0309 02:41:20.200352 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4"} Mar 09 02:41:20 crc kubenswrapper[4901]: I0309 02:41:20.200847 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:20 crc kubenswrapper[4901]: I0309 02:41:20.202290 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:20 crc kubenswrapper[4901]: I0309 02:41:20.202359 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:20 crc kubenswrapper[4901]: I0309 02:41:20.202381 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:20 crc kubenswrapper[4901]: I0309 02:41:20.209735 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2d20dc59e6126e5e586d975f53336f2c9a2d3adf377765e405fcfa7d200ad3c8"} Mar 09 02:41:20 crc kubenswrapper[4901]: I0309 02:41:20.209900 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:20 crc kubenswrapper[4901]: I0309 02:41:20.209966 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:20 crc kubenswrapper[4901]: I0309 02:41:20.210004 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:20 crc kubenswrapper[4901]: I0309 02:41:20.210697 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 02:41:20 crc kubenswrapper[4901]: I0309 02:41:20.210834 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:20 crc kubenswrapper[4901]: I0309 02:41:20.211944 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:20 crc kubenswrapper[4901]: I0309 02:41:20.212004 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:20 crc kubenswrapper[4901]: I0309 02:41:20.212028 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:20 crc kubenswrapper[4901]: I0309 02:41:20.212489 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:20 crc kubenswrapper[4901]: I0309 02:41:20.212554 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:20 crc kubenswrapper[4901]: I0309 02:41:20.212551 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:20 crc kubenswrapper[4901]: I0309 02:41:20.212646 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:20 crc kubenswrapper[4901]: I0309 02:41:20.212673 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:20 crc kubenswrapper[4901]: I0309 02:41:20.212583 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:20 crc kubenswrapper[4901]: I0309 02:41:20.213693 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:20 crc kubenswrapper[4901]: I0309 02:41:20.213740 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:20 crc kubenswrapper[4901]: I0309 02:41:20.213761 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:20 crc kubenswrapper[4901]: I0309 02:41:20.663972 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 02:41:21 crc kubenswrapper[4901]: I0309 02:41:21.219369 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"79c92d776bf96579afc6fbc0e1e70bb645a371966be4e4e9d86e18ef3c977b99"} Mar 09 02:41:21 crc kubenswrapper[4901]: I0309 02:41:21.219505 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"18f78a3248e86d96e6293502db1727285fb76668c8242ed3399de4b8e8fb666a"} Mar 09 02:41:21 crc kubenswrapper[4901]: I0309 02:41:21.219538 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f23f66811373b03763a100551c0390732ccceac869c908d769de48f847e4ab82"} Mar 09 02:41:21 crc kubenswrapper[4901]: I0309 02:41:21.219685 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:21 crc kubenswrapper[4901]: I0309 02:41:21.219823 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:21 crc kubenswrapper[4901]: I0309 02:41:21.222825 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:21 crc kubenswrapper[4901]: I0309 02:41:21.223287 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:21 crc kubenswrapper[4901]: I0309 02:41:21.223357 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:21 crc kubenswrapper[4901]: I0309 02:41:21.223403 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:21 crc kubenswrapper[4901]: I0309 02:41:21.224507 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:21 crc kubenswrapper[4901]: I0309 02:41:21.224670 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:21 crc kubenswrapper[4901]: I0309 02:41:21.224730 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:21 crc kubenswrapper[4901]: I0309 02:41:21.227743 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:21 crc kubenswrapper[4901]: I0309 02:41:21.227789 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:21 crc kubenswrapper[4901]: I0309 02:41:21.227809 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:22 crc kubenswrapper[4901]: I0309 02:41:22.231619 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:22 crc kubenswrapper[4901]: I0309 02:41:22.231709 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e1098f7874be8a2a538c966845a946ab7993ec921e4e0859953f0187b4fabfd5"} Mar 09 02:41:22 crc kubenswrapper[4901]: I0309 02:41:22.232459 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7936d7271945f5c0b5ab0ddecf9a3f46c9f7306b71e25fabbb3927303c6f9adb"} Mar 09 02:41:22 crc kubenswrapper[4901]: I0309 02:41:22.231875 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:22 crc kubenswrapper[4901]: I0309 02:41:22.233876 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:22 crc kubenswrapper[4901]: I0309 02:41:22.233934 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:22 crc kubenswrapper[4901]: I0309 02:41:22.233956 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:22 crc kubenswrapper[4901]: I0309 02:41:22.234313 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:22 crc kubenswrapper[4901]: I0309 02:41:22.234430 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:22 crc kubenswrapper[4901]: I0309 02:41:22.234510 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:22 crc kubenswrapper[4901]: I0309 02:41:22.268129 4901 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 09 02:41:22 crc kubenswrapper[4901]: I0309 02:41:22.499456 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:22 crc kubenswrapper[4901]: I0309 02:41:22.501878 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:22 crc kubenswrapper[4901]: I0309 02:41:22.501943 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:22 crc kubenswrapper[4901]: I0309 02:41:22.501967 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:22 crc kubenswrapper[4901]: I0309 02:41:22.502013 4901 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 02:41:23 crc kubenswrapper[4901]: I0309 02:41:23.181662 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 02:41:23 crc kubenswrapper[4901]: I0309 02:41:23.234213 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:23 crc kubenswrapper[4901]: I0309 02:41:23.234380 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:23 crc kubenswrapper[4901]: I0309 02:41:23.236635 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:23 crc kubenswrapper[4901]: I0309 02:41:23.236928 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:23 crc kubenswrapper[4901]: I0309 02:41:23.237088 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:23 crc kubenswrapper[4901]: I0309 02:41:23.237372 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:23 crc kubenswrapper[4901]: I0309 02:41:23.237440 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:23 crc kubenswrapper[4901]: I0309 02:41:23.237468 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:23 crc kubenswrapper[4901]: I0309 02:41:23.573529 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 02:41:24 crc kubenswrapper[4901]: I0309 02:41:24.237888 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:24 crc kubenswrapper[4901]: I0309 02:41:24.239677 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:24 crc kubenswrapper[4901]: I0309 02:41:24.239732 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:24 crc kubenswrapper[4901]: I0309 02:41:24.239743 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:24 crc kubenswrapper[4901]: I0309 02:41:24.539785 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 02:41:24 crc kubenswrapper[4901]: I0309 02:41:24.540150 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:24 crc kubenswrapper[4901]: I0309 02:41:24.542434 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:24 crc kubenswrapper[4901]: I0309 02:41:24.542505 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:24 crc kubenswrapper[4901]: I0309 02:41:24.542531 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:25 crc kubenswrapper[4901]: I0309 02:41:25.782840 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 02:41:25 crc kubenswrapper[4901]: I0309 02:41:25.783117 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:25 crc kubenswrapper[4901]: I0309 02:41:25.784981 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:25 crc kubenswrapper[4901]: I0309 02:41:25.785067 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:25 crc kubenswrapper[4901]: I0309 02:41:25.785093 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:25 crc kubenswrapper[4901]: I0309 02:41:25.790958 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 02:41:26 crc kubenswrapper[4901]: E0309 02:41:26.196268 4901 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 02:41:26 crc kubenswrapper[4901]: I0309 02:41:26.243301 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:26 crc kubenswrapper[4901]: I0309 02:41:26.244945 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:26 crc kubenswrapper[4901]: I0309 02:41:26.245153 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:26 crc kubenswrapper[4901]: I0309 02:41:26.245360 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:26 crc kubenswrapper[4901]: I0309 02:41:26.780332 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 02:41:27 crc kubenswrapper[4901]: I0309 02:41:27.234191 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 09 02:41:27 crc kubenswrapper[4901]: I0309 02:41:27.234620 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:27 crc kubenswrapper[4901]: I0309 02:41:27.236468 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:27 crc kubenswrapper[4901]: I0309 02:41:27.236532 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:27 crc kubenswrapper[4901]: I0309 02:41:27.236553 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:27 crc kubenswrapper[4901]: I0309 02:41:27.246367 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:27 crc kubenswrapper[4901]: I0309 02:41:27.248426 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:27 crc kubenswrapper[4901]: I0309 02:41:27.248529 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:27 crc kubenswrapper[4901]: I0309 02:41:27.248559 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:27 crc kubenswrapper[4901]: I0309 02:41:27.253186 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 02:41:27 crc kubenswrapper[4901]: I0309 02:41:27.540581 4901 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 02:41:27 crc kubenswrapper[4901]: I0309 02:41:27.540677 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 02:41:27 crc kubenswrapper[4901]: I0309 02:41:27.742062 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 09 02:41:27 crc kubenswrapper[4901]: I0309 02:41:27.742337 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:27 crc kubenswrapper[4901]: I0309 02:41:27.743886 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:27 crc kubenswrapper[4901]: I0309 02:41:27.743950 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:27 crc kubenswrapper[4901]: I0309 02:41:27.743970 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:28 crc kubenswrapper[4901]: I0309 02:41:28.250302 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:28 crc kubenswrapper[4901]: I0309 02:41:28.251912 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:28 crc kubenswrapper[4901]: I0309 02:41:28.252132 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:28 crc kubenswrapper[4901]: I0309 02:41:28.252360 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:29 crc kubenswrapper[4901]: W0309 02:41:29.691343 4901 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 09 02:41:29 crc kubenswrapper[4901]: I0309 02:41:29.691451 4901 trace.go:236] Trace[1141915370]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (09-Mar-2026 02:41:19.690) (total time: 10001ms): Mar 09 02:41:29 crc kubenswrapper[4901]: Trace[1141915370]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (02:41:29.691) Mar 09 02:41:29 crc kubenswrapper[4901]: Trace[1141915370]: [10.001352435s] [10.001352435s] END Mar 09 02:41:29 crc kubenswrapper[4901]: E0309 02:41:29.691476 4901 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 09 02:41:30 crc kubenswrapper[4901]: I0309 02:41:30.032503 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 09 02:41:30 crc kubenswrapper[4901]: E0309 02:41:30.465074 4901 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:30Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189b0c00d5f31a5c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:16.02943446 +0000 UTC m=+0.619098222,LastTimestamp:2026-03-09 02:41:16.02943446 +0000 UTC m=+0.619098222,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:41:30 crc kubenswrapper[4901]: E0309 02:41:30.465426 4901 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:30Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 09 02:41:30 crc kubenswrapper[4901]: E0309 02:41:30.467669 4901 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:30Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 02:41:30 crc kubenswrapper[4901]: E0309 02:41:30.469741 4901 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:30Z is after 2026-02-23T05:33:13Z" node="crc" Mar 09 02:41:30 crc kubenswrapper[4901]: W0309 02:41:30.470125 4901 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:30Z is after 2026-02-23T05:33:13Z Mar 09 02:41:30 crc kubenswrapper[4901]: E0309 02:41:30.470212 4901 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:30Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 02:41:30 crc kubenswrapper[4901]: W0309 02:41:30.473341 4901 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:30Z is after 2026-02-23T05:33:13Z Mar 09 02:41:30 crc kubenswrapper[4901]: E0309 02:41:30.473496 4901 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:30Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 02:41:30 crc kubenswrapper[4901]: W0309 02:41:30.482778 4901 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:30Z is after 2026-02-23T05:33:13Z Mar 09 02:41:30 crc kubenswrapper[4901]: E0309 02:41:30.482950 4901 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:30Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 02:41:30 crc kubenswrapper[4901]: I0309 02:41:30.487564 4901 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 09 02:41:30 crc kubenswrapper[4901]: I0309 02:41:30.487664 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 09 02:41:30 crc kubenswrapper[4901]: I0309 02:41:30.496835 4901 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 09 02:41:30 crc kubenswrapper[4901]: [+]log ok Mar 09 02:41:30 crc kubenswrapper[4901]: [+]etcd ok Mar 09 02:41:30 crc kubenswrapper[4901]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 09 02:41:30 crc kubenswrapper[4901]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 09 02:41:30 crc kubenswrapper[4901]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 09 02:41:30 crc kubenswrapper[4901]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 09 02:41:30 crc kubenswrapper[4901]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 09 02:41:30 crc kubenswrapper[4901]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 09 02:41:30 crc kubenswrapper[4901]: [+]poststarthook/generic-apiserver-start-informers ok Mar 09 02:41:30 crc kubenswrapper[4901]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 09 02:41:30 crc kubenswrapper[4901]: [+]poststarthook/priority-and-fairness-filter ok Mar 09 02:41:30 crc kubenswrapper[4901]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 09 02:41:30 crc kubenswrapper[4901]: [+]poststarthook/start-apiextensions-informers ok Mar 09 02:41:30 crc kubenswrapper[4901]: [-]poststarthook/start-apiextensions-controllers failed: reason withheld Mar 09 02:41:30 crc kubenswrapper[4901]: [-]poststarthook/crd-informer-synced failed: reason withheld Mar 09 02:41:30 crc kubenswrapper[4901]: [+]poststarthook/start-system-namespaces-controller ok Mar 09 02:41:30 crc kubenswrapper[4901]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 09 02:41:30 crc kubenswrapper[4901]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 09 02:41:30 crc kubenswrapper[4901]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 09 02:41:30 crc kubenswrapper[4901]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 09 02:41:30 crc kubenswrapper[4901]: [-]poststarthook/start-service-ip-repair-controllers failed: reason withheld Mar 09 02:41:30 crc kubenswrapper[4901]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Mar 09 02:41:30 crc kubenswrapper[4901]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Mar 09 02:41:30 crc kubenswrapper[4901]: [-]poststarthook/priority-and-fairness-config-producer failed: reason withheld Mar 09 02:41:30 crc kubenswrapper[4901]: [-]poststarthook/bootstrap-controller failed: reason withheld Mar 09 02:41:30 crc kubenswrapper[4901]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 09 02:41:30 crc kubenswrapper[4901]: [+]poststarthook/start-kube-aggregator-informers ok Mar 09 02:41:30 crc kubenswrapper[4901]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 09 02:41:30 crc kubenswrapper[4901]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 09 02:41:30 crc kubenswrapper[4901]: [-]poststarthook/apiservice-registration-controller failed: reason withheld Mar 09 02:41:30 crc kubenswrapper[4901]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 09 02:41:30 crc kubenswrapper[4901]: [-]poststarthook/apiservice-discovery-controller failed: reason withheld Mar 09 02:41:30 crc kubenswrapper[4901]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 09 02:41:30 crc kubenswrapper[4901]: [+]autoregister-completion ok Mar 09 02:41:30 crc kubenswrapper[4901]: [+]poststarthook/apiservice-openapi-controller ok Mar 09 02:41:30 crc kubenswrapper[4901]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 09 02:41:30 crc kubenswrapper[4901]: livez check failed Mar 09 02:41:30 crc kubenswrapper[4901]: I0309 02:41:30.496953 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 02:41:30 crc kubenswrapper[4901]: I0309 02:41:30.665077 4901 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Mar 09 02:41:30 crc kubenswrapper[4901]: I0309 02:41:30.665170 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Mar 09 02:41:31 crc kubenswrapper[4901]: I0309 02:41:31.036787 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:31Z is after 2026-02-23T05:33:13Z Mar 09 02:41:31 crc kubenswrapper[4901]: I0309 02:41:31.260464 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 09 02:41:31 crc kubenswrapper[4901]: I0309 02:41:31.262921 4901 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2d20dc59e6126e5e586d975f53336f2c9a2d3adf377765e405fcfa7d200ad3c8" exitCode=255 Mar 09 02:41:31 crc kubenswrapper[4901]: I0309 02:41:31.262963 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"2d20dc59e6126e5e586d975f53336f2c9a2d3adf377765e405fcfa7d200ad3c8"} Mar 09 02:41:31 crc kubenswrapper[4901]: I0309 02:41:31.263206 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:31 crc kubenswrapper[4901]: I0309 02:41:31.264344 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:31 crc kubenswrapper[4901]: I0309 02:41:31.264393 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:31 crc kubenswrapper[4901]: I0309 02:41:31.264410 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:31 crc kubenswrapper[4901]: I0309 02:41:31.265107 4901 scope.go:117] "RemoveContainer" containerID="2d20dc59e6126e5e586d975f53336f2c9a2d3adf377765e405fcfa7d200ad3c8" Mar 09 02:41:32 crc kubenswrapper[4901]: I0309 02:41:32.036365 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:32Z is after 2026-02-23T05:33:13Z Mar 09 02:41:32 crc kubenswrapper[4901]: I0309 02:41:32.271490 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 09 02:41:32 crc kubenswrapper[4901]: I0309 02:41:32.275031 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"61ccbbff7ef5bf0c73946aae89b8507a233bbf9e2e30265a15fb649716182a07"} Mar 09 02:41:32 crc kubenswrapper[4901]: I0309 02:41:32.275171 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:32 crc kubenswrapper[4901]: I0309 02:41:32.276036 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:32 crc kubenswrapper[4901]: I0309 02:41:32.276122 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:32 crc kubenswrapper[4901]: I0309 02:41:32.276143 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:32 crc kubenswrapper[4901]: W0309 02:41:32.920919 4901 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:32Z is after 2026-02-23T05:33:13Z Mar 09 02:41:32 crc kubenswrapper[4901]: E0309 02:41:32.921047 4901 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:32Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 02:41:33 crc kubenswrapper[4901]: I0309 02:41:33.035779 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:33Z is after 2026-02-23T05:33:13Z Mar 09 02:41:33 crc kubenswrapper[4901]: I0309 02:41:33.189559 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 02:41:33 crc kubenswrapper[4901]: I0309 02:41:33.283408 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 09 02:41:33 crc kubenswrapper[4901]: I0309 02:41:33.284194 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 09 02:41:33 crc kubenswrapper[4901]: I0309 02:41:33.287622 4901 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="61ccbbff7ef5bf0c73946aae89b8507a233bbf9e2e30265a15fb649716182a07" exitCode=255 Mar 09 02:41:33 crc kubenswrapper[4901]: I0309 02:41:33.287690 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"61ccbbff7ef5bf0c73946aae89b8507a233bbf9e2e30265a15fb649716182a07"} Mar 09 02:41:33 crc kubenswrapper[4901]: I0309 02:41:33.287780 4901 scope.go:117] "RemoveContainer" containerID="2d20dc59e6126e5e586d975f53336f2c9a2d3adf377765e405fcfa7d200ad3c8" Mar 09 02:41:33 crc kubenswrapper[4901]: I0309 02:41:33.287874 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:33 crc kubenswrapper[4901]: I0309 02:41:33.293132 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:33 crc kubenswrapper[4901]: I0309 02:41:33.293284 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:33 crc kubenswrapper[4901]: I0309 02:41:33.293325 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:33 crc kubenswrapper[4901]: I0309 02:41:33.295515 4901 scope.go:117] "RemoveContainer" containerID="61ccbbff7ef5bf0c73946aae89b8507a233bbf9e2e30265a15fb649716182a07" Mar 09 02:41:33 crc kubenswrapper[4901]: E0309 02:41:33.295863 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 02:41:33 crc kubenswrapper[4901]: I0309 02:41:33.300315 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 02:41:34 crc kubenswrapper[4901]: I0309 02:41:34.036658 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:34Z is after 2026-02-23T05:33:13Z Mar 09 02:41:34 crc kubenswrapper[4901]: I0309 02:41:34.294765 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 09 02:41:34 crc kubenswrapper[4901]: I0309 02:41:34.297890 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:34 crc kubenswrapper[4901]: I0309 02:41:34.299670 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:34 crc kubenswrapper[4901]: I0309 02:41:34.300171 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:34 crc kubenswrapper[4901]: I0309 02:41:34.300360 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:34 crc kubenswrapper[4901]: I0309 02:41:34.301351 4901 scope.go:117] "RemoveContainer" containerID="61ccbbff7ef5bf0c73946aae89b8507a233bbf9e2e30265a15fb649716182a07" Mar 09 02:41:34 crc kubenswrapper[4901]: E0309 02:41:34.301971 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 02:41:35 crc kubenswrapper[4901]: I0309 02:41:35.035391 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:35Z is after 2026-02-23T05:33:13Z Mar 09 02:41:35 crc kubenswrapper[4901]: I0309 02:41:35.300850 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:35 crc kubenswrapper[4901]: I0309 02:41:35.302299 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:35 crc kubenswrapper[4901]: I0309 02:41:35.302344 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:35 crc kubenswrapper[4901]: I0309 02:41:35.302362 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:35 crc kubenswrapper[4901]: I0309 02:41:35.303204 4901 scope.go:117] "RemoveContainer" containerID="61ccbbff7ef5bf0c73946aae89b8507a233bbf9e2e30265a15fb649716182a07" Mar 09 02:41:35 crc kubenswrapper[4901]: E0309 02:41:35.303540 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 02:41:36 crc kubenswrapper[4901]: I0309 02:41:36.036163 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:36Z is after 2026-02-23T05:33:13Z Mar 09 02:41:36 crc kubenswrapper[4901]: E0309 02:41:36.196421 4901 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 02:41:36 crc kubenswrapper[4901]: I0309 02:41:36.775866 4901 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 02:41:36 crc kubenswrapper[4901]: I0309 02:41:36.776140 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:36 crc kubenswrapper[4901]: I0309 02:41:36.778145 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:36 crc kubenswrapper[4901]: I0309 02:41:36.778211 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:36 crc kubenswrapper[4901]: I0309 02:41:36.778275 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:36 crc kubenswrapper[4901]: I0309 02:41:36.779085 4901 scope.go:117] "RemoveContainer" containerID="61ccbbff7ef5bf0c73946aae89b8507a233bbf9e2e30265a15fb649716182a07" Mar 09 02:41:36 crc kubenswrapper[4901]: E0309 02:41:36.779507 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 02:41:36 crc kubenswrapper[4901]: I0309 02:41:36.869882 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:36 crc kubenswrapper[4901]: E0309 02:41:36.870873 4901 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:36Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 09 02:41:36 crc kubenswrapper[4901]: I0309 02:41:36.871432 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:36 crc kubenswrapper[4901]: I0309 02:41:36.871504 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:36 crc kubenswrapper[4901]: I0309 02:41:36.871527 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:36 crc kubenswrapper[4901]: I0309 02:41:36.871572 4901 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 02:41:36 crc kubenswrapper[4901]: E0309 02:41:36.876937 4901 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:36Z is after 2026-02-23T05:33:13Z" node="crc" Mar 09 02:41:37 crc kubenswrapper[4901]: I0309 02:41:37.036449 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:37Z is after 2026-02-23T05:33:13Z Mar 09 02:41:37 crc kubenswrapper[4901]: I0309 02:41:37.540458 4901 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 02:41:37 crc kubenswrapper[4901]: I0309 02:41:37.540535 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 02:41:37 crc kubenswrapper[4901]: W0309 02:41:37.745335 4901 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:37Z is after 2026-02-23T05:33:13Z Mar 09 02:41:37 crc kubenswrapper[4901]: E0309 02:41:37.745449 4901 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:37Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 02:41:37 crc kubenswrapper[4901]: I0309 02:41:37.779863 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 09 02:41:37 crc kubenswrapper[4901]: I0309 02:41:37.780089 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:37 crc kubenswrapper[4901]: I0309 02:41:37.781539 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:37 crc kubenswrapper[4901]: I0309 02:41:37.781723 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:37 crc kubenswrapper[4901]: I0309 02:41:37.781881 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:37 crc kubenswrapper[4901]: I0309 02:41:37.799399 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 09 02:41:38 crc kubenswrapper[4901]: I0309 02:41:38.036985 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:38Z is after 2026-02-23T05:33:13Z Mar 09 02:41:38 crc kubenswrapper[4901]: W0309 02:41:38.254107 4901 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:38Z is after 2026-02-23T05:33:13Z Mar 09 02:41:38 crc kubenswrapper[4901]: E0309 02:41:38.254204 4901 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:38Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 02:41:38 crc kubenswrapper[4901]: I0309 02:41:38.310126 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:38 crc kubenswrapper[4901]: I0309 02:41:38.311949 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:38 crc kubenswrapper[4901]: I0309 02:41:38.311999 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:38 crc kubenswrapper[4901]: I0309 02:41:38.312017 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:39 crc kubenswrapper[4901]: I0309 02:41:39.036161 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:39Z is after 2026-02-23T05:33:13Z Mar 09 02:41:39 crc kubenswrapper[4901]: I0309 02:41:39.069762 4901 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 09 02:41:39 crc kubenswrapper[4901]: E0309 02:41:39.075214 4901 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:39Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 02:41:40 crc kubenswrapper[4901]: I0309 02:41:40.036559 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:40Z is after 2026-02-23T05:33:13Z Mar 09 02:41:40 crc kubenswrapper[4901]: E0309 02:41:40.471020 4901 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:40Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189b0c00d5f31a5c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:16.02943446 +0000 UTC m=+0.619098222,LastTimestamp:2026-03-09 02:41:16.02943446 +0000 UTC m=+0.619098222,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:41:40 crc kubenswrapper[4901]: I0309 02:41:40.664210 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 02:41:40 crc kubenswrapper[4901]: I0309 02:41:40.664847 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:40 crc kubenswrapper[4901]: I0309 02:41:40.666859 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:40 crc kubenswrapper[4901]: I0309 02:41:40.666932 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:40 crc kubenswrapper[4901]: I0309 02:41:40.666956 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:40 crc kubenswrapper[4901]: I0309 02:41:40.668130 4901 scope.go:117] "RemoveContainer" containerID="61ccbbff7ef5bf0c73946aae89b8507a233bbf9e2e30265a15fb649716182a07" Mar 09 02:41:40 crc kubenswrapper[4901]: E0309 02:41:40.668508 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 02:41:40 crc kubenswrapper[4901]: W0309 02:41:40.700389 4901 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:40Z is after 2026-02-23T05:33:13Z Mar 09 02:41:40 crc kubenswrapper[4901]: E0309 02:41:40.700520 4901 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:40Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 02:41:41 crc kubenswrapper[4901]: I0309 02:41:41.036395 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:41Z is after 2026-02-23T05:33:13Z Mar 09 02:41:42 crc kubenswrapper[4901]: I0309 02:41:42.037698 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:42Z is after 2026-02-23T05:33:13Z Mar 09 02:41:42 crc kubenswrapper[4901]: W0309 02:41:42.916827 4901 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:42Z is after 2026-02-23T05:33:13Z Mar 09 02:41:42 crc kubenswrapper[4901]: E0309 02:41:42.916972 4901 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:42Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 02:41:43 crc kubenswrapper[4901]: I0309 02:41:43.037349 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:43Z is after 2026-02-23T05:33:13Z Mar 09 02:41:43 crc kubenswrapper[4901]: E0309 02:41:43.876779 4901 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:43Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 09 02:41:43 crc kubenswrapper[4901]: I0309 02:41:43.877973 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:43 crc kubenswrapper[4901]: I0309 02:41:43.879612 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:43 crc kubenswrapper[4901]: I0309 02:41:43.879673 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:43 crc kubenswrapper[4901]: I0309 02:41:43.879701 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:43 crc kubenswrapper[4901]: I0309 02:41:43.879749 4901 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 02:41:43 crc kubenswrapper[4901]: E0309 02:41:43.884668 4901 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:43Z is after 2026-02-23T05:33:13Z" node="crc" Mar 09 02:41:44 crc kubenswrapper[4901]: I0309 02:41:44.036727 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:44Z is after 2026-02-23T05:33:13Z Mar 09 02:41:45 crc kubenswrapper[4901]: I0309 02:41:45.036266 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:45Z is after 2026-02-23T05:33:13Z Mar 09 02:41:46 crc kubenswrapper[4901]: I0309 02:41:46.035946 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:46Z is after 2026-02-23T05:33:13Z Mar 09 02:41:46 crc kubenswrapper[4901]: E0309 02:41:46.196819 4901 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 02:41:47 crc kubenswrapper[4901]: I0309 02:41:47.036474 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:47Z is after 2026-02-23T05:33:13Z Mar 09 02:41:47 crc kubenswrapper[4901]: I0309 02:41:47.541162 4901 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 02:41:47 crc kubenswrapper[4901]: I0309 02:41:47.541263 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 02:41:47 crc kubenswrapper[4901]: I0309 02:41:47.541338 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 02:41:47 crc kubenswrapper[4901]: I0309 02:41:47.541520 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:47 crc kubenswrapper[4901]: I0309 02:41:47.542773 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:47 crc kubenswrapper[4901]: I0309 02:41:47.542809 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:47 crc kubenswrapper[4901]: I0309 02:41:47.542820 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:47 crc kubenswrapper[4901]: I0309 02:41:47.543412 4901 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"a9c9580eb124dab4af78808f51714f0bc0d74cda42b20a800c3f85dccafb84ec"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 09 02:41:47 crc kubenswrapper[4901]: I0309 02:41:47.543636 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://a9c9580eb124dab4af78808f51714f0bc0d74cda42b20a800c3f85dccafb84ec" gracePeriod=30 Mar 09 02:41:48 crc kubenswrapper[4901]: I0309 02:41:48.037195 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:48Z is after 2026-02-23T05:33:13Z Mar 09 02:41:48 crc kubenswrapper[4901]: I0309 02:41:48.344343 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 09 02:41:48 crc kubenswrapper[4901]: I0309 02:41:48.345109 4901 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="a9c9580eb124dab4af78808f51714f0bc0d74cda42b20a800c3f85dccafb84ec" exitCode=255 Mar 09 02:41:48 crc kubenswrapper[4901]: I0309 02:41:48.345189 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"a9c9580eb124dab4af78808f51714f0bc0d74cda42b20a800c3f85dccafb84ec"} Mar 09 02:41:48 crc kubenswrapper[4901]: I0309 02:41:48.345272 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ceafe61efea5730e474e74a9ed5a7558cc06dc75d105c37680ed1eea84b8827c"} Mar 09 02:41:48 crc kubenswrapper[4901]: I0309 02:41:48.345451 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:48 crc kubenswrapper[4901]: I0309 02:41:48.347028 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:48 crc kubenswrapper[4901]: I0309 02:41:48.347073 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:48 crc kubenswrapper[4901]: I0309 02:41:48.347095 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:49 crc kubenswrapper[4901]: I0309 02:41:49.036751 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:49Z is after 2026-02-23T05:33:13Z Mar 09 02:41:50 crc kubenswrapper[4901]: I0309 02:41:50.037429 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:50Z is after 2026-02-23T05:33:13Z Mar 09 02:41:50 crc kubenswrapper[4901]: E0309 02:41:50.477613 4901 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:50Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189b0c00d5f31a5c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:16.02943446 +0000 UTC m=+0.619098222,LastTimestamp:2026-03-09 02:41:16.02943446 +0000 UTC m=+0.619098222,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:41:50 crc kubenswrapper[4901]: E0309 02:41:50.881859 4901 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:50Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 09 02:41:50 crc kubenswrapper[4901]: I0309 02:41:50.884975 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:50 crc kubenswrapper[4901]: I0309 02:41:50.887265 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:50 crc kubenswrapper[4901]: I0309 02:41:50.887339 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:50 crc kubenswrapper[4901]: I0309 02:41:50.887360 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:50 crc kubenswrapper[4901]: I0309 02:41:50.887410 4901 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 02:41:50 crc kubenswrapper[4901]: E0309 02:41:50.892363 4901 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:50Z is after 2026-02-23T05:33:13Z" node="crc" Mar 09 02:41:51 crc kubenswrapper[4901]: I0309 02:41:51.036291 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:51Z is after 2026-02-23T05:33:13Z Mar 09 02:41:52 crc kubenswrapper[4901]: I0309 02:41:52.037261 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:52Z is after 2026-02-23T05:33:13Z Mar 09 02:41:53 crc kubenswrapper[4901]: I0309 02:41:53.040635 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:53Z is after 2026-02-23T05:33:13Z Mar 09 02:41:54 crc kubenswrapper[4901]: I0309 02:41:54.035986 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:54Z is after 2026-02-23T05:33:13Z Mar 09 02:41:54 crc kubenswrapper[4901]: I0309 02:41:54.105911 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:54 crc kubenswrapper[4901]: I0309 02:41:54.107904 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:54 crc kubenswrapper[4901]: I0309 02:41:54.108001 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:54 crc kubenswrapper[4901]: I0309 02:41:54.108021 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:54 crc kubenswrapper[4901]: I0309 02:41:54.108898 4901 scope.go:117] "RemoveContainer" containerID="61ccbbff7ef5bf0c73946aae89b8507a233bbf9e2e30265a15fb649716182a07" Mar 09 02:41:54 crc kubenswrapper[4901]: I0309 02:41:54.370608 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 09 02:41:54 crc kubenswrapper[4901]: I0309 02:41:54.373870 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7e3fac986a6132fe6180180b96999fcb0b1e6be73db98c01464aae6f5bf83630"} Mar 09 02:41:54 crc kubenswrapper[4901]: I0309 02:41:54.374113 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:54 crc kubenswrapper[4901]: I0309 02:41:54.376563 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:54 crc kubenswrapper[4901]: I0309 02:41:54.376631 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:54 crc kubenswrapper[4901]: I0309 02:41:54.376656 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:54 crc kubenswrapper[4901]: I0309 02:41:54.540800 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 02:41:54 crc kubenswrapper[4901]: I0309 02:41:54.540998 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:54 crc kubenswrapper[4901]: I0309 02:41:54.543098 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:54 crc kubenswrapper[4901]: I0309 02:41:54.543205 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:54 crc kubenswrapper[4901]: I0309 02:41:54.543276 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:55 crc kubenswrapper[4901]: I0309 02:41:55.034954 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:55Z is after 2026-02-23T05:33:13Z Mar 09 02:41:55 crc kubenswrapper[4901]: I0309 02:41:55.379775 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 09 02:41:55 crc kubenswrapper[4901]: I0309 02:41:55.380789 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 09 02:41:55 crc kubenswrapper[4901]: I0309 02:41:55.383591 4901 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7e3fac986a6132fe6180180b96999fcb0b1e6be73db98c01464aae6f5bf83630" exitCode=255 Mar 09 02:41:55 crc kubenswrapper[4901]: I0309 02:41:55.383645 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"7e3fac986a6132fe6180180b96999fcb0b1e6be73db98c01464aae6f5bf83630"} Mar 09 02:41:55 crc kubenswrapper[4901]: I0309 02:41:55.383706 4901 scope.go:117] "RemoveContainer" containerID="61ccbbff7ef5bf0c73946aae89b8507a233bbf9e2e30265a15fb649716182a07" Mar 09 02:41:55 crc kubenswrapper[4901]: I0309 02:41:55.383935 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:55 crc kubenswrapper[4901]: I0309 02:41:55.385461 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:55 crc kubenswrapper[4901]: I0309 02:41:55.385507 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:55 crc kubenswrapper[4901]: I0309 02:41:55.385527 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:55 crc kubenswrapper[4901]: I0309 02:41:55.386420 4901 scope.go:117] "RemoveContainer" containerID="7e3fac986a6132fe6180180b96999fcb0b1e6be73db98c01464aae6f5bf83630" Mar 09 02:41:55 crc kubenswrapper[4901]: E0309 02:41:55.386704 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 02:41:56 crc kubenswrapper[4901]: I0309 02:41:56.034623 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:56Z is after 2026-02-23T05:33:13Z Mar 09 02:41:56 crc kubenswrapper[4901]: I0309 02:41:56.117091 4901 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 09 02:41:56 crc kubenswrapper[4901]: E0309 02:41:56.121150 4901 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:56Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 02:41:56 crc kubenswrapper[4901]: E0309 02:41:56.122386 4901 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Mar 09 02:41:56 crc kubenswrapper[4901]: E0309 02:41:56.196976 4901 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 02:41:56 crc kubenswrapper[4901]: I0309 02:41:56.389486 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 09 02:41:56 crc kubenswrapper[4901]: I0309 02:41:56.776121 4901 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 02:41:56 crc kubenswrapper[4901]: I0309 02:41:56.776497 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:56 crc kubenswrapper[4901]: I0309 02:41:56.778526 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:56 crc kubenswrapper[4901]: I0309 02:41:56.778620 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:56 crc kubenswrapper[4901]: I0309 02:41:56.778646 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:56 crc kubenswrapper[4901]: I0309 02:41:56.779857 4901 scope.go:117] "RemoveContainer" containerID="7e3fac986a6132fe6180180b96999fcb0b1e6be73db98c01464aae6f5bf83630" Mar 09 02:41:56 crc kubenswrapper[4901]: E0309 02:41:56.780276 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 02:41:56 crc kubenswrapper[4901]: I0309 02:41:56.780374 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 02:41:56 crc kubenswrapper[4901]: I0309 02:41:56.780615 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:56 crc kubenswrapper[4901]: I0309 02:41:56.782329 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:56 crc kubenswrapper[4901]: I0309 02:41:56.782397 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:56 crc kubenswrapper[4901]: I0309 02:41:56.782422 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:57 crc kubenswrapper[4901]: I0309 02:41:57.036968 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:57Z is after 2026-02-23T05:33:13Z Mar 09 02:41:57 crc kubenswrapper[4901]: I0309 02:41:57.541682 4901 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 02:41:57 crc kubenswrapper[4901]: I0309 02:41:57.541800 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 02:41:57 crc kubenswrapper[4901]: W0309 02:41:57.768506 4901 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:57Z is after 2026-02-23T05:33:13Z Mar 09 02:41:57 crc kubenswrapper[4901]: E0309 02:41:57.768607 4901 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:57Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 02:41:57 crc kubenswrapper[4901]: E0309 02:41:57.888806 4901 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:57Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 09 02:41:57 crc kubenswrapper[4901]: I0309 02:41:57.893097 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:41:57 crc kubenswrapper[4901]: I0309 02:41:57.895408 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:41:57 crc kubenswrapper[4901]: I0309 02:41:57.895464 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:41:57 crc kubenswrapper[4901]: I0309 02:41:57.895487 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:41:57 crc kubenswrapper[4901]: I0309 02:41:57.895532 4901 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 02:41:57 crc kubenswrapper[4901]: E0309 02:41:57.901520 4901 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:57Z is after 2026-02-23T05:33:13Z" node="crc" Mar 09 02:41:58 crc kubenswrapper[4901]: I0309 02:41:58.038618 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:58Z is after 2026-02-23T05:33:13Z Mar 09 02:41:58 crc kubenswrapper[4901]: W0309 02:41:58.573654 4901 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:58Z is after 2026-02-23T05:33:13Z Mar 09 02:41:58 crc kubenswrapper[4901]: E0309 02:41:58.573782 4901 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:58Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 02:41:59 crc kubenswrapper[4901]: W0309 02:41:59.030339 4901 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:59Z is after 2026-02-23T05:33:13Z Mar 09 02:41:59 crc kubenswrapper[4901]: E0309 02:41:59.030458 4901 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:59Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 02:41:59 crc kubenswrapper[4901]: I0309 02:41:59.036568 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:41:59Z is after 2026-02-23T05:33:13Z Mar 09 02:42:00 crc kubenswrapper[4901]: I0309 02:42:00.036649 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:42:00Z is after 2026-02-23T05:33:13Z Mar 09 02:42:00 crc kubenswrapper[4901]: E0309 02:42:00.483537 4901 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:42:00Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189b0c00d5f31a5c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:16.02943446 +0000 UTC m=+0.619098222,LastTimestamp:2026-03-09 02:41:16.02943446 +0000 UTC m=+0.619098222,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:00 crc kubenswrapper[4901]: W0309 02:42:00.559604 4901 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:42:00Z is after 2026-02-23T05:33:13Z Mar 09 02:42:00 crc kubenswrapper[4901]: E0309 02:42:00.559754 4901 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:42:00Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 02:42:00 crc kubenswrapper[4901]: I0309 02:42:00.664725 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 02:42:00 crc kubenswrapper[4901]: I0309 02:42:00.665055 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:42:00 crc kubenswrapper[4901]: I0309 02:42:00.666993 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:00 crc kubenswrapper[4901]: I0309 02:42:00.667082 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:00 crc kubenswrapper[4901]: I0309 02:42:00.667111 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:00 crc kubenswrapper[4901]: I0309 02:42:00.668405 4901 scope.go:117] "RemoveContainer" containerID="7e3fac986a6132fe6180180b96999fcb0b1e6be73db98c01464aae6f5bf83630" Mar 09 02:42:00 crc kubenswrapper[4901]: E0309 02:42:00.668804 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 02:42:01 crc kubenswrapper[4901]: I0309 02:42:01.036706 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:42:01Z is after 2026-02-23T05:33:13Z Mar 09 02:42:02 crc kubenswrapper[4901]: I0309 02:42:02.036919 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:42:02Z is after 2026-02-23T05:33:13Z Mar 09 02:42:03 crc kubenswrapper[4901]: I0309 02:42:03.035587 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:42:03Z is after 2026-02-23T05:33:13Z Mar 09 02:42:04 crc kubenswrapper[4901]: I0309 02:42:04.035669 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:42:04Z is after 2026-02-23T05:33:13Z Mar 09 02:42:04 crc kubenswrapper[4901]: E0309 02:42:04.895379 4901 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:42:04Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 09 02:42:04 crc kubenswrapper[4901]: I0309 02:42:04.902480 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:42:04 crc kubenswrapper[4901]: I0309 02:42:04.908766 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:04 crc kubenswrapper[4901]: I0309 02:42:04.909522 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:04 crc kubenswrapper[4901]: I0309 02:42:04.909703 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:04 crc kubenswrapper[4901]: I0309 02:42:04.909878 4901 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 02:42:04 crc kubenswrapper[4901]: E0309 02:42:04.915524 4901 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:42:04Z is after 2026-02-23T05:33:13Z" node="crc" Mar 09 02:42:05 crc kubenswrapper[4901]: I0309 02:42:05.037113 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:42:05Z is after 2026-02-23T05:33:13Z Mar 09 02:42:06 crc kubenswrapper[4901]: I0309 02:42:06.036522 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:42:06Z is after 2026-02-23T05:33:13Z Mar 09 02:42:06 crc kubenswrapper[4901]: E0309 02:42:06.197148 4901 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 02:42:07 crc kubenswrapper[4901]: I0309 02:42:07.036693 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:42:07Z is after 2026-02-23T05:33:13Z Mar 09 02:42:07 crc kubenswrapper[4901]: I0309 02:42:07.540602 4901 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 02:42:07 crc kubenswrapper[4901]: I0309 02:42:07.541353 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 02:42:07 crc kubenswrapper[4901]: I0309 02:42:07.795182 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 02:42:07 crc kubenswrapper[4901]: I0309 02:42:07.795479 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:42:07 crc kubenswrapper[4901]: I0309 02:42:07.796945 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:07 crc kubenswrapper[4901]: I0309 02:42:07.796985 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:07 crc kubenswrapper[4901]: I0309 02:42:07.797003 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:08 crc kubenswrapper[4901]: I0309 02:42:08.039583 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:42:08Z is after 2026-02-23T05:33:13Z Mar 09 02:42:09 crc kubenswrapper[4901]: I0309 02:42:09.036957 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:42:09Z is after 2026-02-23T05:33:13Z Mar 09 02:42:10 crc kubenswrapper[4901]: I0309 02:42:10.037472 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:42:10Z is after 2026-02-23T05:33:13Z Mar 09 02:42:10 crc kubenswrapper[4901]: E0309 02:42:10.489266 4901 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:42:10Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189b0c00d5f31a5c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:16.02943446 +0000 UTC m=+0.619098222,LastTimestamp:2026-03-09 02:41:16.02943446 +0000 UTC m=+0.619098222,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:11 crc kubenswrapper[4901]: I0309 02:42:11.034636 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:42:11Z is after 2026-02-23T05:33:13Z Mar 09 02:42:11 crc kubenswrapper[4901]: E0309 02:42:11.903303 4901 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 09 02:42:11 crc kubenswrapper[4901]: I0309 02:42:11.916509 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:42:11 crc kubenswrapper[4901]: I0309 02:42:11.918433 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:11 crc kubenswrapper[4901]: I0309 02:42:11.918493 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:11 crc kubenswrapper[4901]: I0309 02:42:11.918513 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:11 crc kubenswrapper[4901]: I0309 02:42:11.918555 4901 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 02:42:11 crc kubenswrapper[4901]: E0309 02:42:11.925617 4901 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 09 02:42:12 crc kubenswrapper[4901]: I0309 02:42:12.040771 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 02:42:13 crc kubenswrapper[4901]: I0309 02:42:13.041067 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 02:42:13 crc kubenswrapper[4901]: I0309 02:42:13.106090 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:42:13 crc kubenswrapper[4901]: I0309 02:42:13.107906 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:13 crc kubenswrapper[4901]: I0309 02:42:13.108111 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:13 crc kubenswrapper[4901]: I0309 02:42:13.108280 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:13 crc kubenswrapper[4901]: I0309 02:42:13.109369 4901 scope.go:117] "RemoveContainer" containerID="7e3fac986a6132fe6180180b96999fcb0b1e6be73db98c01464aae6f5bf83630" Mar 09 02:42:13 crc kubenswrapper[4901]: E0309 02:42:13.109792 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 02:42:14 crc kubenswrapper[4901]: I0309 02:42:14.037638 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 02:42:15 crc kubenswrapper[4901]: I0309 02:42:15.038558 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 02:42:16 crc kubenswrapper[4901]: I0309 02:42:16.038469 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 02:42:16 crc kubenswrapper[4901]: E0309 02:42:16.197441 4901 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 02:42:17 crc kubenswrapper[4901]: I0309 02:42:17.038462 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 02:42:17 crc kubenswrapper[4901]: I0309 02:42:17.540915 4901 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 02:42:17 crc kubenswrapper[4901]: I0309 02:42:17.540985 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 02:42:17 crc kubenswrapper[4901]: I0309 02:42:17.541049 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 02:42:17 crc kubenswrapper[4901]: I0309 02:42:17.541190 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:42:17 crc kubenswrapper[4901]: I0309 02:42:17.542579 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:17 crc kubenswrapper[4901]: I0309 02:42:17.542611 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:17 crc kubenswrapper[4901]: I0309 02:42:17.542650 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:17 crc kubenswrapper[4901]: I0309 02:42:17.543077 4901 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"ceafe61efea5730e474e74a9ed5a7558cc06dc75d105c37680ed1eea84b8827c"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 09 02:42:17 crc kubenswrapper[4901]: I0309 02:42:17.543176 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://ceafe61efea5730e474e74a9ed5a7558cc06dc75d105c37680ed1eea84b8827c" gracePeriod=30 Mar 09 02:42:18 crc kubenswrapper[4901]: I0309 02:42:18.040254 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 02:42:18 crc kubenswrapper[4901]: I0309 02:42:18.459974 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 09 02:42:18 crc kubenswrapper[4901]: I0309 02:42:18.462038 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 09 02:42:18 crc kubenswrapper[4901]: I0309 02:42:18.462703 4901 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="ceafe61efea5730e474e74a9ed5a7558cc06dc75d105c37680ed1eea84b8827c" exitCode=255 Mar 09 02:42:18 crc kubenswrapper[4901]: I0309 02:42:18.462841 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"ceafe61efea5730e474e74a9ed5a7558cc06dc75d105c37680ed1eea84b8827c"} Mar 09 02:42:18 crc kubenswrapper[4901]: I0309 02:42:18.463175 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"95a5fc152f730178294cfb5d25a815b2041e4b712e4bdb7b334b4b417f463ee3"} Mar 09 02:42:18 crc kubenswrapper[4901]: I0309 02:42:18.463266 4901 scope.go:117] "RemoveContainer" containerID="a9c9580eb124dab4af78808f51714f0bc0d74cda42b20a800c3f85dccafb84ec" Mar 09 02:42:18 crc kubenswrapper[4901]: I0309 02:42:18.463391 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:42:18 crc kubenswrapper[4901]: I0309 02:42:18.465164 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:18 crc kubenswrapper[4901]: I0309 02:42:18.465215 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:18 crc kubenswrapper[4901]: I0309 02:42:18.465261 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:18 crc kubenswrapper[4901]: E0309 02:42:18.910918 4901 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 09 02:42:18 crc kubenswrapper[4901]: I0309 02:42:18.926919 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:42:18 crc kubenswrapper[4901]: I0309 02:42:18.928847 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:18 crc kubenswrapper[4901]: I0309 02:42:18.928921 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:18 crc kubenswrapper[4901]: I0309 02:42:18.928946 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:18 crc kubenswrapper[4901]: I0309 02:42:18.929015 4901 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 02:42:18 crc kubenswrapper[4901]: E0309 02:42:18.931072 4901 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 09 02:42:19 crc kubenswrapper[4901]: I0309 02:42:19.039819 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 02:42:19 crc kubenswrapper[4901]: I0309 02:42:19.478386 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 09 02:42:19 crc kubenswrapper[4901]: I0309 02:42:19.480718 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:42:19 crc kubenswrapper[4901]: I0309 02:42:19.482061 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:19 crc kubenswrapper[4901]: I0309 02:42:19.482172 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:19 crc kubenswrapper[4901]: I0309 02:42:19.482192 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:20 crc kubenswrapper[4901]: I0309 02:42:20.038735 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.496817 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b0c00d5f31a5c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:16.02943446 +0000 UTC m=+0.619098222,LastTimestamp:2026-03-09 02:41:16.02943446 +0000 UTC m=+0.619098222,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.503058 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b0c00da6ab4d2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:16.10438165 +0000 UTC m=+0.694045432,LastTimestamp:2026-03-09 02:41:16.10438165 +0000 UTC m=+0.694045432,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.509911 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b0c00da6b5ad7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:16.104424151 +0000 UTC m=+0.694087913,LastTimestamp:2026-03-09 02:41:16.104424151 +0000 UTC m=+0.694087913,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.516804 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b0c00da6bb69a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:16.104447642 +0000 UTC m=+0.694111414,LastTimestamp:2026-03-09 02:41:16.104447642 +0000 UTC m=+0.694111414,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.522752 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b0c00df46e097 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:16.185919639 +0000 UTC m=+0.775583381,LastTimestamp:2026-03-09 02:41:16.185919639 +0000 UTC m=+0.775583381,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.529812 4901 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b0c00da6ab4d2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b0c00da6ab4d2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:16.10438165 +0000 UTC m=+0.694045432,LastTimestamp:2026-03-09 02:41:16.207347079 +0000 UTC m=+0.797010821,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.536755 4901 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b0c00da6b5ad7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b0c00da6b5ad7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:16.104424151 +0000 UTC m=+0.694087913,LastTimestamp:2026-03-09 02:41:16.20736102 +0000 UTC m=+0.797024762,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.538948 4901 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b0c00da6bb69a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b0c00da6bb69a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:16.104447642 +0000 UTC m=+0.694111414,LastTimestamp:2026-03-09 02:41:16.207373941 +0000 UTC m=+0.797037683,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.545545 4901 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b0c00da6ab4d2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b0c00da6ab4d2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:16.10438165 +0000 UTC m=+0.694045432,LastTimestamp:2026-03-09 02:41:16.212091491 +0000 UTC m=+0.801755233,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.554004 4901 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b0c00da6b5ad7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b0c00da6b5ad7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:16.104424151 +0000 UTC m=+0.694087913,LastTimestamp:2026-03-09 02:41:16.212109212 +0000 UTC m=+0.801772954,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.561840 4901 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b0c00da6bb69a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b0c00da6bb69a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:16.104447642 +0000 UTC m=+0.694111414,LastTimestamp:2026-03-09 02:41:16.212119872 +0000 UTC m=+0.801783614,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.570103 4901 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b0c00da6ab4d2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b0c00da6ab4d2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:16.10438165 +0000 UTC m=+0.694045432,LastTimestamp:2026-03-09 02:41:16.212336224 +0000 UTC m=+0.801999986,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.577211 4901 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b0c00da6b5ad7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b0c00da6b5ad7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:16.104424151 +0000 UTC m=+0.694087913,LastTimestamp:2026-03-09 02:41:16.212370836 +0000 UTC m=+0.802034598,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.584258 4901 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b0c00da6bb69a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b0c00da6bb69a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:16.104447642 +0000 UTC m=+0.694111414,LastTimestamp:2026-03-09 02:41:16.212388937 +0000 UTC m=+0.802052709,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.590858 4901 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b0c00da6ab4d2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b0c00da6ab4d2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:16.10438165 +0000 UTC m=+0.694045432,LastTimestamp:2026-03-09 02:41:16.214364746 +0000 UTC m=+0.804028488,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.597892 4901 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b0c00da6b5ad7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b0c00da6b5ad7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:16.104424151 +0000 UTC m=+0.694087913,LastTimestamp:2026-03-09 02:41:16.214483582 +0000 UTC m=+0.804147324,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.604760 4901 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b0c00da6bb69a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b0c00da6bb69a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:16.104447642 +0000 UTC m=+0.694111414,LastTimestamp:2026-03-09 02:41:16.214569257 +0000 UTC m=+0.804232999,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.612132 4901 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b0c00da6ab4d2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b0c00da6ab4d2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:16.10438165 +0000 UTC m=+0.694045432,LastTimestamp:2026-03-09 02:41:16.214645211 +0000 UTC m=+0.804308973,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.621069 4901 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b0c00da6b5ad7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b0c00da6b5ad7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:16.104424151 +0000 UTC m=+0.694087913,LastTimestamp:2026-03-09 02:41:16.214718195 +0000 UTC m=+0.804381957,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.627496 4901 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b0c00da6bb69a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b0c00da6bb69a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:16.104447642 +0000 UTC m=+0.694111414,LastTimestamp:2026-03-09 02:41:16.214753497 +0000 UTC m=+0.804417259,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.633792 4901 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b0c00da6ab4d2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b0c00da6ab4d2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:16.10438165 +0000 UTC m=+0.694045432,LastTimestamp:2026-03-09 02:41:16.216970159 +0000 UTC m=+0.806633931,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.640647 4901 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b0c00da6b5ad7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b0c00da6b5ad7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:16.104424151 +0000 UTC m=+0.694087913,LastTimestamp:2026-03-09 02:41:16.216990631 +0000 UTC m=+0.806654393,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.645479 4901 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b0c00da6bb69a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b0c00da6bb69a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:16.104447642 +0000 UTC m=+0.694111414,LastTimestamp:2026-03-09 02:41:16.217006231 +0000 UTC m=+0.806670003,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.648041 4901 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b0c00da6ab4d2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b0c00da6ab4d2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:16.10438165 +0000 UTC m=+0.694045432,LastTimestamp:2026-03-09 02:41:16.217359541 +0000 UTC m=+0.807023293,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.651719 4901 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b0c00da6b5ad7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b0c00da6b5ad7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:16.104424151 +0000 UTC m=+0.694087913,LastTimestamp:2026-03-09 02:41:16.217493658 +0000 UTC m=+0.807157420,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.654758 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b0c00f8f43f65 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:16.616712037 +0000 UTC m=+1.206375799,LastTimestamp:2026-03-09 02:41:16.616712037 +0000 UTC m=+1.206375799,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.662568 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b0c00f8f57b82 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:16.616792962 +0000 UTC m=+1.206456734,LastTimestamp:2026-03-09 02:41:16.616792962 +0000 UTC m=+1.206456734,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.667433 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b0c00f8f5c2e1 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:16.616811233 +0000 UTC m=+1.206474995,LastTimestamp:2026-03-09 02:41:16.616811233 +0000 UTC m=+1.206474995,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.672756 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b0c00f94a5d1a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:16.622355738 +0000 UTC m=+1.212019500,LastTimestamp:2026-03-09 02:41:16.622355738 +0000 UTC m=+1.212019500,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.676746 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b0c00f971a5e8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:16.62493028 +0000 UTC m=+1.214594052,LastTimestamp:2026-03-09 02:41:16.62493028 +0000 UTC m=+1.214594052,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.681158 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b0c01206e214a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:17.279011146 +0000 UTC m=+1.868674918,LastTimestamp:2026-03-09 02:41:17.279011146 +0000 UTC m=+1.868674918,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.685718 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b0c01207150eb openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:17.279219947 +0000 UTC m=+1.868883689,LastTimestamp:2026-03-09 02:41:17.279219947 +0000 UTC m=+1.868883689,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.690374 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b0c012198ad35 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:17.298576693 +0000 UTC m=+1.888240435,LastTimestamp:2026-03-09 02:41:17.298576693 +0000 UTC m=+1.888240435,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.697280 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b0c0121cf75f7 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:17.302167031 +0000 UTC m=+1.891830773,LastTimestamp:2026-03-09 02:41:17.302167031 +0000 UTC m=+1.891830773,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.703095 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b0c0121cffb69 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:17.302201193 +0000 UTC m=+1.891864945,LastTimestamp:2026-03-09 02:41:17.302201193 +0000 UTC m=+1.891864945,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.707683 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b0c0121d10931 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:17.302270257 +0000 UTC m=+1.891934009,LastTimestamp:2026-03-09 02:41:17.302270257 +0000 UTC m=+1.891934009,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.713052 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b0c0121d76aa4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:17.30268842 +0000 UTC m=+1.892352202,LastTimestamp:2026-03-09 02:41:17.30268842 +0000 UTC m=+1.892352202,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.717892 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b0c0121f3bada openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:17.304543962 +0000 UTC m=+1.894207704,LastTimestamp:2026-03-09 02:41:17.304543962 +0000 UTC m=+1.894207704,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.722040 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b0c0122fa9489 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:17.321770121 +0000 UTC m=+1.911433863,LastTimestamp:2026-03-09 02:41:17.321770121 +0000 UTC m=+1.911433863,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.725644 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b0c012307fb8b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:17.322648459 +0000 UTC m=+1.912312201,LastTimestamp:2026-03-09 02:41:17.322648459 +0000 UTC m=+1.912312201,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.729935 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b0c012310edde openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:17.323234782 +0000 UTC m=+1.912898524,LastTimestamp:2026-03-09 02:41:17.323234782 +0000 UTC m=+1.912898524,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.735394 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b0c0136375372 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:17.644518258 +0000 UTC m=+2.234182020,LastTimestamp:2026-03-09 02:41:17.644518258 +0000 UTC m=+2.234182020,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.741545 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b0c01374dbcfb openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:17.662764283 +0000 UTC m=+2.252428015,LastTimestamp:2026-03-09 02:41:17.662764283 +0000 UTC m=+2.252428015,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.745938 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b0c01375f1544 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:17.663900996 +0000 UTC m=+2.253564728,LastTimestamp:2026-03-09 02:41:17.663900996 +0000 UTC m=+2.253564728,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.749915 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b0c014598dbc0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:17.902568384 +0000 UTC m=+2.492232156,LastTimestamp:2026-03-09 02:41:17.902568384 +0000 UTC m=+2.492232156,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.753371 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b0c0146936508 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:17.918987528 +0000 UTC m=+2.508651300,LastTimestamp:2026-03-09 02:41:17.918987528 +0000 UTC m=+2.508651300,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.756672 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b0c0146af531d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:17.920817949 +0000 UTC m=+2.510481711,LastTimestamp:2026-03-09 02:41:17.920817949 +0000 UTC m=+2.510481711,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.760704 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b0c01536c7a98 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:18.134540952 +0000 UTC m=+2.724204684,LastTimestamp:2026-03-09 02:41:18.134540952 +0000 UTC m=+2.724204684,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.764564 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b0c01536c9550 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:18.134547792 +0000 UTC m=+2.724211534,LastTimestamp:2026-03-09 02:41:18.134547792 +0000 UTC m=+2.724211534,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.769216 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b0c015457eabf openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:18.149970623 +0000 UTC m=+2.739634365,LastTimestamp:2026-03-09 02:41:18.149970623 +0000 UTC m=+2.739634365,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.774041 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b0c0154854157 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:18.152941911 +0000 UTC m=+2.742605683,LastTimestamp:2026-03-09 02:41:18.152941911 +0000 UTC m=+2.742605683,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.778713 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b0c015701768a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:18.194636426 +0000 UTC m=+2.784300158,LastTimestamp:2026-03-09 02:41:18.194636426 +0000 UTC m=+2.784300158,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.784605 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b0c01583d3d67 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:18.215331175 +0000 UTC m=+2.804994917,LastTimestamp:2026-03-09 02:41:18.215331175 +0000 UTC m=+2.804994917,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.789910 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b0c0161934f0e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:18.371966734 +0000 UTC m=+2.961630466,LastTimestamp:2026-03-09 02:41:18.371966734 +0000 UTC m=+2.961630466,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.795107 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b0c0161fb42d8 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:18.378779352 +0000 UTC m=+2.968443084,LastTimestamp:2026-03-09 02:41:18.378779352 +0000 UTC m=+2.968443084,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.800614 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b0c0162bd1c03 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:18.391483395 +0000 UTC m=+2.981147127,LastTimestamp:2026-03-09 02:41:18.391483395 +0000 UTC m=+2.981147127,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.807346 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b0c0162ca5f67 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:18.392352615 +0000 UTC m=+2.982016347,LastTimestamp:2026-03-09 02:41:18.392352615 +0000 UTC m=+2.982016347,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.811764 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b0c0162e14c7c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:18.3938551 +0000 UTC m=+2.983518832,LastTimestamp:2026-03-09 02:41:18.3938551 +0000 UTC m=+2.983518832,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.818077 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b0c0162e83870 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:18.39430872 +0000 UTC m=+2.983972452,LastTimestamp:2026-03-09 02:41:18.39430872 +0000 UTC m=+2.983972452,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.824146 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b0c0162ee0669 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:18.394689129 +0000 UTC m=+2.984352861,LastTimestamp:2026-03-09 02:41:18.394689129 +0000 UTC m=+2.984352861,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.829017 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b0c016427e6d0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:18.415259344 +0000 UTC m=+3.004923076,LastTimestamp:2026-03-09 02:41:18.415259344 +0000 UTC m=+3.004923076,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.836097 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b0c0164301213 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:18.415794707 +0000 UTC m=+3.005458439,LastTimestamp:2026-03-09 02:41:18.415794707 +0000 UTC m=+3.005458439,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.842146 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b0c01643decc2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:18.416702658 +0000 UTC m=+3.006366390,LastTimestamp:2026-03-09 02:41:18.416702658 +0000 UTC m=+3.006366390,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.846445 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b0c017085d6cb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:18.622742219 +0000 UTC m=+3.212405961,LastTimestamp:2026-03-09 02:41:18.622742219 +0000 UTC m=+3.212405961,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.850536 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b0c0171e7283a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:18.645897274 +0000 UTC m=+3.235561006,LastTimestamp:2026-03-09 02:41:18.645897274 +0000 UTC m=+3.235561006,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.854420 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b0c0171fa684f openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:18.647158863 +0000 UTC m=+3.236822595,LastTimestamp:2026-03-09 02:41:18.647158863 +0000 UTC m=+3.236822595,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.857960 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b0c0171fd86ce openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:18.647363278 +0000 UTC m=+3.237027010,LastTimestamp:2026-03-09 02:41:18.647363278 +0000 UTC m=+3.237027010,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.863777 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b0c01734c09f2 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:18.669285874 +0000 UTC m=+3.258949606,LastTimestamp:2026-03-09 02:41:18.669285874 +0000 UTC m=+3.258949606,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.870869 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b0c0173a03f34 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:18.674804532 +0000 UTC m=+3.264468264,LastTimestamp:2026-03-09 02:41:18.674804532 +0000 UTC m=+3.264468264,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.875407 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b0c01801c799e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:18.884272542 +0000 UTC m=+3.473936284,LastTimestamp:2026-03-09 02:41:18.884272542 +0000 UTC m=+3.473936284,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.879880 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b0c01809958f6 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:18.892456182 +0000 UTC m=+3.482119914,LastTimestamp:2026-03-09 02:41:18.892456182 +0000 UTC m=+3.482119914,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.885789 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b0c0180d91fa0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:18.896635808 +0000 UTC m=+3.486299540,LastTimestamp:2026-03-09 02:41:18.896635808 +0000 UTC m=+3.486299540,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.890659 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b0c0180e8a7b2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:18.897653682 +0000 UTC m=+3.487317414,LastTimestamp:2026-03-09 02:41:18.897653682 +0000 UTC m=+3.487317414,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.896013 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b0c01813d725e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:18.90321059 +0000 UTC m=+3.492874322,LastTimestamp:2026-03-09 02:41:18.90321059 +0000 UTC m=+3.492874322,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.899875 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b0c018bcfda35 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:19.080577589 +0000 UTC m=+3.670241321,LastTimestamp:2026-03-09 02:41:19.080577589 +0000 UTC m=+3.670241321,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.905788 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b0c018cacdac9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:19.095061193 +0000 UTC m=+3.684724975,LastTimestamp:2026-03-09 02:41:19.095061193 +0000 UTC m=+3.684724975,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.911711 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b0c018cc6be3e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:19.096757822 +0000 UTC m=+3.686421554,LastTimestamp:2026-03-09 02:41:19.096757822 +0000 UTC m=+3.686421554,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.917115 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b0c0191b3fa75 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:19.179414133 +0000 UTC m=+3.769077875,LastTimestamp:2026-03-09 02:41:19.179414133 +0000 UTC m=+3.769077875,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.923742 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b0c019a610a1c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:19.324973596 +0000 UTC m=+3.914637348,LastTimestamp:2026-03-09 02:41:19.324973596 +0000 UTC m=+3.914637348,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.928661 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b0c019b092387 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:19.335990151 +0000 UTC m=+3.925653913,LastTimestamp:2026-03-09 02:41:19.335990151 +0000 UTC m=+3.925653913,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.932533 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b0c019f8ca217 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:19.411716631 +0000 UTC m=+4.001380363,LastTimestamp:2026-03-09 02:41:19.411716631 +0000 UTC m=+4.001380363,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.941081 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b0c01a062ca53 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:19.425751635 +0000 UTC m=+4.015415367,LastTimestamp:2026-03-09 02:41:19.425751635 +0000 UTC m=+4.015415367,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.948493 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b0c01cee30fe9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:20.205909993 +0000 UTC m=+4.795573755,LastTimestamp:2026-03-09 02:41:20.205909993 +0000 UTC m=+4.795573755,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.954865 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b0c01dee28c0f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:20.474311695 +0000 UTC m=+5.063975467,LastTimestamp:2026-03-09 02:41:20.474311695 +0000 UTC m=+5.063975467,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.959662 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b0c01dfb28856 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:20.48794223 +0000 UTC m=+5.077606012,LastTimestamp:2026-03-09 02:41:20.48794223 +0000 UTC m=+5.077606012,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.963751 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b0c01dfcbeeb6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:20.489606838 +0000 UTC m=+5.079270580,LastTimestamp:2026-03-09 02:41:20.489606838 +0000 UTC m=+5.079270580,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.968260 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b0c01f005d5c6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:20.761836998 +0000 UTC m=+5.351500760,LastTimestamp:2026-03-09 02:41:20.761836998 +0000 UTC m=+5.351500760,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.972459 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b0c01f11176cd openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:20.779376333 +0000 UTC m=+5.369040095,LastTimestamp:2026-03-09 02:41:20.779376333 +0000 UTC m=+5.369040095,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.977003 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b0c01f12a7f95 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:20.781016981 +0000 UTC m=+5.370680753,LastTimestamp:2026-03-09 02:41:20.781016981 +0000 UTC m=+5.370680753,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.980632 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b0c02006761f1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:21.036665329 +0000 UTC m=+5.626329101,LastTimestamp:2026-03-09 02:41:21.036665329 +0000 UTC m=+5.626329101,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.984568 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b0c020196ba62 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:21.056545378 +0000 UTC m=+5.646209150,LastTimestamp:2026-03-09 02:41:21.056545378 +0000 UTC m=+5.646209150,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.988663 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b0c0201ae8170 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:21.058103664 +0000 UTC m=+5.647767406,LastTimestamp:2026-03-09 02:41:21.058103664 +0000 UTC m=+5.647767406,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.995210 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b0c02110d9f52 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:21.315995474 +0000 UTC m=+5.905659206,LastTimestamp:2026-03-09 02:41:21.315995474 +0000 UTC m=+5.905659206,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:20 crc kubenswrapper[4901]: E0309 02:42:20.999500 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b0c0211a261b9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:21.325744569 +0000 UTC m=+5.915408311,LastTimestamp:2026-03-09 02:41:21.325744569 +0000 UTC m=+5.915408311,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:21 crc kubenswrapper[4901]: E0309 02:42:21.003260 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b0c0211b574b0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:21.326994608 +0000 UTC m=+5.916658340,LastTimestamp:2026-03-09 02:41:21.326994608 +0000 UTC m=+5.916658340,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:21 crc kubenswrapper[4901]: E0309 02:42:21.007439 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b0c022049759f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:21.571575199 +0000 UTC m=+6.161238961,LastTimestamp:2026-03-09 02:41:21.571575199 +0000 UTC m=+6.161238961,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:21 crc kubenswrapper[4901]: E0309 02:42:21.011797 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b0c0221645e44 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:21.590115908 +0000 UTC m=+6.179779690,LastTimestamp:2026-03-09 02:41:21.590115908 +0000 UTC m=+6.179779690,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:21 crc kubenswrapper[4901]: E0309 02:42:21.018267 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 09 02:42:21 crc kubenswrapper[4901]: &Event{ObjectMeta:{kube-controller-manager-crc.189b0c0384125f37 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 09 02:42:21 crc kubenswrapper[4901]: body: Mar 09 02:42:21 crc kubenswrapper[4901]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:27.540653879 +0000 UTC m=+12.130317651,LastTimestamp:2026-03-09 02:41:27.540653879 +0000 UTC m=+12.130317651,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 02:42:21 crc kubenswrapper[4901]: > Mar 09 02:42:21 crc kubenswrapper[4901]: E0309 02:42:21.022951 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b0c0384137955 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:27.540726101 +0000 UTC m=+12.130389873,LastTimestamp:2026-03-09 02:41:27.540726101 +0000 UTC m=+12.130389873,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:21 crc kubenswrapper[4901]: E0309 02:42:21.027479 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 09 02:42:21 crc kubenswrapper[4901]: &Event{ObjectMeta:{kube-apiserver-crc.189b0c0433b9bcb7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 09 02:42:21 crc kubenswrapper[4901]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 09 02:42:21 crc kubenswrapper[4901]: Mar 09 02:42:21 crc kubenswrapper[4901]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:30.487635127 +0000 UTC m=+15.077298869,LastTimestamp:2026-03-09 02:41:30.487635127 +0000 UTC m=+15.077298869,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 02:42:21 crc kubenswrapper[4901]: > Mar 09 02:42:21 crc kubenswrapper[4901]: E0309 02:42:21.030795 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b0c0433badf26 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:30.487709478 +0000 UTC m=+15.077373220,LastTimestamp:2026-03-09 02:41:30.487709478 +0000 UTC m=+15.077373220,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:21 crc kubenswrapper[4901]: I0309 02:42:21.036876 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 02:42:21 crc kubenswrapper[4901]: E0309 02:42:21.036825 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 09 02:42:21 crc kubenswrapper[4901]: &Event{ObjectMeta:{kube-apiserver-crc.189b0c043447546d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 500 Mar 09 02:42:21 crc kubenswrapper[4901]: body: [+]ping ok Mar 09 02:42:21 crc kubenswrapper[4901]: [+]log ok Mar 09 02:42:21 crc kubenswrapper[4901]: [+]etcd ok Mar 09 02:42:21 crc kubenswrapper[4901]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 09 02:42:21 crc kubenswrapper[4901]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 09 02:42:21 crc kubenswrapper[4901]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 09 02:42:21 crc kubenswrapper[4901]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 09 02:42:21 crc kubenswrapper[4901]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 09 02:42:21 crc kubenswrapper[4901]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 09 02:42:21 crc kubenswrapper[4901]: [+]poststarthook/generic-apiserver-start-informers ok Mar 09 02:42:21 crc kubenswrapper[4901]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 09 02:42:21 crc kubenswrapper[4901]: [+]poststarthook/priority-and-fairness-filter ok Mar 09 02:42:21 crc kubenswrapper[4901]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 09 02:42:21 crc kubenswrapper[4901]: [+]poststarthook/start-apiextensions-informers ok Mar 09 02:42:21 crc kubenswrapper[4901]: [-]poststarthook/start-apiextensions-controllers failed: reason withheld Mar 09 02:42:21 crc kubenswrapper[4901]: [-]poststarthook/crd-informer-synced failed: reason withheld Mar 09 02:42:21 crc kubenswrapper[4901]: [+]poststarthook/start-system-namespaces-controller ok Mar 09 02:42:21 crc kubenswrapper[4901]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 09 02:42:21 crc kubenswrapper[4901]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 09 02:42:21 crc kubenswrapper[4901]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 09 02:42:21 crc kubenswrapper[4901]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 09 02:42:21 crc kubenswrapper[4901]: [-]poststarthook/start-service-ip-repair-controllers failed: reason withheld Mar 09 02:42:21 crc kubenswrapper[4901]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Mar 09 02:42:21 crc kubenswrapper[4901]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Mar 09 02:42:21 crc kubenswrapper[4901]: [-]poststarthook/priority-and-fairness-config-producer failed: reason withheld Mar 09 02:42:21 crc kubenswrapper[4901]: [-]poststarthook/bootstrap-controller failed: reason withheld Mar 09 02:42:21 crc kubenswrapper[4901]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 09 02:42:21 crc kubenswrapper[4901]: [+]poststarthook/start-kube-aggregator-informers ok Mar 09 02:42:21 crc kubenswrapper[4901]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 09 02:42:21 crc kubenswrapper[4901]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 09 02:42:21 crc kubenswrapper[4901]: [-]poststarthook/apiservice-registration-controller failed: reason withheld Mar 09 02:42:21 crc kubenswrapper[4901]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 09 02:42:21 crc kubenswrapper[4901]: [-]poststarthook/apiservice-discovery-controller failed: reason withheld Mar 09 02:42:21 crc kubenswrapper[4901]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 09 02:42:21 crc kubenswrapper[4901]: [+]autoregister-completion ok Mar 09 02:42:21 crc kubenswrapper[4901]: [+]poststarthook/apiservice-openapi-controller ok Mar 09 02:42:21 crc kubenswrapper[4901]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 09 02:42:21 crc kubenswrapper[4901]: livez check failed Mar 09 02:42:21 crc kubenswrapper[4901]: Mar 09 02:42:21 crc kubenswrapper[4901]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:30.496914541 +0000 UTC m=+15.086578303,LastTimestamp:2026-03-09 02:41:30.496914541 +0000 UTC m=+15.086578303,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 02:42:21 crc kubenswrapper[4901]: > Mar 09 02:42:21 crc kubenswrapper[4901]: E0309 02:42:21.043302 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b0c0434489a4b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:30.496997963 +0000 UTC m=+15.086661735,LastTimestamp:2026-03-09 02:41:30.496997963 +0000 UTC m=+15.086661735,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:21 crc kubenswrapper[4901]: E0309 02:42:21.047410 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 09 02:42:21 crc kubenswrapper[4901]: &Event{ObjectMeta:{kube-apiserver-crc.189b0c043e4e41d0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Readiness probe error: Get "https://192.168.126.11:17697/healthz": dial tcp 192.168.126.11:17697: connect: connection refused Mar 09 02:42:21 crc kubenswrapper[4901]: body: Mar 09 02:42:21 crc kubenswrapper[4901]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:30.665140688 +0000 UTC m=+15.254804420,LastTimestamp:2026-03-09 02:41:30.665140688 +0000 UTC m=+15.254804420,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 02:42:21 crc kubenswrapper[4901]: > Mar 09 02:42:21 crc kubenswrapper[4901]: E0309 02:42:21.052347 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b0c043e4f2c14 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Unhealthy,Message:Readiness probe failed: Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:30.66520066 +0000 UTC m=+15.254864392,LastTimestamp:2026-03-09 02:41:30.66520066 +0000 UTC m=+15.254864392,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:21 crc kubenswrapper[4901]: E0309 02:42:21.059747 4901 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b0c018cc6be3e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b0c018cc6be3e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:19.096757822 +0000 UTC m=+3.686421554,LastTimestamp:2026-03-09 02:41:31.266758399 +0000 UTC m=+15.856422171,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:21 crc kubenswrapper[4901]: E0309 02:42:21.064285 4901 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b0c0384125f37\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 09 02:42:21 crc kubenswrapper[4901]: &Event{ObjectMeta:{kube-controller-manager-crc.189b0c0384125f37 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 09 02:42:21 crc kubenswrapper[4901]: body: Mar 09 02:42:21 crc kubenswrapper[4901]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:27.540653879 +0000 UTC m=+12.130317651,LastTimestamp:2026-03-09 02:41:37.540512571 +0000 UTC m=+22.130176303,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 02:42:21 crc kubenswrapper[4901]: > Mar 09 02:42:21 crc kubenswrapper[4901]: E0309 02:42:21.068354 4901 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b0c0384137955\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b0c0384137955 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:27.540726101 +0000 UTC m=+12.130389873,LastTimestamp:2026-03-09 02:41:37.540562812 +0000 UTC m=+22.130226544,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:21 crc kubenswrapper[4901]: E0309 02:42:21.075711 4901 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b0c0384125f37\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 09 02:42:21 crc kubenswrapper[4901]: &Event{ObjectMeta:{kube-controller-manager-crc.189b0c0384125f37 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 09 02:42:21 crc kubenswrapper[4901]: body: Mar 09 02:42:21 crc kubenswrapper[4901]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:27.540653879 +0000 UTC m=+12.130317651,LastTimestamp:2026-03-09 02:41:47.541239402 +0000 UTC m=+32.130903154,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 02:42:21 crc kubenswrapper[4901]: > Mar 09 02:42:21 crc kubenswrapper[4901]: E0309 02:42:21.079745 4901 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b0c0384137955\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b0c0384137955 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:27.540726101 +0000 UTC m=+12.130389873,LastTimestamp:2026-03-09 02:41:47.541298444 +0000 UTC m=+32.130962176,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:21 crc kubenswrapper[4901]: E0309 02:42:21.083796 4901 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b0c082c573153 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:47.543605587 +0000 UTC m=+32.133269339,LastTimestamp:2026-03-09 02:41:47.543605587 +0000 UTC m=+32.133269339,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:21 crc kubenswrapper[4901]: E0309 02:42:21.090120 4901 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b0c0121f3bada\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b0c0121f3bada openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:17.304543962 +0000 UTC m=+1.894207704,LastTimestamp:2026-03-09 02:41:47.671495202 +0000 UTC m=+32.261158964,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:21 crc kubenswrapper[4901]: E0309 02:42:21.094593 4901 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b0c0136375372\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b0c0136375372 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:17.644518258 +0000 UTC m=+2.234182020,LastTimestamp:2026-03-09 02:41:47.912184804 +0000 UTC m=+32.501848576,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:21 crc kubenswrapper[4901]: E0309 02:42:21.098532 4901 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b0c01374dbcfb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b0c01374dbcfb openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:17.662764283 +0000 UTC m=+2.252428015,LastTimestamp:2026-03-09 02:41:47.927441767 +0000 UTC m=+32.517105499,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:21 crc kubenswrapper[4901]: E0309 02:42:21.107062 4901 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b0c0384125f37\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 09 02:42:21 crc kubenswrapper[4901]: &Event{ObjectMeta:{kube-controller-manager-crc.189b0c0384125f37 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 09 02:42:21 crc kubenswrapper[4901]: body: Mar 09 02:42:21 crc kubenswrapper[4901]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:27.540653879 +0000 UTC m=+12.130317651,LastTimestamp:2026-03-09 02:41:57.541761646 +0000 UTC m=+42.131425408,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 02:42:21 crc kubenswrapper[4901]: > Mar 09 02:42:21 crc kubenswrapper[4901]: E0309 02:42:21.113595 4901 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b0c0384137955\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b0c0384137955 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:27.540726101 +0000 UTC m=+12.130389873,LastTimestamp:2026-03-09 02:41:57.541845288 +0000 UTC m=+42.131509060,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:42:21 crc kubenswrapper[4901]: E0309 02:42:21.121064 4901 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b0c0384125f37\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 09 02:42:21 crc kubenswrapper[4901]: &Event{ObjectMeta:{kube-controller-manager-crc.189b0c0384125f37 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 09 02:42:21 crc kubenswrapper[4901]: body: Mar 09 02:42:21 crc kubenswrapper[4901]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:41:27.540653879 +0000 UTC m=+12.130317651,LastTimestamp:2026-03-09 02:42:07.541313095 +0000 UTC m=+52.130976857,Count:5,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 02:42:21 crc kubenswrapper[4901]: > Mar 09 02:42:22 crc kubenswrapper[4901]: I0309 02:42:22.036934 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 02:42:23 crc kubenswrapper[4901]: I0309 02:42:23.036938 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 02:42:24 crc kubenswrapper[4901]: I0309 02:42:24.037076 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 02:42:24 crc kubenswrapper[4901]: I0309 02:42:24.540312 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 02:42:24 crc kubenswrapper[4901]: I0309 02:42:24.540541 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:42:24 crc kubenswrapper[4901]: I0309 02:42:24.542200 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:24 crc kubenswrapper[4901]: I0309 02:42:24.542288 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:24 crc kubenswrapper[4901]: I0309 02:42:24.542307 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:24 crc kubenswrapper[4901]: I0309 02:42:24.546275 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 02:42:25 crc kubenswrapper[4901]: I0309 02:42:25.039941 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 02:42:25 crc kubenswrapper[4901]: I0309 02:42:25.496911 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 02:42:25 crc kubenswrapper[4901]: I0309 02:42:25.497209 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:42:25 crc kubenswrapper[4901]: I0309 02:42:25.498958 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:25 crc kubenswrapper[4901]: I0309 02:42:25.499007 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:25 crc kubenswrapper[4901]: I0309 02:42:25.499021 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:25 crc kubenswrapper[4901]: E0309 02:42:25.918665 4901 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 09 02:42:25 crc kubenswrapper[4901]: I0309 02:42:25.931828 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:42:25 crc kubenswrapper[4901]: I0309 02:42:25.933038 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:25 crc kubenswrapper[4901]: I0309 02:42:25.933091 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:25 crc kubenswrapper[4901]: I0309 02:42:25.933108 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:25 crc kubenswrapper[4901]: I0309 02:42:25.933141 4901 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 02:42:25 crc kubenswrapper[4901]: E0309 02:42:25.938823 4901 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 09 02:42:26 crc kubenswrapper[4901]: I0309 02:42:26.037468 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 02:42:26 crc kubenswrapper[4901]: E0309 02:42:26.197599 4901 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 02:42:26 crc kubenswrapper[4901]: I0309 02:42:26.500333 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:42:26 crc kubenswrapper[4901]: I0309 02:42:26.501657 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:26 crc kubenswrapper[4901]: I0309 02:42:26.501717 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:26 crc kubenswrapper[4901]: I0309 02:42:26.501737 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:27 crc kubenswrapper[4901]: I0309 02:42:27.037917 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 02:42:27 crc kubenswrapper[4901]: I0309 02:42:27.105548 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:42:27 crc kubenswrapper[4901]: I0309 02:42:27.107104 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:27 crc kubenswrapper[4901]: I0309 02:42:27.107305 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:27 crc kubenswrapper[4901]: I0309 02:42:27.107419 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:27 crc kubenswrapper[4901]: I0309 02:42:27.108262 4901 scope.go:117] "RemoveContainer" containerID="7e3fac986a6132fe6180180b96999fcb0b1e6be73db98c01464aae6f5bf83630" Mar 09 02:42:27 crc kubenswrapper[4901]: I0309 02:42:27.504602 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 09 02:42:27 crc kubenswrapper[4901]: I0309 02:42:27.506990 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc"} Mar 09 02:42:27 crc kubenswrapper[4901]: I0309 02:42:27.507153 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:42:27 crc kubenswrapper[4901]: I0309 02:42:27.508060 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:27 crc kubenswrapper[4901]: I0309 02:42:27.508108 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:27 crc kubenswrapper[4901]: I0309 02:42:27.508120 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:28 crc kubenswrapper[4901]: I0309 02:42:28.035925 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 02:42:28 crc kubenswrapper[4901]: I0309 02:42:28.123745 4901 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 09 02:42:28 crc kubenswrapper[4901]: I0309 02:42:28.155633 4901 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 09 02:42:28 crc kubenswrapper[4901]: I0309 02:42:28.511567 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 09 02:42:28 crc kubenswrapper[4901]: I0309 02:42:28.512993 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 09 02:42:28 crc kubenswrapper[4901]: I0309 02:42:28.516305 4901 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc" exitCode=255 Mar 09 02:42:28 crc kubenswrapper[4901]: I0309 02:42:28.516345 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc"} Mar 09 02:42:28 crc kubenswrapper[4901]: I0309 02:42:28.516383 4901 scope.go:117] "RemoveContainer" containerID="7e3fac986a6132fe6180180b96999fcb0b1e6be73db98c01464aae6f5bf83630" Mar 09 02:42:28 crc kubenswrapper[4901]: I0309 02:42:28.516561 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:42:28 crc kubenswrapper[4901]: I0309 02:42:28.518780 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:28 crc kubenswrapper[4901]: I0309 02:42:28.518802 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:28 crc kubenswrapper[4901]: I0309 02:42:28.518813 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:28 crc kubenswrapper[4901]: I0309 02:42:28.519404 4901 scope.go:117] "RemoveContainer" containerID="d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc" Mar 09 02:42:28 crc kubenswrapper[4901]: E0309 02:42:28.519591 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 02:42:29 crc kubenswrapper[4901]: I0309 02:42:29.038143 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 02:42:29 crc kubenswrapper[4901]: I0309 02:42:29.521802 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 09 02:42:30 crc kubenswrapper[4901]: I0309 02:42:30.039560 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 02:42:30 crc kubenswrapper[4901]: I0309 02:42:30.664435 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 02:42:30 crc kubenswrapper[4901]: I0309 02:42:30.664729 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:42:30 crc kubenswrapper[4901]: I0309 02:42:30.666679 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:30 crc kubenswrapper[4901]: I0309 02:42:30.666752 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:30 crc kubenswrapper[4901]: I0309 02:42:30.666778 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:30 crc kubenswrapper[4901]: I0309 02:42:30.667853 4901 scope.go:117] "RemoveContainer" containerID="d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc" Mar 09 02:42:30 crc kubenswrapper[4901]: E0309 02:42:30.668189 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 02:42:31 crc kubenswrapper[4901]: I0309 02:42:31.036749 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 02:42:31 crc kubenswrapper[4901]: I0309 02:42:31.105933 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:42:31 crc kubenswrapper[4901]: I0309 02:42:31.107677 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:31 crc kubenswrapper[4901]: I0309 02:42:31.107718 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:31 crc kubenswrapper[4901]: I0309 02:42:31.107731 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:31 crc kubenswrapper[4901]: I0309 02:42:31.155040 4901 csr.go:261] certificate signing request csr-j9zqx is approved, waiting to be issued Mar 09 02:42:31 crc kubenswrapper[4901]: I0309 02:42:31.164295 4901 csr.go:257] certificate signing request csr-j9zqx is issued Mar 09 02:42:31 crc kubenswrapper[4901]: I0309 02:42:31.222251 4901 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 09 02:42:31 crc kubenswrapper[4901]: I0309 02:42:31.887251 4901 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 09 02:42:32 crc kubenswrapper[4901]: I0309 02:42:32.166065 4901 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-12 12:05:10.193209709 +0000 UTC Mar 09 02:42:32 crc kubenswrapper[4901]: I0309 02:42:32.166109 4901 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 5961h22m38.027103104s for next certificate rotation Mar 09 02:42:32 crc kubenswrapper[4901]: I0309 02:42:32.960601 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 02:42:32 crc kubenswrapper[4901]: I0309 02:42:32.962150 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:32 crc kubenswrapper[4901]: I0309 02:42:32.962259 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:32 crc kubenswrapper[4901]: I0309 02:42:32.962293 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:32 crc kubenswrapper[4901]: I0309 02:42:32.962560 4901 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 02:42:32 crc kubenswrapper[4901]: I0309 02:42:32.977121 4901 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 09 02:42:32 crc kubenswrapper[4901]: I0309 02:42:32.977830 4901 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 09 02:42:32 crc kubenswrapper[4901]: E0309 02:42:32.978034 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 09 02:42:32 crc kubenswrapper[4901]: I0309 02:42:32.982879 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:32 crc kubenswrapper[4901]: I0309 02:42:32.983054 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:32 crc kubenswrapper[4901]: I0309 02:42:32.983201 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:32 crc kubenswrapper[4901]: I0309 02:42:32.983396 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:32 crc kubenswrapper[4901]: I0309 02:42:32.983577 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:32Z","lastTransitionTime":"2026-03-09T02:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:33 crc kubenswrapper[4901]: E0309 02:42:33.003638 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:42:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:42:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:42:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:42:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4fb5477d-c9aa-418f-9a0d-560ac0227b13\\\",\\\"systemUUID\\\":\\\"92bcf75b-bfed-4296-bbf2-d35c6ac3a586\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:33 crc kubenswrapper[4901]: I0309 02:42:33.013532 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:33 crc kubenswrapper[4901]: I0309 02:42:33.013789 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:33 crc kubenswrapper[4901]: I0309 02:42:33.013980 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:33 crc kubenswrapper[4901]: I0309 02:42:33.014521 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:33 crc kubenswrapper[4901]: I0309 02:42:33.014773 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:33Z","lastTransitionTime":"2026-03-09T02:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:33 crc kubenswrapper[4901]: E0309 02:42:33.032059 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:42:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:42:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:42:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:42:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4fb5477d-c9aa-418f-9a0d-560ac0227b13\\\",\\\"systemUUID\\\":\\\"92bcf75b-bfed-4296-bbf2-d35c6ac3a586\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:33 crc kubenswrapper[4901]: I0309 02:42:33.043863 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:33 crc kubenswrapper[4901]: I0309 02:42:33.043912 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:33 crc kubenswrapper[4901]: I0309 02:42:33.043923 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:33 crc kubenswrapper[4901]: I0309 02:42:33.043946 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:33 crc kubenswrapper[4901]: I0309 02:42:33.043958 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:33Z","lastTransitionTime":"2026-03-09T02:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:33 crc kubenswrapper[4901]: E0309 02:42:33.059495 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:42:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:42:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:42:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:42:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4fb5477d-c9aa-418f-9a0d-560ac0227b13\\\",\\\"systemUUID\\\":\\\"92bcf75b-bfed-4296-bbf2-d35c6ac3a586\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:33 crc kubenswrapper[4901]: I0309 02:42:33.069406 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:33 crc kubenswrapper[4901]: I0309 02:42:33.069458 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:33 crc kubenswrapper[4901]: I0309 02:42:33.069472 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:33 crc kubenswrapper[4901]: I0309 02:42:33.069497 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:33 crc kubenswrapper[4901]: I0309 02:42:33.069515 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:33Z","lastTransitionTime":"2026-03-09T02:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:33 crc kubenswrapper[4901]: E0309 02:42:33.082089 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:42:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:42:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:42:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:42:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4fb5477d-c9aa-418f-9a0d-560ac0227b13\\\",\\\"systemUUID\\\":\\\"92bcf75b-bfed-4296-bbf2-d35c6ac3a586\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:33 crc kubenswrapper[4901]: E0309 02:42:33.082856 4901 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 02:42:33 crc kubenswrapper[4901]: E0309 02:42:33.083049 4901 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 02:42:33 crc kubenswrapper[4901]: E0309 02:42:33.184117 4901 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 02:42:33 crc kubenswrapper[4901]: E0309 02:42:33.284709 4901 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 02:42:33 crc kubenswrapper[4901]: E0309 02:42:33.384912 4901 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 02:42:33 crc kubenswrapper[4901]: E0309 02:42:33.485842 4901 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 02:42:33 crc kubenswrapper[4901]: E0309 02:42:33.586493 4901 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 02:42:33 crc kubenswrapper[4901]: E0309 02:42:33.687553 4901 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 02:42:33 crc kubenswrapper[4901]: E0309 02:42:33.788971 4901 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 02:42:33 crc kubenswrapper[4901]: I0309 02:42:33.887344 4901 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 09 02:42:33 crc kubenswrapper[4901]: I0309 02:42:33.892678 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:33 crc kubenswrapper[4901]: I0309 02:42:33.892912 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:33 crc kubenswrapper[4901]: I0309 02:42:33.893101 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:33 crc kubenswrapper[4901]: I0309 02:42:33.893319 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:33 crc kubenswrapper[4901]: I0309 02:42:33.893667 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:33Z","lastTransitionTime":"2026-03-09T02:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:33 crc kubenswrapper[4901]: I0309 02:42:33.997148 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:33 crc kubenswrapper[4901]: I0309 02:42:33.998001 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:33 crc kubenswrapper[4901]: I0309 02:42:33.998186 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:33 crc kubenswrapper[4901]: I0309 02:42:33.998459 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:33 crc kubenswrapper[4901]: I0309 02:42:33.998610 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:33Z","lastTransitionTime":"2026-03-09T02:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.050737 4901 apiserver.go:52] "Watching apiserver" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.057731 4901 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.058063 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h"] Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.058551 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.058707 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.058754 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:42:34 crc kubenswrapper[4901]: E0309 02:42:34.058820 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 02:42:34 crc kubenswrapper[4901]: E0309 02:42:34.058861 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.059474 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.059576 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.059598 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 02:42:34 crc kubenswrapper[4901]: E0309 02:42:34.060391 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.066316 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.066414 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.066489 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.066835 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.069360 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.069946 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.073761 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.075370 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.075956 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.100883 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.100944 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.100964 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.100991 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.101009 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:34Z","lastTransitionTime":"2026-03-09T02:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.118262 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.135355 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.137855 4901 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.156674 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.167609 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.167680 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.167720 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.167779 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.167815 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.167845 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.167880 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.167916 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.167952 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.167985 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.168017 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.168048 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.168080 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.168115 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.168147 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.168178 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.168210 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.168289 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.168320 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.168357 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.168388 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.168421 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.168452 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.168486 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.168520 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.168553 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.168622 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.168654 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.168688 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.168721 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.168754 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.168783 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.168819 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.168852 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.168896 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.168927 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.168960 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.168994 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.169024 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.169054 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.169085 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.169115 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.169147 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.169184 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.169253 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.169287 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.169321 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.169355 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.169387 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.169419 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.169455 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.169544 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.169701 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.169738 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.169772 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.169806 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.169837 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.169872 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.169906 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.169941 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.169973 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.170005 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.170037 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.170073 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.170109 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.170139 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.170510 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.170607 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.170676 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.170731 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.170782 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.170832 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.170880 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.170930 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.170973 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.171019 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.171068 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.171115 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.171164 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.171212 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.171300 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.171347 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.171393 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.171440 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.171491 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.171537 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.171582 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.171628 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.171669 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.171714 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.171765 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.171803 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.171850 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.171897 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.171946 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.171990 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.172036 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.172086 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.172132 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.172184 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.172265 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.172324 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.172371 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.172419 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.172469 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.172517 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.172599 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.172651 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.172703 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.172758 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.172810 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.172861 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.172915 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.172967 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.173013 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.173062 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.173112 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.173170 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.173217 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.173403 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.173451 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.173497 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.173542 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.173587 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.173637 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.173689 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.173738 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.173789 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.173938 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.173995 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.174044 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.174821 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.174890 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.174942 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.174980 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.175015 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.175058 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.175097 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.175136 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.175170 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.175204 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.175292 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.175332 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.175369 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.175421 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.175474 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.175530 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.175600 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.175438 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.175654 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.168839 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.175707 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.170445 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.175757 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.175715 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.171322 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.175807 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.175859 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.175914 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.175966 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.176021 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.176073 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.176127 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.176181 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.176334 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.176395 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.176452 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.176511 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.176586 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.176641 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.176696 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.176751 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.176806 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.176857 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.176907 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.176962 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.177007 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.177054 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.177108 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.177161 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.177211 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.177301 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.177355 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.177571 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.177638 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.177680 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.177725 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.177780 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.177833 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.177892 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.177945 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.177999 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.178055 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.178113 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.178173 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.178280 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.178345 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.178404 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.178462 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.178524 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.178632 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.178696 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.178760 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.178823 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.178882 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.178930 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.178968 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.179011 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.179050 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.179105 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.179143 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.179179 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.179253 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.179324 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.179446 4901 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.179476 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.179498 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.179966 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.185761 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.192146 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.194037 4901 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.197087 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.172073 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.172168 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.172250 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.172514 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.172691 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.173082 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.173121 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.173979 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.174309 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.174556 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.174981 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.174990 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.175197 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.175429 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.175846 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.175862 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.176400 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.176413 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.177028 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.177064 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.177316 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.177359 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.177422 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.177931 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.177954 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.177960 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.178110 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.178055 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.178316 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.178459 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.178548 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.179055 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.179119 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.179129 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.179176 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.179336 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.179934 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.180160 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.180299 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.180488 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.180663 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.180721 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.180947 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.181098 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.181108 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.181110 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.181510 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.181530 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.180724 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.181749 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.182458 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.182848 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: E0309 02:42:34.183004 4901 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.183460 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.183847 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.183896 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.183937 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: E0309 02:42:34.184027 4901 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.184062 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.183985 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.184359 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.184898 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.185375 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.185809 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.186185 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.187011 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.186679 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.187302 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.187354 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.187431 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.187521 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.187848 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.187913 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.188289 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.188717 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.189059 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.189431 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: E0309 02:42:34.189837 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 02:42:34.689181888 +0000 UTC m=+79.278845660 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.191117 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.191380 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.191635 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.193074 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.193100 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.193946 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.194068 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.194591 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.195004 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.195151 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.195660 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.196156 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.196543 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.196524 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.197010 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.197021 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.198376 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.198523 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.199101 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.199293 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.199327 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.199905 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.200394 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.200532 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.200770 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.200878 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.200948 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.201763 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.201909 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.201973 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.202143 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.202449 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.202570 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.202577 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.202790 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.202890 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: E0309 02:42:34.203081 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 02:42:34.703007836 +0000 UTC m=+79.292671658 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 02:42:34 crc kubenswrapper[4901]: E0309 02:42:34.203384 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 02:42:34.703351034 +0000 UTC m=+79.293014846 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.203605 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.204372 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.205079 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.207102 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.207840 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.208193 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.208253 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.208751 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.208796 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.208819 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.208879 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.208905 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:34Z","lastTransitionTime":"2026-03-09T02:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.210333 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.211603 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.213284 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.213742 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.214216 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.214290 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.214475 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.214950 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.215158 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.215531 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.215811 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.216435 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.216916 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: E0309 02:42:34.218071 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 02:42:34 crc kubenswrapper[4901]: E0309 02:42:34.218099 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 02:42:34 crc kubenswrapper[4901]: E0309 02:42:34.218118 4901 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 02:42:34 crc kubenswrapper[4901]: E0309 02:42:34.218204 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 02:42:34.718180116 +0000 UTC m=+79.307843958 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.218412 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.218952 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.222597 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.222670 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.225009 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.225047 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.225478 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.225589 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.225879 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.226216 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.226283 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.226368 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.226755 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: E0309 02:42:34.227267 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 02:42:34 crc kubenswrapper[4901]: E0309 02:42:34.227289 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 02:42:34 crc kubenswrapper[4901]: E0309 02:42:34.227305 4901 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 02:42:34 crc kubenswrapper[4901]: E0309 02:42:34.227370 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 02:42:34.727350054 +0000 UTC m=+79.317013796 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.228019 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.228087 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.228151 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.228473 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.229115 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.230990 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.231120 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.231345 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.231396 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.231701 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.231739 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.231913 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.231880 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.232073 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.232148 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.232309 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.232412 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.232461 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.232524 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.232626 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.232653 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.232685 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.233439 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.233509 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.239694 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.239767 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.240856 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.241427 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.242114 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.242158 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.242394 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.242661 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.243810 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.243962 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.244423 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.244493 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.245051 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.246755 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.246969 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.247077 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.247170 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.247347 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.247483 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.247590 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.247683 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.247691 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.248446 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.250377 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.251640 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.251968 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.261069 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.265047 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.278172 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.280771 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.280879 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.280976 4901 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.281018 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.280962 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.281037 4901 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.281120 4901 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.281143 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.281149 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.281164 4901 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.281294 4901 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.281321 4901 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.281343 4901 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.281364 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.281386 4901 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.281406 4901 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.281425 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.281446 4901 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.281465 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.281484 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.281506 4901 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.281524 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.281542 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.281561 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.281581 4901 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.281602 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.281621 4901 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.281639 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.281658 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.281679 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.281698 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.281716 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.281734 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.281753 4901 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.281771 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.281790 4901 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.281809 4901 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.281828 4901 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.281847 4901 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.281866 4901 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.281884 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.281902 4901 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.281920 4901 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.281939 4901 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.281957 4901 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.281975 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.281993 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.282012 4901 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.282030 4901 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.282050 4901 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.282068 4901 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.282089 4901 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.282107 4901 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.282127 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.282145 4901 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.282163 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.282181 4901 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.282199 4901 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.282218 4901 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.282284 4901 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.282306 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.282325 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.282342 4901 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.282361 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.282379 4901 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.282397 4901 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.282416 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.282435 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.282452 4901 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.282470 4901 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.282489 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.282506 4901 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.282524 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.282542 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.282560 4901 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.282581 4901 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.282599 4901 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.282618 4901 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.282636 4901 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.282654 4901 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.282672 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.282689 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.282707 4901 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.282725 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.282743 4901 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.282761 4901 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.282780 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.282798 4901 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.282817 4901 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.282836 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.282854 4901 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.282872 4901 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.282890 4901 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.282907 4901 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.282925 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.282946 4901 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.282966 4901 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.282983 4901 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.283003 4901 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.283020 4901 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.283038 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.283059 4901 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.283076 4901 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.283093 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.283112 4901 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.283130 4901 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.283148 4901 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.283166 4901 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.283184 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.283204 4901 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.283257 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.283284 4901 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.283302 4901 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.283320 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.283338 4901 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.283356 4901 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.283374 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.283393 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.283410 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.283428 4901 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.283446 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.283465 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.283482 4901 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.283501 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.283519 4901 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.283537 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.283555 4901 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.283573 4901 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.283591 4901 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.283609 4901 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.283626 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.283644 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.283664 4901 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.283682 4901 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.283700 4901 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.283719 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.283738 4901 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.283756 4901 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.283776 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.283795 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.283813 4901 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.283831 4901 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.283850 4901 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.283867 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.283886 4901 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.283906 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.283924 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.283945 4901 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.283964 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.283983 4901 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.284001 4901 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.284020 4901 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.284038 4901 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.284056 4901 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.284074 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.284092 4901 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.284110 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.284129 4901 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.284148 4901 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.284168 4901 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.284186 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.284205 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.284251 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.284278 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.284296 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.284314 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.284332 4901 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.284350 4901 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.284369 4901 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.284386 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.284404 4901 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.284422 4901 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.284439 4901 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.284457 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.284475 4901 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.284494 4901 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.284517 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.284535 4901 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.284553 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.284573 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.284591 4901 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.284608 4901 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.284663 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.284687 4901 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.284710 4901 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.284729 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.284748 4901 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.284766 4901 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.284784 4901 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.284804 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.284822 4901 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.284841 4901 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.284862 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.288020 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.294130 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.312603 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.312738 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.312758 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.312783 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.312800 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:34Z","lastTransitionTime":"2026-03-09T02:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.385424 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.385917 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.385988 4901 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.402624 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 02:42:34 crc kubenswrapper[4901]: W0309 02:42:34.406172 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-a2c97f52452d7d0598fe6376ffa8cd2a4cb49b6f31d07ff67d31808e282985f0 WatchSource:0}: Error finding container a2c97f52452d7d0598fe6376ffa8cd2a4cb49b6f31d07ff67d31808e282985f0: Status 404 returned error can't find the container with id a2c97f52452d7d0598fe6376ffa8cd2a4cb49b6f31d07ff67d31808e282985f0 Mar 09 02:42:34 crc kubenswrapper[4901]: E0309 02:42:34.410137 4901 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.410567 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 02:42:34 crc kubenswrapper[4901]: E0309 02:42:34.412280 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.415667 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.415734 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.415753 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.415778 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.415802 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:34Z","lastTransitionTime":"2026-03-09T02:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:34 crc kubenswrapper[4901]: W0309 02:42:34.424736 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-28818377dd2e0fb23147cd7b223e590efca4903c244d5643eff391505afaf32e WatchSource:0}: Error finding container 28818377dd2e0fb23147cd7b223e590efca4903c244d5643eff391505afaf32e: Status 404 returned error can't find the container with id 28818377dd2e0fb23147cd7b223e590efca4903c244d5643eff391505afaf32e Mar 09 02:42:34 crc kubenswrapper[4901]: E0309 02:42:34.430258 4901 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 09 02:42:34 crc kubenswrapper[4901]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 09 02:42:34 crc kubenswrapper[4901]: set -o allexport Mar 09 02:42:34 crc kubenswrapper[4901]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 09 02:42:34 crc kubenswrapper[4901]: source /etc/kubernetes/apiserver-url.env Mar 09 02:42:34 crc kubenswrapper[4901]: else Mar 09 02:42:34 crc kubenswrapper[4901]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 09 02:42:34 crc kubenswrapper[4901]: exit 1 Mar 09 02:42:34 crc kubenswrapper[4901]: fi Mar 09 02:42:34 crc kubenswrapper[4901]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 09 02:42:34 crc kubenswrapper[4901]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 09 02:42:34 crc kubenswrapper[4901]: > logger="UnhandledError" Mar 09 02:42:34 crc kubenswrapper[4901]: W0309 02:42:34.431565 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-98faedad9fd1d7e907924fac935a8a3c902c90e6b98a811464e28df5e2339568 WatchSource:0}: Error finding container 98faedad9fd1d7e907924fac935a8a3c902c90e6b98a811464e28df5e2339568: Status 404 returned error can't find the container with id 98faedad9fd1d7e907924fac935a8a3c902c90e6b98a811464e28df5e2339568 Mar 09 02:42:34 crc kubenswrapper[4901]: E0309 02:42:34.431675 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 09 02:42:34 crc kubenswrapper[4901]: E0309 02:42:34.434869 4901 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 09 02:42:34 crc kubenswrapper[4901]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 09 02:42:34 crc kubenswrapper[4901]: if [[ -f "/env/_master" ]]; then Mar 09 02:42:34 crc kubenswrapper[4901]: set -o allexport Mar 09 02:42:34 crc kubenswrapper[4901]: source "/env/_master" Mar 09 02:42:34 crc kubenswrapper[4901]: set +o allexport Mar 09 02:42:34 crc kubenswrapper[4901]: fi Mar 09 02:42:34 crc kubenswrapper[4901]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 09 02:42:34 crc kubenswrapper[4901]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 09 02:42:34 crc kubenswrapper[4901]: ho_enable="--enable-hybrid-overlay" Mar 09 02:42:34 crc kubenswrapper[4901]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 09 02:42:34 crc kubenswrapper[4901]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 09 02:42:34 crc kubenswrapper[4901]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 09 02:42:34 crc kubenswrapper[4901]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 09 02:42:34 crc kubenswrapper[4901]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 09 02:42:34 crc kubenswrapper[4901]: --webhook-host=127.0.0.1 \ Mar 09 02:42:34 crc kubenswrapper[4901]: --webhook-port=9743 \ Mar 09 02:42:34 crc kubenswrapper[4901]: ${ho_enable} \ Mar 09 02:42:34 crc kubenswrapper[4901]: --enable-interconnect \ Mar 09 02:42:34 crc kubenswrapper[4901]: --disable-approver \ Mar 09 02:42:34 crc kubenswrapper[4901]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 09 02:42:34 crc kubenswrapper[4901]: --wait-for-kubernetes-api=200s \ Mar 09 02:42:34 crc kubenswrapper[4901]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 09 02:42:34 crc kubenswrapper[4901]: --loglevel="${LOGLEVEL}" Mar 09 02:42:34 crc kubenswrapper[4901]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 09 02:42:34 crc kubenswrapper[4901]: > logger="UnhandledError" Mar 09 02:42:34 crc kubenswrapper[4901]: E0309 02:42:34.441449 4901 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 09 02:42:34 crc kubenswrapper[4901]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 09 02:42:34 crc kubenswrapper[4901]: if [[ -f "/env/_master" ]]; then Mar 09 02:42:34 crc kubenswrapper[4901]: set -o allexport Mar 09 02:42:34 crc kubenswrapper[4901]: source "/env/_master" Mar 09 02:42:34 crc kubenswrapper[4901]: set +o allexport Mar 09 02:42:34 crc kubenswrapper[4901]: fi Mar 09 02:42:34 crc kubenswrapper[4901]: Mar 09 02:42:34 crc kubenswrapper[4901]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 09 02:42:34 crc kubenswrapper[4901]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 09 02:42:34 crc kubenswrapper[4901]: --disable-webhook \ Mar 09 02:42:34 crc kubenswrapper[4901]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 09 02:42:34 crc kubenswrapper[4901]: --loglevel="${LOGLEVEL}" Mar 09 02:42:34 crc kubenswrapper[4901]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 09 02:42:34 crc kubenswrapper[4901]: > logger="UnhandledError" Mar 09 02:42:34 crc kubenswrapper[4901]: E0309 02:42:34.442751 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.519580 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.519644 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.519663 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.519689 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.519707 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:34Z","lastTransitionTime":"2026-03-09T02:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.539214 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"98faedad9fd1d7e907924fac935a8a3c902c90e6b98a811464e28df5e2339568"} Mar 09 02:42:34 crc kubenswrapper[4901]: E0309 02:42:34.541923 4901 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 09 02:42:34 crc kubenswrapper[4901]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 09 02:42:34 crc kubenswrapper[4901]: if [[ -f "/env/_master" ]]; then Mar 09 02:42:34 crc kubenswrapper[4901]: set -o allexport Mar 09 02:42:34 crc kubenswrapper[4901]: source "/env/_master" Mar 09 02:42:34 crc kubenswrapper[4901]: set +o allexport Mar 09 02:42:34 crc kubenswrapper[4901]: fi Mar 09 02:42:34 crc kubenswrapper[4901]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 09 02:42:34 crc kubenswrapper[4901]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 09 02:42:34 crc kubenswrapper[4901]: ho_enable="--enable-hybrid-overlay" Mar 09 02:42:34 crc kubenswrapper[4901]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 09 02:42:34 crc kubenswrapper[4901]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 09 02:42:34 crc kubenswrapper[4901]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 09 02:42:34 crc kubenswrapper[4901]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 09 02:42:34 crc kubenswrapper[4901]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 09 02:42:34 crc kubenswrapper[4901]: --webhook-host=127.0.0.1 \ Mar 09 02:42:34 crc kubenswrapper[4901]: --webhook-port=9743 \ Mar 09 02:42:34 crc kubenswrapper[4901]: ${ho_enable} \ Mar 09 02:42:34 crc kubenswrapper[4901]: --enable-interconnect \ Mar 09 02:42:34 crc kubenswrapper[4901]: --disable-approver \ Mar 09 02:42:34 crc kubenswrapper[4901]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 09 02:42:34 crc kubenswrapper[4901]: --wait-for-kubernetes-api=200s \ Mar 09 02:42:34 crc kubenswrapper[4901]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 09 02:42:34 crc kubenswrapper[4901]: --loglevel="${LOGLEVEL}" Mar 09 02:42:34 crc kubenswrapper[4901]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 09 02:42:34 crc kubenswrapper[4901]: > logger="UnhandledError" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.542267 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"28818377dd2e0fb23147cd7b223e590efca4903c244d5643eff391505afaf32e"} Mar 09 02:42:34 crc kubenswrapper[4901]: E0309 02:42:34.543562 4901 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 09 02:42:34 crc kubenswrapper[4901]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 09 02:42:34 crc kubenswrapper[4901]: set -o allexport Mar 09 02:42:34 crc kubenswrapper[4901]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 09 02:42:34 crc kubenswrapper[4901]: source /etc/kubernetes/apiserver-url.env Mar 09 02:42:34 crc kubenswrapper[4901]: else Mar 09 02:42:34 crc kubenswrapper[4901]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 09 02:42:34 crc kubenswrapper[4901]: exit 1 Mar 09 02:42:34 crc kubenswrapper[4901]: fi Mar 09 02:42:34 crc kubenswrapper[4901]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 09 02:42:34 crc kubenswrapper[4901]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 09 02:42:34 crc kubenswrapper[4901]: > logger="UnhandledError" Mar 09 02:42:34 crc kubenswrapper[4901]: E0309 02:42:34.546200 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 09 02:42:34 crc kubenswrapper[4901]: E0309 02:42:34.546305 4901 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 09 02:42:34 crc kubenswrapper[4901]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 09 02:42:34 crc kubenswrapper[4901]: if [[ -f "/env/_master" ]]; then Mar 09 02:42:34 crc kubenswrapper[4901]: set -o allexport Mar 09 02:42:34 crc kubenswrapper[4901]: source "/env/_master" Mar 09 02:42:34 crc kubenswrapper[4901]: set +o allexport Mar 09 02:42:34 crc kubenswrapper[4901]: fi Mar 09 02:42:34 crc kubenswrapper[4901]: Mar 09 02:42:34 crc kubenswrapper[4901]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 09 02:42:34 crc kubenswrapper[4901]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 09 02:42:34 crc kubenswrapper[4901]: --disable-webhook \ Mar 09 02:42:34 crc kubenswrapper[4901]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 09 02:42:34 crc kubenswrapper[4901]: --loglevel="${LOGLEVEL}" Mar 09 02:42:34 crc kubenswrapper[4901]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 09 02:42:34 crc kubenswrapper[4901]: > logger="UnhandledError" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.546481 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"a2c97f52452d7d0598fe6376ffa8cd2a4cb49b6f31d07ff67d31808e282985f0"} Mar 09 02:42:34 crc kubenswrapper[4901]: E0309 02:42:34.547690 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 09 02:42:34 crc kubenswrapper[4901]: E0309 02:42:34.548537 4901 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 09 02:42:34 crc kubenswrapper[4901]: E0309 02:42:34.549888 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.560050 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.575031 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.591351 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.609120 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.622040 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.622109 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.622132 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.622174 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.622189 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:34Z","lastTransitionTime":"2026-03-09T02:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.624377 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.640739 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.656659 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.671400 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.685629 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.702357 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.718997 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.725076 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.725142 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.725164 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.725191 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.725213 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:34Z","lastTransitionTime":"2026-03-09T02:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.736206 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.790904 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.791070 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:42:34 crc kubenswrapper[4901]: E0309 02:42:34.791139 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 02:42:35.791064665 +0000 UTC m=+80.380728457 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:42:34 crc kubenswrapper[4901]: E0309 02:42:34.791173 4901 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 02:42:34 crc kubenswrapper[4901]: E0309 02:42:34.791267 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 02:42:35.79124927 +0000 UTC m=+80.380912992 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.791219 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.791389 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.791490 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:42:34 crc kubenswrapper[4901]: E0309 02:42:34.791526 4901 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 02:42:34 crc kubenswrapper[4901]: E0309 02:42:34.791593 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 02:42:34 crc kubenswrapper[4901]: E0309 02:42:34.791636 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 02:42:34 crc kubenswrapper[4901]: E0309 02:42:34.791661 4901 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 02:42:34 crc kubenswrapper[4901]: E0309 02:42:34.791609 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 02:42:35.791590578 +0000 UTC m=+80.381254340 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 02:42:34 crc kubenswrapper[4901]: E0309 02:42:34.791726 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 02:42:34 crc kubenswrapper[4901]: E0309 02:42:34.791820 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 02:42:34 crc kubenswrapper[4901]: E0309 02:42:34.791884 4901 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 02:42:34 crc kubenswrapper[4901]: E0309 02:42:34.791755 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 02:42:35.791734971 +0000 UTC m=+80.381398733 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 02:42:34 crc kubenswrapper[4901]: E0309 02:42:34.792022 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 02:42:35.792003898 +0000 UTC m=+80.381667670 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.828489 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.828565 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.828583 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.828609 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.828630 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:34Z","lastTransitionTime":"2026-03-09T02:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.932257 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.932313 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.932330 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.932356 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:34 crc kubenswrapper[4901]: I0309 02:42:34.932373 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:34Z","lastTransitionTime":"2026-03-09T02:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:35 crc kubenswrapper[4901]: I0309 02:42:35.035153 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:35 crc kubenswrapper[4901]: I0309 02:42:35.035214 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:35 crc kubenswrapper[4901]: I0309 02:42:35.035253 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:35 crc kubenswrapper[4901]: I0309 02:42:35.035276 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:35 crc kubenswrapper[4901]: I0309 02:42:35.035317 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:35Z","lastTransitionTime":"2026-03-09T02:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:35 crc kubenswrapper[4901]: I0309 02:42:35.138480 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:35 crc kubenswrapper[4901]: I0309 02:42:35.138529 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:35 crc kubenswrapper[4901]: I0309 02:42:35.138548 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:35 crc kubenswrapper[4901]: I0309 02:42:35.138572 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:35 crc kubenswrapper[4901]: I0309 02:42:35.138589 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:35Z","lastTransitionTime":"2026-03-09T02:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:35 crc kubenswrapper[4901]: I0309 02:42:35.240980 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:35 crc kubenswrapper[4901]: I0309 02:42:35.241049 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:35 crc kubenswrapper[4901]: I0309 02:42:35.241067 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:35 crc kubenswrapper[4901]: I0309 02:42:35.241092 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:35 crc kubenswrapper[4901]: I0309 02:42:35.241110 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:35Z","lastTransitionTime":"2026-03-09T02:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:35 crc kubenswrapper[4901]: I0309 02:42:35.343858 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:35 crc kubenswrapper[4901]: I0309 02:42:35.343929 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:35 crc kubenswrapper[4901]: I0309 02:42:35.343949 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:35 crc kubenswrapper[4901]: I0309 02:42:35.343985 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:35 crc kubenswrapper[4901]: I0309 02:42:35.344007 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:35Z","lastTransitionTime":"2026-03-09T02:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:35 crc kubenswrapper[4901]: I0309 02:42:35.447388 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:35 crc kubenswrapper[4901]: I0309 02:42:35.447463 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:35 crc kubenswrapper[4901]: I0309 02:42:35.447482 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:35 crc kubenswrapper[4901]: I0309 02:42:35.447516 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:35 crc kubenswrapper[4901]: I0309 02:42:35.447539 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:35Z","lastTransitionTime":"2026-03-09T02:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:35 crc kubenswrapper[4901]: I0309 02:42:35.549965 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:35 crc kubenswrapper[4901]: I0309 02:42:35.550028 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:35 crc kubenswrapper[4901]: I0309 02:42:35.550053 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:35 crc kubenswrapper[4901]: I0309 02:42:35.550086 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:35 crc kubenswrapper[4901]: I0309 02:42:35.550107 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:35Z","lastTransitionTime":"2026-03-09T02:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:35 crc kubenswrapper[4901]: I0309 02:42:35.652977 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:35 crc kubenswrapper[4901]: I0309 02:42:35.653060 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:35 crc kubenswrapper[4901]: I0309 02:42:35.653111 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:35 crc kubenswrapper[4901]: I0309 02:42:35.653145 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:35 crc kubenswrapper[4901]: I0309 02:42:35.653169 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:35Z","lastTransitionTime":"2026-03-09T02:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:35 crc kubenswrapper[4901]: I0309 02:42:35.756259 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:35 crc kubenswrapper[4901]: I0309 02:42:35.756316 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:35 crc kubenswrapper[4901]: I0309 02:42:35.756335 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:35 crc kubenswrapper[4901]: I0309 02:42:35.756370 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:35 crc kubenswrapper[4901]: I0309 02:42:35.756387 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:35Z","lastTransitionTime":"2026-03-09T02:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:35 crc kubenswrapper[4901]: I0309 02:42:35.801722 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 02:42:35 crc kubenswrapper[4901]: I0309 02:42:35.801823 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:42:35 crc kubenswrapper[4901]: I0309 02:42:35.801863 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:42:35 crc kubenswrapper[4901]: E0309 02:42:35.801926 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 02:42:37.801887979 +0000 UTC m=+82.391551761 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:42:35 crc kubenswrapper[4901]: I0309 02:42:35.802005 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:42:35 crc kubenswrapper[4901]: E0309 02:42:35.802056 4901 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 02:42:35 crc kubenswrapper[4901]: E0309 02:42:35.802178 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 02:42:35 crc kubenswrapper[4901]: E0309 02:42:35.802208 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 02:42:35 crc kubenswrapper[4901]: E0309 02:42:35.802257 4901 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 02:42:35 crc kubenswrapper[4901]: E0309 02:42:35.802270 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 02:42:37.802201856 +0000 UTC m=+82.391865628 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 02:42:35 crc kubenswrapper[4901]: E0309 02:42:35.802319 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 02:42:35 crc kubenswrapper[4901]: E0309 02:42:35.802334 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 02:42:37.802312359 +0000 UTC m=+82.391976121 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 02:42:35 crc kubenswrapper[4901]: E0309 02:42:35.802353 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 02:42:35 crc kubenswrapper[4901]: I0309 02:42:35.802084 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:42:35 crc kubenswrapper[4901]: E0309 02:42:35.802390 4901 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 02:42:35 crc kubenswrapper[4901]: E0309 02:42:35.802433 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 02:42:37.802420981 +0000 UTC m=+82.392084753 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 02:42:35 crc kubenswrapper[4901]: E0309 02:42:35.802495 4901 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 02:42:35 crc kubenswrapper[4901]: E0309 02:42:35.802580 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 02:42:37.802553695 +0000 UTC m=+82.392217467 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 02:42:35 crc kubenswrapper[4901]: I0309 02:42:35.859803 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:35 crc kubenswrapper[4901]: I0309 02:42:35.859864 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:35 crc kubenswrapper[4901]: I0309 02:42:35.859884 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:35 crc kubenswrapper[4901]: I0309 02:42:35.859910 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:35 crc kubenswrapper[4901]: I0309 02:42:35.859929 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:35Z","lastTransitionTime":"2026-03-09T02:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:35 crc kubenswrapper[4901]: I0309 02:42:35.963485 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:35 crc kubenswrapper[4901]: I0309 02:42:35.963551 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:35 crc kubenswrapper[4901]: I0309 02:42:35.963569 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:35 crc kubenswrapper[4901]: I0309 02:42:35.963595 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:35 crc kubenswrapper[4901]: I0309 02:42:35.963615 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:35Z","lastTransitionTime":"2026-03-09T02:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.066640 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.066704 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.066729 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.066758 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.066778 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:36Z","lastTransitionTime":"2026-03-09T02:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.105317 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.105386 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.105446 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:42:36 crc kubenswrapper[4901]: E0309 02:42:36.105520 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 02:42:36 crc kubenswrapper[4901]: E0309 02:42:36.105711 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 02:42:36 crc kubenswrapper[4901]: E0309 02:42:36.105858 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.113258 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.114397 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.117179 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.118490 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.120370 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.121206 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.122011 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.123828 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.126931 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.128305 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.129473 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.130557 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.131960 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.132941 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.134107 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.135199 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.137148 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.138360 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.138813 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.139757 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.141084 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.144103 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.145195 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.147673 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.148633 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.151718 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.153177 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.155249 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.156588 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.158182 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.159181 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.160369 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.161401 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.162396 4901 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.162600 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.165271 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.166913 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.169882 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.170405 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.170457 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.170475 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.170504 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.170522 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:36Z","lastTransitionTime":"2026-03-09T02:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.173139 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.173252 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.174764 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.176543 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.178357 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.180959 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.182351 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.184563 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.186014 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.187604 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.188314 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.189465 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.189906 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.190669 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.192836 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.194183 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.196080 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.197167 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.203498 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.205555 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.205984 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.208972 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.273701 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.273767 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.273787 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.273814 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.273832 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:36Z","lastTransitionTime":"2026-03-09T02:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.376934 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.377001 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.377020 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.377049 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.377070 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:36Z","lastTransitionTime":"2026-03-09T02:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.480553 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.480643 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.480670 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.480702 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.480722 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:36Z","lastTransitionTime":"2026-03-09T02:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.584360 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.584436 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.584454 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.584484 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.584507 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:36Z","lastTransitionTime":"2026-03-09T02:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.687410 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.687478 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.687496 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.687521 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.687540 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:36Z","lastTransitionTime":"2026-03-09T02:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.775808 4901 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.785820 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.789783 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.789840 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.789862 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.789890 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.789915 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:36Z","lastTransitionTime":"2026-03-09T02:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.792725 4901 scope.go:117] "RemoveContainer" containerID="d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc" Mar 09 02:42:36 crc kubenswrapper[4901]: E0309 02:42:36.792885 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.801251 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.821783 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.841652 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.864039 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.879626 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.891214 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.892179 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.892205 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.892214 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.892243 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.892252 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:36Z","lastTransitionTime":"2026-03-09T02:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.892483 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.903424 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.995190 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.995296 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.995315 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.995342 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:36 crc kubenswrapper[4901]: I0309 02:42:36.995360 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:36Z","lastTransitionTime":"2026-03-09T02:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:37 crc kubenswrapper[4901]: I0309 02:42:37.097458 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:37 crc kubenswrapper[4901]: I0309 02:42:37.097579 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:37 crc kubenswrapper[4901]: I0309 02:42:37.097604 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:37 crc kubenswrapper[4901]: I0309 02:42:37.097633 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:37 crc kubenswrapper[4901]: I0309 02:42:37.097655 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:37Z","lastTransitionTime":"2026-03-09T02:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:37 crc kubenswrapper[4901]: I0309 02:42:37.200283 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:37 crc kubenswrapper[4901]: I0309 02:42:37.200358 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:37 crc kubenswrapper[4901]: I0309 02:42:37.200384 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:37 crc kubenswrapper[4901]: I0309 02:42:37.200418 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:37 crc kubenswrapper[4901]: I0309 02:42:37.200441 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:37Z","lastTransitionTime":"2026-03-09T02:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:37 crc kubenswrapper[4901]: I0309 02:42:37.303683 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:37 crc kubenswrapper[4901]: I0309 02:42:37.303757 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:37 crc kubenswrapper[4901]: I0309 02:42:37.303778 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:37 crc kubenswrapper[4901]: I0309 02:42:37.303808 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:37 crc kubenswrapper[4901]: I0309 02:42:37.303831 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:37Z","lastTransitionTime":"2026-03-09T02:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:37 crc kubenswrapper[4901]: I0309 02:42:37.407596 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:37 crc kubenswrapper[4901]: I0309 02:42:37.407656 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:37 crc kubenswrapper[4901]: I0309 02:42:37.407673 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:37 crc kubenswrapper[4901]: I0309 02:42:37.407697 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:37 crc kubenswrapper[4901]: I0309 02:42:37.407714 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:37Z","lastTransitionTime":"2026-03-09T02:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:37 crc kubenswrapper[4901]: I0309 02:42:37.510100 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:37 crc kubenswrapper[4901]: I0309 02:42:37.510179 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:37 crc kubenswrapper[4901]: I0309 02:42:37.510198 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:37 crc kubenswrapper[4901]: I0309 02:42:37.510251 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:37 crc kubenswrapper[4901]: I0309 02:42:37.510270 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:37Z","lastTransitionTime":"2026-03-09T02:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:37 crc kubenswrapper[4901]: I0309 02:42:37.556134 4901 scope.go:117] "RemoveContainer" containerID="d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc" Mar 09 02:42:37 crc kubenswrapper[4901]: E0309 02:42:37.556459 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 02:42:37 crc kubenswrapper[4901]: I0309 02:42:37.613400 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:37 crc kubenswrapper[4901]: I0309 02:42:37.613478 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:37 crc kubenswrapper[4901]: I0309 02:42:37.613500 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:37 crc kubenswrapper[4901]: I0309 02:42:37.613530 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:37 crc kubenswrapper[4901]: I0309 02:42:37.613549 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:37Z","lastTransitionTime":"2026-03-09T02:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:37 crc kubenswrapper[4901]: I0309 02:42:37.717308 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:37 crc kubenswrapper[4901]: I0309 02:42:37.717376 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:37 crc kubenswrapper[4901]: I0309 02:42:37.717396 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:37 crc kubenswrapper[4901]: I0309 02:42:37.717427 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:37 crc kubenswrapper[4901]: I0309 02:42:37.717446 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:37Z","lastTransitionTime":"2026-03-09T02:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:37 crc kubenswrapper[4901]: I0309 02:42:37.820788 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:37 crc kubenswrapper[4901]: I0309 02:42:37.820871 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:37 crc kubenswrapper[4901]: I0309 02:42:37.820894 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:37 crc kubenswrapper[4901]: I0309 02:42:37.820926 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:37 crc kubenswrapper[4901]: I0309 02:42:37.820943 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:37Z","lastTransitionTime":"2026-03-09T02:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:37 crc kubenswrapper[4901]: I0309 02:42:37.829270 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 02:42:37 crc kubenswrapper[4901]: I0309 02:42:37.829395 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:42:37 crc kubenswrapper[4901]: I0309 02:42:37.829443 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:42:37 crc kubenswrapper[4901]: I0309 02:42:37.829495 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:42:37 crc kubenswrapper[4901]: E0309 02:42:37.829547 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 02:42:41.82951492 +0000 UTC m=+86.419178702 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:42:37 crc kubenswrapper[4901]: I0309 02:42:37.829597 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:42:37 crc kubenswrapper[4901]: E0309 02:42:37.829652 4901 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 02:42:37 crc kubenswrapper[4901]: E0309 02:42:37.829662 4901 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 02:42:37 crc kubenswrapper[4901]: E0309 02:42:37.829729 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 02:42:41.829709715 +0000 UTC m=+86.419373487 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 02:42:37 crc kubenswrapper[4901]: E0309 02:42:37.829745 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 02:42:37 crc kubenswrapper[4901]: E0309 02:42:37.829772 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 02:42:37 crc kubenswrapper[4901]: E0309 02:42:37.829790 4901 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 02:42:37 crc kubenswrapper[4901]: E0309 02:42:37.829827 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 02:42:41.829789157 +0000 UTC m=+86.419452919 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 02:42:37 crc kubenswrapper[4901]: E0309 02:42:37.829674 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 02:42:37 crc kubenswrapper[4901]: E0309 02:42:37.829863 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 02:42:37 crc kubenswrapper[4901]: E0309 02:42:37.829880 4901 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 02:42:37 crc kubenswrapper[4901]: E0309 02:42:37.829866 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 02:42:41.829849198 +0000 UTC m=+86.419512960 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 02:42:37 crc kubenswrapper[4901]: E0309 02:42:37.829951 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 02:42:41.82993633 +0000 UTC m=+86.419600092 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 02:42:37 crc kubenswrapper[4901]: I0309 02:42:37.924217 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:37 crc kubenswrapper[4901]: I0309 02:42:37.924312 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:37 crc kubenswrapper[4901]: I0309 02:42:37.924330 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:37 crc kubenswrapper[4901]: I0309 02:42:37.924355 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:37 crc kubenswrapper[4901]: I0309 02:42:37.924374 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:37Z","lastTransitionTime":"2026-03-09T02:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:38 crc kubenswrapper[4901]: I0309 02:42:38.027433 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:38 crc kubenswrapper[4901]: I0309 02:42:38.027507 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:38 crc kubenswrapper[4901]: I0309 02:42:38.027532 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:38 crc kubenswrapper[4901]: I0309 02:42:38.027564 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:38 crc kubenswrapper[4901]: I0309 02:42:38.027587 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:38Z","lastTransitionTime":"2026-03-09T02:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:38 crc kubenswrapper[4901]: I0309 02:42:38.105889 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:42:38 crc kubenswrapper[4901]: I0309 02:42:38.106005 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:42:38 crc kubenswrapper[4901]: E0309 02:42:38.106119 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 02:42:38 crc kubenswrapper[4901]: E0309 02:42:38.106341 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 02:42:38 crc kubenswrapper[4901]: I0309 02:42:38.106492 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:42:38 crc kubenswrapper[4901]: E0309 02:42:38.106834 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 02:42:38 crc kubenswrapper[4901]: I0309 02:42:38.130559 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:38 crc kubenswrapper[4901]: I0309 02:42:38.130661 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:38 crc kubenswrapper[4901]: I0309 02:42:38.130702 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:38 crc kubenswrapper[4901]: I0309 02:42:38.130741 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:38 crc kubenswrapper[4901]: I0309 02:42:38.130768 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:38Z","lastTransitionTime":"2026-03-09T02:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:38 crc kubenswrapper[4901]: I0309 02:42:38.234269 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:38 crc kubenswrapper[4901]: I0309 02:42:38.234341 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:38 crc kubenswrapper[4901]: I0309 02:42:38.234359 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:38 crc kubenswrapper[4901]: I0309 02:42:38.234385 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:38 crc kubenswrapper[4901]: I0309 02:42:38.234403 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:38Z","lastTransitionTime":"2026-03-09T02:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:38 crc kubenswrapper[4901]: I0309 02:42:38.337611 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:38 crc kubenswrapper[4901]: I0309 02:42:38.337666 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:38 crc kubenswrapper[4901]: I0309 02:42:38.337683 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:38 crc kubenswrapper[4901]: I0309 02:42:38.337705 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:38 crc kubenswrapper[4901]: I0309 02:42:38.337722 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:38Z","lastTransitionTime":"2026-03-09T02:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:38 crc kubenswrapper[4901]: I0309 02:42:38.440854 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:38 crc kubenswrapper[4901]: I0309 02:42:38.440907 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:38 crc kubenswrapper[4901]: I0309 02:42:38.440924 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:38 crc kubenswrapper[4901]: I0309 02:42:38.440954 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:38 crc kubenswrapper[4901]: I0309 02:42:38.440972 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:38Z","lastTransitionTime":"2026-03-09T02:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:38 crc kubenswrapper[4901]: I0309 02:42:38.543025 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:38 crc kubenswrapper[4901]: I0309 02:42:38.543083 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:38 crc kubenswrapper[4901]: I0309 02:42:38.543098 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:38 crc kubenswrapper[4901]: I0309 02:42:38.543118 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:38 crc kubenswrapper[4901]: I0309 02:42:38.543134 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:38Z","lastTransitionTime":"2026-03-09T02:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:38 crc kubenswrapper[4901]: I0309 02:42:38.646055 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:38 crc kubenswrapper[4901]: I0309 02:42:38.646125 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:38 crc kubenswrapper[4901]: I0309 02:42:38.646149 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:38 crc kubenswrapper[4901]: I0309 02:42:38.646179 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:38 crc kubenswrapper[4901]: I0309 02:42:38.646205 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:38Z","lastTransitionTime":"2026-03-09T02:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:38 crc kubenswrapper[4901]: I0309 02:42:38.749285 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:38 crc kubenswrapper[4901]: I0309 02:42:38.749351 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:38 crc kubenswrapper[4901]: I0309 02:42:38.749379 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:38 crc kubenswrapper[4901]: I0309 02:42:38.749414 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:38 crc kubenswrapper[4901]: I0309 02:42:38.749435 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:38Z","lastTransitionTime":"2026-03-09T02:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:38 crc kubenswrapper[4901]: I0309 02:42:38.852081 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:38 crc kubenswrapper[4901]: I0309 02:42:38.852161 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:38 crc kubenswrapper[4901]: I0309 02:42:38.852181 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:38 crc kubenswrapper[4901]: I0309 02:42:38.852208 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:38 crc kubenswrapper[4901]: I0309 02:42:38.852251 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:38Z","lastTransitionTime":"2026-03-09T02:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:38 crc kubenswrapper[4901]: I0309 02:42:38.955804 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:38 crc kubenswrapper[4901]: I0309 02:42:38.955875 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:38 crc kubenswrapper[4901]: I0309 02:42:38.955951 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:38 crc kubenswrapper[4901]: I0309 02:42:38.955989 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:38 crc kubenswrapper[4901]: I0309 02:42:38.956011 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:38Z","lastTransitionTime":"2026-03-09T02:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:39 crc kubenswrapper[4901]: I0309 02:42:39.057982 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:39 crc kubenswrapper[4901]: I0309 02:42:39.058052 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:39 crc kubenswrapper[4901]: I0309 02:42:39.058071 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:39 crc kubenswrapper[4901]: I0309 02:42:39.058095 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:39 crc kubenswrapper[4901]: I0309 02:42:39.058113 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:39Z","lastTransitionTime":"2026-03-09T02:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:39 crc kubenswrapper[4901]: I0309 02:42:39.126898 4901 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 09 02:42:39 crc kubenswrapper[4901]: I0309 02:42:39.162852 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:39 crc kubenswrapper[4901]: I0309 02:42:39.162957 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:39 crc kubenswrapper[4901]: I0309 02:42:39.162988 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:39 crc kubenswrapper[4901]: I0309 02:42:39.163029 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:39 crc kubenswrapper[4901]: I0309 02:42:39.163071 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:39Z","lastTransitionTime":"2026-03-09T02:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:39 crc kubenswrapper[4901]: I0309 02:42:39.266915 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:39 crc kubenswrapper[4901]: I0309 02:42:39.266981 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:39 crc kubenswrapper[4901]: I0309 02:42:39.266993 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:39 crc kubenswrapper[4901]: I0309 02:42:39.267010 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:39 crc kubenswrapper[4901]: I0309 02:42:39.267022 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:39Z","lastTransitionTime":"2026-03-09T02:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:39 crc kubenswrapper[4901]: I0309 02:42:39.370905 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:39 crc kubenswrapper[4901]: I0309 02:42:39.371006 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:39 crc kubenswrapper[4901]: I0309 02:42:39.371026 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:39 crc kubenswrapper[4901]: I0309 02:42:39.371049 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:39 crc kubenswrapper[4901]: I0309 02:42:39.371068 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:39Z","lastTransitionTime":"2026-03-09T02:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:39 crc kubenswrapper[4901]: I0309 02:42:39.474749 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:39 crc kubenswrapper[4901]: I0309 02:42:39.474809 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:39 crc kubenswrapper[4901]: I0309 02:42:39.474828 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:39 crc kubenswrapper[4901]: I0309 02:42:39.474851 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:39 crc kubenswrapper[4901]: I0309 02:42:39.474869 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:39Z","lastTransitionTime":"2026-03-09T02:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:39 crc kubenswrapper[4901]: I0309 02:42:39.577646 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:39 crc kubenswrapper[4901]: I0309 02:42:39.577738 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:39 crc kubenswrapper[4901]: I0309 02:42:39.577753 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:39 crc kubenswrapper[4901]: I0309 02:42:39.577771 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:39 crc kubenswrapper[4901]: I0309 02:42:39.577783 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:39Z","lastTransitionTime":"2026-03-09T02:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:39 crc kubenswrapper[4901]: I0309 02:42:39.680437 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:39 crc kubenswrapper[4901]: I0309 02:42:39.680495 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:39 crc kubenswrapper[4901]: I0309 02:42:39.680509 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:39 crc kubenswrapper[4901]: I0309 02:42:39.680527 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:39 crc kubenswrapper[4901]: I0309 02:42:39.680541 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:39Z","lastTransitionTime":"2026-03-09T02:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:39 crc kubenswrapper[4901]: I0309 02:42:39.782890 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:39 crc kubenswrapper[4901]: I0309 02:42:39.782960 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:39 crc kubenswrapper[4901]: I0309 02:42:39.782979 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:39 crc kubenswrapper[4901]: I0309 02:42:39.783006 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:39 crc kubenswrapper[4901]: I0309 02:42:39.783028 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:39Z","lastTransitionTime":"2026-03-09T02:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:39 crc kubenswrapper[4901]: I0309 02:42:39.887029 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:39 crc kubenswrapper[4901]: I0309 02:42:39.887137 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:39 crc kubenswrapper[4901]: I0309 02:42:39.887166 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:39 crc kubenswrapper[4901]: I0309 02:42:39.887205 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:39 crc kubenswrapper[4901]: I0309 02:42:39.887279 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:39Z","lastTransitionTime":"2026-03-09T02:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:39 crc kubenswrapper[4901]: I0309 02:42:39.990169 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:39 crc kubenswrapper[4901]: I0309 02:42:39.990267 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:39 crc kubenswrapper[4901]: I0309 02:42:39.990288 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:39 crc kubenswrapper[4901]: I0309 02:42:39.990316 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:39 crc kubenswrapper[4901]: I0309 02:42:39.990338 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:39Z","lastTransitionTime":"2026-03-09T02:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:40 crc kubenswrapper[4901]: I0309 02:42:40.093605 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:40 crc kubenswrapper[4901]: I0309 02:42:40.093663 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:40 crc kubenswrapper[4901]: I0309 02:42:40.093679 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:40 crc kubenswrapper[4901]: I0309 02:42:40.093701 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:40 crc kubenswrapper[4901]: I0309 02:42:40.093714 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:40Z","lastTransitionTime":"2026-03-09T02:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:40 crc kubenswrapper[4901]: I0309 02:42:40.106251 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:42:40 crc kubenswrapper[4901]: E0309 02:42:40.106468 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 02:42:40 crc kubenswrapper[4901]: I0309 02:42:40.106512 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:42:40 crc kubenswrapper[4901]: I0309 02:42:40.106548 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:42:40 crc kubenswrapper[4901]: E0309 02:42:40.106631 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 02:42:40 crc kubenswrapper[4901]: E0309 02:42:40.106777 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 02:42:40 crc kubenswrapper[4901]: I0309 02:42:40.197289 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:40 crc kubenswrapper[4901]: I0309 02:42:40.197361 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:40 crc kubenswrapper[4901]: I0309 02:42:40.197385 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:40 crc kubenswrapper[4901]: I0309 02:42:40.197413 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:40 crc kubenswrapper[4901]: I0309 02:42:40.197432 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:40Z","lastTransitionTime":"2026-03-09T02:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:40 crc kubenswrapper[4901]: I0309 02:42:40.301374 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:40 crc kubenswrapper[4901]: I0309 02:42:40.301467 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:40 crc kubenswrapper[4901]: I0309 02:42:40.301488 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:40 crc kubenswrapper[4901]: I0309 02:42:40.301516 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:40 crc kubenswrapper[4901]: I0309 02:42:40.301536 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:40Z","lastTransitionTime":"2026-03-09T02:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:40 crc kubenswrapper[4901]: I0309 02:42:40.405058 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:40 crc kubenswrapper[4901]: I0309 02:42:40.405143 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:40 crc kubenswrapper[4901]: I0309 02:42:40.405167 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:40 crc kubenswrapper[4901]: I0309 02:42:40.405199 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:40 crc kubenswrapper[4901]: I0309 02:42:40.405267 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:40Z","lastTransitionTime":"2026-03-09T02:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:40 crc kubenswrapper[4901]: I0309 02:42:40.508887 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:40 crc kubenswrapper[4901]: I0309 02:42:40.508958 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:40 crc kubenswrapper[4901]: I0309 02:42:40.508993 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:40 crc kubenswrapper[4901]: I0309 02:42:40.509032 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:40 crc kubenswrapper[4901]: I0309 02:42:40.509060 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:40Z","lastTransitionTime":"2026-03-09T02:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:40 crc kubenswrapper[4901]: I0309 02:42:40.613576 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:40 crc kubenswrapper[4901]: I0309 02:42:40.613651 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:40 crc kubenswrapper[4901]: I0309 02:42:40.613670 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:40 crc kubenswrapper[4901]: I0309 02:42:40.613700 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:40 crc kubenswrapper[4901]: I0309 02:42:40.613719 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:40Z","lastTransitionTime":"2026-03-09T02:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:40 crc kubenswrapper[4901]: I0309 02:42:40.716074 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:40 crc kubenswrapper[4901]: I0309 02:42:40.716113 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:40 crc kubenswrapper[4901]: I0309 02:42:40.716126 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:40 crc kubenswrapper[4901]: I0309 02:42:40.716145 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:40 crc kubenswrapper[4901]: I0309 02:42:40.716178 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:40Z","lastTransitionTime":"2026-03-09T02:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:40 crc kubenswrapper[4901]: I0309 02:42:40.819053 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:40 crc kubenswrapper[4901]: I0309 02:42:40.819099 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:40 crc kubenswrapper[4901]: I0309 02:42:40.819118 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:40 crc kubenswrapper[4901]: I0309 02:42:40.819146 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:40 crc kubenswrapper[4901]: I0309 02:42:40.819199 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:40Z","lastTransitionTime":"2026-03-09T02:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:40 crc kubenswrapper[4901]: I0309 02:42:40.922941 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:40 crc kubenswrapper[4901]: I0309 02:42:40.923006 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:40 crc kubenswrapper[4901]: I0309 02:42:40.923025 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:40 crc kubenswrapper[4901]: I0309 02:42:40.923044 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:40 crc kubenswrapper[4901]: I0309 02:42:40.923055 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:40Z","lastTransitionTime":"2026-03-09T02:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:41 crc kubenswrapper[4901]: I0309 02:42:41.025757 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:41 crc kubenswrapper[4901]: I0309 02:42:41.025815 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:41 crc kubenswrapper[4901]: I0309 02:42:41.025825 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:41 crc kubenswrapper[4901]: I0309 02:42:41.025840 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:41 crc kubenswrapper[4901]: I0309 02:42:41.025849 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:41Z","lastTransitionTime":"2026-03-09T02:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:41 crc kubenswrapper[4901]: I0309 02:42:41.128936 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:41 crc kubenswrapper[4901]: I0309 02:42:41.129005 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:41 crc kubenswrapper[4901]: I0309 02:42:41.129023 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:41 crc kubenswrapper[4901]: I0309 02:42:41.129052 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:41 crc kubenswrapper[4901]: I0309 02:42:41.129070 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:41Z","lastTransitionTime":"2026-03-09T02:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:41 crc kubenswrapper[4901]: I0309 02:42:41.231622 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:41 crc kubenswrapper[4901]: I0309 02:42:41.231697 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:41 crc kubenswrapper[4901]: I0309 02:42:41.231720 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:41 crc kubenswrapper[4901]: I0309 02:42:41.231755 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:41 crc kubenswrapper[4901]: I0309 02:42:41.231779 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:41Z","lastTransitionTime":"2026-03-09T02:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:41 crc kubenswrapper[4901]: I0309 02:42:41.335021 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:41 crc kubenswrapper[4901]: I0309 02:42:41.335537 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:41 crc kubenswrapper[4901]: I0309 02:42:41.335561 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:41 crc kubenswrapper[4901]: I0309 02:42:41.335589 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:41 crc kubenswrapper[4901]: I0309 02:42:41.335608 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:41Z","lastTransitionTime":"2026-03-09T02:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:41 crc kubenswrapper[4901]: I0309 02:42:41.438760 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:41 crc kubenswrapper[4901]: I0309 02:42:41.438821 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:41 crc kubenswrapper[4901]: I0309 02:42:41.438838 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:41 crc kubenswrapper[4901]: I0309 02:42:41.438863 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:41 crc kubenswrapper[4901]: I0309 02:42:41.438882 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:41Z","lastTransitionTime":"2026-03-09T02:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:41 crc kubenswrapper[4901]: I0309 02:42:41.543038 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:41 crc kubenswrapper[4901]: I0309 02:42:41.543111 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:41 crc kubenswrapper[4901]: I0309 02:42:41.543149 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:41 crc kubenswrapper[4901]: I0309 02:42:41.543181 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:41 crc kubenswrapper[4901]: I0309 02:42:41.543203 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:41Z","lastTransitionTime":"2026-03-09T02:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:41 crc kubenswrapper[4901]: I0309 02:42:41.645871 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:41 crc kubenswrapper[4901]: I0309 02:42:41.645934 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:41 crc kubenswrapper[4901]: I0309 02:42:41.645954 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:41 crc kubenswrapper[4901]: I0309 02:42:41.645983 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:41 crc kubenswrapper[4901]: I0309 02:42:41.646006 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:41Z","lastTransitionTime":"2026-03-09T02:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:41 crc kubenswrapper[4901]: I0309 02:42:41.749805 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:41 crc kubenswrapper[4901]: I0309 02:42:41.749880 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:41 crc kubenswrapper[4901]: I0309 02:42:41.749903 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:41 crc kubenswrapper[4901]: I0309 02:42:41.749938 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:41 crc kubenswrapper[4901]: I0309 02:42:41.749962 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:41Z","lastTransitionTime":"2026-03-09T02:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:41 crc kubenswrapper[4901]: I0309 02:42:41.852956 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:41 crc kubenswrapper[4901]: I0309 02:42:41.853020 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:41 crc kubenswrapper[4901]: I0309 02:42:41.853033 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:41 crc kubenswrapper[4901]: I0309 02:42:41.853057 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:41 crc kubenswrapper[4901]: I0309 02:42:41.853076 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:41Z","lastTransitionTime":"2026-03-09T02:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:41 crc kubenswrapper[4901]: I0309 02:42:41.871165 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 02:42:41 crc kubenswrapper[4901]: I0309 02:42:41.871262 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:42:41 crc kubenswrapper[4901]: I0309 02:42:41.871288 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:42:41 crc kubenswrapper[4901]: I0309 02:42:41.871307 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:42:41 crc kubenswrapper[4901]: I0309 02:42:41.871324 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:42:41 crc kubenswrapper[4901]: E0309 02:42:41.871363 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 02:42:49.871336812 +0000 UTC m=+94.461000554 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:42:41 crc kubenswrapper[4901]: E0309 02:42:41.871418 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 02:42:41 crc kubenswrapper[4901]: E0309 02:42:41.871433 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 02:42:41 crc kubenswrapper[4901]: E0309 02:42:41.871443 4901 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 02:42:41 crc kubenswrapper[4901]: E0309 02:42:41.871450 4901 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 02:42:41 crc kubenswrapper[4901]: E0309 02:42:41.871482 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 02:42:49.871469435 +0000 UTC m=+94.461133167 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 02:42:41 crc kubenswrapper[4901]: E0309 02:42:41.871498 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 02:42:49.871492106 +0000 UTC m=+94.461155838 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 02:42:41 crc kubenswrapper[4901]: E0309 02:42:41.871538 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 02:42:41 crc kubenswrapper[4901]: E0309 02:42:41.871546 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 02:42:41 crc kubenswrapper[4901]: E0309 02:42:41.871552 4901 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 02:42:41 crc kubenswrapper[4901]: E0309 02:42:41.871554 4901 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 02:42:41 crc kubenswrapper[4901]: E0309 02:42:41.871572 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 02:42:49.871565197 +0000 UTC m=+94.461228929 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 02:42:41 crc kubenswrapper[4901]: E0309 02:42:41.871587 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 02:42:49.871578768 +0000 UTC m=+94.461242510 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 02:42:41 crc kubenswrapper[4901]: I0309 02:42:41.955296 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:41 crc kubenswrapper[4901]: I0309 02:42:41.955374 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:41 crc kubenswrapper[4901]: I0309 02:42:41.955391 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:41 crc kubenswrapper[4901]: I0309 02:42:41.955415 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:41 crc kubenswrapper[4901]: I0309 02:42:41.955433 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:41Z","lastTransitionTime":"2026-03-09T02:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:42 crc kubenswrapper[4901]: I0309 02:42:42.057433 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:42 crc kubenswrapper[4901]: I0309 02:42:42.057465 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:42 crc kubenswrapper[4901]: I0309 02:42:42.057473 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:42 crc kubenswrapper[4901]: I0309 02:42:42.057486 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:42 crc kubenswrapper[4901]: I0309 02:42:42.057495 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:42Z","lastTransitionTime":"2026-03-09T02:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:42 crc kubenswrapper[4901]: I0309 02:42:42.106346 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:42:42 crc kubenswrapper[4901]: E0309 02:42:42.106508 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 02:42:42 crc kubenswrapper[4901]: I0309 02:42:42.106641 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:42:42 crc kubenswrapper[4901]: I0309 02:42:42.106779 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:42:42 crc kubenswrapper[4901]: E0309 02:42:42.106915 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 02:42:42 crc kubenswrapper[4901]: E0309 02:42:42.107441 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 02:42:42 crc kubenswrapper[4901]: I0309 02:42:42.125733 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 09 02:42:42 crc kubenswrapper[4901]: I0309 02:42:42.159969 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:42 crc kubenswrapper[4901]: I0309 02:42:42.160047 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:42 crc kubenswrapper[4901]: I0309 02:42:42.160068 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:42 crc kubenswrapper[4901]: I0309 02:42:42.160095 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:42 crc kubenswrapper[4901]: I0309 02:42:42.160113 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:42Z","lastTransitionTime":"2026-03-09T02:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:42 crc kubenswrapper[4901]: I0309 02:42:42.263092 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:42 crc kubenswrapper[4901]: I0309 02:42:42.263143 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:42 crc kubenswrapper[4901]: I0309 02:42:42.263159 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:42 crc kubenswrapper[4901]: I0309 02:42:42.263186 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:42 crc kubenswrapper[4901]: I0309 02:42:42.263205 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:42Z","lastTransitionTime":"2026-03-09T02:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:42 crc kubenswrapper[4901]: I0309 02:42:42.365544 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:42 crc kubenswrapper[4901]: I0309 02:42:42.365620 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:42 crc kubenswrapper[4901]: I0309 02:42:42.365642 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:42 crc kubenswrapper[4901]: I0309 02:42:42.365674 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:42 crc kubenswrapper[4901]: I0309 02:42:42.365698 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:42Z","lastTransitionTime":"2026-03-09T02:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:42 crc kubenswrapper[4901]: I0309 02:42:42.468707 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:42 crc kubenswrapper[4901]: I0309 02:42:42.468764 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:42 crc kubenswrapper[4901]: I0309 02:42:42.468780 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:42 crc kubenswrapper[4901]: I0309 02:42:42.468808 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:42 crc kubenswrapper[4901]: I0309 02:42:42.468825 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:42Z","lastTransitionTime":"2026-03-09T02:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:42 crc kubenswrapper[4901]: I0309 02:42:42.571636 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:42 crc kubenswrapper[4901]: I0309 02:42:42.571698 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:42 crc kubenswrapper[4901]: I0309 02:42:42.571717 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:42 crc kubenswrapper[4901]: I0309 02:42:42.571745 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:42 crc kubenswrapper[4901]: I0309 02:42:42.571764 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:42Z","lastTransitionTime":"2026-03-09T02:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:42 crc kubenswrapper[4901]: I0309 02:42:42.675660 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:42 crc kubenswrapper[4901]: I0309 02:42:42.675726 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:42 crc kubenswrapper[4901]: I0309 02:42:42.675743 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:42 crc kubenswrapper[4901]: I0309 02:42:42.675766 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:42 crc kubenswrapper[4901]: I0309 02:42:42.675783 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:42Z","lastTransitionTime":"2026-03-09T02:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:42 crc kubenswrapper[4901]: I0309 02:42:42.728329 4901 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 09 02:42:42 crc kubenswrapper[4901]: I0309 02:42:42.785500 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:42 crc kubenswrapper[4901]: I0309 02:42:42.785572 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:42 crc kubenswrapper[4901]: I0309 02:42:42.785594 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:42 crc kubenswrapper[4901]: I0309 02:42:42.785682 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:42 crc kubenswrapper[4901]: I0309 02:42:42.785712 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:42Z","lastTransitionTime":"2026-03-09T02:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:42 crc kubenswrapper[4901]: I0309 02:42:42.889119 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:42 crc kubenswrapper[4901]: I0309 02:42:42.889212 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:42 crc kubenswrapper[4901]: I0309 02:42:42.889263 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:42 crc kubenswrapper[4901]: I0309 02:42:42.889294 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:42 crc kubenswrapper[4901]: I0309 02:42:42.889311 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:42Z","lastTransitionTime":"2026-03-09T02:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.012026 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.012081 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.012096 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.012116 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.012127 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:43Z","lastTransitionTime":"2026-03-09T02:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.115986 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.116060 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.116079 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.116105 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.116128 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:43Z","lastTransitionTime":"2026-03-09T02:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.219362 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.219449 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.219467 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.219492 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.219510 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:43Z","lastTransitionTime":"2026-03-09T02:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.288417 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.288468 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.288480 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.288499 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.288510 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:43Z","lastTransitionTime":"2026-03-09T02:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:43 crc kubenswrapper[4901]: E0309 02:42:43.308878 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:42:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:42:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:42:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:42:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4fb5477d-c9aa-418f-9a0d-560ac0227b13\\\",\\\"systemUUID\\\":\\\"92bcf75b-bfed-4296-bbf2-d35c6ac3a586\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.314909 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.314974 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.315001 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.315030 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.315054 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:43Z","lastTransitionTime":"2026-03-09T02:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:43 crc kubenswrapper[4901]: E0309 02:42:43.333820 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:42:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:42:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:42:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:42:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4fb5477d-c9aa-418f-9a0d-560ac0227b13\\\",\\\"systemUUID\\\":\\\"92bcf75b-bfed-4296-bbf2-d35c6ac3a586\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.338694 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.338764 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.338784 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.338812 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.338831 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:43Z","lastTransitionTime":"2026-03-09T02:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:43 crc kubenswrapper[4901]: E0309 02:42:43.350922 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:42:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:42:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:42:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:42:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4fb5477d-c9aa-418f-9a0d-560ac0227b13\\\",\\\"systemUUID\\\":\\\"92bcf75b-bfed-4296-bbf2-d35c6ac3a586\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.355628 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.355696 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.355717 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.355813 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.355870 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:43Z","lastTransitionTime":"2026-03-09T02:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:43 crc kubenswrapper[4901]: E0309 02:42:43.371716 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:42:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:42:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:42:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:42:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4fb5477d-c9aa-418f-9a0d-560ac0227b13\\\",\\\"systemUUID\\\":\\\"92bcf75b-bfed-4296-bbf2-d35c6ac3a586\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.376795 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.376854 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.376874 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.376899 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.376921 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:43Z","lastTransitionTime":"2026-03-09T02:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:43 crc kubenswrapper[4901]: E0309 02:42:43.396575 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:42:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:42:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:42:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:42:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4fb5477d-c9aa-418f-9a0d-560ac0227b13\\\",\\\"systemUUID\\\":\\\"92bcf75b-bfed-4296-bbf2-d35c6ac3a586\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:43 crc kubenswrapper[4901]: E0309 02:42:43.396843 4901 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.399278 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.399392 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.399421 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.399459 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.399488 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:43Z","lastTransitionTime":"2026-03-09T02:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.502318 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.502379 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.502398 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.502424 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.502443 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:43Z","lastTransitionTime":"2026-03-09T02:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.605902 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.605978 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.606003 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.606098 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.606178 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:43Z","lastTransitionTime":"2026-03-09T02:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.709374 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.709427 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.709444 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.709466 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.709482 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:43Z","lastTransitionTime":"2026-03-09T02:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.813065 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.813119 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.813138 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.813164 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.813181 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:43Z","lastTransitionTime":"2026-03-09T02:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.915706 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.915752 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.915764 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.915782 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:43 crc kubenswrapper[4901]: I0309 02:42:43.915795 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:43Z","lastTransitionTime":"2026-03-09T02:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:44 crc kubenswrapper[4901]: I0309 02:42:44.018307 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:44 crc kubenswrapper[4901]: I0309 02:42:44.018353 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:44 crc kubenswrapper[4901]: I0309 02:42:44.018363 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:44 crc kubenswrapper[4901]: I0309 02:42:44.018386 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:44 crc kubenswrapper[4901]: I0309 02:42:44.018396 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:44Z","lastTransitionTime":"2026-03-09T02:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:44 crc kubenswrapper[4901]: I0309 02:42:44.105579 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:42:44 crc kubenswrapper[4901]: I0309 02:42:44.105699 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:42:44 crc kubenswrapper[4901]: E0309 02:42:44.105795 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 02:42:44 crc kubenswrapper[4901]: I0309 02:42:44.105828 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:42:44 crc kubenswrapper[4901]: E0309 02:42:44.106144 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 02:42:44 crc kubenswrapper[4901]: E0309 02:42:44.106020 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 02:42:44 crc kubenswrapper[4901]: I0309 02:42:44.122549 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:44 crc kubenswrapper[4901]: I0309 02:42:44.122631 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:44 crc kubenswrapper[4901]: I0309 02:42:44.122652 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:44 crc kubenswrapper[4901]: I0309 02:42:44.122702 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:44 crc kubenswrapper[4901]: I0309 02:42:44.122731 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:44Z","lastTransitionTime":"2026-03-09T02:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:44 crc kubenswrapper[4901]: I0309 02:42:44.226472 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:44 crc kubenswrapper[4901]: I0309 02:42:44.226556 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:44 crc kubenswrapper[4901]: I0309 02:42:44.226583 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:44 crc kubenswrapper[4901]: I0309 02:42:44.226646 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:44 crc kubenswrapper[4901]: I0309 02:42:44.226672 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:44Z","lastTransitionTime":"2026-03-09T02:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:44 crc kubenswrapper[4901]: I0309 02:42:44.330938 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:44 crc kubenswrapper[4901]: I0309 02:42:44.331014 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:44 crc kubenswrapper[4901]: I0309 02:42:44.331032 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:44 crc kubenswrapper[4901]: I0309 02:42:44.331058 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:44 crc kubenswrapper[4901]: I0309 02:42:44.331076 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:44Z","lastTransitionTime":"2026-03-09T02:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:44 crc kubenswrapper[4901]: I0309 02:42:44.433739 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:44 crc kubenswrapper[4901]: I0309 02:42:44.433815 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:44 crc kubenswrapper[4901]: I0309 02:42:44.433840 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:44 crc kubenswrapper[4901]: I0309 02:42:44.433871 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:44 crc kubenswrapper[4901]: I0309 02:42:44.433895 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:44Z","lastTransitionTime":"2026-03-09T02:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:44 crc kubenswrapper[4901]: I0309 02:42:44.538098 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:44 crc kubenswrapper[4901]: I0309 02:42:44.538178 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:44 crc kubenswrapper[4901]: I0309 02:42:44.538201 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:44 crc kubenswrapper[4901]: I0309 02:42:44.538262 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:44 crc kubenswrapper[4901]: I0309 02:42:44.538287 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:44Z","lastTransitionTime":"2026-03-09T02:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:44 crc kubenswrapper[4901]: I0309 02:42:44.641906 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:44 crc kubenswrapper[4901]: I0309 02:42:44.641992 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:44 crc kubenswrapper[4901]: I0309 02:42:44.642015 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:44 crc kubenswrapper[4901]: I0309 02:42:44.642110 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:44 crc kubenswrapper[4901]: I0309 02:42:44.642137 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:44Z","lastTransitionTime":"2026-03-09T02:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:44 crc kubenswrapper[4901]: I0309 02:42:44.745932 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:44 crc kubenswrapper[4901]: I0309 02:42:44.746030 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:44 crc kubenswrapper[4901]: I0309 02:42:44.746067 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:44 crc kubenswrapper[4901]: I0309 02:42:44.746138 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:44 crc kubenswrapper[4901]: I0309 02:42:44.746161 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:44Z","lastTransitionTime":"2026-03-09T02:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:44 crc kubenswrapper[4901]: I0309 02:42:44.849705 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:44 crc kubenswrapper[4901]: I0309 02:42:44.849777 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:44 crc kubenswrapper[4901]: I0309 02:42:44.849805 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:44 crc kubenswrapper[4901]: I0309 02:42:44.849849 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:44 crc kubenswrapper[4901]: I0309 02:42:44.849886 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:44Z","lastTransitionTime":"2026-03-09T02:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:44 crc kubenswrapper[4901]: I0309 02:42:44.954095 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:44 crc kubenswrapper[4901]: I0309 02:42:44.954163 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:44 crc kubenswrapper[4901]: I0309 02:42:44.954178 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:44 crc kubenswrapper[4901]: I0309 02:42:44.954201 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:44 crc kubenswrapper[4901]: I0309 02:42:44.954253 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:44Z","lastTransitionTime":"2026-03-09T02:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:45 crc kubenswrapper[4901]: I0309 02:42:45.057298 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:45 crc kubenswrapper[4901]: I0309 02:42:45.057388 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:45 crc kubenswrapper[4901]: I0309 02:42:45.057408 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:45 crc kubenswrapper[4901]: I0309 02:42:45.057449 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:45 crc kubenswrapper[4901]: I0309 02:42:45.057469 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:45Z","lastTransitionTime":"2026-03-09T02:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:45 crc kubenswrapper[4901]: I0309 02:42:45.161158 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:45 crc kubenswrapper[4901]: I0309 02:42:45.161217 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:45 crc kubenswrapper[4901]: I0309 02:42:45.161248 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:45 crc kubenswrapper[4901]: I0309 02:42:45.161266 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:45 crc kubenswrapper[4901]: I0309 02:42:45.161278 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:45Z","lastTransitionTime":"2026-03-09T02:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:45 crc kubenswrapper[4901]: I0309 02:42:45.264449 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:45 crc kubenswrapper[4901]: I0309 02:42:45.264509 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:45 crc kubenswrapper[4901]: I0309 02:42:45.264524 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:45 crc kubenswrapper[4901]: I0309 02:42:45.264544 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:45 crc kubenswrapper[4901]: I0309 02:42:45.264559 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:45Z","lastTransitionTime":"2026-03-09T02:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:45 crc kubenswrapper[4901]: I0309 02:42:45.367909 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:45 crc kubenswrapper[4901]: I0309 02:42:45.368006 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:45 crc kubenswrapper[4901]: I0309 02:42:45.368020 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:45 crc kubenswrapper[4901]: I0309 02:42:45.368041 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:45 crc kubenswrapper[4901]: I0309 02:42:45.368054 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:45Z","lastTransitionTime":"2026-03-09T02:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:45 crc kubenswrapper[4901]: I0309 02:42:45.470944 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:45 crc kubenswrapper[4901]: I0309 02:42:45.471002 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:45 crc kubenswrapper[4901]: I0309 02:42:45.471015 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:45 crc kubenswrapper[4901]: I0309 02:42:45.471041 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:45 crc kubenswrapper[4901]: I0309 02:42:45.471058 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:45Z","lastTransitionTime":"2026-03-09T02:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:45 crc kubenswrapper[4901]: I0309 02:42:45.574262 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:45 crc kubenswrapper[4901]: I0309 02:42:45.574329 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:45 crc kubenswrapper[4901]: I0309 02:42:45.574346 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:45 crc kubenswrapper[4901]: I0309 02:42:45.574369 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:45 crc kubenswrapper[4901]: I0309 02:42:45.574383 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:45Z","lastTransitionTime":"2026-03-09T02:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:45 crc kubenswrapper[4901]: I0309 02:42:45.677853 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:45 crc kubenswrapper[4901]: I0309 02:42:45.677941 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:45 crc kubenswrapper[4901]: I0309 02:42:45.677956 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:45 crc kubenswrapper[4901]: I0309 02:42:45.677982 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:45 crc kubenswrapper[4901]: I0309 02:42:45.677997 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:45Z","lastTransitionTime":"2026-03-09T02:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:45 crc kubenswrapper[4901]: I0309 02:42:45.783147 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:45 crc kubenswrapper[4901]: I0309 02:42:45.783212 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:45 crc kubenswrapper[4901]: I0309 02:42:45.783251 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:45 crc kubenswrapper[4901]: I0309 02:42:45.783279 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:45 crc kubenswrapper[4901]: I0309 02:42:45.783298 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:45Z","lastTransitionTime":"2026-03-09T02:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:45 crc kubenswrapper[4901]: I0309 02:42:45.885830 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:45 crc kubenswrapper[4901]: I0309 02:42:45.885894 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:45 crc kubenswrapper[4901]: I0309 02:42:45.885912 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:45 crc kubenswrapper[4901]: I0309 02:42:45.885939 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:45 crc kubenswrapper[4901]: I0309 02:42:45.885955 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:45Z","lastTransitionTime":"2026-03-09T02:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:45 crc kubenswrapper[4901]: I0309 02:42:45.988598 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:45 crc kubenswrapper[4901]: I0309 02:42:45.988650 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:45 crc kubenswrapper[4901]: I0309 02:42:45.988660 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:45 crc kubenswrapper[4901]: I0309 02:42:45.988679 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:45 crc kubenswrapper[4901]: I0309 02:42:45.988695 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:45Z","lastTransitionTime":"2026-03-09T02:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:46 crc kubenswrapper[4901]: I0309 02:42:46.091178 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:46 crc kubenswrapper[4901]: I0309 02:42:46.091260 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:46 crc kubenswrapper[4901]: I0309 02:42:46.091298 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:46 crc kubenswrapper[4901]: I0309 02:42:46.091331 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:46 crc kubenswrapper[4901]: I0309 02:42:46.091354 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:46Z","lastTransitionTime":"2026-03-09T02:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:46 crc kubenswrapper[4901]: I0309 02:42:46.106131 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:42:46 crc kubenswrapper[4901]: I0309 02:42:46.106195 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:42:46 crc kubenswrapper[4901]: I0309 02:42:46.106266 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:42:46 crc kubenswrapper[4901]: E0309 02:42:46.106377 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 02:42:46 crc kubenswrapper[4901]: E0309 02:42:46.106719 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 02:42:46 crc kubenswrapper[4901]: E0309 02:42:46.106886 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 02:42:46 crc kubenswrapper[4901]: I0309 02:42:46.123627 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:46 crc kubenswrapper[4901]: I0309 02:42:46.138371 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:46 crc kubenswrapper[4901]: I0309 02:42:46.152093 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:46 crc kubenswrapper[4901]: I0309 02:42:46.167851 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad3db3b3-d72c-4e61-9db8-fcbc6abb0578\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95a5fc152f730178294cfb5d25a815b2041e4b712e4bdb7b334b4b417f463ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ceafe61efea5730e474e74a9ed5a7558cc06dc75d105c37680ed1eea84b8827c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:17Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 02:41:48.034714 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 02:41:48.036179 1 observer_polling.go:159] Starting file observer\\\\nI0309 02:41:48.038474 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 02:41:48.039660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 02:42:14.296758 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 02:42:17.547772 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 02:42:17.547837 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:47Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0cd220ceed026b8b0448743b09537807f1d4c59e290c74e4afa63b3331aaf97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c86d550b0e0522073aa52f3a6aa6671f7ded403e37eda7c7b91c6e069272dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04aae8495c8f5edddbe537c44cefdb20eaa781df485766f6e709a5d4c6e82df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:46 crc kubenswrapper[4901]: I0309 02:42:46.194422 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:46 crc kubenswrapper[4901]: I0309 02:42:46.194492 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:46 crc kubenswrapper[4901]: I0309 02:42:46.194508 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:46 crc kubenswrapper[4901]: I0309 02:42:46.194533 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:46 crc kubenswrapper[4901]: I0309 02:42:46.194550 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:46Z","lastTransitionTime":"2026-03-09T02:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:46 crc kubenswrapper[4901]: I0309 02:42:46.197493 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f58555f-103d-42bd-affa-68dd3ed5baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18f78a3248e86d96e6293502db1727285fb76668c8242ed3399de4b8e8fb666a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79c92d776bf96579afc6fbc0e1e70bb645a371966be4e4e9d86e18ef3c977b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7936d7271945f5c0b5ab0ddecf9a3f46c9f7306b71e25fabbb3927303c6f9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1098f7874be8a2a538c966845a946ab7993ec921e4e0859953f0187b4fabfd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23f66811373b03763a100551c0390732ccceac869c908d769de48f847e4ab82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:46 crc kubenswrapper[4901]: I0309 02:42:46.220497 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ad2682-53b5-4e9d-acd1-0f0d210b322c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a290d0cc95a867bb7d924bfb2e63a518bb753765d4cdb890ccdf96cd36450b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72901ff5efbd52bce10e4651744925f7a5758a7a3c2572b581ab2cd222f58bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d0fe7f5e959b5c71b03b4ad017949229c3a161ea79f7869e610d069248c552\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 02:42:28.169798 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 02:42:28.169917 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 02:42:28.170597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1507146702/tls.crt::/tmp/serving-cert-1507146702/tls.key\\\\\\\"\\\\nI0309 02:42:28.390239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 02:42:28.392919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 02:42:28.392936 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 02:42:28.392959 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 02:42:28.392964 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 02:42:28.400711 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 02:42:28.400732 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 02:42:28.400760 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 02:42:28.400786 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 02:42:28.400792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 02:42:28.400798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 02:42:28.402404 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99df08aa182691c1bdb9ab9fb7b5352b276c236655b1076e2993639d63a94bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:46 crc kubenswrapper[4901]: I0309 02:42:46.237953 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:46 crc kubenswrapper[4901]: I0309 02:42:46.253710 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:46 crc kubenswrapper[4901]: I0309 02:42:46.270272 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:46 crc kubenswrapper[4901]: I0309 02:42:46.297458 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:46 crc kubenswrapper[4901]: I0309 02:42:46.297507 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:46 crc kubenswrapper[4901]: I0309 02:42:46.297520 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:46 crc kubenswrapper[4901]: I0309 02:42:46.297541 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:46 crc kubenswrapper[4901]: I0309 02:42:46.297556 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:46Z","lastTransitionTime":"2026-03-09T02:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:46 crc kubenswrapper[4901]: I0309 02:42:46.401020 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:46 crc kubenswrapper[4901]: I0309 02:42:46.401096 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:46 crc kubenswrapper[4901]: I0309 02:42:46.401132 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:46 crc kubenswrapper[4901]: I0309 02:42:46.401155 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:46 crc kubenswrapper[4901]: I0309 02:42:46.401169 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:46Z","lastTransitionTime":"2026-03-09T02:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:46 crc kubenswrapper[4901]: I0309 02:42:46.503375 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:46 crc kubenswrapper[4901]: I0309 02:42:46.503419 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:46 crc kubenswrapper[4901]: I0309 02:42:46.503433 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:46 crc kubenswrapper[4901]: I0309 02:42:46.503453 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:46 crc kubenswrapper[4901]: I0309 02:42:46.503469 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:46Z","lastTransitionTime":"2026-03-09T02:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:46 crc kubenswrapper[4901]: I0309 02:42:46.606355 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:46 crc kubenswrapper[4901]: I0309 02:42:46.606420 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:46 crc kubenswrapper[4901]: I0309 02:42:46.606434 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:46 crc kubenswrapper[4901]: I0309 02:42:46.606452 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:46 crc kubenswrapper[4901]: I0309 02:42:46.606465 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:46Z","lastTransitionTime":"2026-03-09T02:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:46 crc kubenswrapper[4901]: I0309 02:42:46.710119 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:46 crc kubenswrapper[4901]: I0309 02:42:46.710182 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:46 crc kubenswrapper[4901]: I0309 02:42:46.710204 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:46 crc kubenswrapper[4901]: I0309 02:42:46.710263 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:46 crc kubenswrapper[4901]: I0309 02:42:46.710280 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:46Z","lastTransitionTime":"2026-03-09T02:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:46 crc kubenswrapper[4901]: I0309 02:42:46.813606 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:46 crc kubenswrapper[4901]: I0309 02:42:46.813669 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:46 crc kubenswrapper[4901]: I0309 02:42:46.813687 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:46 crc kubenswrapper[4901]: I0309 02:42:46.813715 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:46 crc kubenswrapper[4901]: I0309 02:42:46.813733 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:46Z","lastTransitionTime":"2026-03-09T02:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:46 crc kubenswrapper[4901]: I0309 02:42:46.917423 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:46 crc kubenswrapper[4901]: I0309 02:42:46.917496 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:46 crc kubenswrapper[4901]: I0309 02:42:46.917520 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:46 crc kubenswrapper[4901]: I0309 02:42:46.917550 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:46 crc kubenswrapper[4901]: I0309 02:42:46.917572 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:46Z","lastTransitionTime":"2026-03-09T02:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:47 crc kubenswrapper[4901]: I0309 02:42:47.020468 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:47 crc kubenswrapper[4901]: I0309 02:42:47.020531 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:47 crc kubenswrapper[4901]: I0309 02:42:47.020544 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:47 crc kubenswrapper[4901]: I0309 02:42:47.020568 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:47 crc kubenswrapper[4901]: I0309 02:42:47.020586 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:47Z","lastTransitionTime":"2026-03-09T02:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:47 crc kubenswrapper[4901]: I0309 02:42:47.123493 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:47 crc kubenswrapper[4901]: I0309 02:42:47.123580 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:47 crc kubenswrapper[4901]: I0309 02:42:47.123597 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:47 crc kubenswrapper[4901]: I0309 02:42:47.123625 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:47 crc kubenswrapper[4901]: I0309 02:42:47.123642 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:47Z","lastTransitionTime":"2026-03-09T02:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:47 crc kubenswrapper[4901]: I0309 02:42:47.226693 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:47 crc kubenswrapper[4901]: I0309 02:42:47.226740 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:47 crc kubenswrapper[4901]: I0309 02:42:47.226750 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:47 crc kubenswrapper[4901]: I0309 02:42:47.226783 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:47 crc kubenswrapper[4901]: I0309 02:42:47.226794 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:47Z","lastTransitionTime":"2026-03-09T02:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:47 crc kubenswrapper[4901]: I0309 02:42:47.329511 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:47 crc kubenswrapper[4901]: I0309 02:42:47.329551 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:47 crc kubenswrapper[4901]: I0309 02:42:47.329560 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:47 crc kubenswrapper[4901]: I0309 02:42:47.329574 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:47 crc kubenswrapper[4901]: I0309 02:42:47.329583 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:47Z","lastTransitionTime":"2026-03-09T02:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:47 crc kubenswrapper[4901]: I0309 02:42:47.433290 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:47 crc kubenswrapper[4901]: I0309 02:42:47.433342 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:47 crc kubenswrapper[4901]: I0309 02:42:47.433357 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:47 crc kubenswrapper[4901]: I0309 02:42:47.433375 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:47 crc kubenswrapper[4901]: I0309 02:42:47.433386 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:47Z","lastTransitionTime":"2026-03-09T02:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:47 crc kubenswrapper[4901]: I0309 02:42:47.460762 4901 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 09 02:42:47 crc kubenswrapper[4901]: I0309 02:42:47.536160 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:47 crc kubenswrapper[4901]: I0309 02:42:47.536260 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:47 crc kubenswrapper[4901]: I0309 02:42:47.536279 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:47 crc kubenswrapper[4901]: I0309 02:42:47.536302 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:47 crc kubenswrapper[4901]: I0309 02:42:47.536319 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:47Z","lastTransitionTime":"2026-03-09T02:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:47 crc kubenswrapper[4901]: I0309 02:42:47.639416 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:47 crc kubenswrapper[4901]: I0309 02:42:47.639803 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:47 crc kubenswrapper[4901]: I0309 02:42:47.639931 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:47 crc kubenswrapper[4901]: I0309 02:42:47.640071 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:47 crc kubenswrapper[4901]: I0309 02:42:47.640186 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:47Z","lastTransitionTime":"2026-03-09T02:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:47 crc kubenswrapper[4901]: I0309 02:42:47.743407 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:47 crc kubenswrapper[4901]: I0309 02:42:47.743458 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:47 crc kubenswrapper[4901]: I0309 02:42:47.743469 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:47 crc kubenswrapper[4901]: I0309 02:42:47.743488 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:47 crc kubenswrapper[4901]: I0309 02:42:47.743501 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:47Z","lastTransitionTime":"2026-03-09T02:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:47 crc kubenswrapper[4901]: I0309 02:42:47.847087 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:47 crc kubenswrapper[4901]: I0309 02:42:47.847180 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:47 crc kubenswrapper[4901]: I0309 02:42:47.847191 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:47 crc kubenswrapper[4901]: I0309 02:42:47.847211 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:47 crc kubenswrapper[4901]: I0309 02:42:47.847241 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:47Z","lastTransitionTime":"2026-03-09T02:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:47 crc kubenswrapper[4901]: I0309 02:42:47.949652 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:47 crc kubenswrapper[4901]: I0309 02:42:47.949739 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:47 crc kubenswrapper[4901]: I0309 02:42:47.949759 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:47 crc kubenswrapper[4901]: I0309 02:42:47.949788 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:47 crc kubenswrapper[4901]: I0309 02:42:47.949808 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:47Z","lastTransitionTime":"2026-03-09T02:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:48 crc kubenswrapper[4901]: I0309 02:42:48.054031 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:48 crc kubenswrapper[4901]: I0309 02:42:48.054637 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:48 crc kubenswrapper[4901]: I0309 02:42:48.054705 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:48 crc kubenswrapper[4901]: I0309 02:42:48.054783 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:48 crc kubenswrapper[4901]: I0309 02:42:48.054883 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:48Z","lastTransitionTime":"2026-03-09T02:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:48 crc kubenswrapper[4901]: I0309 02:42:48.106374 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:42:48 crc kubenswrapper[4901]: E0309 02:42:48.107062 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 02:42:48 crc kubenswrapper[4901]: I0309 02:42:48.106649 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:42:48 crc kubenswrapper[4901]: I0309 02:42:48.106514 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:42:48 crc kubenswrapper[4901]: E0309 02:42:48.107696 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 02:42:48 crc kubenswrapper[4901]: E0309 02:42:48.107798 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 02:42:48 crc kubenswrapper[4901]: I0309 02:42:48.158835 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:48 crc kubenswrapper[4901]: I0309 02:42:48.159242 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:48 crc kubenswrapper[4901]: I0309 02:42:48.159317 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:48 crc kubenswrapper[4901]: I0309 02:42:48.159389 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:48 crc kubenswrapper[4901]: I0309 02:42:48.159465 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:48Z","lastTransitionTime":"2026-03-09T02:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:48 crc kubenswrapper[4901]: I0309 02:42:48.262956 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:48 crc kubenswrapper[4901]: I0309 02:42:48.263002 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:48 crc kubenswrapper[4901]: I0309 02:42:48.263012 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:48 crc kubenswrapper[4901]: I0309 02:42:48.263029 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:48 crc kubenswrapper[4901]: I0309 02:42:48.263044 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:48Z","lastTransitionTime":"2026-03-09T02:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:48 crc kubenswrapper[4901]: I0309 02:42:48.366671 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:48 crc kubenswrapper[4901]: I0309 02:42:48.366745 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:48 crc kubenswrapper[4901]: I0309 02:42:48.366755 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:48 crc kubenswrapper[4901]: I0309 02:42:48.366773 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:48 crc kubenswrapper[4901]: I0309 02:42:48.366784 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:48Z","lastTransitionTime":"2026-03-09T02:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:48 crc kubenswrapper[4901]: I0309 02:42:48.469591 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:48 crc kubenswrapper[4901]: I0309 02:42:48.469625 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:48 crc kubenswrapper[4901]: I0309 02:42:48.469636 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:48 crc kubenswrapper[4901]: I0309 02:42:48.469651 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:48 crc kubenswrapper[4901]: I0309 02:42:48.469662 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:48Z","lastTransitionTime":"2026-03-09T02:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:48 crc kubenswrapper[4901]: I0309 02:42:48.571966 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:48 crc kubenswrapper[4901]: I0309 02:42:48.572423 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:48 crc kubenswrapper[4901]: I0309 02:42:48.572448 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:48 crc kubenswrapper[4901]: I0309 02:42:48.572478 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:48 crc kubenswrapper[4901]: I0309 02:42:48.572502 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:48Z","lastTransitionTime":"2026-03-09T02:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:48 crc kubenswrapper[4901]: I0309 02:42:48.675343 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:48 crc kubenswrapper[4901]: I0309 02:42:48.675398 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:48 crc kubenswrapper[4901]: I0309 02:42:48.675409 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:48 crc kubenswrapper[4901]: I0309 02:42:48.675427 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:48 crc kubenswrapper[4901]: I0309 02:42:48.675440 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:48Z","lastTransitionTime":"2026-03-09T02:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:48 crc kubenswrapper[4901]: I0309 02:42:48.778515 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:48 crc kubenswrapper[4901]: I0309 02:42:48.778591 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:48 crc kubenswrapper[4901]: I0309 02:42:48.778615 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:48 crc kubenswrapper[4901]: I0309 02:42:48.778644 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:48 crc kubenswrapper[4901]: I0309 02:42:48.778665 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:48Z","lastTransitionTime":"2026-03-09T02:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:48 crc kubenswrapper[4901]: I0309 02:42:48.882270 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:48 crc kubenswrapper[4901]: I0309 02:42:48.882333 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:48 crc kubenswrapper[4901]: I0309 02:42:48.882349 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:48 crc kubenswrapper[4901]: I0309 02:42:48.882377 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:48 crc kubenswrapper[4901]: I0309 02:42:48.882394 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:48Z","lastTransitionTime":"2026-03-09T02:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:48 crc kubenswrapper[4901]: I0309 02:42:48.985427 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:48 crc kubenswrapper[4901]: I0309 02:42:48.985481 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:48 crc kubenswrapper[4901]: I0309 02:42:48.985490 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:48 crc kubenswrapper[4901]: I0309 02:42:48.985508 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:48 crc kubenswrapper[4901]: I0309 02:42:48.985519 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:48Z","lastTransitionTime":"2026-03-09T02:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.088387 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.088468 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.088478 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.088498 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.088509 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:49Z","lastTransitionTime":"2026-03-09T02:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.191468 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.191538 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.191558 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.191582 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.191601 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:49Z","lastTransitionTime":"2026-03-09T02:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.295603 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.295659 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.295679 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.295705 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.295723 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:49Z","lastTransitionTime":"2026-03-09T02:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.398063 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.398121 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.398137 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.398159 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.398172 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:49Z","lastTransitionTime":"2026-03-09T02:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.501184 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.501241 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.501250 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.501269 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.501280 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:49Z","lastTransitionTime":"2026-03-09T02:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.590882 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"2f7c21ba0d2836cb5905904941725dc9b5df5e173b47637722608075496850bb"} Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.590936 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"7ce0f53e8c88e047328fb896ffa5133e9ac723b73795634e24072acde25506d9"} Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.594489 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"ba7a956f96042c9ac58c08b70e189b199ec512bc56ffedd9df3956127a2380a1"} Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.607803 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.607870 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.607891 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.607920 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.607945 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:49Z","lastTransitionTime":"2026-03-09T02:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.608461 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad3db3b3-d72c-4e61-9db8-fcbc6abb0578\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95a5fc152f730178294cfb5d25a815b2041e4b712e4bdb7b334b4b417f463ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ceafe61efea5730e474e74a9ed5a7558cc06dc75d105c37680ed1eea84b8827c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:17Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 02:41:48.034714 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 02:41:48.036179 1 observer_polling.go:159] Starting file observer\\\\nI0309 02:41:48.038474 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 02:41:48.039660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 02:42:14.296758 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 02:42:17.547772 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 02:42:17.547837 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:47Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0cd220ceed026b8b0448743b09537807f1d4c59e290c74e4afa63b3331aaf97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c86d550b0e0522073aa52f3a6aa6671f7ded403e37eda7c7b91c6e069272dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04aae8495c8f5edddbe537c44cefdb20eaa781df485766f6e709a5d4c6e82df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.638605 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f58555f-103d-42bd-affa-68dd3ed5baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18f78a3248e86d96e6293502db1727285fb76668c8242ed3399de4b8e8fb666a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79c92d776bf96579afc6fbc0e1e70bb645a371966be4e4e9d86e18ef3c977b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7936d7271945f5c0b5ab0ddecf9a3f46c9f7306b71e25fabbb3927303c6f9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1098f7874be8a2a538c966845a946ab7993ec921e4e0859953f0187b4fabfd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23f66811373b03763a100551c0390732ccceac869c908d769de48f847e4ab82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.656166 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.669421 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.684319 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.698616 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ad2682-53b5-4e9d-acd1-0f0d210b322c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a290d0cc95a867bb7d924bfb2e63a518bb753765d4cdb890ccdf96cd36450b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72901ff5efbd52bce10e4651744925f7a5758a7a3c2572b581ab2cd222f58bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d0fe7f5e959b5c71b03b4ad017949229c3a161ea79f7869e610d069248c552\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 02:42:28.169798 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 02:42:28.169917 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 02:42:28.170597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1507146702/tls.crt::/tmp/serving-cert-1507146702/tls.key\\\\\\\"\\\\nI0309 02:42:28.390239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 02:42:28.392919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 02:42:28.392936 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 02:42:28.392959 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 02:42:28.392964 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 02:42:28.400711 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 02:42:28.400732 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 02:42:28.400760 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 02:42:28.400786 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 02:42:28.400792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 02:42:28.400798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 02:42:28.402404 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99df08aa182691c1bdb9ab9fb7b5352b276c236655b1076e2993639d63a94bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.710699 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.710749 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.710783 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.710792 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.710813 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.710824 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:49Z","lastTransitionTime":"2026-03-09T02:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.723504 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.738260 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f7c21ba0d2836cb5905904941725dc9b5df5e173b47637722608075496850bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ce0f53e8c88e047328fb896ffa5133e9ac723b73795634e24072acde25506d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.758404 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad3db3b3-d72c-4e61-9db8-fcbc6abb0578\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95a5fc152f730178294cfb5d25a815b2041e4b712e4bdb7b334b4b417f463ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ceafe61efea5730e474e74a9ed5a7558cc06dc75d105c37680ed1eea84b8827c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:17Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 02:41:48.034714 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 02:41:48.036179 1 observer_polling.go:159] Starting file observer\\\\nI0309 02:41:48.038474 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 02:41:48.039660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 02:42:14.296758 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 02:42:17.547772 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 02:42:17.547837 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:47Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0cd220ceed026b8b0448743b09537807f1d4c59e290c74e4afa63b3331aaf97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c86d550b0e0522073aa52f3a6aa6671f7ded403e37eda7c7b91c6e069272dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04aae8495c8f5edddbe537c44cefdb20eaa781df485766f6e709a5d4c6e82df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.787536 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f58555f-103d-42bd-affa-68dd3ed5baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18f78a3248e86d96e6293502db1727285fb76668c8242ed3399de4b8e8fb666a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79c92d776bf96579afc6fbc0e1e70bb645a371966be4e4e9d86e18ef3c977b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7936d7271945f5c0b5ab0ddecf9a3f46c9f7306b71e25fabbb3927303c6f9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1098f7874be8a2a538c966845a946ab7993ec921e4e0859953f0187b4fabfd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23f66811373b03763a100551c0390732ccceac869c908d769de48f847e4ab82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.805026 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.814014 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.814065 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.814081 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.814105 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.814124 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:49Z","lastTransitionTime":"2026-03-09T02:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.820970 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.834528 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.853928 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ad2682-53b5-4e9d-acd1-0f0d210b322c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a290d0cc95a867bb7d924bfb2e63a518bb753765d4cdb890ccdf96cd36450b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72901ff5efbd52bce10e4651744925f7a5758a7a3c2572b581ab2cd222f58bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d0fe7f5e959b5c71b03b4ad017949229c3a161ea79f7869e610d069248c552\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 02:42:28.169798 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 02:42:28.169917 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 02:42:28.170597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1507146702/tls.crt::/tmp/serving-cert-1507146702/tls.key\\\\\\\"\\\\nI0309 02:42:28.390239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 02:42:28.392919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 02:42:28.392936 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 02:42:28.392959 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 02:42:28.392964 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 02:42:28.400711 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 02:42:28.400732 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 02:42:28.400760 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 02:42:28.400786 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 02:42:28.400792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 02:42:28.400798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 02:42:28.402404 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99df08aa182691c1bdb9ab9fb7b5352b276c236655b1076e2993639d63a94bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.871801 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7a956f96042c9ac58c08b70e189b199ec512bc56ffedd9df3956127a2380a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.875105 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.875193 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.875242 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.875270 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.875295 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:42:49 crc kubenswrapper[4901]: E0309 02:42:49.875396 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 02:43:05.875338176 +0000 UTC m=+110.465001948 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:42:49 crc kubenswrapper[4901]: E0309 02:42:49.875481 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 02:42:49 crc kubenswrapper[4901]: E0309 02:42:49.875530 4901 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 02:42:49 crc kubenswrapper[4901]: E0309 02:42:49.875571 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 02:42:49 crc kubenswrapper[4901]: E0309 02:42:49.875602 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 02:43:05.875586502 +0000 UTC m=+110.465250274 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 02:42:49 crc kubenswrapper[4901]: E0309 02:42:49.875582 4901 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 02:42:49 crc kubenswrapper[4901]: E0309 02:42:49.875611 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 02:42:49 crc kubenswrapper[4901]: E0309 02:42:49.875802 4901 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 02:42:49 crc kubenswrapper[4901]: E0309 02:42:49.875533 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 02:42:49 crc kubenswrapper[4901]: E0309 02:42:49.875880 4901 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 02:42:49 crc kubenswrapper[4901]: E0309 02:42:49.875768 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 02:43:05.875737995 +0000 UTC m=+110.465401747 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 02:42:49 crc kubenswrapper[4901]: E0309 02:42:49.875941 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 02:43:05.875914799 +0000 UTC m=+110.465578542 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 02:42:49 crc kubenswrapper[4901]: E0309 02:42:49.875960 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 02:43:05.87595161 +0000 UTC m=+110.465615362 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.887247 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.907517 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f7c21ba0d2836cb5905904941725dc9b5df5e173b47637722608075496850bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ce0f53e8c88e047328fb896ffa5133e9ac723b73795634e24072acde25506d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.916634 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.916674 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.916688 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.916713 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:49 crc kubenswrapper[4901]: I0309 02:42:49.916732 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:49Z","lastTransitionTime":"2026-03-09T02:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:50 crc kubenswrapper[4901]: I0309 02:42:50.019340 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:50 crc kubenswrapper[4901]: I0309 02:42:50.019395 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:50 crc kubenswrapper[4901]: I0309 02:42:50.019409 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:50 crc kubenswrapper[4901]: I0309 02:42:50.019431 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:50 crc kubenswrapper[4901]: I0309 02:42:50.019447 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:50Z","lastTransitionTime":"2026-03-09T02:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:50 crc kubenswrapper[4901]: I0309 02:42:50.105844 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:42:50 crc kubenswrapper[4901]: I0309 02:42:50.105921 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:42:50 crc kubenswrapper[4901]: I0309 02:42:50.105858 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:42:50 crc kubenswrapper[4901]: E0309 02:42:50.106064 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 02:42:50 crc kubenswrapper[4901]: E0309 02:42:50.106426 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 02:42:50 crc kubenswrapper[4901]: E0309 02:42:50.106500 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 02:42:50 crc kubenswrapper[4901]: I0309 02:42:50.122115 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:50 crc kubenswrapper[4901]: I0309 02:42:50.122212 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:50 crc kubenswrapper[4901]: I0309 02:42:50.122300 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:50 crc kubenswrapper[4901]: I0309 02:42:50.122338 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:50 crc kubenswrapper[4901]: I0309 02:42:50.122363 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:50Z","lastTransitionTime":"2026-03-09T02:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:50 crc kubenswrapper[4901]: I0309 02:42:50.225007 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:50 crc kubenswrapper[4901]: I0309 02:42:50.225085 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:50 crc kubenswrapper[4901]: I0309 02:42:50.225102 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:50 crc kubenswrapper[4901]: I0309 02:42:50.225120 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:50 crc kubenswrapper[4901]: I0309 02:42:50.225133 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:50Z","lastTransitionTime":"2026-03-09T02:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:50 crc kubenswrapper[4901]: I0309 02:42:50.327289 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:50 crc kubenswrapper[4901]: I0309 02:42:50.327332 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:50 crc kubenswrapper[4901]: I0309 02:42:50.327343 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:50 crc kubenswrapper[4901]: I0309 02:42:50.327361 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:50 crc kubenswrapper[4901]: I0309 02:42:50.327419 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:50Z","lastTransitionTime":"2026-03-09T02:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:50 crc kubenswrapper[4901]: I0309 02:42:50.430049 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:50 crc kubenswrapper[4901]: I0309 02:42:50.430113 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:50 crc kubenswrapper[4901]: I0309 02:42:50.430138 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:50 crc kubenswrapper[4901]: I0309 02:42:50.430169 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:50 crc kubenswrapper[4901]: I0309 02:42:50.430193 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:50Z","lastTransitionTime":"2026-03-09T02:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:50 crc kubenswrapper[4901]: I0309 02:42:50.532196 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:50 crc kubenswrapper[4901]: I0309 02:42:50.532292 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:50 crc kubenswrapper[4901]: I0309 02:42:50.532302 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:50 crc kubenswrapper[4901]: I0309 02:42:50.532321 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:50 crc kubenswrapper[4901]: I0309 02:42:50.532332 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:50Z","lastTransitionTime":"2026-03-09T02:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:50 crc kubenswrapper[4901]: I0309 02:42:50.635705 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:50 crc kubenswrapper[4901]: I0309 02:42:50.635764 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:50 crc kubenswrapper[4901]: I0309 02:42:50.635780 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:50 crc kubenswrapper[4901]: I0309 02:42:50.635804 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:50 crc kubenswrapper[4901]: I0309 02:42:50.635822 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:50Z","lastTransitionTime":"2026-03-09T02:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:50 crc kubenswrapper[4901]: I0309 02:42:50.739215 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:50 crc kubenswrapper[4901]: I0309 02:42:50.739267 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:50 crc kubenswrapper[4901]: I0309 02:42:50.739296 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:50 crc kubenswrapper[4901]: I0309 02:42:50.739314 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:50 crc kubenswrapper[4901]: I0309 02:42:50.739329 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:50Z","lastTransitionTime":"2026-03-09T02:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:50 crc kubenswrapper[4901]: I0309 02:42:50.841841 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:50 crc kubenswrapper[4901]: I0309 02:42:50.841884 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:50 crc kubenswrapper[4901]: I0309 02:42:50.841894 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:50 crc kubenswrapper[4901]: I0309 02:42:50.841912 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:50 crc kubenswrapper[4901]: I0309 02:42:50.841923 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:50Z","lastTransitionTime":"2026-03-09T02:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:50 crc kubenswrapper[4901]: I0309 02:42:50.945173 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:50 crc kubenswrapper[4901]: I0309 02:42:50.945241 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:50 crc kubenswrapper[4901]: I0309 02:42:50.945255 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:50 crc kubenswrapper[4901]: I0309 02:42:50.945275 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:50 crc kubenswrapper[4901]: I0309 02:42:50.945287 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:50Z","lastTransitionTime":"2026-03-09T02:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:51 crc kubenswrapper[4901]: I0309 02:42:51.047823 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:51 crc kubenswrapper[4901]: I0309 02:42:51.047864 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:51 crc kubenswrapper[4901]: I0309 02:42:51.047875 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:51 crc kubenswrapper[4901]: I0309 02:42:51.047894 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:51 crc kubenswrapper[4901]: I0309 02:42:51.047905 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:51Z","lastTransitionTime":"2026-03-09T02:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:51 crc kubenswrapper[4901]: I0309 02:42:51.151894 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:51 crc kubenswrapper[4901]: I0309 02:42:51.151975 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:51 crc kubenswrapper[4901]: I0309 02:42:51.151993 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:51 crc kubenswrapper[4901]: I0309 02:42:51.152017 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:51 crc kubenswrapper[4901]: I0309 02:42:51.152032 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:51Z","lastTransitionTime":"2026-03-09T02:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:51 crc kubenswrapper[4901]: I0309 02:42:51.255177 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:51 crc kubenswrapper[4901]: I0309 02:42:51.255293 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:51 crc kubenswrapper[4901]: I0309 02:42:51.255312 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:51 crc kubenswrapper[4901]: I0309 02:42:51.255346 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:51 crc kubenswrapper[4901]: I0309 02:42:51.255365 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:51Z","lastTransitionTime":"2026-03-09T02:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:51 crc kubenswrapper[4901]: I0309 02:42:51.358587 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:51 crc kubenswrapper[4901]: I0309 02:42:51.358665 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:51 crc kubenswrapper[4901]: I0309 02:42:51.358683 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:51 crc kubenswrapper[4901]: I0309 02:42:51.358715 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:51 crc kubenswrapper[4901]: I0309 02:42:51.358733 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:51Z","lastTransitionTime":"2026-03-09T02:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:51 crc kubenswrapper[4901]: I0309 02:42:51.461870 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:51 crc kubenswrapper[4901]: I0309 02:42:51.461898 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:51 crc kubenswrapper[4901]: I0309 02:42:51.461905 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:51 crc kubenswrapper[4901]: I0309 02:42:51.461918 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:51 crc kubenswrapper[4901]: I0309 02:42:51.461927 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:51Z","lastTransitionTime":"2026-03-09T02:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:51 crc kubenswrapper[4901]: I0309 02:42:51.564710 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:51 crc kubenswrapper[4901]: I0309 02:42:51.564776 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:51 crc kubenswrapper[4901]: I0309 02:42:51.564800 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:51 crc kubenswrapper[4901]: I0309 02:42:51.564839 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:51 crc kubenswrapper[4901]: I0309 02:42:51.564862 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:51Z","lastTransitionTime":"2026-03-09T02:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:51 crc kubenswrapper[4901]: I0309 02:42:51.667115 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:51 crc kubenswrapper[4901]: I0309 02:42:51.667158 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:51 crc kubenswrapper[4901]: I0309 02:42:51.667174 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:51 crc kubenswrapper[4901]: I0309 02:42:51.667197 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:51 crc kubenswrapper[4901]: I0309 02:42:51.667243 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:51Z","lastTransitionTime":"2026-03-09T02:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:51 crc kubenswrapper[4901]: I0309 02:42:51.770285 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:51 crc kubenswrapper[4901]: I0309 02:42:51.770332 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:51 crc kubenswrapper[4901]: I0309 02:42:51.770343 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:51 crc kubenswrapper[4901]: I0309 02:42:51.770359 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:51 crc kubenswrapper[4901]: I0309 02:42:51.770370 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:51Z","lastTransitionTime":"2026-03-09T02:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:51 crc kubenswrapper[4901]: I0309 02:42:51.872917 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:51 crc kubenswrapper[4901]: I0309 02:42:51.872958 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:51 crc kubenswrapper[4901]: I0309 02:42:51.872968 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:51 crc kubenswrapper[4901]: I0309 02:42:51.872995 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:51 crc kubenswrapper[4901]: I0309 02:42:51.873007 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:51Z","lastTransitionTime":"2026-03-09T02:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:51 crc kubenswrapper[4901]: I0309 02:42:51.976820 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:51 crc kubenswrapper[4901]: I0309 02:42:51.976865 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:51 crc kubenswrapper[4901]: I0309 02:42:51.976875 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:51 crc kubenswrapper[4901]: I0309 02:42:51.976891 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:51 crc kubenswrapper[4901]: I0309 02:42:51.976903 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:51Z","lastTransitionTime":"2026-03-09T02:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:52 crc kubenswrapper[4901]: I0309 02:42:52.080547 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:52 crc kubenswrapper[4901]: I0309 02:42:52.080601 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:52 crc kubenswrapper[4901]: I0309 02:42:52.080636 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:52 crc kubenswrapper[4901]: I0309 02:42:52.080676 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:52 crc kubenswrapper[4901]: I0309 02:42:52.080693 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:52Z","lastTransitionTime":"2026-03-09T02:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:52 crc kubenswrapper[4901]: I0309 02:42:52.105872 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:42:52 crc kubenswrapper[4901]: I0309 02:42:52.105884 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:42:52 crc kubenswrapper[4901]: E0309 02:42:52.106129 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 02:42:52 crc kubenswrapper[4901]: I0309 02:42:52.106294 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:42:52 crc kubenswrapper[4901]: E0309 02:42:52.106451 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 02:42:52 crc kubenswrapper[4901]: E0309 02:42:52.107269 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 02:42:52 crc kubenswrapper[4901]: I0309 02:42:52.107613 4901 scope.go:117] "RemoveContainer" containerID="d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc" Mar 09 02:42:52 crc kubenswrapper[4901]: E0309 02:42:52.107939 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 02:42:52 crc kubenswrapper[4901]: I0309 02:42:52.183577 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:52 crc kubenswrapper[4901]: I0309 02:42:52.183655 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:52 crc kubenswrapper[4901]: I0309 02:42:52.183671 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:52 crc kubenswrapper[4901]: I0309 02:42:52.183697 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:52 crc kubenswrapper[4901]: I0309 02:42:52.183712 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:52Z","lastTransitionTime":"2026-03-09T02:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:52 crc kubenswrapper[4901]: I0309 02:42:52.286727 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:52 crc kubenswrapper[4901]: I0309 02:42:52.286806 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:52 crc kubenswrapper[4901]: I0309 02:42:52.286830 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:52 crc kubenswrapper[4901]: I0309 02:42:52.286861 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:52 crc kubenswrapper[4901]: I0309 02:42:52.286878 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:52Z","lastTransitionTime":"2026-03-09T02:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:52 crc kubenswrapper[4901]: I0309 02:42:52.390412 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:52 crc kubenswrapper[4901]: I0309 02:42:52.390889 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:52 crc kubenswrapper[4901]: I0309 02:42:52.391052 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:52 crc kubenswrapper[4901]: I0309 02:42:52.391169 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:52 crc kubenswrapper[4901]: I0309 02:42:52.391286 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:52Z","lastTransitionTime":"2026-03-09T02:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:52 crc kubenswrapper[4901]: I0309 02:42:52.494987 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:52 crc kubenswrapper[4901]: I0309 02:42:52.495062 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:52 crc kubenswrapper[4901]: I0309 02:42:52.495077 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:52 crc kubenswrapper[4901]: I0309 02:42:52.495108 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:52 crc kubenswrapper[4901]: I0309 02:42:52.495127 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:52Z","lastTransitionTime":"2026-03-09T02:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:52 crc kubenswrapper[4901]: I0309 02:42:52.598387 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:52 crc kubenswrapper[4901]: I0309 02:42:52.598436 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:52 crc kubenswrapper[4901]: I0309 02:42:52.598453 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:52 crc kubenswrapper[4901]: I0309 02:42:52.598478 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:52 crc kubenswrapper[4901]: I0309 02:42:52.598495 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:52Z","lastTransitionTime":"2026-03-09T02:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:52 crc kubenswrapper[4901]: I0309 02:42:52.604433 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"84cc042c2e8b6984f9b3357afce277a63722f17b3e8fd289eeeaa8a7a47935f4"} Mar 09 02:42:52 crc kubenswrapper[4901]: I0309 02:42:52.622312 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f7c21ba0d2836cb5905904941725dc9b5df5e173b47637722608075496850bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ce0f53e8c88e047328fb896ffa5133e9ac723b73795634e24072acde25506d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:42:52Z is after 2025-08-24T17:21:41Z" Mar 09 02:42:52 crc kubenswrapper[4901]: I0309 02:42:52.645798 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ad2682-53b5-4e9d-acd1-0f0d210b322c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a290d0cc95a867bb7d924bfb2e63a518bb753765d4cdb890ccdf96cd36450b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72901ff5efbd52bce10e4651744925f7a5758a7a3c2572b581ab2cd222f58bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d0fe7f5e959b5c71b03b4ad017949229c3a161ea79f7869e610d069248c552\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 02:42:28.169798 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 02:42:28.169917 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 02:42:28.170597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1507146702/tls.crt::/tmp/serving-cert-1507146702/tls.key\\\\\\\"\\\\nI0309 02:42:28.390239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 02:42:28.392919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 02:42:28.392936 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 02:42:28.392959 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 02:42:28.392964 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 02:42:28.400711 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 02:42:28.400732 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 02:42:28.400760 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 02:42:28.400786 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 02:42:28.400792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 02:42:28.400798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 02:42:28.402404 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99df08aa182691c1bdb9ab9fb7b5352b276c236655b1076e2993639d63a94bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:42:52Z is after 2025-08-24T17:21:41Z" Mar 09 02:42:52 crc kubenswrapper[4901]: I0309 02:42:52.662886 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7a956f96042c9ac58c08b70e189b199ec512bc56ffedd9df3956127a2380a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:42:52Z is after 2025-08-24T17:21:41Z" Mar 09 02:42:52 crc kubenswrapper[4901]: I0309 02:42:52.678610 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:42:52Z is after 2025-08-24T17:21:41Z" Mar 09 02:42:52 crc kubenswrapper[4901]: I0309 02:42:52.701444 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:52 crc kubenswrapper[4901]: I0309 02:42:52.701521 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:52 crc kubenswrapper[4901]: I0309 02:42:52.701547 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:52 crc kubenswrapper[4901]: I0309 02:42:52.701577 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:52 crc kubenswrapper[4901]: I0309 02:42:52.701600 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:52Z","lastTransitionTime":"2026-03-09T02:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:52 crc kubenswrapper[4901]: I0309 02:42:52.707912 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f58555f-103d-42bd-affa-68dd3ed5baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18f78a3248e86d96e6293502db1727285fb76668c8242ed3399de4b8e8fb666a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79c92d776bf96579afc6fbc0e1e70bb645a371966be4e4e9d86e18ef3c977b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7936d7271945f5c0b5ab0ddecf9a3f46c9f7306b71e25fabbb3927303c6f9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1098f7874be8a2a538c966845a946ab7993ec921e4e0859953f0187b4fabfd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23f66811373b03763a100551c0390732ccceac869c908d769de48f847e4ab82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:42:52Z is after 2025-08-24T17:21:41Z" Mar 09 02:42:52 crc kubenswrapper[4901]: I0309 02:42:52.725763 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:42:52Z is after 2025-08-24T17:21:41Z" Mar 09 02:42:52 crc kubenswrapper[4901]: I0309 02:42:52.741261 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84cc042c2e8b6984f9b3357afce277a63722f17b3e8fd289eeeaa8a7a47935f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:42:52Z is after 2025-08-24T17:21:41Z" Mar 09 02:42:52 crc kubenswrapper[4901]: I0309 02:42:52.761692 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:42:52Z is after 2025-08-24T17:21:41Z" Mar 09 02:42:52 crc kubenswrapper[4901]: I0309 02:42:52.777806 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad3db3b3-d72c-4e61-9db8-fcbc6abb0578\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95a5fc152f730178294cfb5d25a815b2041e4b712e4bdb7b334b4b417f463ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ceafe61efea5730e474e74a9ed5a7558cc06dc75d105c37680ed1eea84b8827c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:17Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 02:41:48.034714 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 02:41:48.036179 1 observer_polling.go:159] Starting file observer\\\\nI0309 02:41:48.038474 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 02:41:48.039660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 02:42:14.296758 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 02:42:17.547772 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 02:42:17.547837 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:47Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0cd220ceed026b8b0448743b09537807f1d4c59e290c74e4afa63b3331aaf97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c86d550b0e0522073aa52f3a6aa6671f7ded403e37eda7c7b91c6e069272dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04aae8495c8f5edddbe537c44cefdb20eaa781df485766f6e709a5d4c6e82df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:42:52Z is after 2025-08-24T17:21:41Z" Mar 09 02:42:52 crc kubenswrapper[4901]: I0309 02:42:52.803814 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:52 crc kubenswrapper[4901]: I0309 02:42:52.803853 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:52 crc kubenswrapper[4901]: I0309 02:42:52.803864 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:52 crc kubenswrapper[4901]: I0309 02:42:52.803880 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:52 crc kubenswrapper[4901]: I0309 02:42:52.803892 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:52Z","lastTransitionTime":"2026-03-09T02:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:52 crc kubenswrapper[4901]: I0309 02:42:52.906660 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:52 crc kubenswrapper[4901]: I0309 02:42:52.906700 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:52 crc kubenswrapper[4901]: I0309 02:42:52.906710 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:52 crc kubenswrapper[4901]: I0309 02:42:52.906725 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:52 crc kubenswrapper[4901]: I0309 02:42:52.906735 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:52Z","lastTransitionTime":"2026-03-09T02:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.010561 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.010615 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.010633 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.010662 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.010682 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:53Z","lastTransitionTime":"2026-03-09T02:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.114148 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.114607 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.114620 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.114640 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.114653 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:53Z","lastTransitionTime":"2026-03-09T02:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.217423 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.217471 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.217480 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.217495 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.217504 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:53Z","lastTransitionTime":"2026-03-09T02:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.319760 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.319802 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.319811 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.319826 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.319836 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:53Z","lastTransitionTime":"2026-03-09T02:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.422566 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.422637 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.422693 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.422723 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.422747 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:53Z","lastTransitionTime":"2026-03-09T02:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.525212 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.525263 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.525272 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.525288 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.525299 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:53Z","lastTransitionTime":"2026-03-09T02:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.553348 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.553378 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.553386 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.553399 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.553407 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:53Z","lastTransitionTime":"2026-03-09T02:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:53 crc kubenswrapper[4901]: E0309 02:42:53.571876 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:42:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:42:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:42:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:42:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4fb5477d-c9aa-418f-9a0d-560ac0227b13\\\",\\\"systemUUID\\\":\\\"92bcf75b-bfed-4296-bbf2-d35c6ac3a586\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:42:53Z is after 2025-08-24T17:21:41Z" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.576361 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.576383 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.576403 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.576416 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.576425 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:53Z","lastTransitionTime":"2026-03-09T02:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:53 crc kubenswrapper[4901]: E0309 02:42:53.593890 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:42:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:42:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:42:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:42:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4fb5477d-c9aa-418f-9a0d-560ac0227b13\\\",\\\"systemUUID\\\":\\\"92bcf75b-bfed-4296-bbf2-d35c6ac3a586\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:42:53Z is after 2025-08-24T17:21:41Z" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.599083 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.599108 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.599118 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.599130 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.599139 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:53Z","lastTransitionTime":"2026-03-09T02:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:53 crc kubenswrapper[4901]: E0309 02:42:53.619348 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:42:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:42:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:42:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:42:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4fb5477d-c9aa-418f-9a0d-560ac0227b13\\\",\\\"systemUUID\\\":\\\"92bcf75b-bfed-4296-bbf2-d35c6ac3a586\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:42:53Z is after 2025-08-24T17:21:41Z" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.624607 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.624673 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.624698 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.624729 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.624753 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:53Z","lastTransitionTime":"2026-03-09T02:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:53 crc kubenswrapper[4901]: E0309 02:42:53.647101 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:42:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:42:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:42:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:42:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4fb5477d-c9aa-418f-9a0d-560ac0227b13\\\",\\\"systemUUID\\\":\\\"92bcf75b-bfed-4296-bbf2-d35c6ac3a586\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:42:53Z is after 2025-08-24T17:21:41Z" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.652662 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.652757 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.652777 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.652797 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.652812 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:53Z","lastTransitionTime":"2026-03-09T02:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:53 crc kubenswrapper[4901]: E0309 02:42:53.671346 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:42:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:42:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:42:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:42:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4fb5477d-c9aa-418f-9a0d-560ac0227b13\\\",\\\"systemUUID\\\":\\\"92bcf75b-bfed-4296-bbf2-d35c6ac3a586\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:42:53Z is after 2025-08-24T17:21:41Z" Mar 09 02:42:53 crc kubenswrapper[4901]: E0309 02:42:53.671490 4901 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.673433 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.673468 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.673479 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.673512 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.673523 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:53Z","lastTransitionTime":"2026-03-09T02:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.781960 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.782021 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.782037 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.782059 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.782081 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:53Z","lastTransitionTime":"2026-03-09T02:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.884741 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.884797 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.884814 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.884837 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.884853 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:53Z","lastTransitionTime":"2026-03-09T02:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.988288 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.988348 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.988359 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.988374 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:53 crc kubenswrapper[4901]: I0309 02:42:53.988382 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:53Z","lastTransitionTime":"2026-03-09T02:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:54 crc kubenswrapper[4901]: I0309 02:42:54.097582 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:54 crc kubenswrapper[4901]: I0309 02:42:54.097641 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:54 crc kubenswrapper[4901]: I0309 02:42:54.097658 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:54 crc kubenswrapper[4901]: I0309 02:42:54.097683 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:54 crc kubenswrapper[4901]: I0309 02:42:54.097700 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:54Z","lastTransitionTime":"2026-03-09T02:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:54 crc kubenswrapper[4901]: I0309 02:42:54.106366 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:42:54 crc kubenswrapper[4901]: I0309 02:42:54.106509 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:42:54 crc kubenswrapper[4901]: E0309 02:42:54.106574 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 02:42:54 crc kubenswrapper[4901]: E0309 02:42:54.106760 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 02:42:54 crc kubenswrapper[4901]: I0309 02:42:54.106956 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:42:54 crc kubenswrapper[4901]: E0309 02:42:54.107192 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 02:42:54 crc kubenswrapper[4901]: I0309 02:42:54.202265 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:54 crc kubenswrapper[4901]: I0309 02:42:54.202384 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:54 crc kubenswrapper[4901]: I0309 02:42:54.202407 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:54 crc kubenswrapper[4901]: I0309 02:42:54.202479 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:54 crc kubenswrapper[4901]: I0309 02:42:54.202503 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:54Z","lastTransitionTime":"2026-03-09T02:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:54 crc kubenswrapper[4901]: I0309 02:42:54.306506 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:54 crc kubenswrapper[4901]: I0309 02:42:54.306555 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:54 crc kubenswrapper[4901]: I0309 02:42:54.306571 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:54 crc kubenswrapper[4901]: I0309 02:42:54.306594 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:54 crc kubenswrapper[4901]: I0309 02:42:54.306613 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:54Z","lastTransitionTime":"2026-03-09T02:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:54 crc kubenswrapper[4901]: I0309 02:42:54.409883 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:54 crc kubenswrapper[4901]: I0309 02:42:54.409938 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:54 crc kubenswrapper[4901]: I0309 02:42:54.409955 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:54 crc kubenswrapper[4901]: I0309 02:42:54.409984 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:54 crc kubenswrapper[4901]: I0309 02:42:54.410002 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:54Z","lastTransitionTime":"2026-03-09T02:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:54 crc kubenswrapper[4901]: I0309 02:42:54.513329 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:54 crc kubenswrapper[4901]: I0309 02:42:54.513385 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:54 crc kubenswrapper[4901]: I0309 02:42:54.513403 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:54 crc kubenswrapper[4901]: I0309 02:42:54.513428 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:54 crc kubenswrapper[4901]: I0309 02:42:54.513446 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:54Z","lastTransitionTime":"2026-03-09T02:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:54 crc kubenswrapper[4901]: I0309 02:42:54.616507 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:54 crc kubenswrapper[4901]: I0309 02:42:54.616595 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:54 crc kubenswrapper[4901]: I0309 02:42:54.616615 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:54 crc kubenswrapper[4901]: I0309 02:42:54.616644 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:54 crc kubenswrapper[4901]: I0309 02:42:54.616664 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:54Z","lastTransitionTime":"2026-03-09T02:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:54 crc kubenswrapper[4901]: I0309 02:42:54.720781 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:54 crc kubenswrapper[4901]: I0309 02:42:54.720928 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:54 crc kubenswrapper[4901]: I0309 02:42:54.721005 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:54 crc kubenswrapper[4901]: I0309 02:42:54.721041 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:54 crc kubenswrapper[4901]: I0309 02:42:54.721134 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:54Z","lastTransitionTime":"2026-03-09T02:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:54 crc kubenswrapper[4901]: I0309 02:42:54.823931 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:54 crc kubenswrapper[4901]: I0309 02:42:54.823968 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:54 crc kubenswrapper[4901]: I0309 02:42:54.823979 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:54 crc kubenswrapper[4901]: I0309 02:42:54.823996 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:54 crc kubenswrapper[4901]: I0309 02:42:54.824008 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:54Z","lastTransitionTime":"2026-03-09T02:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:54 crc kubenswrapper[4901]: I0309 02:42:54.926726 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:54 crc kubenswrapper[4901]: I0309 02:42:54.926795 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:54 crc kubenswrapper[4901]: I0309 02:42:54.926818 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:54 crc kubenswrapper[4901]: I0309 02:42:54.926849 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:54 crc kubenswrapper[4901]: I0309 02:42:54.926871 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:54Z","lastTransitionTime":"2026-03-09T02:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:55 crc kubenswrapper[4901]: I0309 02:42:55.030075 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:55 crc kubenswrapper[4901]: I0309 02:42:55.030156 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:55 crc kubenswrapper[4901]: I0309 02:42:55.030178 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:55 crc kubenswrapper[4901]: I0309 02:42:55.030207 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:55 crc kubenswrapper[4901]: I0309 02:42:55.030273 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:55Z","lastTransitionTime":"2026-03-09T02:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:55 crc kubenswrapper[4901]: I0309 02:42:55.133187 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:55 crc kubenswrapper[4901]: I0309 02:42:55.133236 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:55 crc kubenswrapper[4901]: I0309 02:42:55.133245 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:55 crc kubenswrapper[4901]: I0309 02:42:55.133259 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:55 crc kubenswrapper[4901]: I0309 02:42:55.133268 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:55Z","lastTransitionTime":"2026-03-09T02:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:55 crc kubenswrapper[4901]: I0309 02:42:55.235919 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:55 crc kubenswrapper[4901]: I0309 02:42:55.235986 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:55 crc kubenswrapper[4901]: I0309 02:42:55.236005 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:55 crc kubenswrapper[4901]: I0309 02:42:55.236033 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:55 crc kubenswrapper[4901]: I0309 02:42:55.236051 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:55Z","lastTransitionTime":"2026-03-09T02:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:55 crc kubenswrapper[4901]: I0309 02:42:55.339370 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:55 crc kubenswrapper[4901]: I0309 02:42:55.339467 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:55 crc kubenswrapper[4901]: I0309 02:42:55.339500 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:55 crc kubenswrapper[4901]: I0309 02:42:55.339531 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:55 crc kubenswrapper[4901]: I0309 02:42:55.339557 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:55Z","lastTransitionTime":"2026-03-09T02:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:55 crc kubenswrapper[4901]: I0309 02:42:55.442683 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:55 crc kubenswrapper[4901]: I0309 02:42:55.442762 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:55 crc kubenswrapper[4901]: I0309 02:42:55.442785 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:55 crc kubenswrapper[4901]: I0309 02:42:55.442816 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:55 crc kubenswrapper[4901]: I0309 02:42:55.442833 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:55Z","lastTransitionTime":"2026-03-09T02:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:55 crc kubenswrapper[4901]: I0309 02:42:55.545779 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:55 crc kubenswrapper[4901]: I0309 02:42:55.545827 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:55 crc kubenswrapper[4901]: I0309 02:42:55.545841 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:55 crc kubenswrapper[4901]: I0309 02:42:55.545858 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:55 crc kubenswrapper[4901]: I0309 02:42:55.545873 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:55Z","lastTransitionTime":"2026-03-09T02:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:55 crc kubenswrapper[4901]: I0309 02:42:55.648939 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:55 crc kubenswrapper[4901]: I0309 02:42:55.649000 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:55 crc kubenswrapper[4901]: I0309 02:42:55.649019 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:55 crc kubenswrapper[4901]: I0309 02:42:55.649044 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:55 crc kubenswrapper[4901]: I0309 02:42:55.649068 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:55Z","lastTransitionTime":"2026-03-09T02:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:55 crc kubenswrapper[4901]: I0309 02:42:55.751708 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:55 crc kubenswrapper[4901]: I0309 02:42:55.751758 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:55 crc kubenswrapper[4901]: I0309 02:42:55.751768 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:55 crc kubenswrapper[4901]: I0309 02:42:55.751785 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:55 crc kubenswrapper[4901]: I0309 02:42:55.751799 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:55Z","lastTransitionTime":"2026-03-09T02:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:55 crc kubenswrapper[4901]: I0309 02:42:55.855213 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:55 crc kubenswrapper[4901]: I0309 02:42:55.855308 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:55 crc kubenswrapper[4901]: I0309 02:42:55.855325 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:55 crc kubenswrapper[4901]: I0309 02:42:55.855349 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:55 crc kubenswrapper[4901]: I0309 02:42:55.855368 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:55Z","lastTransitionTime":"2026-03-09T02:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:55 crc kubenswrapper[4901]: I0309 02:42:55.958733 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:55 crc kubenswrapper[4901]: I0309 02:42:55.958838 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:55 crc kubenswrapper[4901]: I0309 02:42:55.958857 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:55 crc kubenswrapper[4901]: I0309 02:42:55.958879 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:55 crc kubenswrapper[4901]: I0309 02:42:55.958895 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:55Z","lastTransitionTime":"2026-03-09T02:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.061756 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.061814 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.061830 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.061853 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.061870 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:56Z","lastTransitionTime":"2026-03-09T02:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.105534 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.105591 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.105893 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:42:56 crc kubenswrapper[4901]: E0309 02:42:56.106055 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 02:42:56 crc kubenswrapper[4901]: E0309 02:42:56.106127 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 02:42:56 crc kubenswrapper[4901]: E0309 02:42:56.105837 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.122050 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84cc042c2e8b6984f9b3357afce277a63722f17b3e8fd289eeeaa8a7a47935f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:42:56Z is after 2025-08-24T17:21:41Z" Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.145717 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:42:56Z is after 2025-08-24T17:21:41Z" Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.164950 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.165018 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.165039 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.165062 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.165079 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:56Z","lastTransitionTime":"2026-03-09T02:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.171400 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad3db3b3-d72c-4e61-9db8-fcbc6abb0578\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95a5fc152f730178294cfb5d25a815b2041e4b712e4bdb7b334b4b417f463ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ceafe61efea5730e474e74a9ed5a7558cc06dc75d105c37680ed1eea84b8827c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:17Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 02:41:48.034714 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 02:41:48.036179 1 observer_polling.go:159] Starting file observer\\\\nI0309 02:41:48.038474 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 02:41:48.039660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 02:42:14.296758 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 02:42:17.547772 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 02:42:17.547837 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:47Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0cd220ceed026b8b0448743b09537807f1d4c59e290c74e4afa63b3331aaf97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c86d550b0e0522073aa52f3a6aa6671f7ded403e37eda7c7b91c6e069272dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04aae8495c8f5edddbe537c44cefdb20eaa781df485766f6e709a5d4c6e82df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:42:56Z is after 2025-08-24T17:21:41Z" Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.212779 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f58555f-103d-42bd-affa-68dd3ed5baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18f78a3248e86d96e6293502db1727285fb76668c8242ed3399de4b8e8fb666a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79c92d776bf96579afc6fbc0e1e70bb645a371966be4e4e9d86e18ef3c977b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7936d7271945f5c0b5ab0ddecf9a3f46c9f7306b71e25fabbb3927303c6f9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1098f7874be8a2a538c966845a946ab7993ec921e4e0859953f0187b4fabfd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23f66811373b03763a100551c0390732ccceac869c908d769de48f847e4ab82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:42:56Z is after 2025-08-24T17:21:41Z" Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.239789 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:42:56Z is after 2025-08-24T17:21:41Z" Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.259903 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ad2682-53b5-4e9d-acd1-0f0d210b322c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a290d0cc95a867bb7d924bfb2e63a518bb753765d4cdb890ccdf96cd36450b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72901ff5efbd52bce10e4651744925f7a5758a7a3c2572b581ab2cd222f58bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d0fe7f5e959b5c71b03b4ad017949229c3a161ea79f7869e610d069248c552\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 02:42:28.169798 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 02:42:28.169917 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 02:42:28.170597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1507146702/tls.crt::/tmp/serving-cert-1507146702/tls.key\\\\\\\"\\\\nI0309 02:42:28.390239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 02:42:28.392919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 02:42:28.392936 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 02:42:28.392959 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 02:42:28.392964 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 02:42:28.400711 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 02:42:28.400732 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 02:42:28.400760 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 02:42:28.400786 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 02:42:28.400792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 02:42:28.400798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 02:42:28.402404 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99df08aa182691c1bdb9ab9fb7b5352b276c236655b1076e2993639d63a94bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:42:56Z is after 2025-08-24T17:21:41Z" Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.269046 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.269117 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.269142 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.269175 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.269200 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:56Z","lastTransitionTime":"2026-03-09T02:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.284888 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7a956f96042c9ac58c08b70e189b199ec512bc56ffedd9df3956127a2380a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:42:56Z is after 2025-08-24T17:21:41Z" Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.296726 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:42:56Z is after 2025-08-24T17:21:41Z" Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.312112 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f7c21ba0d2836cb5905904941725dc9b5df5e173b47637722608075496850bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ce0f53e8c88e047328fb896ffa5133e9ac723b73795634e24072acde25506d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:42:56Z is after 2025-08-24T17:21:41Z" Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.372327 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.372403 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.372419 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.372440 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.372455 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:56Z","lastTransitionTime":"2026-03-09T02:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.476459 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.476514 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.476525 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.476547 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.476561 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:56Z","lastTransitionTime":"2026-03-09T02:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.580386 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.580443 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.580455 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.580476 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.580488 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:56Z","lastTransitionTime":"2026-03-09T02:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.684046 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.684123 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.684145 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.684177 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.684200 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:56Z","lastTransitionTime":"2026-03-09T02:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.787183 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.787257 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.787269 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.787291 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.787304 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:56Z","lastTransitionTime":"2026-03-09T02:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.890375 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.890434 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.890451 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.890474 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.890491 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:56Z","lastTransitionTime":"2026-03-09T02:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.994126 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.994186 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.994199 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.994254 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:56 crc kubenswrapper[4901]: I0309 02:42:56.994272 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:56Z","lastTransitionTime":"2026-03-09T02:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:57 crc kubenswrapper[4901]: I0309 02:42:57.097541 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:57 crc kubenswrapper[4901]: I0309 02:42:57.097607 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:57 crc kubenswrapper[4901]: I0309 02:42:57.097618 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:57 crc kubenswrapper[4901]: I0309 02:42:57.097643 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:57 crc kubenswrapper[4901]: I0309 02:42:57.097657 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:57Z","lastTransitionTime":"2026-03-09T02:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:57 crc kubenswrapper[4901]: I0309 02:42:57.201196 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:57 crc kubenswrapper[4901]: I0309 02:42:57.201315 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:57 crc kubenswrapper[4901]: I0309 02:42:57.201335 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:57 crc kubenswrapper[4901]: I0309 02:42:57.201405 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:57 crc kubenswrapper[4901]: I0309 02:42:57.201425 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:57Z","lastTransitionTime":"2026-03-09T02:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:57 crc kubenswrapper[4901]: I0309 02:42:57.305122 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:57 crc kubenswrapper[4901]: I0309 02:42:57.305176 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:57 crc kubenswrapper[4901]: I0309 02:42:57.305190 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:57 crc kubenswrapper[4901]: I0309 02:42:57.305210 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:57 crc kubenswrapper[4901]: I0309 02:42:57.305252 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:57Z","lastTransitionTime":"2026-03-09T02:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:57 crc kubenswrapper[4901]: I0309 02:42:57.408834 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:57 crc kubenswrapper[4901]: I0309 02:42:57.408894 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:57 crc kubenswrapper[4901]: I0309 02:42:57.408912 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:57 crc kubenswrapper[4901]: I0309 02:42:57.408935 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:57 crc kubenswrapper[4901]: I0309 02:42:57.408953 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:57Z","lastTransitionTime":"2026-03-09T02:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:57 crc kubenswrapper[4901]: I0309 02:42:57.513099 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:57 crc kubenswrapper[4901]: I0309 02:42:57.513161 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:57 crc kubenswrapper[4901]: I0309 02:42:57.513179 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:57 crc kubenswrapper[4901]: I0309 02:42:57.513208 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:57 crc kubenswrapper[4901]: I0309 02:42:57.513310 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:57Z","lastTransitionTime":"2026-03-09T02:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:57 crc kubenswrapper[4901]: I0309 02:42:57.616301 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:57 crc kubenswrapper[4901]: I0309 02:42:57.616380 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:57 crc kubenswrapper[4901]: I0309 02:42:57.616397 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:57 crc kubenswrapper[4901]: I0309 02:42:57.616416 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:57 crc kubenswrapper[4901]: I0309 02:42:57.616429 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:57Z","lastTransitionTime":"2026-03-09T02:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:57 crc kubenswrapper[4901]: I0309 02:42:57.719649 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:57 crc kubenswrapper[4901]: I0309 02:42:57.719755 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:57 crc kubenswrapper[4901]: I0309 02:42:57.719773 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:57 crc kubenswrapper[4901]: I0309 02:42:57.719800 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:57 crc kubenswrapper[4901]: I0309 02:42:57.719817 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:57Z","lastTransitionTime":"2026-03-09T02:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:57 crc kubenswrapper[4901]: I0309 02:42:57.823323 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:57 crc kubenswrapper[4901]: I0309 02:42:57.823391 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:57 crc kubenswrapper[4901]: I0309 02:42:57.823411 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:57 crc kubenswrapper[4901]: I0309 02:42:57.823439 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:57 crc kubenswrapper[4901]: I0309 02:42:57.823457 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:57Z","lastTransitionTime":"2026-03-09T02:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:57 crc kubenswrapper[4901]: I0309 02:42:57.927066 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:57 crc kubenswrapper[4901]: I0309 02:42:57.927118 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:57 crc kubenswrapper[4901]: I0309 02:42:57.927129 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:57 crc kubenswrapper[4901]: I0309 02:42:57.927148 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:57 crc kubenswrapper[4901]: I0309 02:42:57.927161 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:57Z","lastTransitionTime":"2026-03-09T02:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:58 crc kubenswrapper[4901]: I0309 02:42:58.030070 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:58 crc kubenswrapper[4901]: I0309 02:42:58.030132 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:58 crc kubenswrapper[4901]: I0309 02:42:58.030150 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:58 crc kubenswrapper[4901]: I0309 02:42:58.030174 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:58 crc kubenswrapper[4901]: I0309 02:42:58.030191 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:58Z","lastTransitionTime":"2026-03-09T02:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:58 crc kubenswrapper[4901]: I0309 02:42:58.105650 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:42:58 crc kubenswrapper[4901]: I0309 02:42:58.105726 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:42:58 crc kubenswrapper[4901]: E0309 02:42:58.105916 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 02:42:58 crc kubenswrapper[4901]: E0309 02:42:58.105968 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 02:42:58 crc kubenswrapper[4901]: I0309 02:42:58.106175 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:42:58 crc kubenswrapper[4901]: E0309 02:42:58.106605 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 02:42:58 crc kubenswrapper[4901]: I0309 02:42:58.139164 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:58 crc kubenswrapper[4901]: I0309 02:42:58.139288 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:58 crc kubenswrapper[4901]: I0309 02:42:58.139317 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:58 crc kubenswrapper[4901]: I0309 02:42:58.139354 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:58 crc kubenswrapper[4901]: I0309 02:42:58.139382 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:58Z","lastTransitionTime":"2026-03-09T02:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:58 crc kubenswrapper[4901]: I0309 02:42:58.243766 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:58 crc kubenswrapper[4901]: I0309 02:42:58.243852 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:58 crc kubenswrapper[4901]: I0309 02:42:58.243873 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:58 crc kubenswrapper[4901]: I0309 02:42:58.243906 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:58 crc kubenswrapper[4901]: I0309 02:42:58.243928 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:58Z","lastTransitionTime":"2026-03-09T02:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:58 crc kubenswrapper[4901]: I0309 02:42:58.347816 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:58 crc kubenswrapper[4901]: I0309 02:42:58.347868 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:58 crc kubenswrapper[4901]: I0309 02:42:58.347885 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:58 crc kubenswrapper[4901]: I0309 02:42:58.347910 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:58 crc kubenswrapper[4901]: I0309 02:42:58.347928 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:58Z","lastTransitionTime":"2026-03-09T02:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:58 crc kubenswrapper[4901]: I0309 02:42:58.450720 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:58 crc kubenswrapper[4901]: I0309 02:42:58.450780 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:58 crc kubenswrapper[4901]: I0309 02:42:58.450793 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:58 crc kubenswrapper[4901]: I0309 02:42:58.450813 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:58 crc kubenswrapper[4901]: I0309 02:42:58.450827 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:58Z","lastTransitionTime":"2026-03-09T02:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:58 crc kubenswrapper[4901]: I0309 02:42:58.553568 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:58 crc kubenswrapper[4901]: I0309 02:42:58.553625 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:58 crc kubenswrapper[4901]: I0309 02:42:58.553637 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:58 crc kubenswrapper[4901]: I0309 02:42:58.553657 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:58 crc kubenswrapper[4901]: I0309 02:42:58.553676 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:58Z","lastTransitionTime":"2026-03-09T02:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:58 crc kubenswrapper[4901]: I0309 02:42:58.656890 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:58 crc kubenswrapper[4901]: I0309 02:42:58.657142 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:58 crc kubenswrapper[4901]: I0309 02:42:58.657154 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:58 crc kubenswrapper[4901]: I0309 02:42:58.657189 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:58 crc kubenswrapper[4901]: I0309 02:42:58.657201 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:58Z","lastTransitionTime":"2026-03-09T02:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:58 crc kubenswrapper[4901]: I0309 02:42:58.760057 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:58 crc kubenswrapper[4901]: I0309 02:42:58.760096 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:58 crc kubenswrapper[4901]: I0309 02:42:58.760105 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:58 crc kubenswrapper[4901]: I0309 02:42:58.760121 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:58 crc kubenswrapper[4901]: I0309 02:42:58.760132 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:58Z","lastTransitionTime":"2026-03-09T02:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:58 crc kubenswrapper[4901]: I0309 02:42:58.863361 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:58 crc kubenswrapper[4901]: I0309 02:42:58.863419 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:58 crc kubenswrapper[4901]: I0309 02:42:58.863430 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:58 crc kubenswrapper[4901]: I0309 02:42:58.863448 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:58 crc kubenswrapper[4901]: I0309 02:42:58.863461 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:58Z","lastTransitionTime":"2026-03-09T02:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:58 crc kubenswrapper[4901]: I0309 02:42:58.965907 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:58 crc kubenswrapper[4901]: I0309 02:42:58.965940 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:58 crc kubenswrapper[4901]: I0309 02:42:58.965949 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:58 crc kubenswrapper[4901]: I0309 02:42:58.965963 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:58 crc kubenswrapper[4901]: I0309 02:42:58.965972 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:58Z","lastTransitionTime":"2026-03-09T02:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:59 crc kubenswrapper[4901]: I0309 02:42:59.068774 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:59 crc kubenswrapper[4901]: I0309 02:42:59.068805 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:59 crc kubenswrapper[4901]: I0309 02:42:59.068814 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:59 crc kubenswrapper[4901]: I0309 02:42:59.068826 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:59 crc kubenswrapper[4901]: I0309 02:42:59.068835 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:59Z","lastTransitionTime":"2026-03-09T02:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:59 crc kubenswrapper[4901]: I0309 02:42:59.172666 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:59 crc kubenswrapper[4901]: I0309 02:42:59.172727 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:59 crc kubenswrapper[4901]: I0309 02:42:59.172746 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:59 crc kubenswrapper[4901]: I0309 02:42:59.172776 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:59 crc kubenswrapper[4901]: I0309 02:42:59.172794 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:59Z","lastTransitionTime":"2026-03-09T02:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:59 crc kubenswrapper[4901]: I0309 02:42:59.275838 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:59 crc kubenswrapper[4901]: I0309 02:42:59.275886 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:59 crc kubenswrapper[4901]: I0309 02:42:59.275903 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:59 crc kubenswrapper[4901]: I0309 02:42:59.275924 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:59 crc kubenswrapper[4901]: I0309 02:42:59.275939 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:59Z","lastTransitionTime":"2026-03-09T02:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:59 crc kubenswrapper[4901]: I0309 02:42:59.378089 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:59 crc kubenswrapper[4901]: I0309 02:42:59.378164 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:59 crc kubenswrapper[4901]: I0309 02:42:59.378187 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:59 crc kubenswrapper[4901]: I0309 02:42:59.378219 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:59 crc kubenswrapper[4901]: I0309 02:42:59.378273 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:59Z","lastTransitionTime":"2026-03-09T02:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:59 crc kubenswrapper[4901]: I0309 02:42:59.480890 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:59 crc kubenswrapper[4901]: I0309 02:42:59.480976 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:59 crc kubenswrapper[4901]: I0309 02:42:59.480999 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:59 crc kubenswrapper[4901]: I0309 02:42:59.481030 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:59 crc kubenswrapper[4901]: I0309 02:42:59.481049 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:59Z","lastTransitionTime":"2026-03-09T02:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:59 crc kubenswrapper[4901]: I0309 02:42:59.583888 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:59 crc kubenswrapper[4901]: I0309 02:42:59.583950 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:59 crc kubenswrapper[4901]: I0309 02:42:59.583967 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:59 crc kubenswrapper[4901]: I0309 02:42:59.583993 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:59 crc kubenswrapper[4901]: I0309 02:42:59.584010 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:59Z","lastTransitionTime":"2026-03-09T02:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:59 crc kubenswrapper[4901]: I0309 02:42:59.686574 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:59 crc kubenswrapper[4901]: I0309 02:42:59.686627 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:59 crc kubenswrapper[4901]: I0309 02:42:59.686641 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:59 crc kubenswrapper[4901]: I0309 02:42:59.686659 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:59 crc kubenswrapper[4901]: I0309 02:42:59.686671 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:59Z","lastTransitionTime":"2026-03-09T02:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:59 crc kubenswrapper[4901]: I0309 02:42:59.789821 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:59 crc kubenswrapper[4901]: I0309 02:42:59.789877 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:59 crc kubenswrapper[4901]: I0309 02:42:59.789889 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:59 crc kubenswrapper[4901]: I0309 02:42:59.789908 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:59 crc kubenswrapper[4901]: I0309 02:42:59.789920 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:59Z","lastTransitionTime":"2026-03-09T02:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:59 crc kubenswrapper[4901]: I0309 02:42:59.893790 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:59 crc kubenswrapper[4901]: I0309 02:42:59.893890 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:59 crc kubenswrapper[4901]: I0309 02:42:59.893914 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:59 crc kubenswrapper[4901]: I0309 02:42:59.893946 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:59 crc kubenswrapper[4901]: I0309 02:42:59.893970 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:59Z","lastTransitionTime":"2026-03-09T02:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:42:59 crc kubenswrapper[4901]: I0309 02:42:59.997217 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:42:59 crc kubenswrapper[4901]: I0309 02:42:59.997322 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:42:59 crc kubenswrapper[4901]: I0309 02:42:59.997343 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:42:59 crc kubenswrapper[4901]: I0309 02:42:59.997369 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:42:59 crc kubenswrapper[4901]: I0309 02:42:59.997387 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:42:59Z","lastTransitionTime":"2026-03-09T02:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.104675 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.104722 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.104734 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.104752 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.104765 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:00Z","lastTransitionTime":"2026-03-09T02:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.109479 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.109495 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:43:00 crc kubenswrapper[4901]: E0309 02:43:00.109887 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.109545 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:43:00 crc kubenswrapper[4901]: E0309 02:43:00.110045 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 02:43:00 crc kubenswrapper[4901]: E0309 02:43:00.110287 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.116733 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-tvqrz"] Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.117120 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-tvqrz" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.121438 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.121443 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.121835 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.139545 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f7c21ba0d2836cb5905904941725dc9b5df5e173b47637722608075496850bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ce0f53e8c88e047328fb896ffa5133e9ac723b73795634e24072acde25506d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:00Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.162488 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ad2682-53b5-4e9d-acd1-0f0d210b322c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a290d0cc95a867bb7d924bfb2e63a518bb753765d4cdb890ccdf96cd36450b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72901ff5efbd52bce10e4651744925f7a5758a7a3c2572b581ab2cd222f58bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d0fe7f5e959b5c71b03b4ad017949229c3a161ea79f7869e610d069248c552\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 02:42:28.169798 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 02:42:28.169917 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 02:42:28.170597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1507146702/tls.crt::/tmp/serving-cert-1507146702/tls.key\\\\\\\"\\\\nI0309 02:42:28.390239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 02:42:28.392919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 02:42:28.392936 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 02:42:28.392959 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 02:42:28.392964 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 02:42:28.400711 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 02:42:28.400732 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 02:42:28.400760 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 02:42:28.400786 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 02:42:28.400792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 02:42:28.400798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 02:42:28.402404 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99df08aa182691c1bdb9ab9fb7b5352b276c236655b1076e2993639d63a94bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:00Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.179537 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f7ac7cbd-671c-4beb-8994-92502ee47ceb-hosts-file\") pod \"node-resolver-tvqrz\" (UID: \"f7ac7cbd-671c-4beb-8994-92502ee47ceb\") " pod="openshift-dns/node-resolver-tvqrz" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.179586 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbmmp\" (UniqueName: \"kubernetes.io/projected/f7ac7cbd-671c-4beb-8994-92502ee47ceb-kube-api-access-pbmmp\") pod \"node-resolver-tvqrz\" (UID: \"f7ac7cbd-671c-4beb-8994-92502ee47ceb\") " pod="openshift-dns/node-resolver-tvqrz" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.181792 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7a956f96042c9ac58c08b70e189b199ec512bc56ffedd9df3956127a2380a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:00Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.200058 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:00Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.207189 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.207475 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.207623 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.207757 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.207868 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:00Z","lastTransitionTime":"2026-03-09T02:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.226177 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f58555f-103d-42bd-affa-68dd3ed5baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18f78a3248e86d96e6293502db1727285fb76668c8242ed3399de4b8e8fb666a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79c92d776bf96579afc6fbc0e1e70bb645a371966be4e4e9d86e18ef3c977b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7936d7271945f5c0b5ab0ddecf9a3f46c9f7306b71e25fabbb3927303c6f9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1098f7874be8a2a538c966845a946ab7993ec921e4e0859953f0187b4fabfd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23f66811373b03763a100551c0390732ccceac869c908d769de48f847e4ab82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:00Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.243115 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:00Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.258442 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84cc042c2e8b6984f9b3357afce277a63722f17b3e8fd289eeeaa8a7a47935f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:00Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.275870 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:00Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.280423 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbmmp\" (UniqueName: \"kubernetes.io/projected/f7ac7cbd-671c-4beb-8994-92502ee47ceb-kube-api-access-pbmmp\") pod \"node-resolver-tvqrz\" (UID: \"f7ac7cbd-671c-4beb-8994-92502ee47ceb\") " pod="openshift-dns/node-resolver-tvqrz" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.280526 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f7ac7cbd-671c-4beb-8994-92502ee47ceb-hosts-file\") pod \"node-resolver-tvqrz\" (UID: \"f7ac7cbd-671c-4beb-8994-92502ee47ceb\") " pod="openshift-dns/node-resolver-tvqrz" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.280626 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f7ac7cbd-671c-4beb-8994-92502ee47ceb-hosts-file\") pod \"node-resolver-tvqrz\" (UID: \"f7ac7cbd-671c-4beb-8994-92502ee47ceb\") " pod="openshift-dns/node-resolver-tvqrz" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.292419 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tvqrz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ac7cbd-671c-4beb-8994-92502ee47ceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbmmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tvqrz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:00Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.305035 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbmmp\" (UniqueName: \"kubernetes.io/projected/f7ac7cbd-671c-4beb-8994-92502ee47ceb-kube-api-access-pbmmp\") pod \"node-resolver-tvqrz\" (UID: \"f7ac7cbd-671c-4beb-8994-92502ee47ceb\") " pod="openshift-dns/node-resolver-tvqrz" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.309444 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad3db3b3-d72c-4e61-9db8-fcbc6abb0578\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95a5fc152f730178294cfb5d25a815b2041e4b712e4bdb7b334b4b417f463ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ceafe61efea5730e474e74a9ed5a7558cc06dc75d105c37680ed1eea84b8827c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:17Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 02:41:48.034714 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 02:41:48.036179 1 observer_polling.go:159] Starting file observer\\\\nI0309 02:41:48.038474 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 02:41:48.039660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 02:42:14.296758 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 02:42:17.547772 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 02:42:17.547837 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:47Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0cd220ceed026b8b0448743b09537807f1d4c59e290c74e4afa63b3331aaf97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c86d550b0e0522073aa52f3a6aa6671f7ded403e37eda7c7b91c6e069272dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04aae8495c8f5edddbe537c44cefdb20eaa781df485766f6e709a5d4c6e82df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:00Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.313645 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.313705 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.313720 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.313742 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.313757 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:00Z","lastTransitionTime":"2026-03-09T02:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.416250 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.416306 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.416325 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.416349 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.416366 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:00Z","lastTransitionTime":"2026-03-09T02:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.439366 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-tvqrz" Mar 09 02:43:00 crc kubenswrapper[4901]: W0309 02:43:00.456745 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7ac7cbd_671c_4beb_8994_92502ee47ceb.slice/crio-8157b03133276ef8f132c80e01f7822ae7bfbcfd7fd7c9b6b5c87a03a9910679 WatchSource:0}: Error finding container 8157b03133276ef8f132c80e01f7822ae7bfbcfd7fd7c9b6b5c87a03a9910679: Status 404 returned error can't find the container with id 8157b03133276ef8f132c80e01f7822ae7bfbcfd7fd7c9b6b5c87a03a9910679 Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.507077 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-jtcxx"] Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.508144 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-429fk"] Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.508372 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-jtcxx" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.508464 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-5c998"] Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.509108 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-429fk" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.509116 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-5c998" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.511265 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.512521 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.514069 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.514187 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.514466 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.514738 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.514919 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.515331 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.515640 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.515782 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.516056 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.516831 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.520391 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.520433 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.520446 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.520467 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.520511 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:00Z","lastTransitionTime":"2026-03-09T02:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.534608 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7a956f96042c9ac58c08b70e189b199ec512bc56ffedd9df3956127a2380a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:00Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.554441 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:00Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.573942 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f7c21ba0d2836cb5905904941725dc9b5df5e173b47637722608075496850bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ce0f53e8c88e047328fb896ffa5133e9ac723b73795634e24072acde25506d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:00Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.588355 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a0d0e040-7ca3-4af8-9f02-d96cff6b3edf-multus-daemon-config\") pod \"multus-429fk\" (UID: \"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\") " pod="openshift-multus/multus-429fk" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.588412 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/65e722e8-52c4-4bb6-9927-f378b2f7296a-mcd-auth-proxy-config\") pod \"machine-config-daemon-5c998\" (UID: \"65e722e8-52c4-4bb6-9927-f378b2f7296a\") " pod="openshift-machine-config-operator/machine-config-daemon-5c998" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.588443 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a0d0e040-7ca3-4af8-9f02-d96cff6b3edf-host-var-lib-cni-multus\") pod \"multus-429fk\" (UID: \"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\") " pod="openshift-multus/multus-429fk" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.588469 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/98293641-7bf9-4473-ae92-c80e56cefdb5-system-cni-dir\") pod \"multus-additional-cni-plugins-jtcxx\" (UID: \"98293641-7bf9-4473-ae92-c80e56cefdb5\") " pod="openshift-multus/multus-additional-cni-plugins-jtcxx" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.588492 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a0d0e040-7ca3-4af8-9f02-d96cff6b3edf-host-var-lib-kubelet\") pod \"multus-429fk\" (UID: \"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\") " pod="openshift-multus/multus-429fk" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.588516 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/98293641-7bf9-4473-ae92-c80e56cefdb5-os-release\") pod \"multus-additional-cni-plugins-jtcxx\" (UID: \"98293641-7bf9-4473-ae92-c80e56cefdb5\") " pod="openshift-multus/multus-additional-cni-plugins-jtcxx" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.588546 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tm787\" (UniqueName: \"kubernetes.io/projected/98293641-7bf9-4473-ae92-c80e56cefdb5-kube-api-access-tm787\") pod \"multus-additional-cni-plugins-jtcxx\" (UID: \"98293641-7bf9-4473-ae92-c80e56cefdb5\") " pod="openshift-multus/multus-additional-cni-plugins-jtcxx" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.588600 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a0d0e040-7ca3-4af8-9f02-d96cff6b3edf-host-run-k8s-cni-cncf-io\") pod \"multus-429fk\" (UID: \"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\") " pod="openshift-multus/multus-429fk" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.588644 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a0d0e040-7ca3-4af8-9f02-d96cff6b3edf-multus-socket-dir-parent\") pod \"multus-429fk\" (UID: \"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\") " pod="openshift-multus/multus-429fk" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.588744 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a0d0e040-7ca3-4af8-9f02-d96cff6b3edf-cnibin\") pod \"multus-429fk\" (UID: \"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\") " pod="openshift-multus/multus-429fk" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.588802 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a0d0e040-7ca3-4af8-9f02-d96cff6b3edf-host-run-netns\") pod \"multus-429fk\" (UID: \"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\") " pod="openshift-multus/multus-429fk" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.588844 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a0d0e040-7ca3-4af8-9f02-d96cff6b3edf-cni-binary-copy\") pod \"multus-429fk\" (UID: \"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\") " pod="openshift-multus/multus-429fk" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.588883 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a0d0e040-7ca3-4af8-9f02-d96cff6b3edf-hostroot\") pod \"multus-429fk\" (UID: \"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\") " pod="openshift-multus/multus-429fk" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.588922 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4j6pq\" (UniqueName: \"kubernetes.io/projected/a0d0e040-7ca3-4af8-9f02-d96cff6b3edf-kube-api-access-4j6pq\") pod \"multus-429fk\" (UID: \"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\") " pod="openshift-multus/multus-429fk" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.589007 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a0d0e040-7ca3-4af8-9f02-d96cff6b3edf-system-cni-dir\") pod \"multus-429fk\" (UID: \"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\") " pod="openshift-multus/multus-429fk" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.589067 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a0d0e040-7ca3-4af8-9f02-d96cff6b3edf-etc-kubernetes\") pod \"multus-429fk\" (UID: \"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\") " pod="openshift-multus/multus-429fk" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.589106 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/98293641-7bf9-4473-ae92-c80e56cefdb5-cnibin\") pod \"multus-additional-cni-plugins-jtcxx\" (UID: \"98293641-7bf9-4473-ae92-c80e56cefdb5\") " pod="openshift-multus/multus-additional-cni-plugins-jtcxx" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.589139 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/98293641-7bf9-4473-ae92-c80e56cefdb5-cni-binary-copy\") pod \"multus-additional-cni-plugins-jtcxx\" (UID: \"98293641-7bf9-4473-ae92-c80e56cefdb5\") " pod="openshift-multus/multus-additional-cni-plugins-jtcxx" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.589182 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/98293641-7bf9-4473-ae92-c80e56cefdb5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jtcxx\" (UID: \"98293641-7bf9-4473-ae92-c80e56cefdb5\") " pod="openshift-multus/multus-additional-cni-plugins-jtcxx" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.589217 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/98293641-7bf9-4473-ae92-c80e56cefdb5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jtcxx\" (UID: \"98293641-7bf9-4473-ae92-c80e56cefdb5\") " pod="openshift-multus/multus-additional-cni-plugins-jtcxx" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.589279 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a0d0e040-7ca3-4af8-9f02-d96cff6b3edf-multus-cni-dir\") pod \"multus-429fk\" (UID: \"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\") " pod="openshift-multus/multus-429fk" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.589313 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a0d0e040-7ca3-4af8-9f02-d96cff6b3edf-host-var-lib-cni-bin\") pod \"multus-429fk\" (UID: \"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\") " pod="openshift-multus/multus-429fk" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.589344 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a0d0e040-7ca3-4af8-9f02-d96cff6b3edf-host-run-multus-certs\") pod \"multus-429fk\" (UID: \"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\") " pod="openshift-multus/multus-429fk" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.589405 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pr22q\" (UniqueName: \"kubernetes.io/projected/65e722e8-52c4-4bb6-9927-f378b2f7296a-kube-api-access-pr22q\") pod \"machine-config-daemon-5c998\" (UID: \"65e722e8-52c4-4bb6-9927-f378b2f7296a\") " pod="openshift-machine-config-operator/machine-config-daemon-5c998" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.589438 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/65e722e8-52c4-4bb6-9927-f378b2f7296a-rootfs\") pod \"machine-config-daemon-5c998\" (UID: \"65e722e8-52c4-4bb6-9927-f378b2f7296a\") " pod="openshift-machine-config-operator/machine-config-daemon-5c998" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.589470 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/65e722e8-52c4-4bb6-9927-f378b2f7296a-proxy-tls\") pod \"machine-config-daemon-5c998\" (UID: \"65e722e8-52c4-4bb6-9927-f378b2f7296a\") " pod="openshift-machine-config-operator/machine-config-daemon-5c998" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.589502 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a0d0e040-7ca3-4af8-9f02-d96cff6b3edf-multus-conf-dir\") pod \"multus-429fk\" (UID: \"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\") " pod="openshift-multus/multus-429fk" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.589612 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a0d0e040-7ca3-4af8-9f02-d96cff6b3edf-os-release\") pod \"multus-429fk\" (UID: \"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\") " pod="openshift-multus/multus-429fk" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.594697 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ad2682-53b5-4e9d-acd1-0f0d210b322c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a290d0cc95a867bb7d924bfb2e63a518bb753765d4cdb890ccdf96cd36450b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72901ff5efbd52bce10e4651744925f7a5758a7a3c2572b581ab2cd222f58bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d0fe7f5e959b5c71b03b4ad017949229c3a161ea79f7869e610d069248c552\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 02:42:28.169798 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 02:42:28.169917 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 02:42:28.170597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1507146702/tls.crt::/tmp/serving-cert-1507146702/tls.key\\\\\\\"\\\\nI0309 02:42:28.390239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 02:42:28.392919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 02:42:28.392936 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 02:42:28.392959 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 02:42:28.392964 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 02:42:28.400711 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 02:42:28.400732 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 02:42:28.400760 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 02:42:28.400786 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 02:42:28.400792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 02:42:28.400798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 02:42:28.402404 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99df08aa182691c1bdb9ab9fb7b5352b276c236655b1076e2993639d63a94bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:00Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.619813 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f58555f-103d-42bd-affa-68dd3ed5baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18f78a3248e86d96e6293502db1727285fb76668c8242ed3399de4b8e8fb666a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79c92d776bf96579afc6fbc0e1e70bb645a371966be4e4e9d86e18ef3c977b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7936d7271945f5c0b5ab0ddecf9a3f46c9f7306b71e25fabbb3927303c6f9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1098f7874be8a2a538c966845a946ab7993ec921e4e0859953f0187b4fabfd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23f66811373b03763a100551c0390732ccceac869c908d769de48f847e4ab82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:00Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.631025 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.631085 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.631103 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.631129 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.631146 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:00Z","lastTransitionTime":"2026-03-09T02:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.635490 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:00Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.636420 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-tvqrz" event={"ID":"f7ac7cbd-671c-4beb-8994-92502ee47ceb","Type":"ContainerStarted","Data":"8157b03133276ef8f132c80e01f7822ae7bfbcfd7fd7c9b6b5c87a03a9910679"} Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.651649 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84cc042c2e8b6984f9b3357afce277a63722f17b3e8fd289eeeaa8a7a47935f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:00Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.674699 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:00Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.690295 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/98293641-7bf9-4473-ae92-c80e56cefdb5-system-cni-dir\") pod \"multus-additional-cni-plugins-jtcxx\" (UID: \"98293641-7bf9-4473-ae92-c80e56cefdb5\") " pod="openshift-multus/multus-additional-cni-plugins-jtcxx" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.690379 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a0d0e040-7ca3-4af8-9f02-d96cff6b3edf-host-var-lib-kubelet\") pod \"multus-429fk\" (UID: \"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\") " pod="openshift-multus/multus-429fk" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.690416 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/98293641-7bf9-4473-ae92-c80e56cefdb5-os-release\") pod \"multus-additional-cni-plugins-jtcxx\" (UID: \"98293641-7bf9-4473-ae92-c80e56cefdb5\") " pod="openshift-multus/multus-additional-cni-plugins-jtcxx" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.690482 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tm787\" (UniqueName: \"kubernetes.io/projected/98293641-7bf9-4473-ae92-c80e56cefdb5-kube-api-access-tm787\") pod \"multus-additional-cni-plugins-jtcxx\" (UID: \"98293641-7bf9-4473-ae92-c80e56cefdb5\") " pod="openshift-multus/multus-additional-cni-plugins-jtcxx" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.690481 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/98293641-7bf9-4473-ae92-c80e56cefdb5-system-cni-dir\") pod \"multus-additional-cni-plugins-jtcxx\" (UID: \"98293641-7bf9-4473-ae92-c80e56cefdb5\") " pod="openshift-multus/multus-additional-cni-plugins-jtcxx" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.690517 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a0d0e040-7ca3-4af8-9f02-d96cff6b3edf-host-run-k8s-cni-cncf-io\") pod \"multus-429fk\" (UID: \"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\") " pod="openshift-multus/multus-429fk" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.690550 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a0d0e040-7ca3-4af8-9f02-d96cff6b3edf-multus-socket-dir-parent\") pod \"multus-429fk\" (UID: \"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\") " pod="openshift-multus/multus-429fk" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.690596 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a0d0e040-7ca3-4af8-9f02-d96cff6b3edf-cnibin\") pod \"multus-429fk\" (UID: \"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\") " pod="openshift-multus/multus-429fk" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.690625 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a0d0e040-7ca3-4af8-9f02-d96cff6b3edf-host-run-netns\") pod \"multus-429fk\" (UID: \"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\") " pod="openshift-multus/multus-429fk" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.690679 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a0d0e040-7ca3-4af8-9f02-d96cff6b3edf-cni-binary-copy\") pod \"multus-429fk\" (UID: \"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\") " pod="openshift-multus/multus-429fk" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.690708 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a0d0e040-7ca3-4af8-9f02-d96cff6b3edf-hostroot\") pod \"multus-429fk\" (UID: \"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\") " pod="openshift-multus/multus-429fk" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.690743 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4j6pq\" (UniqueName: \"kubernetes.io/projected/a0d0e040-7ca3-4af8-9f02-d96cff6b3edf-kube-api-access-4j6pq\") pod \"multus-429fk\" (UID: \"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\") " pod="openshift-multus/multus-429fk" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.690784 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a0d0e040-7ca3-4af8-9f02-d96cff6b3edf-system-cni-dir\") pod \"multus-429fk\" (UID: \"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\") " pod="openshift-multus/multus-429fk" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.690829 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a0d0e040-7ca3-4af8-9f02-d96cff6b3edf-etc-kubernetes\") pod \"multus-429fk\" (UID: \"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\") " pod="openshift-multus/multus-429fk" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.690862 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/98293641-7bf9-4473-ae92-c80e56cefdb5-cnibin\") pod \"multus-additional-cni-plugins-jtcxx\" (UID: \"98293641-7bf9-4473-ae92-c80e56cefdb5\") " pod="openshift-multus/multus-additional-cni-plugins-jtcxx" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.690895 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/98293641-7bf9-4473-ae92-c80e56cefdb5-cni-binary-copy\") pod \"multus-additional-cni-plugins-jtcxx\" (UID: \"98293641-7bf9-4473-ae92-c80e56cefdb5\") " pod="openshift-multus/multus-additional-cni-plugins-jtcxx" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.690931 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/98293641-7bf9-4473-ae92-c80e56cefdb5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jtcxx\" (UID: \"98293641-7bf9-4473-ae92-c80e56cefdb5\") " pod="openshift-multus/multus-additional-cni-plugins-jtcxx" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.690963 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pr22q\" (UniqueName: \"kubernetes.io/projected/65e722e8-52c4-4bb6-9927-f378b2f7296a-kube-api-access-pr22q\") pod \"machine-config-daemon-5c998\" (UID: \"65e722e8-52c4-4bb6-9927-f378b2f7296a\") " pod="openshift-machine-config-operator/machine-config-daemon-5c998" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.690994 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/98293641-7bf9-4473-ae92-c80e56cefdb5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jtcxx\" (UID: \"98293641-7bf9-4473-ae92-c80e56cefdb5\") " pod="openshift-multus/multus-additional-cni-plugins-jtcxx" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.691027 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a0d0e040-7ca3-4af8-9f02-d96cff6b3edf-multus-cni-dir\") pod \"multus-429fk\" (UID: \"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\") " pod="openshift-multus/multus-429fk" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.691056 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a0d0e040-7ca3-4af8-9f02-d96cff6b3edf-host-var-lib-cni-bin\") pod \"multus-429fk\" (UID: \"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\") " pod="openshift-multus/multus-429fk" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.691086 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a0d0e040-7ca3-4af8-9f02-d96cff6b3edf-host-run-multus-certs\") pod \"multus-429fk\" (UID: \"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\") " pod="openshift-multus/multus-429fk" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.691117 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/65e722e8-52c4-4bb6-9927-f378b2f7296a-rootfs\") pod \"machine-config-daemon-5c998\" (UID: \"65e722e8-52c4-4bb6-9927-f378b2f7296a\") " pod="openshift-machine-config-operator/machine-config-daemon-5c998" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.691169 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/65e722e8-52c4-4bb6-9927-f378b2f7296a-proxy-tls\") pod \"machine-config-daemon-5c998\" (UID: \"65e722e8-52c4-4bb6-9927-f378b2f7296a\") " pod="openshift-machine-config-operator/machine-config-daemon-5c998" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.691203 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a0d0e040-7ca3-4af8-9f02-d96cff6b3edf-multus-conf-dir\") pod \"multus-429fk\" (UID: \"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\") " pod="openshift-multus/multus-429fk" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.691291 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a0d0e040-7ca3-4af8-9f02-d96cff6b3edf-os-release\") pod \"multus-429fk\" (UID: \"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\") " pod="openshift-multus/multus-429fk" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.691333 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a0d0e040-7ca3-4af8-9f02-d96cff6b3edf-multus-daemon-config\") pod \"multus-429fk\" (UID: \"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\") " pod="openshift-multus/multus-429fk" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.691388 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/65e722e8-52c4-4bb6-9927-f378b2f7296a-mcd-auth-proxy-config\") pod \"machine-config-daemon-5c998\" (UID: \"65e722e8-52c4-4bb6-9927-f378b2f7296a\") " pod="openshift-machine-config-operator/machine-config-daemon-5c998" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.691429 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a0d0e040-7ca3-4af8-9f02-d96cff6b3edf-host-var-lib-cni-multus\") pod \"multus-429fk\" (UID: \"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\") " pod="openshift-multus/multus-429fk" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.691518 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a0d0e040-7ca3-4af8-9f02-d96cff6b3edf-host-var-lib-cni-multus\") pod \"multus-429fk\" (UID: \"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\") " pod="openshift-multus/multus-429fk" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.691577 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a0d0e040-7ca3-4af8-9f02-d96cff6b3edf-host-var-lib-kubelet\") pod \"multus-429fk\" (UID: \"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\") " pod="openshift-multus/multus-429fk" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.691655 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/98293641-7bf9-4473-ae92-c80e56cefdb5-os-release\") pod \"multus-additional-cni-plugins-jtcxx\" (UID: \"98293641-7bf9-4473-ae92-c80e56cefdb5\") " pod="openshift-multus/multus-additional-cni-plugins-jtcxx" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.692056 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a0d0e040-7ca3-4af8-9f02-d96cff6b3edf-host-run-k8s-cni-cncf-io\") pod \"multus-429fk\" (UID: \"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\") " pod="openshift-multus/multus-429fk" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.692155 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a0d0e040-7ca3-4af8-9f02-d96cff6b3edf-multus-socket-dir-parent\") pod \"multus-429fk\" (UID: \"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\") " pod="openshift-multus/multus-429fk" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.692425 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/65e722e8-52c4-4bb6-9927-f378b2f7296a-rootfs\") pod \"machine-config-daemon-5c998\" (UID: \"65e722e8-52c4-4bb6-9927-f378b2f7296a\") " pod="openshift-machine-config-operator/machine-config-daemon-5c998" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.692578 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a0d0e040-7ca3-4af8-9f02-d96cff6b3edf-os-release\") pod \"multus-429fk\" (UID: \"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\") " pod="openshift-multus/multus-429fk" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.692548 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tvqrz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ac7cbd-671c-4beb-8994-92502ee47ceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbmmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tvqrz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:00Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.692992 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a0d0e040-7ca3-4af8-9f02-d96cff6b3edf-multus-conf-dir\") pod \"multus-429fk\" (UID: \"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\") " pod="openshift-multus/multus-429fk" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.693087 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a0d0e040-7ca3-4af8-9f02-d96cff6b3edf-host-var-lib-cni-bin\") pod \"multus-429fk\" (UID: \"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\") " pod="openshift-multus/multus-429fk" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.693202 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a0d0e040-7ca3-4af8-9f02-d96cff6b3edf-multus-cni-dir\") pod \"multus-429fk\" (UID: \"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\") " pod="openshift-multus/multus-429fk" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.693250 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a0d0e040-7ca3-4af8-9f02-d96cff6b3edf-cnibin\") pod \"multus-429fk\" (UID: \"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\") " pod="openshift-multus/multus-429fk" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.693314 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/98293641-7bf9-4473-ae92-c80e56cefdb5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jtcxx\" (UID: \"98293641-7bf9-4473-ae92-c80e56cefdb5\") " pod="openshift-multus/multus-additional-cni-plugins-jtcxx" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.693315 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a0d0e040-7ca3-4af8-9f02-d96cff6b3edf-host-run-multus-certs\") pod \"multus-429fk\" (UID: \"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\") " pod="openshift-multus/multus-429fk" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.693459 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a0d0e040-7ca3-4af8-9f02-d96cff6b3edf-system-cni-dir\") pod \"multus-429fk\" (UID: \"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\") " pod="openshift-multus/multus-429fk" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.693469 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a0d0e040-7ca3-4af8-9f02-d96cff6b3edf-etc-kubernetes\") pod \"multus-429fk\" (UID: \"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\") " pod="openshift-multus/multus-429fk" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.693513 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a0d0e040-7ca3-4af8-9f02-d96cff6b3edf-host-run-netns\") pod \"multus-429fk\" (UID: \"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\") " pod="openshift-multus/multus-429fk" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.693511 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/98293641-7bf9-4473-ae92-c80e56cefdb5-cnibin\") pod \"multus-additional-cni-plugins-jtcxx\" (UID: \"98293641-7bf9-4473-ae92-c80e56cefdb5\") " pod="openshift-multus/multus-additional-cni-plugins-jtcxx" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.693762 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a0d0e040-7ca3-4af8-9f02-d96cff6b3edf-hostroot\") pod \"multus-429fk\" (UID: \"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\") " pod="openshift-multus/multus-429fk" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.694075 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/65e722e8-52c4-4bb6-9927-f378b2f7296a-mcd-auth-proxy-config\") pod \"machine-config-daemon-5c998\" (UID: \"65e722e8-52c4-4bb6-9927-f378b2f7296a\") " pod="openshift-machine-config-operator/machine-config-daemon-5c998" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.694093 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/98293641-7bf9-4473-ae92-c80e56cefdb5-cni-binary-copy\") pod \"multus-additional-cni-plugins-jtcxx\" (UID: \"98293641-7bf9-4473-ae92-c80e56cefdb5\") " pod="openshift-multus/multus-additional-cni-plugins-jtcxx" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.694742 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a0d0e040-7ca3-4af8-9f02-d96cff6b3edf-cni-binary-copy\") pod \"multus-429fk\" (UID: \"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\") " pod="openshift-multus/multus-429fk" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.694783 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a0d0e040-7ca3-4af8-9f02-d96cff6b3edf-multus-daemon-config\") pod \"multus-429fk\" (UID: \"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\") " pod="openshift-multus/multus-429fk" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.695103 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/98293641-7bf9-4473-ae92-c80e56cefdb5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jtcxx\" (UID: \"98293641-7bf9-4473-ae92-c80e56cefdb5\") " pod="openshift-multus/multus-additional-cni-plugins-jtcxx" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.708799 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/65e722e8-52c4-4bb6-9927-f378b2f7296a-proxy-tls\") pod \"machine-config-daemon-5c998\" (UID: \"65e722e8-52c4-4bb6-9927-f378b2f7296a\") " pod="openshift-machine-config-operator/machine-config-daemon-5c998" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.713412 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tm787\" (UniqueName: \"kubernetes.io/projected/98293641-7bf9-4473-ae92-c80e56cefdb5-kube-api-access-tm787\") pod \"multus-additional-cni-plugins-jtcxx\" (UID: \"98293641-7bf9-4473-ae92-c80e56cefdb5\") " pod="openshift-multus/multus-additional-cni-plugins-jtcxx" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.714024 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad3db3b3-d72c-4e61-9db8-fcbc6abb0578\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95a5fc152f730178294cfb5d25a815b2041e4b712e4bdb7b334b4b417f463ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ceafe61efea5730e474e74a9ed5a7558cc06dc75d105c37680ed1eea84b8827c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:17Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 02:41:48.034714 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 02:41:48.036179 1 observer_polling.go:159] Starting file observer\\\\nI0309 02:41:48.038474 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 02:41:48.039660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 02:42:14.296758 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 02:42:17.547772 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 02:42:17.547837 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:47Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0cd220ceed026b8b0448743b09537807f1d4c59e290c74e4afa63b3331aaf97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c86d550b0e0522073aa52f3a6aa6671f7ded403e37eda7c7b91c6e069272dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04aae8495c8f5edddbe537c44cefdb20eaa781df485766f6e709a5d4c6e82df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:00Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.719688 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4j6pq\" (UniqueName: \"kubernetes.io/projected/a0d0e040-7ca3-4af8-9f02-d96cff6b3edf-kube-api-access-4j6pq\") pod \"multus-429fk\" (UID: \"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\") " pod="openshift-multus/multus-429fk" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.722913 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pr22q\" (UniqueName: \"kubernetes.io/projected/65e722e8-52c4-4bb6-9927-f378b2f7296a-kube-api-access-pr22q\") pod \"machine-config-daemon-5c998\" (UID: \"65e722e8-52c4-4bb6-9927-f378b2f7296a\") " pod="openshift-machine-config-operator/machine-config-daemon-5c998" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.734823 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.734874 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.734889 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.734909 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.734922 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:00Z","lastTransitionTime":"2026-03-09T02:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.736289 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jtcxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98293641-7bf9-4473-ae92-c80e56cefdb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jtcxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:00Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.773067 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f58555f-103d-42bd-affa-68dd3ed5baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18f78a3248e86d96e6293502db1727285fb76668c8242ed3399de4b8e8fb666a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79c92d776bf96579afc6fbc0e1e70bb645a371966be4e4e9d86e18ef3c977b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7936d7271945f5c0b5ab0ddecf9a3f46c9f7306b71e25fabbb3927303c6f9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1098f7874be8a2a538c966845a946ab7993ec921e4e0859953f0187b4fabfd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23f66811373b03763a100551c0390732ccceac869c908d769de48f847e4ab82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:00Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.791179 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:00Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.805455 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:00Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.826058 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ad2682-53b5-4e9d-acd1-0f0d210b322c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a290d0cc95a867bb7d924bfb2e63a518bb753765d4cdb890ccdf96cd36450b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72901ff5efbd52bce10e4651744925f7a5758a7a3c2572b581ab2cd222f58bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d0fe7f5e959b5c71b03b4ad017949229c3a161ea79f7869e610d069248c552\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 02:42:28.169798 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 02:42:28.169917 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 02:42:28.170597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1507146702/tls.crt::/tmp/serving-cert-1507146702/tls.key\\\\\\\"\\\\nI0309 02:42:28.390239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 02:42:28.392919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 02:42:28.392936 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 02:42:28.392959 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 02:42:28.392964 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 02:42:28.400711 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 02:42:28.400732 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 02:42:28.400760 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 02:42:28.400786 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 02:42:28.400792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 02:42:28.400798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 02:42:28.402404 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99df08aa182691c1bdb9ab9fb7b5352b276c236655b1076e2993639d63a94bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:00Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.835653 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-jtcxx" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.837319 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.837355 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.837367 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.837386 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.837399 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:00Z","lastTransitionTime":"2026-03-09T02:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.844268 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:00Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.848951 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-429fk" Mar 09 02:43:00 crc kubenswrapper[4901]: W0309 02:43:00.853549 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98293641_7bf9_4473_ae92_c80e56cefdb5.slice/crio-98d629bef0fa3471027467f789ac8c54f41653f838cb828b53587be9f2d09eb5 WatchSource:0}: Error finding container 98d629bef0fa3471027467f789ac8c54f41653f838cb828b53587be9f2d09eb5: Status 404 returned error can't find the container with id 98d629bef0fa3471027467f789ac8c54f41653f838cb828b53587be9f2d09eb5 Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.861674 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad3db3b3-d72c-4e61-9db8-fcbc6abb0578\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95a5fc152f730178294cfb5d25a815b2041e4b712e4bdb7b334b4b417f463ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ceafe61efea5730e474e74a9ed5a7558cc06dc75d105c37680ed1eea84b8827c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:17Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 02:41:48.034714 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 02:41:48.036179 1 observer_polling.go:159] Starting file observer\\\\nI0309 02:41:48.038474 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 02:41:48.039660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 02:42:14.296758 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 02:42:17.547772 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 02:42:17.547837 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:47Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0cd220ceed026b8b0448743b09537807f1d4c59e290c74e4afa63b3331aaf97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c86d550b0e0522073aa52f3a6aa6671f7ded403e37eda7c7b91c6e069272dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04aae8495c8f5edddbe537c44cefdb20eaa781df485766f6e709a5d4c6e82df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:00Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.861989 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-5c998" Mar 09 02:43:00 crc kubenswrapper[4901]: W0309 02:43:00.865475 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0d0e040_7ca3_4af8_9f02_d96cff6b3edf.slice/crio-21433b8c02925f8fd1c8af2ba36440fbf1515ee9b9bbe5f9c93e90b47103c74f WatchSource:0}: Error finding container 21433b8c02925f8fd1c8af2ba36440fbf1515ee9b9bbe5f9c93e90b47103c74f: Status 404 returned error can't find the container with id 21433b8c02925f8fd1c8af2ba36440fbf1515ee9b9bbe5f9c93e90b47103c74f Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.878272 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84cc042c2e8b6984f9b3357afce277a63722f17b3e8fd289eeeaa8a7a47935f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:00Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.881498 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bmfgc"] Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.882538 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.885354 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.885680 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.885734 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.886116 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.886118 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.886152 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.888965 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.899477 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tvqrz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ac7cbd-671c-4beb-8994-92502ee47ceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbmmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tvqrz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:00Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.920977 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jtcxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98293641-7bf9-4473-ae92-c80e56cefdb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jtcxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:00Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.937944 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7a956f96042c9ac58c08b70e189b199ec512bc56ffedd9df3956127a2380a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:00Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.940313 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.940350 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.940362 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.940380 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.940390 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:00Z","lastTransitionTime":"2026-03-09T02:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.960018 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f7c21ba0d2836cb5905904941725dc9b5df5e173b47637722608075496850bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ce0f53e8c88e047328fb896ffa5133e9ac723b73795634e24072acde25506d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:00Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.983926 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-429fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4j6pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-429fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:00Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.994511 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-systemd-units\") pod \"ovnkube-node-bmfgc\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.994579 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-run-openvswitch\") pod \"ovnkube-node-bmfgc\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.994622 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/40c17e04-3fc2-48a2-95dc-fe0428b91e66-ovnkube-script-lib\") pod \"ovnkube-node-bmfgc\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.994783 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-host-slash\") pod \"ovnkube-node-bmfgc\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.994848 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-run-ovn\") pod \"ovnkube-node-bmfgc\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.994866 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/40c17e04-3fc2-48a2-95dc-fe0428b91e66-env-overrides\") pod \"ovnkube-node-bmfgc\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.994895 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-host-kubelet\") pod \"ovnkube-node-bmfgc\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.994918 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-var-lib-openvswitch\") pod \"ovnkube-node-bmfgc\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.994948 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-log-socket\") pod \"ovnkube-node-bmfgc\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.994968 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-host-cni-netd\") pod \"ovnkube-node-bmfgc\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.994987 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-host-run-ovn-kubernetes\") pod \"ovnkube-node-bmfgc\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.995002 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/40c17e04-3fc2-48a2-95dc-fe0428b91e66-ovn-node-metrics-cert\") pod \"ovnkube-node-bmfgc\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.995041 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-host-run-netns\") pod \"ovnkube-node-bmfgc\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.995069 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cddsm\" (UniqueName: \"kubernetes.io/projected/40c17e04-3fc2-48a2-95dc-fe0428b91e66-kube-api-access-cddsm\") pod \"ovnkube-node-bmfgc\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.995096 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bmfgc\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.995115 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/40c17e04-3fc2-48a2-95dc-fe0428b91e66-ovnkube-config\") pod \"ovnkube-node-bmfgc\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.995135 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-host-cni-bin\") pod \"ovnkube-node-bmfgc\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.995167 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-run-systemd\") pod \"ovnkube-node-bmfgc\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.995183 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-etc-openvswitch\") pod \"ovnkube-node-bmfgc\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:00 crc kubenswrapper[4901]: I0309 02:43:00.995198 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-node-log\") pod \"ovnkube-node-bmfgc\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.001673 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65e722e8-52c4-4bb6-9927-f378b2f7296a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5c998\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:00Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.023585 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ad2682-53b5-4e9d-acd1-0f0d210b322c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a290d0cc95a867bb7d924bfb2e63a518bb753765d4cdb890ccdf96cd36450b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72901ff5efbd52bce10e4651744925f7a5758a7a3c2572b581ab2cd222f58bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d0fe7f5e959b5c71b03b4ad017949229c3a161ea79f7869e610d069248c552\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 02:42:28.169798 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 02:42:28.169917 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 02:42:28.170597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1507146702/tls.crt::/tmp/serving-cert-1507146702/tls.key\\\\\\\"\\\\nI0309 02:42:28.390239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 02:42:28.392919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 02:42:28.392936 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 02:42:28.392959 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 02:42:28.392964 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 02:42:28.400711 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 02:42:28.400732 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 02:42:28.400760 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 02:42:28.400786 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 02:42:28.400792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 02:42:28.400798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 02:42:28.402404 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99df08aa182691c1bdb9ab9fb7b5352b276c236655b1076e2993639d63a94bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:01Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.037388 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:01Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.041991 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.042030 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.042040 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.042608 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.042654 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:01Z","lastTransitionTime":"2026-03-09T02:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.053745 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad3db3b3-d72c-4e61-9db8-fcbc6abb0578\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95a5fc152f730178294cfb5d25a815b2041e4b712e4bdb7b334b4b417f463ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ceafe61efea5730e474e74a9ed5a7558cc06dc75d105c37680ed1eea84b8827c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:17Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 02:41:48.034714 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 02:41:48.036179 1 observer_polling.go:159] Starting file observer\\\\nI0309 02:41:48.038474 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 02:41:48.039660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 02:42:14.296758 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 02:42:17.547772 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 02:42:17.547837 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:47Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0cd220ceed026b8b0448743b09537807f1d4c59e290c74e4afa63b3331aaf97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c86d550b0e0522073aa52f3a6aa6671f7ded403e37eda7c7b91c6e069272dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04aae8495c8f5edddbe537c44cefdb20eaa781df485766f6e709a5d4c6e82df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:01Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.066185 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84cc042c2e8b6984f9b3357afce277a63722f17b3e8fd289eeeaa8a7a47935f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:01Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.079472 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tvqrz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ac7cbd-671c-4beb-8994-92502ee47ceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbmmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tvqrz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:01Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.096360 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cddsm\" (UniqueName: \"kubernetes.io/projected/40c17e04-3fc2-48a2-95dc-fe0428b91e66-kube-api-access-cddsm\") pod \"ovnkube-node-bmfgc\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.096775 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bmfgc\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.096799 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/40c17e04-3fc2-48a2-95dc-fe0428b91e66-ovnkube-config\") pod \"ovnkube-node-bmfgc\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.096820 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-run-systemd\") pod \"ovnkube-node-bmfgc\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.096836 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-etc-openvswitch\") pod \"ovnkube-node-bmfgc\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.096861 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-node-log\") pod \"ovnkube-node-bmfgc\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.096877 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-host-cni-bin\") pod \"ovnkube-node-bmfgc\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.096873 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bmfgc\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.096912 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-systemd-units\") pod \"ovnkube-node-bmfgc\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.096943 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-run-openvswitch\") pod \"ovnkube-node-bmfgc\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.096958 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-etc-openvswitch\") pod \"ovnkube-node-bmfgc\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.096965 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/40c17e04-3fc2-48a2-95dc-fe0428b91e66-ovnkube-script-lib\") pod \"ovnkube-node-bmfgc\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.097027 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-host-slash\") pod \"ovnkube-node-bmfgc\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.097070 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-run-ovn\") pod \"ovnkube-node-bmfgc\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.097098 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/40c17e04-3fc2-48a2-95dc-fe0428b91e66-env-overrides\") pod \"ovnkube-node-bmfgc\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.097145 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-host-kubelet\") pod \"ovnkube-node-bmfgc\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.097173 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-var-lib-openvswitch\") pod \"ovnkube-node-bmfgc\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.097217 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-log-socket\") pod \"ovnkube-node-bmfgc\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.097279 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-host-cni-netd\") pod \"ovnkube-node-bmfgc\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.097309 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-host-run-ovn-kubernetes\") pod \"ovnkube-node-bmfgc\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.097343 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/40c17e04-3fc2-48a2-95dc-fe0428b91e66-ovn-node-metrics-cert\") pod \"ovnkube-node-bmfgc\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.097377 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-host-run-netns\") pod \"ovnkube-node-bmfgc\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.097485 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-host-run-netns\") pod \"ovnkube-node-bmfgc\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.097742 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/40c17e04-3fc2-48a2-95dc-fe0428b91e66-ovnkube-script-lib\") pod \"ovnkube-node-bmfgc\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.097806 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-node-log\") pod \"ovnkube-node-bmfgc\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.097831 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-host-cni-bin\") pod \"ovnkube-node-bmfgc\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.097852 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-systemd-units\") pod \"ovnkube-node-bmfgc\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.097880 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-run-openvswitch\") pod \"ovnkube-node-bmfgc\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.097906 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-host-kubelet\") pod \"ovnkube-node-bmfgc\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.097928 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-run-systemd\") pod \"ovnkube-node-bmfgc\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.097953 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-host-slash\") pod \"ovnkube-node-bmfgc\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.097977 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-run-ovn\") pod \"ovnkube-node-bmfgc\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.098323 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-host-cni-netd\") pod \"ovnkube-node-bmfgc\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.098347 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/40c17e04-3fc2-48a2-95dc-fe0428b91e66-ovnkube-config\") pod \"ovnkube-node-bmfgc\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.098394 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/40c17e04-3fc2-48a2-95dc-fe0428b91e66-env-overrides\") pod \"ovnkube-node-bmfgc\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.098406 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-log-socket\") pod \"ovnkube-node-bmfgc\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.098334 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-var-lib-openvswitch\") pod \"ovnkube-node-bmfgc\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.098465 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-host-run-ovn-kubernetes\") pod \"ovnkube-node-bmfgc\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.103795 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jtcxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98293641-7bf9-4473-ae92-c80e56cefdb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jtcxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:01Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.104897 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/40c17e04-3fc2-48a2-95dc-fe0428b91e66-ovn-node-metrics-cert\") pod \"ovnkube-node-bmfgc\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.115561 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7a956f96042c9ac58c08b70e189b199ec512bc56ffedd9df3956127a2380a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:01Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.118037 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cddsm\" (UniqueName: \"kubernetes.io/projected/40c17e04-3fc2-48a2-95dc-fe0428b91e66-kube-api-access-cddsm\") pod \"ovnkube-node-bmfgc\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.129252 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f7c21ba0d2836cb5905904941725dc9b5df5e173b47637722608075496850bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ce0f53e8c88e047328fb896ffa5133e9ac723b73795634e24072acde25506d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:01Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.157324 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-429fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4j6pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-429fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:01Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.157662 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.157681 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.157694 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.157711 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.157724 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:01Z","lastTransitionTime":"2026-03-09T02:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.171215 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65e722e8-52c4-4bb6-9927-f378b2f7296a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5c998\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:01Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.191929 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f58555f-103d-42bd-affa-68dd3ed5baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18f78a3248e86d96e6293502db1727285fb76668c8242ed3399de4b8e8fb666a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79c92d776bf96579afc6fbc0e1e70bb645a371966be4e4e9d86e18ef3c977b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7936d7271945f5c0b5ab0ddecf9a3f46c9f7306b71e25fabbb3927303c6f9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1098f7874be8a2a538c966845a946ab7993ec921e4e0859953f0187b4fabfd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23f66811373b03763a100551c0390732ccceac869c908d769de48f847e4ab82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:01Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.207636 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.208142 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:01Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.224176 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:01Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:01 crc kubenswrapper[4901]: W0309 02:43:01.225478 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40c17e04_3fc2_48a2_95dc_fe0428b91e66.slice/crio-9518415db7d7377fe8b5c75deee1d0719fdd022fe2f41787df42c8537002e69c WatchSource:0}: Error finding container 9518415db7d7377fe8b5c75deee1d0719fdd022fe2f41787df42c8537002e69c: Status 404 returned error can't find the container with id 9518415db7d7377fe8b5c75deee1d0719fdd022fe2f41787df42c8537002e69c Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.244183 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40c17e04-3fc2-48a2-95dc-fe0428b91e66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmfgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:01Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.260561 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.260626 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.260639 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.260675 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.260689 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:01Z","lastTransitionTime":"2026-03-09T02:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.367529 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.367590 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.367602 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.367620 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.367631 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:01Z","lastTransitionTime":"2026-03-09T02:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.470898 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.470945 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.470959 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.470978 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.470990 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:01Z","lastTransitionTime":"2026-03-09T02:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.574675 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.574738 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.574751 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.574778 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.574794 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:01Z","lastTransitionTime":"2026-03-09T02:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.646050 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" event={"ID":"65e722e8-52c4-4bb6-9927-f378b2f7296a","Type":"ContainerStarted","Data":"25b2c2479794812333e6425d16a7432c46578a2a6d240682761545e57af6b4a0"} Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.646113 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" event={"ID":"65e722e8-52c4-4bb6-9927-f378b2f7296a","Type":"ContainerStarted","Data":"f7db76f28a31cfe2c8e33f8d3addf0b75358f97a36710315a069de1716506eae"} Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.646131 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" event={"ID":"65e722e8-52c4-4bb6-9927-f378b2f7296a","Type":"ContainerStarted","Data":"395db4a394b2703aa229ece5d3201fea4cc9360c09fb2ae76073f3ad684c6ddf"} Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.647796 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-429fk" event={"ID":"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf","Type":"ContainerStarted","Data":"0f00772dc49e31beb94fb8c970e126f77b33ecff59ca8d6fbc856704648ec6d1"} Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.647868 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-429fk" event={"ID":"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf","Type":"ContainerStarted","Data":"21433b8c02925f8fd1c8af2ba36440fbf1515ee9b9bbe5f9c93e90b47103c74f"} Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.650162 4901 generic.go:334] "Generic (PLEG): container finished" podID="98293641-7bf9-4473-ae92-c80e56cefdb5" containerID="9388d004278c87c5f70e21f80ffb92a37910be99d37d4890f1802ca8ee862c02" exitCode=0 Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.650271 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jtcxx" event={"ID":"98293641-7bf9-4473-ae92-c80e56cefdb5","Type":"ContainerDied","Data":"9388d004278c87c5f70e21f80ffb92a37910be99d37d4890f1802ca8ee862c02"} Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.650521 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jtcxx" event={"ID":"98293641-7bf9-4473-ae92-c80e56cefdb5","Type":"ContainerStarted","Data":"98d629bef0fa3471027467f789ac8c54f41653f838cb828b53587be9f2d09eb5"} Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.652166 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-tvqrz" event={"ID":"f7ac7cbd-671c-4beb-8994-92502ee47ceb","Type":"ContainerStarted","Data":"ee19f70b95aa359f8230d02f66a5ffe7bd83b01af33602ffee28239e335210d2"} Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.654186 4901 generic.go:334] "Generic (PLEG): container finished" podID="40c17e04-3fc2-48a2-95dc-fe0428b91e66" containerID="f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf" exitCode=0 Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.654283 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" event={"ID":"40c17e04-3fc2-48a2-95dc-fe0428b91e66","Type":"ContainerDied","Data":"f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf"} Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.654433 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" event={"ID":"40c17e04-3fc2-48a2-95dc-fe0428b91e66","Type":"ContainerStarted","Data":"9518415db7d7377fe8b5c75deee1d0719fdd022fe2f41787df42c8537002e69c"} Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.675583 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f58555f-103d-42bd-affa-68dd3ed5baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18f78a3248e86d96e6293502db1727285fb76668c8242ed3399de4b8e8fb666a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79c92d776bf96579afc6fbc0e1e70bb645a371966be4e4e9d86e18ef3c977b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7936d7271945f5c0b5ab0ddecf9a3f46c9f7306b71e25fabbb3927303c6f9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1098f7874be8a2a538c966845a946ab7993ec921e4e0859953f0187b4fabfd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23f66811373b03763a100551c0390732ccceac869c908d769de48f847e4ab82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:01Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.678409 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.678456 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.678467 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.678490 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.678501 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:01Z","lastTransitionTime":"2026-03-09T02:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.708625 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:01Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.726787 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:01Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.757261 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40c17e04-3fc2-48a2-95dc-fe0428b91e66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmfgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:01Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.777264 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ad2682-53b5-4e9d-acd1-0f0d210b322c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a290d0cc95a867bb7d924bfb2e63a518bb753765d4cdb890ccdf96cd36450b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72901ff5efbd52bce10e4651744925f7a5758a7a3c2572b581ab2cd222f58bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d0fe7f5e959b5c71b03b4ad017949229c3a161ea79f7869e610d069248c552\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 02:42:28.169798 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 02:42:28.169917 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 02:42:28.170597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1507146702/tls.crt::/tmp/serving-cert-1507146702/tls.key\\\\\\\"\\\\nI0309 02:42:28.390239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 02:42:28.392919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 02:42:28.392936 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 02:42:28.392959 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 02:42:28.392964 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 02:42:28.400711 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 02:42:28.400732 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 02:42:28.400760 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 02:42:28.400786 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 02:42:28.400792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 02:42:28.400798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 02:42:28.402404 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99df08aa182691c1bdb9ab9fb7b5352b276c236655b1076e2993639d63a94bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:01Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.783893 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.783947 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.783959 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.783980 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.783994 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:01Z","lastTransitionTime":"2026-03-09T02:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.793366 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:01Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.810008 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad3db3b3-d72c-4e61-9db8-fcbc6abb0578\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95a5fc152f730178294cfb5d25a815b2041e4b712e4bdb7b334b4b417f463ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ceafe61efea5730e474e74a9ed5a7558cc06dc75d105c37680ed1eea84b8827c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:17Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 02:41:48.034714 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 02:41:48.036179 1 observer_polling.go:159] Starting file observer\\\\nI0309 02:41:48.038474 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 02:41:48.039660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 02:42:14.296758 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 02:42:17.547772 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 02:42:17.547837 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:47Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0cd220ceed026b8b0448743b09537807f1d4c59e290c74e4afa63b3331aaf97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c86d550b0e0522073aa52f3a6aa6671f7ded403e37eda7c7b91c6e069272dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04aae8495c8f5edddbe537c44cefdb20eaa781df485766f6e709a5d4c6e82df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:01Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.824574 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84cc042c2e8b6984f9b3357afce277a63722f17b3e8fd289eeeaa8a7a47935f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:01Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.841999 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tvqrz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ac7cbd-671c-4beb-8994-92502ee47ceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbmmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tvqrz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:01Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.858703 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jtcxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98293641-7bf9-4473-ae92-c80e56cefdb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jtcxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:01Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.879897 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7a956f96042c9ac58c08b70e189b199ec512bc56ffedd9df3956127a2380a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:01Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.886638 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.886672 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.886680 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.886694 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.886704 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:01Z","lastTransitionTime":"2026-03-09T02:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.899137 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f7c21ba0d2836cb5905904941725dc9b5df5e173b47637722608075496850bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ce0f53e8c88e047328fb896ffa5133e9ac723b73795634e24072acde25506d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:01Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.915927 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-429fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4j6pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-429fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:01Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.930206 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65e722e8-52c4-4bb6-9927-f378b2f7296a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25b2c2479794812333e6425d16a7432c46578a2a6d240682761545e57af6b4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7db76f28a31cfe2c8e33f8d3addf0b75358f97a36710315a069de1716506eae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5c998\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:01Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.951722 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad3db3b3-d72c-4e61-9db8-fcbc6abb0578\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95a5fc152f730178294cfb5d25a815b2041e4b712e4bdb7b334b4b417f463ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ceafe61efea5730e474e74a9ed5a7558cc06dc75d105c37680ed1eea84b8827c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:17Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 02:41:48.034714 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 02:41:48.036179 1 observer_polling.go:159] Starting file observer\\\\nI0309 02:41:48.038474 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 02:41:48.039660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 02:42:14.296758 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 02:42:17.547772 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 02:42:17.547837 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:47Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0cd220ceed026b8b0448743b09537807f1d4c59e290c74e4afa63b3331aaf97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c86d550b0e0522073aa52f3a6aa6671f7ded403e37eda7c7b91c6e069272dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04aae8495c8f5edddbe537c44cefdb20eaa781df485766f6e709a5d4c6e82df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:01Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.967593 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84cc042c2e8b6984f9b3357afce277a63722f17b3e8fd289eeeaa8a7a47935f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:01Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.982084 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tvqrz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ac7cbd-671c-4beb-8994-92502ee47ceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee19f70b95aa359f8230d02f66a5ffe7bd83b01af33602ffee28239e335210d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbmmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tvqrz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:01Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.989261 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.989308 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.989319 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.989338 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:01 crc kubenswrapper[4901]: I0309 02:43:01.989350 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:01Z","lastTransitionTime":"2026-03-09T02:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.000033 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jtcxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98293641-7bf9-4473-ae92-c80e56cefdb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9388d004278c87c5f70e21f80ffb92a37910be99d37d4890f1802ca8ee862c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9388d004278c87c5f70e21f80ffb92a37910be99d37d4890f1802ca8ee862c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jtcxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:01Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.013517 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7a956f96042c9ac58c08b70e189b199ec512bc56ffedd9df3956127a2380a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:02Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.028941 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f7c21ba0d2836cb5905904941725dc9b5df5e173b47637722608075496850bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ce0f53e8c88e047328fb896ffa5133e9ac723b73795634e24072acde25506d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:02Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.056368 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-429fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f00772dc49e31beb94fb8c970e126f77b33ecff59ca8d6fbc856704648ec6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4j6pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-429fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:02Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.068388 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65e722e8-52c4-4bb6-9927-f378b2f7296a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25b2c2479794812333e6425d16a7432c46578a2a6d240682761545e57af6b4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7db76f28a31cfe2c8e33f8d3addf0b75358f97a36710315a069de1716506eae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5c998\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:02Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.086882 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f58555f-103d-42bd-affa-68dd3ed5baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18f78a3248e86d96e6293502db1727285fb76668c8242ed3399de4b8e8fb666a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79c92d776bf96579afc6fbc0e1e70bb645a371966be4e4e9d86e18ef3c977b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7936d7271945f5c0b5ab0ddecf9a3f46c9f7306b71e25fabbb3927303c6f9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1098f7874be8a2a538c966845a946ab7993ec921e4e0859953f0187b4fabfd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23f66811373b03763a100551c0390732ccceac869c908d769de48f847e4ab82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:02Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.091681 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.091743 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.091756 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.091778 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.091794 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:02Z","lastTransitionTime":"2026-03-09T02:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.101451 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:02Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.106105 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:43:02 crc kubenswrapper[4901]: E0309 02:43:02.106260 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.106337 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:43:02 crc kubenswrapper[4901]: E0309 02:43:02.106455 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.107364 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:43:02 crc kubenswrapper[4901]: E0309 02:43:02.107448 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.117059 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:02Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.140052 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40c17e04-3fc2-48a2-95dc-fe0428b91e66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmfgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:02Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.156943 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ad2682-53b5-4e9d-acd1-0f0d210b322c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a290d0cc95a867bb7d924bfb2e63a518bb753765d4cdb890ccdf96cd36450b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72901ff5efbd52bce10e4651744925f7a5758a7a3c2572b581ab2cd222f58bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d0fe7f5e959b5c71b03b4ad017949229c3a161ea79f7869e610d069248c552\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 02:42:28.169798 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 02:42:28.169917 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 02:42:28.170597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1507146702/tls.crt::/tmp/serving-cert-1507146702/tls.key\\\\\\\"\\\\nI0309 02:42:28.390239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 02:42:28.392919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 02:42:28.392936 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 02:42:28.392959 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 02:42:28.392964 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 02:42:28.400711 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 02:42:28.400732 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 02:42:28.400760 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 02:42:28.400786 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 02:42:28.400792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 02:42:28.400798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 02:42:28.402404 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99df08aa182691c1bdb9ab9fb7b5352b276c236655b1076e2993639d63a94bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:02Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.175711 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:02Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.196365 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.196804 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.196818 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.196839 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.196853 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:02Z","lastTransitionTime":"2026-03-09T02:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.300039 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.300095 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.300114 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.300137 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.300152 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:02Z","lastTransitionTime":"2026-03-09T02:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.403501 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.403543 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.403555 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.403572 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.403587 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:02Z","lastTransitionTime":"2026-03-09T02:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.506054 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.506102 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.506114 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.506134 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.506146 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:02Z","lastTransitionTime":"2026-03-09T02:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.609101 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.609472 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.609490 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.609513 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.609529 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:02Z","lastTransitionTime":"2026-03-09T02:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.662604 4901 generic.go:334] "Generic (PLEG): container finished" podID="98293641-7bf9-4473-ae92-c80e56cefdb5" containerID="57469350bdcfa9d44c83dd87ce9ef88a75c2ed0302fd633c6c23f95a60572f9a" exitCode=0 Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.662712 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jtcxx" event={"ID":"98293641-7bf9-4473-ae92-c80e56cefdb5","Type":"ContainerDied","Data":"57469350bdcfa9d44c83dd87ce9ef88a75c2ed0302fd633c6c23f95a60572f9a"} Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.674342 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" event={"ID":"40c17e04-3fc2-48a2-95dc-fe0428b91e66","Type":"ContainerStarted","Data":"fce11cf214e5d6313c477f4939e0627d14eecc389d7df43bfa0a4bff5152baae"} Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.674434 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" event={"ID":"40c17e04-3fc2-48a2-95dc-fe0428b91e66","Type":"ContainerStarted","Data":"3178075164db642d21adaf4deed75f6f295794530311c6476b6f0376101b61c4"} Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.674458 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" event={"ID":"40c17e04-3fc2-48a2-95dc-fe0428b91e66","Type":"ContainerStarted","Data":"17a115bbd5ac61babf8f88452e2c50b49fd24b34589357e4a7d59bd2a3511b49"} Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.674482 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" event={"ID":"40c17e04-3fc2-48a2-95dc-fe0428b91e66","Type":"ContainerStarted","Data":"4ce9289308fa0e0313c1e1de4a2da2e1544bf309f32c6f58cf4a914210f10c87"} Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.701703 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f58555f-103d-42bd-affa-68dd3ed5baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18f78a3248e86d96e6293502db1727285fb76668c8242ed3399de4b8e8fb666a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79c92d776bf96579afc6fbc0e1e70bb645a371966be4e4e9d86e18ef3c977b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7936d7271945f5c0b5ab0ddecf9a3f46c9f7306b71e25fabbb3927303c6f9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1098f7874be8a2a538c966845a946ab7993ec921e4e0859953f0187b4fabfd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23f66811373b03763a100551c0390732ccceac869c908d769de48f847e4ab82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:02Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.713041 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.713091 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.713107 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.713128 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.713143 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:02Z","lastTransitionTime":"2026-03-09T02:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.732757 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:02Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.755559 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:02Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.789256 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40c17e04-3fc2-48a2-95dc-fe0428b91e66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmfgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:02Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.808477 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:02Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.818098 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.818150 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.818166 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.818190 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.818204 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:02Z","lastTransitionTime":"2026-03-09T02:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.828726 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ad2682-53b5-4e9d-acd1-0f0d210b322c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a290d0cc95a867bb7d924bfb2e63a518bb753765d4cdb890ccdf96cd36450b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72901ff5efbd52bce10e4651744925f7a5758a7a3c2572b581ab2cd222f58bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d0fe7f5e959b5c71b03b4ad017949229c3a161ea79f7869e610d069248c552\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 02:42:28.169798 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 02:42:28.169917 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 02:42:28.170597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1507146702/tls.crt::/tmp/serving-cert-1507146702/tls.key\\\\\\\"\\\\nI0309 02:42:28.390239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 02:42:28.392919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 02:42:28.392936 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 02:42:28.392959 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 02:42:28.392964 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 02:42:28.400711 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 02:42:28.400732 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 02:42:28.400760 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 02:42:28.400786 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 02:42:28.400792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 02:42:28.400798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 02:42:28.402404 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99df08aa182691c1bdb9ab9fb7b5352b276c236655b1076e2993639d63a94bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:02Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.847452 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84cc042c2e8b6984f9b3357afce277a63722f17b3e8fd289eeeaa8a7a47935f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:02Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.865705 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tvqrz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ac7cbd-671c-4beb-8994-92502ee47ceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee19f70b95aa359f8230d02f66a5ffe7bd83b01af33602ffee28239e335210d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbmmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tvqrz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:02Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.885145 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jtcxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98293641-7bf9-4473-ae92-c80e56cefdb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9388d004278c87c5f70e21f80ffb92a37910be99d37d4890f1802ca8ee862c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9388d004278c87c5f70e21f80ffb92a37910be99d37d4890f1802ca8ee862c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57469350bdcfa9d44c83dd87ce9ef88a75c2ed0302fd633c6c23f95a60572f9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57469350bdcfa9d44c83dd87ce9ef88a75c2ed0302fd633c6c23f95a60572f9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jtcxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:02Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.905676 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad3db3b3-d72c-4e61-9db8-fcbc6abb0578\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95a5fc152f730178294cfb5d25a815b2041e4b712e4bdb7b334b4b417f463ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ceafe61efea5730e474e74a9ed5a7558cc06dc75d105c37680ed1eea84b8827c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:17Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 02:41:48.034714 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 02:41:48.036179 1 observer_polling.go:159] Starting file observer\\\\nI0309 02:41:48.038474 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 02:41:48.039660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 02:42:14.296758 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 02:42:17.547772 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 02:42:17.547837 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:47Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0cd220ceed026b8b0448743b09537807f1d4c59e290c74e4afa63b3331aaf97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c86d550b0e0522073aa52f3a6aa6671f7ded403e37eda7c7b91c6e069272dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04aae8495c8f5edddbe537c44cefdb20eaa781df485766f6e709a5d4c6e82df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:02Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.922461 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.922507 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.922522 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.922540 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.922552 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:02Z","lastTransitionTime":"2026-03-09T02:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.925335 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f7c21ba0d2836cb5905904941725dc9b5df5e173b47637722608075496850bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ce0f53e8c88e047328fb896ffa5133e9ac723b73795634e24072acde25506d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:02Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.947721 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-429fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f00772dc49e31beb94fb8c970e126f77b33ecff59ca8d6fbc856704648ec6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4j6pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-429fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:02Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.963541 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65e722e8-52c4-4bb6-9927-f378b2f7296a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25b2c2479794812333e6425d16a7432c46578a2a6d240682761545e57af6b4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7db76f28a31cfe2c8e33f8d3addf0b75358f97a36710315a069de1716506eae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5c998\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:02Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:02 crc kubenswrapper[4901]: I0309 02:43:02.976713 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7a956f96042c9ac58c08b70e189b199ec512bc56ffedd9df3956127a2380a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:02Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.026124 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.026165 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.026177 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.026193 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.026202 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:03Z","lastTransitionTime":"2026-03-09T02:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.129113 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.129737 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.129763 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.129796 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.129824 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:03Z","lastTransitionTime":"2026-03-09T02:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.233369 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.233434 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.233452 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.233479 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.233496 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:03Z","lastTransitionTime":"2026-03-09T02:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.336571 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.336618 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.336629 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.336646 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.336660 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:03Z","lastTransitionTime":"2026-03-09T02:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.439662 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.439716 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.439730 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.439749 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.439762 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:03Z","lastTransitionTime":"2026-03-09T02:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.542700 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.542755 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.542768 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.542786 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.542799 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:03Z","lastTransitionTime":"2026-03-09T02:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.646202 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.646303 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.646321 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.646345 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.646363 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:03Z","lastTransitionTime":"2026-03-09T02:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.682443 4901 generic.go:334] "Generic (PLEG): container finished" podID="98293641-7bf9-4473-ae92-c80e56cefdb5" containerID="1536a2d9d7cc0d60c0442b625d3e58b3b5ffa615b5424405b8f896940e4085b6" exitCode=0 Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.682548 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jtcxx" event={"ID":"98293641-7bf9-4473-ae92-c80e56cefdb5","Type":"ContainerDied","Data":"1536a2d9d7cc0d60c0442b625d3e58b3b5ffa615b5424405b8f896940e4085b6"} Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.689384 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" event={"ID":"40c17e04-3fc2-48a2-95dc-fe0428b91e66","Type":"ContainerStarted","Data":"8eead4ce26a12a03f0e1808e8e8479ca48c9a409935ea566fae8a7cc2d8bc7c0"} Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.689443 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" event={"ID":"40c17e04-3fc2-48a2-95dc-fe0428b91e66","Type":"ContainerStarted","Data":"9c9f87a22de3336b00d43f7848f50bda9f5f27f7ac84c6c6e63a224583005e77"} Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.706903 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ad2682-53b5-4e9d-acd1-0f0d210b322c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a290d0cc95a867bb7d924bfb2e63a518bb753765d4cdb890ccdf96cd36450b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72901ff5efbd52bce10e4651744925f7a5758a7a3c2572b581ab2cd222f58bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d0fe7f5e959b5c71b03b4ad017949229c3a161ea79f7869e610d069248c552\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 02:42:28.169798 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 02:42:28.169917 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 02:42:28.170597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1507146702/tls.crt::/tmp/serving-cert-1507146702/tls.key\\\\\\\"\\\\nI0309 02:42:28.390239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 02:42:28.392919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 02:42:28.392936 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 02:42:28.392959 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 02:42:28.392964 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 02:42:28.400711 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 02:42:28.400732 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 02:42:28.400760 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 02:42:28.400786 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 02:42:28.400792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 02:42:28.400798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 02:42:28.402404 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99df08aa182691c1bdb9ab9fb7b5352b276c236655b1076e2993639d63a94bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:03Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.728734 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:03Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.744998 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84cc042c2e8b6984f9b3357afce277a63722f17b3e8fd289eeeaa8a7a47935f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:03Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.749460 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.749520 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.749544 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.749576 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.749599 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:03Z","lastTransitionTime":"2026-03-09T02:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.760901 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tvqrz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ac7cbd-671c-4beb-8994-92502ee47ceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee19f70b95aa359f8230d02f66a5ffe7bd83b01af33602ffee28239e335210d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbmmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tvqrz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:03Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.781941 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jtcxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98293641-7bf9-4473-ae92-c80e56cefdb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9388d004278c87c5f70e21f80ffb92a37910be99d37d4890f1802ca8ee862c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9388d004278c87c5f70e21f80ffb92a37910be99d37d4890f1802ca8ee862c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57469350bdcfa9d44c83dd87ce9ef88a75c2ed0302fd633c6c23f95a60572f9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57469350bdcfa9d44c83dd87ce9ef88a75c2ed0302fd633c6c23f95a60572f9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1536a2d9d7cc0d60c0442b625d3e58b3b5ffa615b5424405b8f896940e4085b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1536a2d9d7cc0d60c0442b625d3e58b3b5ffa615b5424405b8f896940e4085b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jtcxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:03Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.794761 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.794813 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.794828 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.794853 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.794867 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:03Z","lastTransitionTime":"2026-03-09T02:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.805095 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad3db3b3-d72c-4e61-9db8-fcbc6abb0578\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95a5fc152f730178294cfb5d25a815b2041e4b712e4bdb7b334b4b417f463ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ceafe61efea5730e474e74a9ed5a7558cc06dc75d105c37680ed1eea84b8827c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:17Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 02:41:48.034714 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 02:41:48.036179 1 observer_polling.go:159] Starting file observer\\\\nI0309 02:41:48.038474 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 02:41:48.039660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 02:42:14.296758 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 02:42:17.547772 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 02:42:17.547837 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:47Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0cd220ceed026b8b0448743b09537807f1d4c59e290c74e4afa63b3331aaf97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c86d550b0e0522073aa52f3a6aa6671f7ded403e37eda7c7b91c6e069272dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04aae8495c8f5edddbe537c44cefdb20eaa781df485766f6e709a5d4c6e82df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:03Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:03 crc kubenswrapper[4901]: E0309 02:43:03.813613 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4fb5477d-c9aa-418f-9a0d-560ac0227b13\\\",\\\"systemUUID\\\":\\\"92bcf75b-bfed-4296-bbf2-d35c6ac3a586\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:03Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.818394 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.818430 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.818443 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.818464 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.818479 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:03Z","lastTransitionTime":"2026-03-09T02:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.827572 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-429fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f00772dc49e31beb94fb8c970e126f77b33ecff59ca8d6fbc856704648ec6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4j6pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-429fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:03Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:03 crc kubenswrapper[4901]: E0309 02:43:03.831769 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4fb5477d-c9aa-418f-9a0d-560ac0227b13\\\",\\\"systemUUID\\\":\\\"92bcf75b-bfed-4296-bbf2-d35c6ac3a586\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:03Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.836488 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.836567 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.836595 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.836626 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.836652 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:03Z","lastTransitionTime":"2026-03-09T02:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.844678 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65e722e8-52c4-4bb6-9927-f378b2f7296a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25b2c2479794812333e6425d16a7432c46578a2a6d240682761545e57af6b4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7db76f28a31cfe2c8e33f8d3addf0b75358f97a36710315a069de1716506eae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5c998\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:03Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:03 crc kubenswrapper[4901]: E0309 02:43:03.852543 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4fb5477d-c9aa-418f-9a0d-560ac0227b13\\\",\\\"systemUUID\\\":\\\"92bcf75b-bfed-4296-bbf2-d35c6ac3a586\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:03Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.856740 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.856794 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.856812 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.856836 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.856854 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:03Z","lastTransitionTime":"2026-03-09T02:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.862934 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7a956f96042c9ac58c08b70e189b199ec512bc56ffedd9df3956127a2380a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:03Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:03 crc kubenswrapper[4901]: E0309 02:43:03.872599 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4fb5477d-c9aa-418f-9a0d-560ac0227b13\\\",\\\"systemUUID\\\":\\\"92bcf75b-bfed-4296-bbf2-d35c6ac3a586\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:03Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.876942 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.876986 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.876997 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.877016 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.877030 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:03Z","lastTransitionTime":"2026-03-09T02:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.884610 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f7c21ba0d2836cb5905904941725dc9b5df5e173b47637722608075496850bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ce0f53e8c88e047328fb896ffa5133e9ac723b73795634e24072acde25506d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:03Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:03 crc kubenswrapper[4901]: E0309 02:43:03.899425 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4fb5477d-c9aa-418f-9a0d-560ac0227b13\\\",\\\"systemUUID\\\":\\\"92bcf75b-bfed-4296-bbf2-d35c6ac3a586\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:03Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:03 crc kubenswrapper[4901]: E0309 02:43:03.899680 4901 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.902823 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.902859 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.902870 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.902889 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.902901 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:03Z","lastTransitionTime":"2026-03-09T02:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.906789 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:03Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.926174 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:03Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.955881 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40c17e04-3fc2-48a2-95dc-fe0428b91e66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmfgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:03Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:03 crc kubenswrapper[4901]: I0309 02:43:03.989346 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f58555f-103d-42bd-affa-68dd3ed5baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18f78a3248e86d96e6293502db1727285fb76668c8242ed3399de4b8e8fb666a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79c92d776bf96579afc6fbc0e1e70bb645a371966be4e4e9d86e18ef3c977b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7936d7271945f5c0b5ab0ddecf9a3f46c9f7306b71e25fabbb3927303c6f9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1098f7874be8a2a538c966845a946ab7993ec921e4e0859953f0187b4fabfd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23f66811373b03763a100551c0390732ccceac869c908d769de48f847e4ab82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:03Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.005128 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.005165 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.005174 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.005188 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.005197 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:04Z","lastTransitionTime":"2026-03-09T02:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.106303 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.106375 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.106436 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:43:04 crc kubenswrapper[4901]: E0309 02:43:04.106557 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 02:43:04 crc kubenswrapper[4901]: E0309 02:43:04.106709 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 02:43:04 crc kubenswrapper[4901]: E0309 02:43:04.106893 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.109323 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.109356 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.109366 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.109397 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.109410 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:04Z","lastTransitionTime":"2026-03-09T02:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.212373 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.212433 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.212451 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.212479 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.212498 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:04Z","lastTransitionTime":"2026-03-09T02:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.314855 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.314897 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.314908 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.314925 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.314937 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:04Z","lastTransitionTime":"2026-03-09T02:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.417700 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.417759 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.417778 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.417802 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.417821 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:04Z","lastTransitionTime":"2026-03-09T02:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.521109 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.521161 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.521177 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.521199 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.521216 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:04Z","lastTransitionTime":"2026-03-09T02:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.624323 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.624579 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.624761 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.624906 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.625039 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:04Z","lastTransitionTime":"2026-03-09T02:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.698841 4901 generic.go:334] "Generic (PLEG): container finished" podID="98293641-7bf9-4473-ae92-c80e56cefdb5" containerID="a3a7a4a4bf21360999d06acf702346f565fb76f5a5c4e5a8d1666fa395299d84" exitCode=0 Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.698894 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jtcxx" event={"ID":"98293641-7bf9-4473-ae92-c80e56cefdb5","Type":"ContainerDied","Data":"a3a7a4a4bf21360999d06acf702346f565fb76f5a5c4e5a8d1666fa395299d84"} Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.715916 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84cc042c2e8b6984f9b3357afce277a63722f17b3e8fd289eeeaa8a7a47935f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:04Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.728658 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.728727 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.728739 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.728762 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.728777 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:04Z","lastTransitionTime":"2026-03-09T02:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.731309 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tvqrz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ac7cbd-671c-4beb-8994-92502ee47ceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee19f70b95aa359f8230d02f66a5ffe7bd83b01af33602ffee28239e335210d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbmmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tvqrz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:04Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.749896 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jtcxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98293641-7bf9-4473-ae92-c80e56cefdb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9388d004278c87c5f70e21f80ffb92a37910be99d37d4890f1802ca8ee862c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9388d004278c87c5f70e21f80ffb92a37910be99d37d4890f1802ca8ee862c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57469350bdcfa9d44c83dd87ce9ef88a75c2ed0302fd633c6c23f95a60572f9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57469350bdcfa9d44c83dd87ce9ef88a75c2ed0302fd633c6c23f95a60572f9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1536a2d9d7cc0d60c0442b625d3e58b3b5ffa615b5424405b8f896940e4085b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1536a2d9d7cc0d60c0442b625d3e58b3b5ffa615b5424405b8f896940e4085b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a7a4a4bf21360999d06acf702346f565fb76f5a5c4e5a8d1666fa395299d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a7a4a4bf21360999d06acf702346f565fb76f5a5c4e5a8d1666fa395299d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jtcxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:04Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.766862 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad3db3b3-d72c-4e61-9db8-fcbc6abb0578\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95a5fc152f730178294cfb5d25a815b2041e4b712e4bdb7b334b4b417f463ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ceafe61efea5730e474e74a9ed5a7558cc06dc75d105c37680ed1eea84b8827c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:17Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 02:41:48.034714 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 02:41:48.036179 1 observer_polling.go:159] Starting file observer\\\\nI0309 02:41:48.038474 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 02:41:48.039660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 02:42:14.296758 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 02:42:17.547772 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 02:42:17.547837 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:47Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0cd220ceed026b8b0448743b09537807f1d4c59e290c74e4afa63b3331aaf97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c86d550b0e0522073aa52f3a6aa6671f7ded403e37eda7c7b91c6e069272dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04aae8495c8f5edddbe537c44cefdb20eaa781df485766f6e709a5d4c6e82df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:04Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.795137 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-429fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f00772dc49e31beb94fb8c970e126f77b33ecff59ca8d6fbc856704648ec6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4j6pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-429fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:04Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.806702 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65e722e8-52c4-4bb6-9927-f378b2f7296a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25b2c2479794812333e6425d16a7432c46578a2a6d240682761545e57af6b4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7db76f28a31cfe2c8e33f8d3addf0b75358f97a36710315a069de1716506eae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5c998\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:04Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.817821 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7a956f96042c9ac58c08b70e189b199ec512bc56ffedd9df3956127a2380a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:04Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.828900 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f7c21ba0d2836cb5905904941725dc9b5df5e173b47637722608075496850bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ce0f53e8c88e047328fb896ffa5133e9ac723b73795634e24072acde25506d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:04Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.830934 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.830956 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.830964 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.830978 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.830986 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:04Z","lastTransitionTime":"2026-03-09T02:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.843339 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:04Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.856532 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:04Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.879991 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40c17e04-3fc2-48a2-95dc-fe0428b91e66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmfgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:04Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.904743 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f58555f-103d-42bd-affa-68dd3ed5baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18f78a3248e86d96e6293502db1727285fb76668c8242ed3399de4b8e8fb666a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79c92d776bf96579afc6fbc0e1e70bb645a371966be4e4e9d86e18ef3c977b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7936d7271945f5c0b5ab0ddecf9a3f46c9f7306b71e25fabbb3927303c6f9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1098f7874be8a2a538c966845a946ab7993ec921e4e0859953f0187b4fabfd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23f66811373b03763a100551c0390732ccceac869c908d769de48f847e4ab82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:04Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.921357 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ad2682-53b5-4e9d-acd1-0f0d210b322c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a290d0cc95a867bb7d924bfb2e63a518bb753765d4cdb890ccdf96cd36450b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72901ff5efbd52bce10e4651744925f7a5758a7a3c2572b581ab2cd222f58bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d0fe7f5e959b5c71b03b4ad017949229c3a161ea79f7869e610d069248c552\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 02:42:28.169798 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 02:42:28.169917 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 02:42:28.170597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1507146702/tls.crt::/tmp/serving-cert-1507146702/tls.key\\\\\\\"\\\\nI0309 02:42:28.390239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 02:42:28.392919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 02:42:28.392936 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 02:42:28.392959 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 02:42:28.392964 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 02:42:28.400711 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 02:42:28.400732 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 02:42:28.400760 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 02:42:28.400786 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 02:42:28.400792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 02:42:28.400798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 02:42:28.402404 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99df08aa182691c1bdb9ab9fb7b5352b276c236655b1076e2993639d63a94bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:04Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.933688 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.933717 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.933729 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.933743 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.933752 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:04Z","lastTransitionTime":"2026-03-09T02:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:04 crc kubenswrapper[4901]: I0309 02:43:04.935844 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:04Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.043522 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.043586 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.043606 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.043632 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.043650 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:05Z","lastTransitionTime":"2026-03-09T02:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.146069 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.146112 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.146126 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.146144 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.146158 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:05Z","lastTransitionTime":"2026-03-09T02:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.249321 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.249384 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.249401 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.249430 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.249443 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:05Z","lastTransitionTime":"2026-03-09T02:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.352788 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.352833 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.352842 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.352858 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.352870 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:05Z","lastTransitionTime":"2026-03-09T02:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.455926 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.455982 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.456001 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.456027 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.456044 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:05Z","lastTransitionTime":"2026-03-09T02:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.558735 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.558811 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.558828 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.558853 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.558871 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:05Z","lastTransitionTime":"2026-03-09T02:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.661589 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.661644 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.661662 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.661685 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.661702 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:05Z","lastTransitionTime":"2026-03-09T02:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.708963 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" event={"ID":"40c17e04-3fc2-48a2-95dc-fe0428b91e66","Type":"ContainerStarted","Data":"f2c4c63b1ab59c59daca5ed71ee5ed0e53a565ffbc793ac516f9ba75a4ece272"} Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.713994 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jtcxx" event={"ID":"98293641-7bf9-4473-ae92-c80e56cefdb5","Type":"ContainerStarted","Data":"ce664be16b67bb0782620399124bae458654d66c9331521c2df213476651b740"} Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.738201 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ad2682-53b5-4e9d-acd1-0f0d210b322c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a290d0cc95a867bb7d924bfb2e63a518bb753765d4cdb890ccdf96cd36450b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72901ff5efbd52bce10e4651744925f7a5758a7a3c2572b581ab2cd222f58bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d0fe7f5e959b5c71b03b4ad017949229c3a161ea79f7869e610d069248c552\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 02:42:28.169798 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 02:42:28.169917 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 02:42:28.170597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1507146702/tls.crt::/tmp/serving-cert-1507146702/tls.key\\\\\\\"\\\\nI0309 02:42:28.390239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 02:42:28.392919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 02:42:28.392936 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 02:42:28.392959 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 02:42:28.392964 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 02:42:28.400711 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 02:42:28.400732 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 02:42:28.400760 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 02:42:28.400786 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 02:42:28.400792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 02:42:28.400798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 02:42:28.402404 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99df08aa182691c1bdb9ab9fb7b5352b276c236655b1076e2993639d63a94bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:05Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.758527 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:05Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.764488 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.764560 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.764583 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.764612 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.764636 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:05Z","lastTransitionTime":"2026-03-09T02:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.780611 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad3db3b3-d72c-4e61-9db8-fcbc6abb0578\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95a5fc152f730178294cfb5d25a815b2041e4b712e4bdb7b334b4b417f463ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ceafe61efea5730e474e74a9ed5a7558cc06dc75d105c37680ed1eea84b8827c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:17Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 02:41:48.034714 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 02:41:48.036179 1 observer_polling.go:159] Starting file observer\\\\nI0309 02:41:48.038474 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 02:41:48.039660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 02:42:14.296758 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 02:42:17.547772 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 02:42:17.547837 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:47Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0cd220ceed026b8b0448743b09537807f1d4c59e290c74e4afa63b3331aaf97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c86d550b0e0522073aa52f3a6aa6671f7ded403e37eda7c7b91c6e069272dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04aae8495c8f5edddbe537c44cefdb20eaa781df485766f6e709a5d4c6e82df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:05Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.801513 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84cc042c2e8b6984f9b3357afce277a63722f17b3e8fd289eeeaa8a7a47935f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:05Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.818618 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tvqrz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ac7cbd-671c-4beb-8994-92502ee47ceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee19f70b95aa359f8230d02f66a5ffe7bd83b01af33602ffee28239e335210d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbmmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tvqrz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:05Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.843114 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jtcxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98293641-7bf9-4473-ae92-c80e56cefdb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9388d004278c87c5f70e21f80ffb92a37910be99d37d4890f1802ca8ee862c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9388d004278c87c5f70e21f80ffb92a37910be99d37d4890f1802ca8ee862c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57469350bdcfa9d44c83dd87ce9ef88a75c2ed0302fd633c6c23f95a60572f9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57469350bdcfa9d44c83dd87ce9ef88a75c2ed0302fd633c6c23f95a60572f9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1536a2d9d7cc0d60c0442b625d3e58b3b5ffa615b5424405b8f896940e4085b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1536a2d9d7cc0d60c0442b625d3e58b3b5ffa615b5424405b8f896940e4085b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a7a4a4bf21360999d06acf702346f565fb76f5a5c4e5a8d1666fa395299d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a7a4a4bf21360999d06acf702346f565fb76f5a5c4e5a8d1666fa395299d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce664be16b67bb0782620399124bae458654d66c9331521c2df213476651b740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jtcxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:05Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.866668 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7a956f96042c9ac58c08b70e189b199ec512bc56ffedd9df3956127a2380a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:05Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.867707 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.867762 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.867780 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.867807 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.867824 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:05Z","lastTransitionTime":"2026-03-09T02:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.882809 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f7c21ba0d2836cb5905904941725dc9b5df5e173b47637722608075496850bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ce0f53e8c88e047328fb896ffa5133e9ac723b73795634e24072acde25506d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:05Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.901003 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-429fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f00772dc49e31beb94fb8c970e126f77b33ecff59ca8d6fbc856704648ec6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4j6pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-429fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:05Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.918787 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65e722e8-52c4-4bb6-9927-f378b2f7296a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25b2c2479794812333e6425d16a7432c46578a2a6d240682761545e57af6b4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7db76f28a31cfe2c8e33f8d3addf0b75358f97a36710315a069de1716506eae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5c998\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:05Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.949389 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f58555f-103d-42bd-affa-68dd3ed5baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18f78a3248e86d96e6293502db1727285fb76668c8242ed3399de4b8e8fb666a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79c92d776bf96579afc6fbc0e1e70bb645a371966be4e4e9d86e18ef3c977b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7936d7271945f5c0b5ab0ddecf9a3f46c9f7306b71e25fabbb3927303c6f9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1098f7874be8a2a538c966845a946ab7993ec921e4e0859953f0187b4fabfd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23f66811373b03763a100551c0390732ccceac869c908d769de48f847e4ab82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:05Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.951628 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.951779 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.951824 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.951849 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.951870 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:43:05 crc kubenswrapper[4901]: E0309 02:43:05.951952 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 02:43:37.951910452 +0000 UTC m=+142.541574214 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:43:05 crc kubenswrapper[4901]: E0309 02:43:05.952024 4901 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 02:43:05 crc kubenswrapper[4901]: E0309 02:43:05.952069 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 02:43:05 crc kubenswrapper[4901]: E0309 02:43:05.952098 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 02:43:05 crc kubenswrapper[4901]: E0309 02:43:05.952113 4901 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 02:43:05 crc kubenswrapper[4901]: E0309 02:43:05.952181 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 02:43:37.952143238 +0000 UTC m=+142.541807120 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 02:43:05 crc kubenswrapper[4901]: E0309 02:43:05.952219 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 02:43:05 crc kubenswrapper[4901]: E0309 02:43:05.952324 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 02:43:05 crc kubenswrapper[4901]: E0309 02:43:05.952351 4901 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 02:43:05 crc kubenswrapper[4901]: E0309 02:43:05.952263 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 02:43:37.952207659 +0000 UTC m=+142.541871611 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 02:43:05 crc kubenswrapper[4901]: E0309 02:43:05.952448 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 02:43:37.952419354 +0000 UTC m=+142.542083116 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 02:43:05 crc kubenswrapper[4901]: E0309 02:43:05.952583 4901 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 02:43:05 crc kubenswrapper[4901]: E0309 02:43:05.952643 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 02:43:37.952628409 +0000 UTC m=+142.542292181 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.968404 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:05Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.970750 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.970796 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.970814 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.970840 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.970858 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:05Z","lastTransitionTime":"2026-03-09T02:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:05 crc kubenswrapper[4901]: I0309 02:43:05.983936 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:05Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.010764 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40c17e04-3fc2-48a2-95dc-fe0428b91e66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmfgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:06Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.074091 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.074169 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.074187 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.074240 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.074265 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:06Z","lastTransitionTime":"2026-03-09T02:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.105829 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:43:06 crc kubenswrapper[4901]: E0309 02:43:06.106195 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.105907 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.106007 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:43:06 crc kubenswrapper[4901]: E0309 02:43:06.106403 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 02:43:06 crc kubenswrapper[4901]: E0309 02:43:06.106536 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.128423 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7a956f96042c9ac58c08b70e189b199ec512bc56ffedd9df3956127a2380a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:06Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.154076 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f7c21ba0d2836cb5905904941725dc9b5df5e173b47637722608075496850bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ce0f53e8c88e047328fb896ffa5133e9ac723b73795634e24072acde25506d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:06Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.177554 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-429fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f00772dc49e31beb94fb8c970e126f77b33ecff59ca8d6fbc856704648ec6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4j6pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-429fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:06Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.196290 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65e722e8-52c4-4bb6-9927-f378b2f7296a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25b2c2479794812333e6425d16a7432c46578a2a6d240682761545e57af6b4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7db76f28a31cfe2c8e33f8d3addf0b75358f97a36710315a069de1716506eae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5c998\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:06Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.230045 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f58555f-103d-42bd-affa-68dd3ed5baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18f78a3248e86d96e6293502db1727285fb76668c8242ed3399de4b8e8fb666a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79c92d776bf96579afc6fbc0e1e70bb645a371966be4e4e9d86e18ef3c977b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7936d7271945f5c0b5ab0ddecf9a3f46c9f7306b71e25fabbb3927303c6f9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1098f7874be8a2a538c966845a946ab7993ec921e4e0859953f0187b4fabfd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23f66811373b03763a100551c0390732ccceac869c908d769de48f847e4ab82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:06Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.244959 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:06Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.447272 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.447325 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.447336 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.447356 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.447367 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:06Z","lastTransitionTime":"2026-03-09T02:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.458104 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:06Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.490485 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40c17e04-3fc2-48a2-95dc-fe0428b91e66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmfgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:06Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.509445 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ad2682-53b5-4e9d-acd1-0f0d210b322c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a290d0cc95a867bb7d924bfb2e63a518bb753765d4cdb890ccdf96cd36450b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72901ff5efbd52bce10e4651744925f7a5758a7a3c2572b581ab2cd222f58bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d0fe7f5e959b5c71b03b4ad017949229c3a161ea79f7869e610d069248c552\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 02:42:28.169798 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 02:42:28.169917 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 02:42:28.170597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1507146702/tls.crt::/tmp/serving-cert-1507146702/tls.key\\\\\\\"\\\\nI0309 02:42:28.390239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 02:42:28.392919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 02:42:28.392936 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 02:42:28.392959 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 02:42:28.392964 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 02:42:28.400711 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 02:42:28.400732 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 02:42:28.400760 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 02:42:28.400786 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 02:42:28.400792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 02:42:28.400798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 02:42:28.402404 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99df08aa182691c1bdb9ab9fb7b5352b276c236655b1076e2993639d63a94bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:06Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.532834 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:06Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.548483 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad3db3b3-d72c-4e61-9db8-fcbc6abb0578\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95a5fc152f730178294cfb5d25a815b2041e4b712e4bdb7b334b4b417f463ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ceafe61efea5730e474e74a9ed5a7558cc06dc75d105c37680ed1eea84b8827c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:17Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 02:41:48.034714 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 02:41:48.036179 1 observer_polling.go:159] Starting file observer\\\\nI0309 02:41:48.038474 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 02:41:48.039660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 02:42:14.296758 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 02:42:17.547772 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 02:42:17.547837 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:47Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0cd220ceed026b8b0448743b09537807f1d4c59e290c74e4afa63b3331aaf97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c86d550b0e0522073aa52f3a6aa6671f7ded403e37eda7c7b91c6e069272dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04aae8495c8f5edddbe537c44cefdb20eaa781df485766f6e709a5d4c6e82df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:06Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.553005 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.553073 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.553092 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.553574 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.553773 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:06Z","lastTransitionTime":"2026-03-09T02:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.565340 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84cc042c2e8b6984f9b3357afce277a63722f17b3e8fd289eeeaa8a7a47935f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:06Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.579574 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tvqrz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ac7cbd-671c-4beb-8994-92502ee47ceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee19f70b95aa359f8230d02f66a5ffe7bd83b01af33602ffee28239e335210d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbmmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tvqrz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:06Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.599563 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jtcxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98293641-7bf9-4473-ae92-c80e56cefdb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9388d004278c87c5f70e21f80ffb92a37910be99d37d4890f1802ca8ee862c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9388d004278c87c5f70e21f80ffb92a37910be99d37d4890f1802ca8ee862c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57469350bdcfa9d44c83dd87ce9ef88a75c2ed0302fd633c6c23f95a60572f9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57469350bdcfa9d44c83dd87ce9ef88a75c2ed0302fd633c6c23f95a60572f9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1536a2d9d7cc0d60c0442b625d3e58b3b5ffa615b5424405b8f896940e4085b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1536a2d9d7cc0d60c0442b625d3e58b3b5ffa615b5424405b8f896940e4085b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a7a4a4bf21360999d06acf702346f565fb76f5a5c4e5a8d1666fa395299d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a7a4a4bf21360999d06acf702346f565fb76f5a5c4e5a8d1666fa395299d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce664be16b67bb0782620399124bae458654d66c9331521c2df213476651b740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jtcxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:06Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.657197 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.657274 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.657290 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.657315 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.657329 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:06Z","lastTransitionTime":"2026-03-09T02:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.722215 4901 generic.go:334] "Generic (PLEG): container finished" podID="98293641-7bf9-4473-ae92-c80e56cefdb5" containerID="ce664be16b67bb0782620399124bae458654d66c9331521c2df213476651b740" exitCode=0 Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.722319 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jtcxx" event={"ID":"98293641-7bf9-4473-ae92-c80e56cefdb5","Type":"ContainerDied","Data":"ce664be16b67bb0782620399124bae458654d66c9331521c2df213476651b740"} Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.741403 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7a956f96042c9ac58c08b70e189b199ec512bc56ffedd9df3956127a2380a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:06Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.755170 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f7c21ba0d2836cb5905904941725dc9b5df5e173b47637722608075496850bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ce0f53e8c88e047328fb896ffa5133e9ac723b73795634e24072acde25506d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:06Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.759545 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.759598 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.759616 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.759642 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.759659 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:06Z","lastTransitionTime":"2026-03-09T02:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.780559 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-429fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f00772dc49e31beb94fb8c970e126f77b33ecff59ca8d6fbc856704648ec6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4j6pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-429fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:06Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.797698 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65e722e8-52c4-4bb6-9927-f378b2f7296a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25b2c2479794812333e6425d16a7432c46578a2a6d240682761545e57af6b4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7db76f28a31cfe2c8e33f8d3addf0b75358f97a36710315a069de1716506eae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5c998\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:06Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.849333 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f58555f-103d-42bd-affa-68dd3ed5baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18f78a3248e86d96e6293502db1727285fb76668c8242ed3399de4b8e8fb666a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79c92d776bf96579afc6fbc0e1e70bb645a371966be4e4e9d86e18ef3c977b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7936d7271945f5c0b5ab0ddecf9a3f46c9f7306b71e25fabbb3927303c6f9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1098f7874be8a2a538c966845a946ab7993ec921e4e0859953f0187b4fabfd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23f66811373b03763a100551c0390732ccceac869c908d769de48f847e4ab82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:06Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.863172 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.863241 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.863254 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.863277 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.863290 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:06Z","lastTransitionTime":"2026-03-09T02:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.894857 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:06Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.918981 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:06Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.937054 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40c17e04-3fc2-48a2-95dc-fe0428b91e66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmfgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:06Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.951948 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ad2682-53b5-4e9d-acd1-0f0d210b322c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a290d0cc95a867bb7d924bfb2e63a518bb753765d4cdb890ccdf96cd36450b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72901ff5efbd52bce10e4651744925f7a5758a7a3c2572b581ab2cd222f58bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d0fe7f5e959b5c71b03b4ad017949229c3a161ea79f7869e610d069248c552\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 02:42:28.169798 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 02:42:28.169917 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 02:42:28.170597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1507146702/tls.crt::/tmp/serving-cert-1507146702/tls.key\\\\\\\"\\\\nI0309 02:42:28.390239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 02:42:28.392919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 02:42:28.392936 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 02:42:28.392959 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 02:42:28.392964 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 02:42:28.400711 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 02:42:28.400732 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 02:42:28.400760 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 02:42:28.400786 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 02:42:28.400792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 02:42:28.400798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 02:42:28.402404 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99df08aa182691c1bdb9ab9fb7b5352b276c236655b1076e2993639d63a94bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:06Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.965919 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:06Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.968594 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.968640 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.968652 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.968671 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.968686 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:06Z","lastTransitionTime":"2026-03-09T02:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.983362 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad3db3b3-d72c-4e61-9db8-fcbc6abb0578\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95a5fc152f730178294cfb5d25a815b2041e4b712e4bdb7b334b4b417f463ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ceafe61efea5730e474e74a9ed5a7558cc06dc75d105c37680ed1eea84b8827c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:17Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 02:41:48.034714 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 02:41:48.036179 1 observer_polling.go:159] Starting file observer\\\\nI0309 02:41:48.038474 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 02:41:48.039660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 02:42:14.296758 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 02:42:17.547772 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 02:42:17.547837 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:47Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0cd220ceed026b8b0448743b09537807f1d4c59e290c74e4afa63b3331aaf97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c86d550b0e0522073aa52f3a6aa6671f7ded403e37eda7c7b91c6e069272dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04aae8495c8f5edddbe537c44cefdb20eaa781df485766f6e709a5d4c6e82df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:06Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:06 crc kubenswrapper[4901]: I0309 02:43:06.999611 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84cc042c2e8b6984f9b3357afce277a63722f17b3e8fd289eeeaa8a7a47935f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:06Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.013913 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tvqrz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ac7cbd-671c-4beb-8994-92502ee47ceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee19f70b95aa359f8230d02f66a5ffe7bd83b01af33602ffee28239e335210d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbmmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tvqrz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:07Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.036350 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jtcxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98293641-7bf9-4473-ae92-c80e56cefdb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9388d004278c87c5f70e21f80ffb92a37910be99d37d4890f1802ca8ee862c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9388d004278c87c5f70e21f80ffb92a37910be99d37d4890f1802ca8ee862c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57469350bdcfa9d44c83dd87ce9ef88a75c2ed0302fd633c6c23f95a60572f9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57469350bdcfa9d44c83dd87ce9ef88a75c2ed0302fd633c6c23f95a60572f9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1536a2d9d7cc0d60c0442b625d3e58b3b5ffa615b5424405b8f896940e4085b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1536a2d9d7cc0d60c0442b625d3e58b3b5ffa615b5424405b8f896940e4085b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a7a4a4bf21360999d06acf702346f565fb76f5a5c4e5a8d1666fa395299d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a7a4a4bf21360999d06acf702346f565fb76f5a5c4e5a8d1666fa395299d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce664be16b67bb0782620399124bae458654d66c9331521c2df213476651b740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce664be16b67bb0782620399124bae458654d66c9331521c2df213476651b740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jtcxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:07Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.070971 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.071017 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.071030 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.071048 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.071060 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:07Z","lastTransitionTime":"2026-03-09T02:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.106254 4901 scope.go:117] "RemoveContainer" containerID="d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc" Mar 09 02:43:07 crc kubenswrapper[4901]: E0309 02:43:07.106446 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.174691 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.174760 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.174783 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.174813 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.174830 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:07Z","lastTransitionTime":"2026-03-09T02:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.271368 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-c2pjf"] Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.271924 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-c2pjf" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.275001 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.275363 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.275602 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.275915 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.281973 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.282219 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.282475 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.282620 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.282741 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:07Z","lastTransitionTime":"2026-03-09T02:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.298685 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ad2682-53b5-4e9d-acd1-0f0d210b322c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a290d0cc95a867bb7d924bfb2e63a518bb753765d4cdb890ccdf96cd36450b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72901ff5efbd52bce10e4651744925f7a5758a7a3c2572b581ab2cd222f58bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d0fe7f5e959b5c71b03b4ad017949229c3a161ea79f7869e610d069248c552\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 02:42:28.169798 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 02:42:28.169917 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 02:42:28.170597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1507146702/tls.crt::/tmp/serving-cert-1507146702/tls.key\\\\\\\"\\\\nI0309 02:42:28.390239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 02:42:28.392919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 02:42:28.392936 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 02:42:28.392959 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 02:42:28.392964 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 02:42:28.400711 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 02:42:28.400732 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 02:42:28.400760 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 02:42:28.400786 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 02:42:28.400792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 02:42:28.400798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 02:42:28.402404 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99df08aa182691c1bdb9ab9fb7b5352b276c236655b1076e2993639d63a94bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:07Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.319486 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:07Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.337582 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad3db3b3-d72c-4e61-9db8-fcbc6abb0578\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95a5fc152f730178294cfb5d25a815b2041e4b712e4bdb7b334b4b417f463ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ceafe61efea5730e474e74a9ed5a7558cc06dc75d105c37680ed1eea84b8827c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:17Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 02:41:48.034714 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 02:41:48.036179 1 observer_polling.go:159] Starting file observer\\\\nI0309 02:41:48.038474 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 02:41:48.039660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 02:42:14.296758 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 02:42:17.547772 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 02:42:17.547837 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:47Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0cd220ceed026b8b0448743b09537807f1d4c59e290c74e4afa63b3331aaf97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c86d550b0e0522073aa52f3a6aa6671f7ded403e37eda7c7b91c6e069272dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04aae8495c8f5edddbe537c44cefdb20eaa781df485766f6e709a5d4c6e82df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:07Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.351996 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84cc042c2e8b6984f9b3357afce277a63722f17b3e8fd289eeeaa8a7a47935f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:07Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.366679 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tvqrz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ac7cbd-671c-4beb-8994-92502ee47ceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee19f70b95aa359f8230d02f66a5ffe7bd83b01af33602ffee28239e335210d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbmmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tvqrz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:07Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.385510 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.385579 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.385602 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.385632 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.385658 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:07Z","lastTransitionTime":"2026-03-09T02:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.386074 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jtcxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98293641-7bf9-4473-ae92-c80e56cefdb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9388d004278c87c5f70e21f80ffb92a37910be99d37d4890f1802ca8ee862c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9388d004278c87c5f70e21f80ffb92a37910be99d37d4890f1802ca8ee862c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57469350bdcfa9d44c83dd87ce9ef88a75c2ed0302fd633c6c23f95a60572f9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57469350bdcfa9d44c83dd87ce9ef88a75c2ed0302fd633c6c23f95a60572f9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1536a2d9d7cc0d60c0442b625d3e58b3b5ffa615b5424405b8f896940e4085b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1536a2d9d7cc0d60c0442b625d3e58b3b5ffa615b5424405b8f896940e4085b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a7a4a4bf21360999d06acf702346f565fb76f5a5c4e5a8d1666fa395299d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a7a4a4bf21360999d06acf702346f565fb76f5a5c4e5a8d1666fa395299d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce664be16b67bb0782620399124bae458654d66c9331521c2df213476651b740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce664be16b67bb0782620399124bae458654d66c9331521c2df213476651b740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jtcxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:07Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.409756 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7a956f96042c9ac58c08b70e189b199ec512bc56ffedd9df3956127a2380a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:07Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.436326 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f7c21ba0d2836cb5905904941725dc9b5df5e173b47637722608075496850bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ce0f53e8c88e047328fb896ffa5133e9ac723b73795634e24072acde25506d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:07Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.454834 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b166f443-7b7c-478b-bfc1-67291aa158fd-host\") pod \"node-ca-c2pjf\" (UID: \"b166f443-7b7c-478b-bfc1-67291aa158fd\") " pod="openshift-image-registry/node-ca-c2pjf" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.454931 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4l68\" (UniqueName: \"kubernetes.io/projected/b166f443-7b7c-478b-bfc1-67291aa158fd-kube-api-access-x4l68\") pod \"node-ca-c2pjf\" (UID: \"b166f443-7b7c-478b-bfc1-67291aa158fd\") " pod="openshift-image-registry/node-ca-c2pjf" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.454960 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b166f443-7b7c-478b-bfc1-67291aa158fd-serviceca\") pod \"node-ca-c2pjf\" (UID: \"b166f443-7b7c-478b-bfc1-67291aa158fd\") " pod="openshift-image-registry/node-ca-c2pjf" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.457681 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-429fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f00772dc49e31beb94fb8c970e126f77b33ecff59ca8d6fbc856704648ec6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4j6pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-429fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:07Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.474368 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65e722e8-52c4-4bb6-9927-f378b2f7296a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25b2c2479794812333e6425d16a7432c46578a2a6d240682761545e57af6b4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7db76f28a31cfe2c8e33f8d3addf0b75358f97a36710315a069de1716506eae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5c998\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:07Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.488738 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.488793 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.488811 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.488837 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.488855 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:07Z","lastTransitionTime":"2026-03-09T02:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.502812 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f58555f-103d-42bd-affa-68dd3ed5baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18f78a3248e86d96e6293502db1727285fb76668c8242ed3399de4b8e8fb666a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79c92d776bf96579afc6fbc0e1e70bb645a371966be4e4e9d86e18ef3c977b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7936d7271945f5c0b5ab0ddecf9a3f46c9f7306b71e25fabbb3927303c6f9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1098f7874be8a2a538c966845a946ab7993ec921e4e0859953f0187b4fabfd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23f66811373b03763a100551c0390732ccceac869c908d769de48f847e4ab82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:07Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.520424 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:07Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.539200 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:07Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.556141 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4l68\" (UniqueName: \"kubernetes.io/projected/b166f443-7b7c-478b-bfc1-67291aa158fd-kube-api-access-x4l68\") pod \"node-ca-c2pjf\" (UID: \"b166f443-7b7c-478b-bfc1-67291aa158fd\") " pod="openshift-image-registry/node-ca-c2pjf" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.556247 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b166f443-7b7c-478b-bfc1-67291aa158fd-serviceca\") pod \"node-ca-c2pjf\" (UID: \"b166f443-7b7c-478b-bfc1-67291aa158fd\") " pod="openshift-image-registry/node-ca-c2pjf" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.556290 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b166f443-7b7c-478b-bfc1-67291aa158fd-host\") pod \"node-ca-c2pjf\" (UID: \"b166f443-7b7c-478b-bfc1-67291aa158fd\") " pod="openshift-image-registry/node-ca-c2pjf" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.556386 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b166f443-7b7c-478b-bfc1-67291aa158fd-host\") pod \"node-ca-c2pjf\" (UID: \"b166f443-7b7c-478b-bfc1-67291aa158fd\") " pod="openshift-image-registry/node-ca-c2pjf" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.558140 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b166f443-7b7c-478b-bfc1-67291aa158fd-serviceca\") pod \"node-ca-c2pjf\" (UID: \"b166f443-7b7c-478b-bfc1-67291aa158fd\") " pod="openshift-image-registry/node-ca-c2pjf" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.565768 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40c17e04-3fc2-48a2-95dc-fe0428b91e66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmfgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:07Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.578345 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-c2pjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b166f443-7b7c-478b-bfc1-67291aa158fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4l68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-c2pjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:07Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.583527 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4l68\" (UniqueName: \"kubernetes.io/projected/b166f443-7b7c-478b-bfc1-67291aa158fd-kube-api-access-x4l68\") pod \"node-ca-c2pjf\" (UID: \"b166f443-7b7c-478b-bfc1-67291aa158fd\") " pod="openshift-image-registry/node-ca-c2pjf" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.592363 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.592408 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.592426 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.592453 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.592474 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:07Z","lastTransitionTime":"2026-03-09T02:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.593279 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-c2pjf" Mar 09 02:43:07 crc kubenswrapper[4901]: W0309 02:43:07.609826 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb166f443_7b7c_478b_bfc1_67291aa158fd.slice/crio-e2c055296c7e344720008422e5370ffacc21a0e9e041e9bd4f5307f586e55e59 WatchSource:0}: Error finding container e2c055296c7e344720008422e5370ffacc21a0e9e041e9bd4f5307f586e55e59: Status 404 returned error can't find the container with id e2c055296c7e344720008422e5370ffacc21a0e9e041e9bd4f5307f586e55e59 Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.695605 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.695636 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.695647 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.695663 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.695674 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:07Z","lastTransitionTime":"2026-03-09T02:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.735901 4901 generic.go:334] "Generic (PLEG): container finished" podID="98293641-7bf9-4473-ae92-c80e56cefdb5" containerID="99311d694d177551b5f59aa756a0674f18816e477c3195cd30d40a805c964940" exitCode=0 Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.735943 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jtcxx" event={"ID":"98293641-7bf9-4473-ae92-c80e56cefdb5","Type":"ContainerDied","Data":"99311d694d177551b5f59aa756a0674f18816e477c3195cd30d40a805c964940"} Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.744566 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" event={"ID":"40c17e04-3fc2-48a2-95dc-fe0428b91e66","Type":"ContainerStarted","Data":"e4e27513170313b32833f104dcccd7fe37890a5eff0dc33d3993efba6ae93dbb"} Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.745161 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.745251 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.745383 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.752899 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-c2pjf" event={"ID":"b166f443-7b7c-478b-bfc1-67291aa158fd","Type":"ContainerStarted","Data":"e2c055296c7e344720008422e5370ffacc21a0e9e041e9bd4f5307f586e55e59"} Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.769555 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40c17e04-3fc2-48a2-95dc-fe0428b91e66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmfgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:07Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.783490 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-c2pjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b166f443-7b7c-478b-bfc1-67291aa158fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4l68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-c2pjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:07Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.798076 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.798120 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.798132 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.798149 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.798162 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:07Z","lastTransitionTime":"2026-03-09T02:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.805402 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f58555f-103d-42bd-affa-68dd3ed5baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18f78a3248e86d96e6293502db1727285fb76668c8242ed3399de4b8e8fb666a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79c92d776bf96579afc6fbc0e1e70bb645a371966be4e4e9d86e18ef3c977b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7936d7271945f5c0b5ab0ddecf9a3f46c9f7306b71e25fabbb3927303c6f9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1098f7874be8a2a538c966845a946ab7993ec921e4e0859953f0187b4fabfd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23f66811373b03763a100551c0390732ccceac869c908d769de48f847e4ab82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:07Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.816763 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.820462 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.825798 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:07Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.840399 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:07Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.855073 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ad2682-53b5-4e9d-acd1-0f0d210b322c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a290d0cc95a867bb7d924bfb2e63a518bb753765d4cdb890ccdf96cd36450b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72901ff5efbd52bce10e4651744925f7a5758a7a3c2572b581ab2cd222f58bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d0fe7f5e959b5c71b03b4ad017949229c3a161ea79f7869e610d069248c552\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 02:42:28.169798 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 02:42:28.169917 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 02:42:28.170597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1507146702/tls.crt::/tmp/serving-cert-1507146702/tls.key\\\\\\\"\\\\nI0309 02:42:28.390239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 02:42:28.392919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 02:42:28.392936 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 02:42:28.392959 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 02:42:28.392964 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 02:42:28.400711 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 02:42:28.400732 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 02:42:28.400760 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 02:42:28.400786 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 02:42:28.400792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 02:42:28.400798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 02:42:28.402404 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99df08aa182691c1bdb9ab9fb7b5352b276c236655b1076e2993639d63a94bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:07Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.870408 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:07Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.884902 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jtcxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98293641-7bf9-4473-ae92-c80e56cefdb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9388d004278c87c5f70e21f80ffb92a37910be99d37d4890f1802ca8ee862c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9388d004278c87c5f70e21f80ffb92a37910be99d37d4890f1802ca8ee862c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57469350bdcfa9d44c83dd87ce9ef88a75c2ed0302fd633c6c23f95a60572f9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57469350bdcfa9d44c83dd87ce9ef88a75c2ed0302fd633c6c23f95a60572f9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1536a2d9d7cc0d60c0442b625d3e58b3b5ffa615b5424405b8f896940e4085b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1536a2d9d7cc0d60c0442b625d3e58b3b5ffa615b5424405b8f896940e4085b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a7a4a4bf21360999d06acf702346f565fb76f5a5c4e5a8d1666fa395299d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a7a4a4bf21360999d06acf702346f565fb76f5a5c4e5a8d1666fa395299d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce664be16b67bb0782620399124bae458654d66c9331521c2df213476651b740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce664be16b67bb0782620399124bae458654d66c9331521c2df213476651b740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99311d694d177551b5f59aa756a0674f18816e477c3195cd30d40a805c964940\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99311d694d177551b5f59aa756a0674f18816e477c3195cd30d40a805c964940\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jtcxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:07Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.898936 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad3db3b3-d72c-4e61-9db8-fcbc6abb0578\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95a5fc152f730178294cfb5d25a815b2041e4b712e4bdb7b334b4b417f463ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ceafe61efea5730e474e74a9ed5a7558cc06dc75d105c37680ed1eea84b8827c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:17Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 02:41:48.034714 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 02:41:48.036179 1 observer_polling.go:159] Starting file observer\\\\nI0309 02:41:48.038474 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 02:41:48.039660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 02:42:14.296758 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 02:42:17.547772 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 02:42:17.547837 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:47Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0cd220ceed026b8b0448743b09537807f1d4c59e290c74e4afa63b3331aaf97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c86d550b0e0522073aa52f3a6aa6671f7ded403e37eda7c7b91c6e069272dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04aae8495c8f5edddbe537c44cefdb20eaa781df485766f6e709a5d4c6e82df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:07Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.900599 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.900671 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.900690 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.900720 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.900741 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:07Z","lastTransitionTime":"2026-03-09T02:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.910301 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84cc042c2e8b6984f9b3357afce277a63722f17b3e8fd289eeeaa8a7a47935f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:07Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.921984 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tvqrz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ac7cbd-671c-4beb-8994-92502ee47ceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee19f70b95aa359f8230d02f66a5ffe7bd83b01af33602ffee28239e335210d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbmmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tvqrz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:07Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.941939 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7a956f96042c9ac58c08b70e189b199ec512bc56ffedd9df3956127a2380a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:07Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.967276 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f7c21ba0d2836cb5905904941725dc9b5df5e173b47637722608075496850bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ce0f53e8c88e047328fb896ffa5133e9ac723b73795634e24072acde25506d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:07Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:07 crc kubenswrapper[4901]: I0309 02:43:07.987525 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-429fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f00772dc49e31beb94fb8c970e126f77b33ecff59ca8d6fbc856704648ec6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4j6pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-429fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:07Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.002327 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65e722e8-52c4-4bb6-9927-f378b2f7296a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25b2c2479794812333e6425d16a7432c46578a2a6d240682761545e57af6b4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7db76f28a31cfe2c8e33f8d3addf0b75358f97a36710315a069de1716506eae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5c998\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:08Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.004286 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.004342 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.004358 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.004381 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.004401 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:08Z","lastTransitionTime":"2026-03-09T02:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.020036 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7a956f96042c9ac58c08b70e189b199ec512bc56ffedd9df3956127a2380a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:08Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.034216 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f7c21ba0d2836cb5905904941725dc9b5df5e173b47637722608075496850bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ce0f53e8c88e047328fb896ffa5133e9ac723b73795634e24072acde25506d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:08Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.048020 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-429fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f00772dc49e31beb94fb8c970e126f77b33ecff59ca8d6fbc856704648ec6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4j6pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-429fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:08Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.064499 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65e722e8-52c4-4bb6-9927-f378b2f7296a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25b2c2479794812333e6425d16a7432c46578a2a6d240682761545e57af6b4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7db76f28a31cfe2c8e33f8d3addf0b75358f97a36710315a069de1716506eae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5c998\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:08Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.092125 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40c17e04-3fc2-48a2-95dc-fe0428b91e66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3178075164db642d21adaf4deed75f6f295794530311c6476b6f0376101b61c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce11cf214e5d6313c477f4939e0627d14eecc389d7df43bfa0a4bff5152baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eead4ce26a12a03f0e1808e8e8479ca48c9a409935ea566fae8a7cc2d8bc7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c9f87a22de3336b00d43f7848f50bda9f5f27f7ac84c6c6e63a224583005e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a115bbd5ac61babf8f88452e2c50b49fd24b34589357e4a7d59bd2a3511b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce9289308fa0e0313c1e1de4a2da2e1544bf309f32c6f58cf4a914210f10c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4e27513170313b32833f104dcccd7fe37890a5eff0dc33d3993efba6ae93dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2c4c63b1ab59c59daca5ed71ee5ed0e53a565ffbc793ac516f9ba75a4ece272\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmfgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:08Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.102791 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-c2pjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b166f443-7b7c-478b-bfc1-67291aa158fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4l68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-c2pjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:08Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.105151 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.105263 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:43:08 crc kubenswrapper[4901]: E0309 02:43:08.105292 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 02:43:08 crc kubenswrapper[4901]: E0309 02:43:08.105453 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.105553 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:43:08 crc kubenswrapper[4901]: E0309 02:43:08.105657 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.106606 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.106630 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.106639 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.106651 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.106659 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:08Z","lastTransitionTime":"2026-03-09T02:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.131698 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f58555f-103d-42bd-affa-68dd3ed5baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18f78a3248e86d96e6293502db1727285fb76668c8242ed3399de4b8e8fb666a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79c92d776bf96579afc6fbc0e1e70bb645a371966be4e4e9d86e18ef3c977b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7936d7271945f5c0b5ab0ddecf9a3f46c9f7306b71e25fabbb3927303c6f9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1098f7874be8a2a538c966845a946ab7993ec921e4e0859953f0187b4fabfd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23f66811373b03763a100551c0390732ccceac869c908d769de48f847e4ab82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:08Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.145324 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:08Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.162277 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:08Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.181303 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ad2682-53b5-4e9d-acd1-0f0d210b322c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a290d0cc95a867bb7d924bfb2e63a518bb753765d4cdb890ccdf96cd36450b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72901ff5efbd52bce10e4651744925f7a5758a7a3c2572b581ab2cd222f58bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d0fe7f5e959b5c71b03b4ad017949229c3a161ea79f7869e610d069248c552\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 02:42:28.169798 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 02:42:28.169917 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 02:42:28.170597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1507146702/tls.crt::/tmp/serving-cert-1507146702/tls.key\\\\\\\"\\\\nI0309 02:42:28.390239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 02:42:28.392919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 02:42:28.392936 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 02:42:28.392959 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 02:42:28.392964 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 02:42:28.400711 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 02:42:28.400732 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 02:42:28.400760 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 02:42:28.400786 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 02:42:28.400792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 02:42:28.400798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 02:42:28.402404 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99df08aa182691c1bdb9ab9fb7b5352b276c236655b1076e2993639d63a94bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:08Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.197708 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:08Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.209819 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.209866 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.209883 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.209906 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.209923 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:08Z","lastTransitionTime":"2026-03-09T02:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.214253 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jtcxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98293641-7bf9-4473-ae92-c80e56cefdb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9388d004278c87c5f70e21f80ffb92a37910be99d37d4890f1802ca8ee862c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9388d004278c87c5f70e21f80ffb92a37910be99d37d4890f1802ca8ee862c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57469350bdcfa9d44c83dd87ce9ef88a75c2ed0302fd633c6c23f95a60572f9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57469350bdcfa9d44c83dd87ce9ef88a75c2ed0302fd633c6c23f95a60572f9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1536a2d9d7cc0d60c0442b625d3e58b3b5ffa615b5424405b8f896940e4085b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1536a2d9d7cc0d60c0442b625d3e58b3b5ffa615b5424405b8f896940e4085b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a7a4a4bf21360999d06acf702346f565fb76f5a5c4e5a8d1666fa395299d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a7a4a4bf21360999d06acf702346f565fb76f5a5c4e5a8d1666fa395299d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce664be16b67bb0782620399124bae458654d66c9331521c2df213476651b740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce664be16b67bb0782620399124bae458654d66c9331521c2df213476651b740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99311d694d177551b5f59aa756a0674f18816e477c3195cd30d40a805c964940\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99311d694d177551b5f59aa756a0674f18816e477c3195cd30d40a805c964940\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jtcxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:08Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.230749 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad3db3b3-d72c-4e61-9db8-fcbc6abb0578\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95a5fc152f730178294cfb5d25a815b2041e4b712e4bdb7b334b4b417f463ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ceafe61efea5730e474e74a9ed5a7558cc06dc75d105c37680ed1eea84b8827c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:17Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 02:41:48.034714 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 02:41:48.036179 1 observer_polling.go:159] Starting file observer\\\\nI0309 02:41:48.038474 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 02:41:48.039660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 02:42:14.296758 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 02:42:17.547772 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 02:42:17.547837 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:47Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0cd220ceed026b8b0448743b09537807f1d4c59e290c74e4afa63b3331aaf97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c86d550b0e0522073aa52f3a6aa6671f7ded403e37eda7c7b91c6e069272dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04aae8495c8f5edddbe537c44cefdb20eaa781df485766f6e709a5d4c6e82df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:08Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.245884 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84cc042c2e8b6984f9b3357afce277a63722f17b3e8fd289eeeaa8a7a47935f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:08Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.268895 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tvqrz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ac7cbd-671c-4beb-8994-92502ee47ceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee19f70b95aa359f8230d02f66a5ffe7bd83b01af33602ffee28239e335210d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbmmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tvqrz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:08Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.311862 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.311895 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.311904 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.311918 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.311927 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:08Z","lastTransitionTime":"2026-03-09T02:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.414199 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.414294 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.414313 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.414337 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.414361 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:08Z","lastTransitionTime":"2026-03-09T02:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.516887 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.516956 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.516974 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.516999 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.517016 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:08Z","lastTransitionTime":"2026-03-09T02:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.620092 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.620168 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.620187 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.620216 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.620262 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:08Z","lastTransitionTime":"2026-03-09T02:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.723349 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.723410 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.723422 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.723445 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.723462 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:08Z","lastTransitionTime":"2026-03-09T02:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.758843 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-c2pjf" event={"ID":"b166f443-7b7c-478b-bfc1-67291aa158fd","Type":"ContainerStarted","Data":"f409cfbaaf5a97aa5b05433826d0ba084ecc5dc20afb3e733c9cb2e38a2e507a"} Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.765608 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jtcxx" event={"ID":"98293641-7bf9-4473-ae92-c80e56cefdb5","Type":"ContainerStarted","Data":"d8cb9651e7ee62e24bc299a24e24546fc393d36c247cfc7d6fab578f2d5fc392"} Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.784179 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jtcxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98293641-7bf9-4473-ae92-c80e56cefdb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9388d004278c87c5f70e21f80ffb92a37910be99d37d4890f1802ca8ee862c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9388d004278c87c5f70e21f80ffb92a37910be99d37d4890f1802ca8ee862c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57469350bdcfa9d44c83dd87ce9ef88a75c2ed0302fd633c6c23f95a60572f9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57469350bdcfa9d44c83dd87ce9ef88a75c2ed0302fd633c6c23f95a60572f9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1536a2d9d7cc0d60c0442b625d3e58b3b5ffa615b5424405b8f896940e4085b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1536a2d9d7cc0d60c0442b625d3e58b3b5ffa615b5424405b8f896940e4085b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a7a4a4bf21360999d06acf702346f565fb76f5a5c4e5a8d1666fa395299d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a7a4a4bf21360999d06acf702346f565fb76f5a5c4e5a8d1666fa395299d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce664be16b67bb0782620399124bae458654d66c9331521c2df213476651b740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce664be16b67bb0782620399124bae458654d66c9331521c2df213476651b740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99311d694d177551b5f59aa756a0674f18816e477c3195cd30d40a805c964940\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99311d694d177551b5f59aa756a0674f18816e477c3195cd30d40a805c964940\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jtcxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:08Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.804593 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad3db3b3-d72c-4e61-9db8-fcbc6abb0578\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95a5fc152f730178294cfb5d25a815b2041e4b712e4bdb7b334b4b417f463ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ceafe61efea5730e474e74a9ed5a7558cc06dc75d105c37680ed1eea84b8827c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:17Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 02:41:48.034714 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 02:41:48.036179 1 observer_polling.go:159] Starting file observer\\\\nI0309 02:41:48.038474 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 02:41:48.039660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 02:42:14.296758 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 02:42:17.547772 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 02:42:17.547837 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:47Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0cd220ceed026b8b0448743b09537807f1d4c59e290c74e4afa63b3331aaf97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c86d550b0e0522073aa52f3a6aa6671f7ded403e37eda7c7b91c6e069272dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04aae8495c8f5edddbe537c44cefdb20eaa781df485766f6e709a5d4c6e82df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:08Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.821177 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84cc042c2e8b6984f9b3357afce277a63722f17b3e8fd289eeeaa8a7a47935f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:08Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.836960 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.837023 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.837037 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.837055 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.837072 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:08Z","lastTransitionTime":"2026-03-09T02:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.838988 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tvqrz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ac7cbd-671c-4beb-8994-92502ee47ceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee19f70b95aa359f8230d02f66a5ffe7bd83b01af33602ffee28239e335210d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbmmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tvqrz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:08Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.866607 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7a956f96042c9ac58c08b70e189b199ec512bc56ffedd9df3956127a2380a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:08Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.883999 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f7c21ba0d2836cb5905904941725dc9b5df5e173b47637722608075496850bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ce0f53e8c88e047328fb896ffa5133e9ac723b73795634e24072acde25506d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:08Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.905001 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-429fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f00772dc49e31beb94fb8c970e126f77b33ecff59ca8d6fbc856704648ec6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4j6pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-429fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:08Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.925392 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65e722e8-52c4-4bb6-9927-f378b2f7296a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25b2c2479794812333e6425d16a7432c46578a2a6d240682761545e57af6b4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7db76f28a31cfe2c8e33f8d3addf0b75358f97a36710315a069de1716506eae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5c998\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:08Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.940745 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.940814 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.940835 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.940860 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.940878 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:08Z","lastTransitionTime":"2026-03-09T02:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.961748 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40c17e04-3fc2-48a2-95dc-fe0428b91e66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3178075164db642d21adaf4deed75f6f295794530311c6476b6f0376101b61c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce11cf214e5d6313c477f4939e0627d14eecc389d7df43bfa0a4bff5152baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eead4ce26a12a03f0e1808e8e8479ca48c9a409935ea566fae8a7cc2d8bc7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c9f87a22de3336b00d43f7848f50bda9f5f27f7ac84c6c6e63a224583005e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a115bbd5ac61babf8f88452e2c50b49fd24b34589357e4a7d59bd2a3511b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce9289308fa0e0313c1e1de4a2da2e1544bf309f32c6f58cf4a914210f10c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4e27513170313b32833f104dcccd7fe37890a5eff0dc33d3993efba6ae93dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2c4c63b1ab59c59daca5ed71ee5ed0e53a565ffbc793ac516f9ba75a4ece272\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmfgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:08Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:08 crc kubenswrapper[4901]: I0309 02:43:08.981561 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-c2pjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b166f443-7b7c-478b-bfc1-67291aa158fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f409cfbaaf5a97aa5b05433826d0ba084ecc5dc20afb3e733c9cb2e38a2e507a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4l68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-c2pjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:08Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.020627 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f58555f-103d-42bd-affa-68dd3ed5baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18f78a3248e86d96e6293502db1727285fb76668c8242ed3399de4b8e8fb666a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79c92d776bf96579afc6fbc0e1e70bb645a371966be4e4e9d86e18ef3c977b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7936d7271945f5c0b5ab0ddecf9a3f46c9f7306b71e25fabbb3927303c6f9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1098f7874be8a2a538c966845a946ab7993ec921e4e0859953f0187b4fabfd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23f66811373b03763a100551c0390732ccceac869c908d769de48f847e4ab82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:09Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.043818 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.043899 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.043920 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.043942 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.043780 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:09Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.043973 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:09Z","lastTransitionTime":"2026-03-09T02:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.061462 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:09Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.076557 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ad2682-53b5-4e9d-acd1-0f0d210b322c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a290d0cc95a867bb7d924bfb2e63a518bb753765d4cdb890ccdf96cd36450b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72901ff5efbd52bce10e4651744925f7a5758a7a3c2572b581ab2cd222f58bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d0fe7f5e959b5c71b03b4ad017949229c3a161ea79f7869e610d069248c552\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 02:42:28.169798 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 02:42:28.169917 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 02:42:28.170597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1507146702/tls.crt::/tmp/serving-cert-1507146702/tls.key\\\\\\\"\\\\nI0309 02:42:28.390239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 02:42:28.392919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 02:42:28.392936 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 02:42:28.392959 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 02:42:28.392964 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 02:42:28.400711 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 02:42:28.400732 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 02:42:28.400760 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 02:42:28.400786 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 02:42:28.400792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 02:42:28.400798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 02:42:28.402404 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99df08aa182691c1bdb9ab9fb7b5352b276c236655b1076e2993639d63a94bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:09Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.090331 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:09Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.105510 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ad2682-53b5-4e9d-acd1-0f0d210b322c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a290d0cc95a867bb7d924bfb2e63a518bb753765d4cdb890ccdf96cd36450b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72901ff5efbd52bce10e4651744925f7a5758a7a3c2572b581ab2cd222f58bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d0fe7f5e959b5c71b03b4ad017949229c3a161ea79f7869e610d069248c552\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 02:42:28.169798 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 02:42:28.169917 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 02:42:28.170597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1507146702/tls.crt::/tmp/serving-cert-1507146702/tls.key\\\\\\\"\\\\nI0309 02:42:28.390239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 02:42:28.392919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 02:42:28.392936 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 02:42:28.392959 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 02:42:28.392964 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 02:42:28.400711 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 02:42:28.400732 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 02:42:28.400760 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 02:42:28.400786 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 02:42:28.400792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 02:42:28.400798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 02:42:28.402404 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99df08aa182691c1bdb9ab9fb7b5352b276c236655b1076e2993639d63a94bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:09Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.117771 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:09Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.130256 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad3db3b3-d72c-4e61-9db8-fcbc6abb0578\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95a5fc152f730178294cfb5d25a815b2041e4b712e4bdb7b334b4b417f463ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ceafe61efea5730e474e74a9ed5a7558cc06dc75d105c37680ed1eea84b8827c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:17Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 02:41:48.034714 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 02:41:48.036179 1 observer_polling.go:159] Starting file observer\\\\nI0309 02:41:48.038474 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 02:41:48.039660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 02:42:14.296758 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 02:42:17.547772 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 02:42:17.547837 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:47Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0cd220ceed026b8b0448743b09537807f1d4c59e290c74e4afa63b3331aaf97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c86d550b0e0522073aa52f3a6aa6671f7ded403e37eda7c7b91c6e069272dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04aae8495c8f5edddbe537c44cefdb20eaa781df485766f6e709a5d4c6e82df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:09Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.144599 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84cc042c2e8b6984f9b3357afce277a63722f17b3e8fd289eeeaa8a7a47935f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:09Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.145959 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.146027 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.146050 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.146084 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.146110 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:09Z","lastTransitionTime":"2026-03-09T02:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.158724 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tvqrz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ac7cbd-671c-4beb-8994-92502ee47ceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee19f70b95aa359f8230d02f66a5ffe7bd83b01af33602ffee28239e335210d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbmmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tvqrz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:09Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.174278 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jtcxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98293641-7bf9-4473-ae92-c80e56cefdb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8cb9651e7ee62e24bc299a24e24546fc393d36c247cfc7d6fab578f2d5fc392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9388d004278c87c5f70e21f80ffb92a37910be99d37d4890f1802ca8ee862c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9388d004278c87c5f70e21f80ffb92a37910be99d37d4890f1802ca8ee862c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57469350bdcfa9d44c83dd87ce9ef88a75c2ed0302fd633c6c23f95a60572f9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57469350bdcfa9d44c83dd87ce9ef88a75c2ed0302fd633c6c23f95a60572f9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1536a2d9d7cc0d60c0442b625d3e58b3b5ffa615b5424405b8f896940e4085b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1536a2d9d7cc0d60c0442b625d3e58b3b5ffa615b5424405b8f896940e4085b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a7a4a4bf21360999d06acf702346f565fb76f5a5c4e5a8d1666fa395299d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a7a4a4bf21360999d06acf702346f565fb76f5a5c4e5a8d1666fa395299d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce664be16b67bb0782620399124bae458654d66c9331521c2df213476651b740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce664be16b67bb0782620399124bae458654d66c9331521c2df213476651b740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99311d694d177551b5f59aa756a0674f18816e477c3195cd30d40a805c964940\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99311d694d177551b5f59aa756a0674f18816e477c3195cd30d40a805c964940\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jtcxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:09Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.194507 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7a956f96042c9ac58c08b70e189b199ec512bc56ffedd9df3956127a2380a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:09Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.216791 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f7c21ba0d2836cb5905904941725dc9b5df5e173b47637722608075496850bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ce0f53e8c88e047328fb896ffa5133e9ac723b73795634e24072acde25506d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:09Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.232847 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-429fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f00772dc49e31beb94fb8c970e126f77b33ecff59ca8d6fbc856704648ec6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4j6pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-429fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:09Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.248322 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.248362 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.248371 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.248385 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.248395 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:09Z","lastTransitionTime":"2026-03-09T02:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.249626 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65e722e8-52c4-4bb6-9927-f378b2f7296a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25b2c2479794812333e6425d16a7432c46578a2a6d240682761545e57af6b4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7db76f28a31cfe2c8e33f8d3addf0b75358f97a36710315a069de1716506eae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5c998\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:09Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.281780 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f58555f-103d-42bd-affa-68dd3ed5baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18f78a3248e86d96e6293502db1727285fb76668c8242ed3399de4b8e8fb666a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79c92d776bf96579afc6fbc0e1e70bb645a371966be4e4e9d86e18ef3c977b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7936d7271945f5c0b5ab0ddecf9a3f46c9f7306b71e25fabbb3927303c6f9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1098f7874be8a2a538c966845a946ab7993ec921e4e0859953f0187b4fabfd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23f66811373b03763a100551c0390732ccceac869c908d769de48f847e4ab82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:09Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.295994 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:09Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.312660 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:09Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.332497 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40c17e04-3fc2-48a2-95dc-fe0428b91e66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3178075164db642d21adaf4deed75f6f295794530311c6476b6f0376101b61c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce11cf214e5d6313c477f4939e0627d14eecc389d7df43bfa0a4bff5152baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eead4ce26a12a03f0e1808e8e8479ca48c9a409935ea566fae8a7cc2d8bc7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c9f87a22de3336b00d43f7848f50bda9f5f27f7ac84c6c6e63a224583005e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a115bbd5ac61babf8f88452e2c50b49fd24b34589357e4a7d59bd2a3511b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce9289308fa0e0313c1e1de4a2da2e1544bf309f32c6f58cf4a914210f10c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4e27513170313b32833f104dcccd7fe37890a5eff0dc33d3993efba6ae93dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2c4c63b1ab59c59daca5ed71ee5ed0e53a565ffbc793ac516f9ba75a4ece272\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmfgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:09Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.345461 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-c2pjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b166f443-7b7c-478b-bfc1-67291aa158fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f409cfbaaf5a97aa5b05433826d0ba084ecc5dc20afb3e733c9cb2e38a2e507a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4l68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-c2pjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:09Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.351383 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.351433 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.351451 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.351475 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.351510 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:09Z","lastTransitionTime":"2026-03-09T02:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.454641 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.454693 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.454712 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.454738 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.454755 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:09Z","lastTransitionTime":"2026-03-09T02:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.557698 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.557765 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.557790 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.557821 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.557845 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:09Z","lastTransitionTime":"2026-03-09T02:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.661313 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.661359 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.661377 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.661408 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.661433 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:09Z","lastTransitionTime":"2026-03-09T02:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.764891 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.764997 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.765022 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.765047 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.765064 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:09Z","lastTransitionTime":"2026-03-09T02:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.867725 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.867780 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.867795 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.867870 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.867890 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:09Z","lastTransitionTime":"2026-03-09T02:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.971759 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.971814 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.971832 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.971856 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:09 crc kubenswrapper[4901]: I0309 02:43:09.971874 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:09Z","lastTransitionTime":"2026-03-09T02:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:10 crc kubenswrapper[4901]: I0309 02:43:10.074552 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:10 crc kubenswrapper[4901]: I0309 02:43:10.074635 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:10 crc kubenswrapper[4901]: I0309 02:43:10.074660 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:10 crc kubenswrapper[4901]: I0309 02:43:10.074698 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:10 crc kubenswrapper[4901]: I0309 02:43:10.074722 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:10Z","lastTransitionTime":"2026-03-09T02:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:10 crc kubenswrapper[4901]: I0309 02:43:10.106166 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:43:10 crc kubenswrapper[4901]: I0309 02:43:10.106352 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:43:10 crc kubenswrapper[4901]: E0309 02:43:10.106608 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 02:43:10 crc kubenswrapper[4901]: I0309 02:43:10.106760 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:43:10 crc kubenswrapper[4901]: E0309 02:43:10.106799 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 02:43:10 crc kubenswrapper[4901]: E0309 02:43:10.106974 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 02:43:10 crc kubenswrapper[4901]: I0309 02:43:10.177888 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:10 crc kubenswrapper[4901]: I0309 02:43:10.177954 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:10 crc kubenswrapper[4901]: I0309 02:43:10.177975 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:10 crc kubenswrapper[4901]: I0309 02:43:10.177999 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:10 crc kubenswrapper[4901]: I0309 02:43:10.178022 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:10Z","lastTransitionTime":"2026-03-09T02:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:10 crc kubenswrapper[4901]: I0309 02:43:10.281730 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:10 crc kubenswrapper[4901]: I0309 02:43:10.281795 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:10 crc kubenswrapper[4901]: I0309 02:43:10.281816 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:10 crc kubenswrapper[4901]: I0309 02:43:10.281842 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:10 crc kubenswrapper[4901]: I0309 02:43:10.281860 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:10Z","lastTransitionTime":"2026-03-09T02:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:10 crc kubenswrapper[4901]: I0309 02:43:10.385077 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:10 crc kubenswrapper[4901]: I0309 02:43:10.385135 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:10 crc kubenswrapper[4901]: I0309 02:43:10.385155 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:10 crc kubenswrapper[4901]: I0309 02:43:10.385184 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:10 crc kubenswrapper[4901]: I0309 02:43:10.385201 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:10Z","lastTransitionTime":"2026-03-09T02:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:10 crc kubenswrapper[4901]: I0309 02:43:10.487870 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:10 crc kubenswrapper[4901]: I0309 02:43:10.487978 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:10 crc kubenswrapper[4901]: I0309 02:43:10.488000 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:10 crc kubenswrapper[4901]: I0309 02:43:10.488025 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:10 crc kubenswrapper[4901]: I0309 02:43:10.488043 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:10Z","lastTransitionTime":"2026-03-09T02:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:10 crc kubenswrapper[4901]: I0309 02:43:10.590645 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:10 crc kubenswrapper[4901]: I0309 02:43:10.590764 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:10 crc kubenswrapper[4901]: I0309 02:43:10.590784 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:10 crc kubenswrapper[4901]: I0309 02:43:10.590809 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:10 crc kubenswrapper[4901]: I0309 02:43:10.590826 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:10Z","lastTransitionTime":"2026-03-09T02:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:10 crc kubenswrapper[4901]: I0309 02:43:10.693870 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:10 crc kubenswrapper[4901]: I0309 02:43:10.693922 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:10 crc kubenswrapper[4901]: I0309 02:43:10.693939 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:10 crc kubenswrapper[4901]: I0309 02:43:10.693961 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:10 crc kubenswrapper[4901]: I0309 02:43:10.693977 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:10Z","lastTransitionTime":"2026-03-09T02:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:10 crc kubenswrapper[4901]: I0309 02:43:10.796442 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:10 crc kubenswrapper[4901]: I0309 02:43:10.796502 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:10 crc kubenswrapper[4901]: I0309 02:43:10.796519 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:10 crc kubenswrapper[4901]: I0309 02:43:10.796544 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:10 crc kubenswrapper[4901]: I0309 02:43:10.796565 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:10Z","lastTransitionTime":"2026-03-09T02:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:10 crc kubenswrapper[4901]: I0309 02:43:10.898924 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:10 crc kubenswrapper[4901]: I0309 02:43:10.898963 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:10 crc kubenswrapper[4901]: I0309 02:43:10.898973 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:10 crc kubenswrapper[4901]: I0309 02:43:10.898989 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:10 crc kubenswrapper[4901]: I0309 02:43:10.899000 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:10Z","lastTransitionTime":"2026-03-09T02:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:11 crc kubenswrapper[4901]: I0309 02:43:11.001797 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:11 crc kubenswrapper[4901]: I0309 02:43:11.001859 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:11 crc kubenswrapper[4901]: I0309 02:43:11.001876 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:11 crc kubenswrapper[4901]: I0309 02:43:11.001900 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:11 crc kubenswrapper[4901]: I0309 02:43:11.001917 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:11Z","lastTransitionTime":"2026-03-09T02:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:11 crc kubenswrapper[4901]: I0309 02:43:11.104757 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:11 crc kubenswrapper[4901]: I0309 02:43:11.104836 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:11 crc kubenswrapper[4901]: I0309 02:43:11.104860 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:11 crc kubenswrapper[4901]: I0309 02:43:11.104892 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:11 crc kubenswrapper[4901]: I0309 02:43:11.104918 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:11Z","lastTransitionTime":"2026-03-09T02:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:11 crc kubenswrapper[4901]: I0309 02:43:11.207608 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:11 crc kubenswrapper[4901]: I0309 02:43:11.207669 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:11 crc kubenswrapper[4901]: I0309 02:43:11.207687 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:11 crc kubenswrapper[4901]: I0309 02:43:11.207713 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:11 crc kubenswrapper[4901]: I0309 02:43:11.207730 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:11Z","lastTransitionTime":"2026-03-09T02:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:11 crc kubenswrapper[4901]: I0309 02:43:11.310692 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:11 crc kubenswrapper[4901]: I0309 02:43:11.310783 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:11 crc kubenswrapper[4901]: I0309 02:43:11.310803 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:11 crc kubenswrapper[4901]: I0309 02:43:11.310866 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:11 crc kubenswrapper[4901]: I0309 02:43:11.310886 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:11Z","lastTransitionTime":"2026-03-09T02:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:11 crc kubenswrapper[4901]: I0309 02:43:11.414452 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:11 crc kubenswrapper[4901]: I0309 02:43:11.414556 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:11 crc kubenswrapper[4901]: I0309 02:43:11.414577 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:11 crc kubenswrapper[4901]: I0309 02:43:11.414602 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:11 crc kubenswrapper[4901]: I0309 02:43:11.414620 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:11Z","lastTransitionTime":"2026-03-09T02:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:11 crc kubenswrapper[4901]: I0309 02:43:11.517168 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:11 crc kubenswrapper[4901]: I0309 02:43:11.517249 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:11 crc kubenswrapper[4901]: I0309 02:43:11.517269 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:11 crc kubenswrapper[4901]: I0309 02:43:11.517294 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:11 crc kubenswrapper[4901]: I0309 02:43:11.517311 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:11Z","lastTransitionTime":"2026-03-09T02:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:11 crc kubenswrapper[4901]: I0309 02:43:11.620304 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:11 crc kubenswrapper[4901]: I0309 02:43:11.620355 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:11 crc kubenswrapper[4901]: I0309 02:43:11.620367 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:11 crc kubenswrapper[4901]: I0309 02:43:11.620382 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:11 crc kubenswrapper[4901]: I0309 02:43:11.620391 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:11Z","lastTransitionTime":"2026-03-09T02:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:11 crc kubenswrapper[4901]: I0309 02:43:11.733077 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:11 crc kubenswrapper[4901]: I0309 02:43:11.733150 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:11 crc kubenswrapper[4901]: I0309 02:43:11.733171 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:11 crc kubenswrapper[4901]: I0309 02:43:11.733199 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:11 crc kubenswrapper[4901]: I0309 02:43:11.733253 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:11Z","lastTransitionTime":"2026-03-09T02:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:11 crc kubenswrapper[4901]: I0309 02:43:11.836253 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:11 crc kubenswrapper[4901]: I0309 02:43:11.836352 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:11 crc kubenswrapper[4901]: I0309 02:43:11.836373 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:11 crc kubenswrapper[4901]: I0309 02:43:11.836400 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:11 crc kubenswrapper[4901]: I0309 02:43:11.836418 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:11Z","lastTransitionTime":"2026-03-09T02:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:11 crc kubenswrapper[4901]: I0309 02:43:11.940372 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:11 crc kubenswrapper[4901]: I0309 02:43:11.940437 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:11 crc kubenswrapper[4901]: I0309 02:43:11.940454 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:11 crc kubenswrapper[4901]: I0309 02:43:11.940480 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:11 crc kubenswrapper[4901]: I0309 02:43:11.940505 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:11Z","lastTransitionTime":"2026-03-09T02:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:12 crc kubenswrapper[4901]: I0309 02:43:12.043830 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:12 crc kubenswrapper[4901]: I0309 02:43:12.043892 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:12 crc kubenswrapper[4901]: I0309 02:43:12.043913 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:12 crc kubenswrapper[4901]: I0309 02:43:12.043938 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:12 crc kubenswrapper[4901]: I0309 02:43:12.043955 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:12Z","lastTransitionTime":"2026-03-09T02:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:12 crc kubenswrapper[4901]: I0309 02:43:12.105197 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:43:12 crc kubenswrapper[4901]: I0309 02:43:12.105290 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:43:12 crc kubenswrapper[4901]: I0309 02:43:12.105347 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:43:12 crc kubenswrapper[4901]: E0309 02:43:12.105451 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 02:43:12 crc kubenswrapper[4901]: E0309 02:43:12.105540 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 02:43:12 crc kubenswrapper[4901]: E0309 02:43:12.105633 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 02:43:12 crc kubenswrapper[4901]: I0309 02:43:12.146940 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:12 crc kubenswrapper[4901]: I0309 02:43:12.146987 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:12 crc kubenswrapper[4901]: I0309 02:43:12.146999 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:12 crc kubenswrapper[4901]: I0309 02:43:12.147039 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:12 crc kubenswrapper[4901]: I0309 02:43:12.147051 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:12Z","lastTransitionTime":"2026-03-09T02:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:12 crc kubenswrapper[4901]: I0309 02:43:12.254804 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:12 crc kubenswrapper[4901]: I0309 02:43:12.255361 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:12 crc kubenswrapper[4901]: I0309 02:43:12.255546 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:12 crc kubenswrapper[4901]: I0309 02:43:12.255714 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:12 crc kubenswrapper[4901]: I0309 02:43:12.255863 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:12Z","lastTransitionTime":"2026-03-09T02:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:12 crc kubenswrapper[4901]: I0309 02:43:12.362675 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:12 crc kubenswrapper[4901]: I0309 02:43:12.362772 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:12 crc kubenswrapper[4901]: I0309 02:43:12.362789 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:12 crc kubenswrapper[4901]: I0309 02:43:12.362813 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:12 crc kubenswrapper[4901]: I0309 02:43:12.362831 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:12Z","lastTransitionTime":"2026-03-09T02:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:12 crc kubenswrapper[4901]: I0309 02:43:12.465978 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:12 crc kubenswrapper[4901]: I0309 02:43:12.466034 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:12 crc kubenswrapper[4901]: I0309 02:43:12.466050 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:12 crc kubenswrapper[4901]: I0309 02:43:12.466068 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:12 crc kubenswrapper[4901]: I0309 02:43:12.466082 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:12Z","lastTransitionTime":"2026-03-09T02:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:12 crc kubenswrapper[4901]: I0309 02:43:12.569437 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:12 crc kubenswrapper[4901]: I0309 02:43:12.569499 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:12 crc kubenswrapper[4901]: I0309 02:43:12.569517 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:12 crc kubenswrapper[4901]: I0309 02:43:12.569543 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:12 crc kubenswrapper[4901]: I0309 02:43:12.569560 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:12Z","lastTransitionTime":"2026-03-09T02:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:12 crc kubenswrapper[4901]: I0309 02:43:12.672807 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:12 crc kubenswrapper[4901]: I0309 02:43:12.672860 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:12 crc kubenswrapper[4901]: I0309 02:43:12.672891 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:12 crc kubenswrapper[4901]: I0309 02:43:12.672916 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:12 crc kubenswrapper[4901]: I0309 02:43:12.672943 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:12Z","lastTransitionTime":"2026-03-09T02:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:12 crc kubenswrapper[4901]: I0309 02:43:12.776845 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:12 crc kubenswrapper[4901]: I0309 02:43:12.777114 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:12 crc kubenswrapper[4901]: I0309 02:43:12.777140 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:12 crc kubenswrapper[4901]: I0309 02:43:12.777175 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:12 crc kubenswrapper[4901]: I0309 02:43:12.777197 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:12Z","lastTransitionTime":"2026-03-09T02:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:12 crc kubenswrapper[4901]: I0309 02:43:12.880333 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:12 crc kubenswrapper[4901]: I0309 02:43:12.880407 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:12 crc kubenswrapper[4901]: I0309 02:43:12.880425 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:12 crc kubenswrapper[4901]: I0309 02:43:12.880450 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:12 crc kubenswrapper[4901]: I0309 02:43:12.880468 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:12Z","lastTransitionTime":"2026-03-09T02:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:12 crc kubenswrapper[4901]: I0309 02:43:12.983747 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:12 crc kubenswrapper[4901]: I0309 02:43:12.984612 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:12 crc kubenswrapper[4901]: I0309 02:43:12.984761 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:12 crc kubenswrapper[4901]: I0309 02:43:12.984933 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:12 crc kubenswrapper[4901]: I0309 02:43:12.985073 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:12Z","lastTransitionTime":"2026-03-09T02:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.089538 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.089575 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.089590 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.089612 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.089629 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:13Z","lastTransitionTime":"2026-03-09T02:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.193673 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.193715 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.193725 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.193739 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.193751 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:13Z","lastTransitionTime":"2026-03-09T02:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.297122 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.297205 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.297251 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.297283 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.297302 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:13Z","lastTransitionTime":"2026-03-09T02:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.400737 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.400786 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.400795 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.400810 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.400819 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:13Z","lastTransitionTime":"2026-03-09T02:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.421464 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz59t"] Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.422299 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz59t" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.424108 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.425110 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.441671 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad3db3b3-d72c-4e61-9db8-fcbc6abb0578\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95a5fc152f730178294cfb5d25a815b2041e4b712e4bdb7b334b4b417f463ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ceafe61efea5730e474e74a9ed5a7558cc06dc75d105c37680ed1eea84b8827c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:17Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 02:41:48.034714 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 02:41:48.036179 1 observer_polling.go:159] Starting file observer\\\\nI0309 02:41:48.038474 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 02:41:48.039660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 02:42:14.296758 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 02:42:17.547772 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 02:42:17.547837 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:47Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0cd220ceed026b8b0448743b09537807f1d4c59e290c74e4afa63b3331aaf97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c86d550b0e0522073aa52f3a6aa6671f7ded403e37eda7c7b91c6e069272dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04aae8495c8f5edddbe537c44cefdb20eaa781df485766f6e709a5d4c6e82df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:13Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.462675 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84cc042c2e8b6984f9b3357afce277a63722f17b3e8fd289eeeaa8a7a47935f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:13Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.476037 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tvqrz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ac7cbd-671c-4beb-8994-92502ee47ceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee19f70b95aa359f8230d02f66a5ffe7bd83b01af33602ffee28239e335210d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbmmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tvqrz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:13Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.497940 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jtcxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98293641-7bf9-4473-ae92-c80e56cefdb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8cb9651e7ee62e24bc299a24e24546fc393d36c247cfc7d6fab578f2d5fc392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9388d004278c87c5f70e21f80ffb92a37910be99d37d4890f1802ca8ee862c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9388d004278c87c5f70e21f80ffb92a37910be99d37d4890f1802ca8ee862c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57469350bdcfa9d44c83dd87ce9ef88a75c2ed0302fd633c6c23f95a60572f9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57469350bdcfa9d44c83dd87ce9ef88a75c2ed0302fd633c6c23f95a60572f9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1536a2d9d7cc0d60c0442b625d3e58b3b5ffa615b5424405b8f896940e4085b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1536a2d9d7cc0d60c0442b625d3e58b3b5ffa615b5424405b8f896940e4085b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a7a4a4bf21360999d06acf702346f565fb76f5a5c4e5a8d1666fa395299d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a7a4a4bf21360999d06acf702346f565fb76f5a5c4e5a8d1666fa395299d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce664be16b67bb0782620399124bae458654d66c9331521c2df213476651b740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce664be16b67bb0782620399124bae458654d66c9331521c2df213476651b740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99311d694d177551b5f59aa756a0674f18816e477c3195cd30d40a805c964940\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99311d694d177551b5f59aa756a0674f18816e477c3195cd30d40a805c964940\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jtcxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:13Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.503144 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.503354 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.503510 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.503660 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.503785 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:13Z","lastTransitionTime":"2026-03-09T02:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.511564 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz59t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d14fa511-1706-4ecd-9190-c39af889657c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5wgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5wgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rz59t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:13Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.522470 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d14fa511-1706-4ecd-9190-c39af889657c-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-rz59t\" (UID: \"d14fa511-1706-4ecd-9190-c39af889657c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz59t" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.522506 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5wgq\" (UniqueName: \"kubernetes.io/projected/d14fa511-1706-4ecd-9190-c39af889657c-kube-api-access-j5wgq\") pod \"ovnkube-control-plane-749d76644c-rz59t\" (UID: \"d14fa511-1706-4ecd-9190-c39af889657c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz59t" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.522525 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d14fa511-1706-4ecd-9190-c39af889657c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-rz59t\" (UID: \"d14fa511-1706-4ecd-9190-c39af889657c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz59t" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.522547 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d14fa511-1706-4ecd-9190-c39af889657c-env-overrides\") pod \"ovnkube-control-plane-749d76644c-rz59t\" (UID: \"d14fa511-1706-4ecd-9190-c39af889657c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz59t" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.529790 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7a956f96042c9ac58c08b70e189b199ec512bc56ffedd9df3956127a2380a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:13Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.546141 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f7c21ba0d2836cb5905904941725dc9b5df5e173b47637722608075496850bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ce0f53e8c88e047328fb896ffa5133e9ac723b73795634e24072acde25506d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:13Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.566605 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-429fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f00772dc49e31beb94fb8c970e126f77b33ecff59ca8d6fbc856704648ec6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4j6pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-429fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:13Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.585522 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65e722e8-52c4-4bb6-9927-f378b2f7296a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25b2c2479794812333e6425d16a7432c46578a2a6d240682761545e57af6b4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7db76f28a31cfe2c8e33f8d3addf0b75358f97a36710315a069de1716506eae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5c998\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:13Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.607732 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.607770 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.607782 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.607800 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.607812 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:13Z","lastTransitionTime":"2026-03-09T02:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.618806 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f58555f-103d-42bd-affa-68dd3ed5baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18f78a3248e86d96e6293502db1727285fb76668c8242ed3399de4b8e8fb666a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79c92d776bf96579afc6fbc0e1e70bb645a371966be4e4e9d86e18ef3c977b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7936d7271945f5c0b5ab0ddecf9a3f46c9f7306b71e25fabbb3927303c6f9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1098f7874be8a2a538c966845a946ab7993ec921e4e0859953f0187b4fabfd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23f66811373b03763a100551c0390732ccceac869c908d769de48f847e4ab82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:13Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.623299 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d14fa511-1706-4ecd-9190-c39af889657c-env-overrides\") pod \"ovnkube-control-plane-749d76644c-rz59t\" (UID: \"d14fa511-1706-4ecd-9190-c39af889657c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz59t" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.623430 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d14fa511-1706-4ecd-9190-c39af889657c-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-rz59t\" (UID: \"d14fa511-1706-4ecd-9190-c39af889657c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz59t" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.623467 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5wgq\" (UniqueName: \"kubernetes.io/projected/d14fa511-1706-4ecd-9190-c39af889657c-kube-api-access-j5wgq\") pod \"ovnkube-control-plane-749d76644c-rz59t\" (UID: \"d14fa511-1706-4ecd-9190-c39af889657c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz59t" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.623514 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d14fa511-1706-4ecd-9190-c39af889657c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-rz59t\" (UID: \"d14fa511-1706-4ecd-9190-c39af889657c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz59t" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.624321 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d14fa511-1706-4ecd-9190-c39af889657c-env-overrides\") pod \"ovnkube-control-plane-749d76644c-rz59t\" (UID: \"d14fa511-1706-4ecd-9190-c39af889657c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz59t" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.625309 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d14fa511-1706-4ecd-9190-c39af889657c-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-rz59t\" (UID: \"d14fa511-1706-4ecd-9190-c39af889657c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz59t" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.631463 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d14fa511-1706-4ecd-9190-c39af889657c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-rz59t\" (UID: \"d14fa511-1706-4ecd-9190-c39af889657c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz59t" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.639362 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:13Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.657683 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5wgq\" (UniqueName: \"kubernetes.io/projected/d14fa511-1706-4ecd-9190-c39af889657c-kube-api-access-j5wgq\") pod \"ovnkube-control-plane-749d76644c-rz59t\" (UID: \"d14fa511-1706-4ecd-9190-c39af889657c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz59t" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.658693 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:13Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.691047 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40c17e04-3fc2-48a2-95dc-fe0428b91e66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3178075164db642d21adaf4deed75f6f295794530311c6476b6f0376101b61c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce11cf214e5d6313c477f4939e0627d14eecc389d7df43bfa0a4bff5152baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eead4ce26a12a03f0e1808e8e8479ca48c9a409935ea566fae8a7cc2d8bc7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c9f87a22de3336b00d43f7848f50bda9f5f27f7ac84c6c6e63a224583005e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a115bbd5ac61babf8f88452e2c50b49fd24b34589357e4a7d59bd2a3511b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce9289308fa0e0313c1e1de4a2da2e1544bf309f32c6f58cf4a914210f10c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4e27513170313b32833f104dcccd7fe37890a5eff0dc33d3993efba6ae93dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2c4c63b1ab59c59daca5ed71ee5ed0e53a565ffbc793ac516f9ba75a4ece272\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmfgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:13Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.709188 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-c2pjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b166f443-7b7c-478b-bfc1-67291aa158fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f409cfbaaf5a97aa5b05433826d0ba084ecc5dc20afb3e733c9cb2e38a2e507a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4l68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-c2pjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:13Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.714107 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.714162 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.714180 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.714205 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.714251 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:13Z","lastTransitionTime":"2026-03-09T02:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.734480 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ad2682-53b5-4e9d-acd1-0f0d210b322c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a290d0cc95a867bb7d924bfb2e63a518bb753765d4cdb890ccdf96cd36450b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72901ff5efbd52bce10e4651744925f7a5758a7a3c2572b581ab2cd222f58bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d0fe7f5e959b5c71b03b4ad017949229c3a161ea79f7869e610d069248c552\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 02:42:28.169798 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 02:42:28.169917 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 02:42:28.170597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1507146702/tls.crt::/tmp/serving-cert-1507146702/tls.key\\\\\\\"\\\\nI0309 02:42:28.390239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 02:42:28.392919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 02:42:28.392936 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 02:42:28.392959 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 02:42:28.392964 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 02:42:28.400711 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 02:42:28.400732 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 02:42:28.400760 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 02:42:28.400786 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 02:42:28.400792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 02:42:28.400798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 02:42:28.402404 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99df08aa182691c1bdb9ab9fb7b5352b276c236655b1076e2993639d63a94bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:13Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.742739 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz59t" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.757585 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:13Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:13 crc kubenswrapper[4901]: W0309 02:43:13.768187 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd14fa511_1706_4ecd_9190_c39af889657c.slice/crio-d2128687ed2af37a56f185946ae01f4126db675ce2a04bc198a6fa2d0e0fe7a4 WatchSource:0}: Error finding container d2128687ed2af37a56f185946ae01f4126db675ce2a04bc198a6fa2d0e0fe7a4: Status 404 returned error can't find the container with id d2128687ed2af37a56f185946ae01f4126db675ce2a04bc198a6fa2d0e0fe7a4 Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.784826 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz59t" event={"ID":"d14fa511-1706-4ecd-9190-c39af889657c","Type":"ContainerStarted","Data":"d2128687ed2af37a56f185946ae01f4126db675ce2a04bc198a6fa2d0e0fe7a4"} Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.786355 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmfgc_40c17e04-3fc2-48a2-95dc-fe0428b91e66/ovnkube-controller/0.log" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.788204 4901 generic.go:334] "Generic (PLEG): container finished" podID="40c17e04-3fc2-48a2-95dc-fe0428b91e66" containerID="e4e27513170313b32833f104dcccd7fe37890a5eff0dc33d3993efba6ae93dbb" exitCode=1 Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.788252 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" event={"ID":"40c17e04-3fc2-48a2-95dc-fe0428b91e66","Type":"ContainerDied","Data":"e4e27513170313b32833f104dcccd7fe37890a5eff0dc33d3993efba6ae93dbb"} Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.788954 4901 scope.go:117] "RemoveContainer" containerID="e4e27513170313b32833f104dcccd7fe37890a5eff0dc33d3993efba6ae93dbb" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.803757 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-c2pjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b166f443-7b7c-478b-bfc1-67291aa158fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f409cfbaaf5a97aa5b05433826d0ba084ecc5dc20afb3e733c9cb2e38a2e507a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4l68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-c2pjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:13Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.817464 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.817501 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.817509 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.817525 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.817535 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:13Z","lastTransitionTime":"2026-03-09T02:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.824421 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f58555f-103d-42bd-affa-68dd3ed5baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18f78a3248e86d96e6293502db1727285fb76668c8242ed3399de4b8e8fb666a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79c92d776bf96579afc6fbc0e1e70bb645a371966be4e4e9d86e18ef3c977b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7936d7271945f5c0b5ab0ddecf9a3f46c9f7306b71e25fabbb3927303c6f9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1098f7874be8a2a538c966845a946ab7993ec921e4e0859953f0187b4fabfd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23f66811373b03763a100551c0390732ccceac869c908d769de48f847e4ab82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:13Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.839442 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:13Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.854788 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:13Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.872735 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40c17e04-3fc2-48a2-95dc-fe0428b91e66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3178075164db642d21adaf4deed75f6f295794530311c6476b6f0376101b61c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce11cf214e5d6313c477f4939e0627d14eecc389d7df43bfa0a4bff5152baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eead4ce26a12a03f0e1808e8e8479ca48c9a409935ea566fae8a7cc2d8bc7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c9f87a22de3336b00d43f7848f50bda9f5f27f7ac84c6c6e63a224583005e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a115bbd5ac61babf8f88452e2c50b49fd24b34589357e4a7d59bd2a3511b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce9289308fa0e0313c1e1de4a2da2e1544bf309f32c6f58cf4a914210f10c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4e27513170313b32833f104dcccd7fe37890a5eff0dc33d3993efba6ae93dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4e27513170313b32833f104dcccd7fe37890a5eff0dc33d3993efba6ae93dbb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T02:43:13Z\\\",\\\"message\\\":\\\"oval\\\\nI0309 02:43:13.253509 6746 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0309 02:43:13.253619 6746 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 02:43:13.253649 6746 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 02:43:13.253698 6746 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0309 02:43:13.253754 6746 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0309 02:43:13.253789 6746 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0309 02:43:13.253846 6746 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0309 02:43:13.253850 6746 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0309 02:43:13.253882 6746 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0309 02:43:13.253906 6746 factory.go:656] Stopping watch factory\\\\nI0309 02:43:13.253925 6746 ovnkube.go:599] Stopped ovnkube\\\\nI0309 02:43:13.253822 6746 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 02:43:13.253973 6746 handler.go:208] Removed *v1.Node event handler 7\\\\nI0309 02:43:13.253985 6746 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0309 02:43:13.253996 6746 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0309 02:43:13.254012 6746 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2c4c63b1ab59c59daca5ed71ee5ed0e53a565ffbc793ac516f9ba75a4ece272\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmfgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:13Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.890919 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ad2682-53b5-4e9d-acd1-0f0d210b322c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a290d0cc95a867bb7d924bfb2e63a518bb753765d4cdb890ccdf96cd36450b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72901ff5efbd52bce10e4651744925f7a5758a7a3c2572b581ab2cd222f58bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d0fe7f5e959b5c71b03b4ad017949229c3a161ea79f7869e610d069248c552\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 02:42:28.169798 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 02:42:28.169917 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 02:42:28.170597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1507146702/tls.crt::/tmp/serving-cert-1507146702/tls.key\\\\\\\"\\\\nI0309 02:42:28.390239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 02:42:28.392919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 02:42:28.392936 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 02:42:28.392959 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 02:42:28.392964 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 02:42:28.400711 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 02:42:28.400732 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 02:42:28.400760 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 02:42:28.400786 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 02:42:28.400792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 02:42:28.400798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 02:42:28.402404 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99df08aa182691c1bdb9ab9fb7b5352b276c236655b1076e2993639d63a94bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:13Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.908529 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:13Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.919456 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.919489 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.919498 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.919511 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.919520 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:13Z","lastTransitionTime":"2026-03-09T02:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.922205 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz59t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d14fa511-1706-4ecd-9190-c39af889657c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5wgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5wgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rz59t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:13Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.937831 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad3db3b3-d72c-4e61-9db8-fcbc6abb0578\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95a5fc152f730178294cfb5d25a815b2041e4b712e4bdb7b334b4b417f463ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ceafe61efea5730e474e74a9ed5a7558cc06dc75d105c37680ed1eea84b8827c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:17Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 02:41:48.034714 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 02:41:48.036179 1 observer_polling.go:159] Starting file observer\\\\nI0309 02:41:48.038474 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 02:41:48.039660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 02:42:14.296758 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 02:42:17.547772 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 02:42:17.547837 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:47Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0cd220ceed026b8b0448743b09537807f1d4c59e290c74e4afa63b3331aaf97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c86d550b0e0522073aa52f3a6aa6671f7ded403e37eda7c7b91c6e069272dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04aae8495c8f5edddbe537c44cefdb20eaa781df485766f6e709a5d4c6e82df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:13Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.953203 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84cc042c2e8b6984f9b3357afce277a63722f17b3e8fd289eeeaa8a7a47935f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:13Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.966586 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tvqrz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ac7cbd-671c-4beb-8994-92502ee47ceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee19f70b95aa359f8230d02f66a5ffe7bd83b01af33602ffee28239e335210d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbmmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tvqrz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:13Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:13 crc kubenswrapper[4901]: I0309 02:43:13.982594 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jtcxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98293641-7bf9-4473-ae92-c80e56cefdb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8cb9651e7ee62e24bc299a24e24546fc393d36c247cfc7d6fab578f2d5fc392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9388d004278c87c5f70e21f80ffb92a37910be99d37d4890f1802ca8ee862c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9388d004278c87c5f70e21f80ffb92a37910be99d37d4890f1802ca8ee862c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57469350bdcfa9d44c83dd87ce9ef88a75c2ed0302fd633c6c23f95a60572f9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57469350bdcfa9d44c83dd87ce9ef88a75c2ed0302fd633c6c23f95a60572f9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1536a2d9d7cc0d60c0442b625d3e58b3b5ffa615b5424405b8f896940e4085b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1536a2d9d7cc0d60c0442b625d3e58b3b5ffa615b5424405b8f896940e4085b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a7a4a4bf21360999d06acf702346f565fb76f5a5c4e5a8d1666fa395299d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a7a4a4bf21360999d06acf702346f565fb76f5a5c4e5a8d1666fa395299d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce664be16b67bb0782620399124bae458654d66c9331521c2df213476651b740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce664be16b67bb0782620399124bae458654d66c9331521c2df213476651b740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99311d694d177551b5f59aa756a0674f18816e477c3195cd30d40a805c964940\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99311d694d177551b5f59aa756a0674f18816e477c3195cd30d40a805c964940\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jtcxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:13Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.001144 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7a956f96042c9ac58c08b70e189b199ec512bc56ffedd9df3956127a2380a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:13Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.018636 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f7c21ba0d2836cb5905904941725dc9b5df5e173b47637722608075496850bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ce0f53e8c88e047328fb896ffa5133e9ac723b73795634e24072acde25506d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:14Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.022464 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.022506 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.022517 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.022535 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.022548 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:14Z","lastTransitionTime":"2026-03-09T02:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.032844 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-429fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f00772dc49e31beb94fb8c970e126f77b33ecff59ca8d6fbc856704648ec6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4j6pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-429fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:14Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.048419 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65e722e8-52c4-4bb6-9927-f378b2f7296a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25b2c2479794812333e6425d16a7432c46578a2a6d240682761545e57af6b4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7db76f28a31cfe2c8e33f8d3addf0b75358f97a36710315a069de1716506eae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5c998\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:14Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.049638 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.049676 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.049685 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.049700 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.049710 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:14Z","lastTransitionTime":"2026-03-09T02:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:14 crc kubenswrapper[4901]: E0309 02:43:14.066745 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4fb5477d-c9aa-418f-9a0d-560ac0227b13\\\",\\\"systemUUID\\\":\\\"92bcf75b-bfed-4296-bbf2-d35c6ac3a586\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:14Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.070294 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.070330 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.070342 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.070359 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.070370 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:14Z","lastTransitionTime":"2026-03-09T02:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:14 crc kubenswrapper[4901]: E0309 02:43:14.090423 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4fb5477d-c9aa-418f-9a0d-560ac0227b13\\\",\\\"systemUUID\\\":\\\"92bcf75b-bfed-4296-bbf2-d35c6ac3a586\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:14Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.094458 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.094495 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.094505 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.094523 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.094533 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:14Z","lastTransitionTime":"2026-03-09T02:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.105594 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:43:14 crc kubenswrapper[4901]: E0309 02:43:14.105686 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.105880 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:43:14 crc kubenswrapper[4901]: E0309 02:43:14.105954 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.106206 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:43:14 crc kubenswrapper[4901]: E0309 02:43:14.106296 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 02:43:14 crc kubenswrapper[4901]: E0309 02:43:14.106457 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4fb5477d-c9aa-418f-9a0d-560ac0227b13\\\",\\\"systemUUID\\\":\\\"92bcf75b-bfed-4296-bbf2-d35c6ac3a586\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:14Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.109239 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.109267 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.109278 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.109293 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.109304 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:14Z","lastTransitionTime":"2026-03-09T02:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:14 crc kubenswrapper[4901]: E0309 02:43:14.123315 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4fb5477d-c9aa-418f-9a0d-560ac0227b13\\\",\\\"systemUUID\\\":\\\"92bcf75b-bfed-4296-bbf2-d35c6ac3a586\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:14Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.133121 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.133185 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.133199 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.133237 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.133250 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:14Z","lastTransitionTime":"2026-03-09T02:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:14 crc kubenswrapper[4901]: E0309 02:43:14.149437 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4fb5477d-c9aa-418f-9a0d-560ac0227b13\\\",\\\"systemUUID\\\":\\\"92bcf75b-bfed-4296-bbf2-d35c6ac3a586\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:14Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:14 crc kubenswrapper[4901]: E0309 02:43:14.149622 4901 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.151526 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.151563 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.151577 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.151597 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.151611 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:14Z","lastTransitionTime":"2026-03-09T02:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.179907 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-lg26b"] Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.180369 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lg26b" Mar 09 02:43:14 crc kubenswrapper[4901]: E0309 02:43:14.180418 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lg26b" podUID="9e883667-62d8-4920-a810-558a77f260ca" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.193915 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ad2682-53b5-4e9d-acd1-0f0d210b322c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a290d0cc95a867bb7d924bfb2e63a518bb753765d4cdb890ccdf96cd36450b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72901ff5efbd52bce10e4651744925f7a5758a7a3c2572b581ab2cd222f58bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d0fe7f5e959b5c71b03b4ad017949229c3a161ea79f7869e610d069248c552\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 02:42:28.169798 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 02:42:28.169917 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 02:42:28.170597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1507146702/tls.crt::/tmp/serving-cert-1507146702/tls.key\\\\\\\"\\\\nI0309 02:42:28.390239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 02:42:28.392919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 02:42:28.392936 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 02:42:28.392959 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 02:42:28.392964 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 02:42:28.400711 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 02:42:28.400732 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 02:42:28.400760 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 02:42:28.400786 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 02:42:28.400792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 02:42:28.400798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 02:42:28.402404 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99df08aa182691c1bdb9ab9fb7b5352b276c236655b1076e2993639d63a94bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:14Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.212539 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:14Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.230318 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgs6t\" (UniqueName: \"kubernetes.io/projected/9e883667-62d8-4920-a810-558a77f260ca-kube-api-access-hgs6t\") pod \"network-metrics-daemon-lg26b\" (UID: \"9e883667-62d8-4920-a810-558a77f260ca\") " pod="openshift-multus/network-metrics-daemon-lg26b" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.230397 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e883667-62d8-4920-a810-558a77f260ca-metrics-certs\") pod \"network-metrics-daemon-lg26b\" (UID: \"9e883667-62d8-4920-a810-558a77f260ca\") " pod="openshift-multus/network-metrics-daemon-lg26b" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.231178 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lg26b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e883667-62d8-4920-a810-558a77f260ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgs6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgs6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lg26b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:14Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.255927 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.256298 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.256332 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.256420 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.256520 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:14Z","lastTransitionTime":"2026-03-09T02:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.256797 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad3db3b3-d72c-4e61-9db8-fcbc6abb0578\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95a5fc152f730178294cfb5d25a815b2041e4b712e4bdb7b334b4b417f463ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ceafe61efea5730e474e74a9ed5a7558cc06dc75d105c37680ed1eea84b8827c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:17Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 02:41:48.034714 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 02:41:48.036179 1 observer_polling.go:159] Starting file observer\\\\nI0309 02:41:48.038474 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 02:41:48.039660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 02:42:14.296758 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 02:42:17.547772 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 02:42:17.547837 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:47Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0cd220ceed026b8b0448743b09537807f1d4c59e290c74e4afa63b3331aaf97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c86d550b0e0522073aa52f3a6aa6671f7ded403e37eda7c7b91c6e069272dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04aae8495c8f5edddbe537c44cefdb20eaa781df485766f6e709a5d4c6e82df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:14Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.277494 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84cc042c2e8b6984f9b3357afce277a63722f17b3e8fd289eeeaa8a7a47935f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:14Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.295637 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tvqrz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ac7cbd-671c-4beb-8994-92502ee47ceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee19f70b95aa359f8230d02f66a5ffe7bd83b01af33602ffee28239e335210d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbmmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tvqrz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:14Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.321830 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jtcxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98293641-7bf9-4473-ae92-c80e56cefdb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8cb9651e7ee62e24bc299a24e24546fc393d36c247cfc7d6fab578f2d5fc392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9388d004278c87c5f70e21f80ffb92a37910be99d37d4890f1802ca8ee862c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9388d004278c87c5f70e21f80ffb92a37910be99d37d4890f1802ca8ee862c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57469350bdcfa9d44c83dd87ce9ef88a75c2ed0302fd633c6c23f95a60572f9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57469350bdcfa9d44c83dd87ce9ef88a75c2ed0302fd633c6c23f95a60572f9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1536a2d9d7cc0d60c0442b625d3e58b3b5ffa615b5424405b8f896940e4085b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1536a2d9d7cc0d60c0442b625d3e58b3b5ffa615b5424405b8f896940e4085b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a7a4a4bf21360999d06acf702346f565fb76f5a5c4e5a8d1666fa395299d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a7a4a4bf21360999d06acf702346f565fb76f5a5c4e5a8d1666fa395299d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce664be16b67bb0782620399124bae458654d66c9331521c2df213476651b740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce664be16b67bb0782620399124bae458654d66c9331521c2df213476651b740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99311d694d177551b5f59aa756a0674f18816e477c3195cd30d40a805c964940\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99311d694d177551b5f59aa756a0674f18816e477c3195cd30d40a805c964940\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jtcxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:14Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.331677 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgs6t\" (UniqueName: \"kubernetes.io/projected/9e883667-62d8-4920-a810-558a77f260ca-kube-api-access-hgs6t\") pod \"network-metrics-daemon-lg26b\" (UID: \"9e883667-62d8-4920-a810-558a77f260ca\") " pod="openshift-multus/network-metrics-daemon-lg26b" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.331785 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e883667-62d8-4920-a810-558a77f260ca-metrics-certs\") pod \"network-metrics-daemon-lg26b\" (UID: \"9e883667-62d8-4920-a810-558a77f260ca\") " pod="openshift-multus/network-metrics-daemon-lg26b" Mar 09 02:43:14 crc kubenswrapper[4901]: E0309 02:43:14.331957 4901 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 02:43:14 crc kubenswrapper[4901]: E0309 02:43:14.332039 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e883667-62d8-4920-a810-558a77f260ca-metrics-certs podName:9e883667-62d8-4920-a810-558a77f260ca nodeName:}" failed. No retries permitted until 2026-03-09 02:43:14.832016774 +0000 UTC m=+119.421680546 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9e883667-62d8-4920-a810-558a77f260ca-metrics-certs") pod "network-metrics-daemon-lg26b" (UID: "9e883667-62d8-4920-a810-558a77f260ca") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.338918 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz59t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d14fa511-1706-4ecd-9190-c39af889657c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5wgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5wgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rz59t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:14Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.358128 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7a956f96042c9ac58c08b70e189b199ec512bc56ffedd9df3956127a2380a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:14Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.359697 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.359755 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.359777 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.359806 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.359827 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:14Z","lastTransitionTime":"2026-03-09T02:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.368132 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgs6t\" (UniqueName: \"kubernetes.io/projected/9e883667-62d8-4920-a810-558a77f260ca-kube-api-access-hgs6t\") pod \"network-metrics-daemon-lg26b\" (UID: \"9e883667-62d8-4920-a810-558a77f260ca\") " pod="openshift-multus/network-metrics-daemon-lg26b" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.381332 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f7c21ba0d2836cb5905904941725dc9b5df5e173b47637722608075496850bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ce0f53e8c88e047328fb896ffa5133e9ac723b73795634e24072acde25506d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:14Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.402604 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-429fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f00772dc49e31beb94fb8c970e126f77b33ecff59ca8d6fbc856704648ec6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4j6pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-429fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:14Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.424088 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65e722e8-52c4-4bb6-9927-f378b2f7296a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25b2c2479794812333e6425d16a7432c46578a2a6d240682761545e57af6b4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7db76f28a31cfe2c8e33f8d3addf0b75358f97a36710315a069de1716506eae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5c998\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:14Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.462360 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.462421 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.462440 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.462482 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.462499 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:14Z","lastTransitionTime":"2026-03-09T02:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.484643 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f58555f-103d-42bd-affa-68dd3ed5baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18f78a3248e86d96e6293502db1727285fb76668c8242ed3399de4b8e8fb666a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79c92d776bf96579afc6fbc0e1e70bb645a371966be4e4e9d86e18ef3c977b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7936d7271945f5c0b5ab0ddecf9a3f46c9f7306b71e25fabbb3927303c6f9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1098f7874be8a2a538c966845a946ab7993ec921e4e0859953f0187b4fabfd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23f66811373b03763a100551c0390732ccceac869c908d769de48f847e4ab82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:14Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.514917 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:14Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.537408 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:14Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.555766 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40c17e04-3fc2-48a2-95dc-fe0428b91e66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3178075164db642d21adaf4deed75f6f295794530311c6476b6f0376101b61c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce11cf214e5d6313c477f4939e0627d14eecc389d7df43bfa0a4bff5152baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eead4ce26a12a03f0e1808e8e8479ca48c9a409935ea566fae8a7cc2d8bc7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c9f87a22de3336b00d43f7848f50bda9f5f27f7ac84c6c6e63a224583005e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a115bbd5ac61babf8f88452e2c50b49fd24b34589357e4a7d59bd2a3511b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce9289308fa0e0313c1e1de4a2da2e1544bf309f32c6f58cf4a914210f10c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4e27513170313b32833f104dcccd7fe37890a5eff0dc33d3993efba6ae93dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4e27513170313b32833f104dcccd7fe37890a5eff0dc33d3993efba6ae93dbb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T02:43:13Z\\\",\\\"message\\\":\\\"oval\\\\nI0309 02:43:13.253509 6746 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0309 02:43:13.253619 6746 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 02:43:13.253649 6746 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 02:43:13.253698 6746 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0309 02:43:13.253754 6746 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0309 02:43:13.253789 6746 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0309 02:43:13.253846 6746 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0309 02:43:13.253850 6746 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0309 02:43:13.253882 6746 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0309 02:43:13.253906 6746 factory.go:656] Stopping watch factory\\\\nI0309 02:43:13.253925 6746 ovnkube.go:599] Stopped ovnkube\\\\nI0309 02:43:13.253822 6746 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 02:43:13.253973 6746 handler.go:208] Removed *v1.Node event handler 7\\\\nI0309 02:43:13.253985 6746 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0309 02:43:13.253996 6746 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0309 02:43:13.254012 6746 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2c4c63b1ab59c59daca5ed71ee5ed0e53a565ffbc793ac516f9ba75a4ece272\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmfgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:14Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.564846 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.564878 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.564886 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.564902 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.564911 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:14Z","lastTransitionTime":"2026-03-09T02:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.566260 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-c2pjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b166f443-7b7c-478b-bfc1-67291aa158fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f409cfbaaf5a97aa5b05433826d0ba084ecc5dc20afb3e733c9cb2e38a2e507a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4l68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-c2pjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:14Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.667426 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.667472 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.667483 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.667502 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.667515 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:14Z","lastTransitionTime":"2026-03-09T02:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.770290 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.770365 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.770385 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.770411 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.770431 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:14Z","lastTransitionTime":"2026-03-09T02:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.794143 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz59t" event={"ID":"d14fa511-1706-4ecd-9190-c39af889657c","Type":"ContainerStarted","Data":"994153537c9e7d0ac89a6651b22147c42d9263d9652cc7748afa72e1e46a344a"} Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.794251 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz59t" event={"ID":"d14fa511-1706-4ecd-9190-c39af889657c","Type":"ContainerStarted","Data":"799a782b20c6af25f16146a6f775dbb1f9a70567c6e183bc3760541beba535a1"} Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.796542 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmfgc_40c17e04-3fc2-48a2-95dc-fe0428b91e66/ovnkube-controller/0.log" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.799667 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" event={"ID":"40c17e04-3fc2-48a2-95dc-fe0428b91e66","Type":"ContainerStarted","Data":"c188c6c7f7cc9a46c9e941cf0096b7d2828dac9036d388388958f1d44f3cfdd4"} Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.800276 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.811578 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:14Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.831529 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40c17e04-3fc2-48a2-95dc-fe0428b91e66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3178075164db642d21adaf4deed75f6f295794530311c6476b6f0376101b61c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce11cf214e5d6313c477f4939e0627d14eecc389d7df43bfa0a4bff5152baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eead4ce26a12a03f0e1808e8e8479ca48c9a409935ea566fae8a7cc2d8bc7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c9f87a22de3336b00d43f7848f50bda9f5f27f7ac84c6c6e63a224583005e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a115bbd5ac61babf8f88452e2c50b49fd24b34589357e4a7d59bd2a3511b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce9289308fa0e0313c1e1de4a2da2e1544bf309f32c6f58cf4a914210f10c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4e27513170313b32833f104dcccd7fe37890a5eff0dc33d3993efba6ae93dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4e27513170313b32833f104dcccd7fe37890a5eff0dc33d3993efba6ae93dbb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T02:43:13Z\\\",\\\"message\\\":\\\"oval\\\\nI0309 02:43:13.253509 6746 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0309 02:43:13.253619 6746 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 02:43:13.253649 6746 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 02:43:13.253698 6746 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0309 02:43:13.253754 6746 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0309 02:43:13.253789 6746 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0309 02:43:13.253846 6746 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0309 02:43:13.253850 6746 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0309 02:43:13.253882 6746 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0309 02:43:13.253906 6746 factory.go:656] Stopping watch factory\\\\nI0309 02:43:13.253925 6746 ovnkube.go:599] Stopped ovnkube\\\\nI0309 02:43:13.253822 6746 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 02:43:13.253973 6746 handler.go:208] Removed *v1.Node event handler 7\\\\nI0309 02:43:13.253985 6746 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0309 02:43:13.253996 6746 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0309 02:43:13.254012 6746 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2c4c63b1ab59c59daca5ed71ee5ed0e53a565ffbc793ac516f9ba75a4ece272\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmfgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:14Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.838125 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e883667-62d8-4920-a810-558a77f260ca-metrics-certs\") pod \"network-metrics-daemon-lg26b\" (UID: \"9e883667-62d8-4920-a810-558a77f260ca\") " pod="openshift-multus/network-metrics-daemon-lg26b" Mar 09 02:43:14 crc kubenswrapper[4901]: E0309 02:43:14.838416 4901 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 02:43:14 crc kubenswrapper[4901]: E0309 02:43:14.838486 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e883667-62d8-4920-a810-558a77f260ca-metrics-certs podName:9e883667-62d8-4920-a810-558a77f260ca nodeName:}" failed. No retries permitted until 2026-03-09 02:43:15.838464626 +0000 UTC m=+120.428128358 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9e883667-62d8-4920-a810-558a77f260ca-metrics-certs") pod "network-metrics-daemon-lg26b" (UID: "9e883667-62d8-4920-a810-558a77f260ca") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.845609 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-c2pjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b166f443-7b7c-478b-bfc1-67291aa158fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f409cfbaaf5a97aa5b05433826d0ba084ecc5dc20afb3e733c9cb2e38a2e507a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4l68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-c2pjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:14Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.871666 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f58555f-103d-42bd-affa-68dd3ed5baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18f78a3248e86d96e6293502db1727285fb76668c8242ed3399de4b8e8fb666a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79c92d776bf96579afc6fbc0e1e70bb645a371966be4e4e9d86e18ef3c977b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7936d7271945f5c0b5ab0ddecf9a3f46c9f7306b71e25fabbb3927303c6f9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1098f7874be8a2a538c966845a946ab7993ec921e4e0859953f0187b4fabfd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23f66811373b03763a100551c0390732ccceac869c908d769de48f847e4ab82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:14Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.873642 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.873705 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.873723 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.873750 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.873769 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:14Z","lastTransitionTime":"2026-03-09T02:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.888570 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:14Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.908505 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ad2682-53b5-4e9d-acd1-0f0d210b322c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a290d0cc95a867bb7d924bfb2e63a518bb753765d4cdb890ccdf96cd36450b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72901ff5efbd52bce10e4651744925f7a5758a7a3c2572b581ab2cd222f58bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d0fe7f5e959b5c71b03b4ad017949229c3a161ea79f7869e610d069248c552\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 02:42:28.169798 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 02:42:28.169917 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 02:42:28.170597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1507146702/tls.crt::/tmp/serving-cert-1507146702/tls.key\\\\\\\"\\\\nI0309 02:42:28.390239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 02:42:28.392919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 02:42:28.392936 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 02:42:28.392959 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 02:42:28.392964 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 02:42:28.400711 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 02:42:28.400732 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 02:42:28.400760 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 02:42:28.400786 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 02:42:28.400792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 02:42:28.400798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 02:42:28.402404 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99df08aa182691c1bdb9ab9fb7b5352b276c236655b1076e2993639d63a94bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:14Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.925461 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:14Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.936517 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lg26b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e883667-62d8-4920-a810-558a77f260ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgs6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgs6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lg26b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:14Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.946602 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tvqrz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ac7cbd-671c-4beb-8994-92502ee47ceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee19f70b95aa359f8230d02f66a5ffe7bd83b01af33602ffee28239e335210d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbmmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tvqrz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:14Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.964364 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jtcxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98293641-7bf9-4473-ae92-c80e56cefdb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8cb9651e7ee62e24bc299a24e24546fc393d36c247cfc7d6fab578f2d5fc392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9388d004278c87c5f70e21f80ffb92a37910be99d37d4890f1802ca8ee862c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9388d004278c87c5f70e21f80ffb92a37910be99d37d4890f1802ca8ee862c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57469350bdcfa9d44c83dd87ce9ef88a75c2ed0302fd633c6c23f95a60572f9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57469350bdcfa9d44c83dd87ce9ef88a75c2ed0302fd633c6c23f95a60572f9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1536a2d9d7cc0d60c0442b625d3e58b3b5ffa615b5424405b8f896940e4085b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1536a2d9d7cc0d60c0442b625d3e58b3b5ffa615b5424405b8f896940e4085b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a7a4a4bf21360999d06acf702346f565fb76f5a5c4e5a8d1666fa395299d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a7a4a4bf21360999d06acf702346f565fb76f5a5c4e5a8d1666fa395299d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce664be16b67bb0782620399124bae458654d66c9331521c2df213476651b740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce664be16b67bb0782620399124bae458654d66c9331521c2df213476651b740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99311d694d177551b5f59aa756a0674f18816e477c3195cd30d40a805c964940\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99311d694d177551b5f59aa756a0674f18816e477c3195cd30d40a805c964940\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jtcxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:14Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.975333 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz59t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d14fa511-1706-4ecd-9190-c39af889657c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://799a782b20c6af25f16146a6f775dbb1f9a70567c6e183bc3760541beba535a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5wgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://994153537c9e7d0ac89a6651b22147c42d9263d9652cc7748afa72e1e46a344a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5wgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rz59t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:14Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.976570 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.976610 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.976619 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.976633 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.976643 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:14Z","lastTransitionTime":"2026-03-09T02:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:14 crc kubenswrapper[4901]: I0309 02:43:14.995299 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad3db3b3-d72c-4e61-9db8-fcbc6abb0578\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95a5fc152f730178294cfb5d25a815b2041e4b712e4bdb7b334b4b417f463ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ceafe61efea5730e474e74a9ed5a7558cc06dc75d105c37680ed1eea84b8827c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:17Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 02:41:48.034714 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 02:41:48.036179 1 observer_polling.go:159] Starting file observer\\\\nI0309 02:41:48.038474 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 02:41:48.039660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 02:42:14.296758 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 02:42:17.547772 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 02:42:17.547837 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:47Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0cd220ceed026b8b0448743b09537807f1d4c59e290c74e4afa63b3331aaf97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c86d550b0e0522073aa52f3a6aa6671f7ded403e37eda7c7b91c6e069272dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04aae8495c8f5edddbe537c44cefdb20eaa781df485766f6e709a5d4c6e82df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:14Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.014284 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84cc042c2e8b6984f9b3357afce277a63722f17b3e8fd289eeeaa8a7a47935f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:15Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.029343 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65e722e8-52c4-4bb6-9927-f378b2f7296a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25b2c2479794812333e6425d16a7432c46578a2a6d240682761545e57af6b4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7db76f28a31cfe2c8e33f8d3addf0b75358f97a36710315a069de1716506eae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5c998\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:15Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.044752 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7a956f96042c9ac58c08b70e189b199ec512bc56ffedd9df3956127a2380a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:15Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.069308 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f7c21ba0d2836cb5905904941725dc9b5df5e173b47637722608075496850bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ce0f53e8c88e047328fb896ffa5133e9ac723b73795634e24072acde25506d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:15Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.078871 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.078920 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.078938 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.078961 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.078975 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:15Z","lastTransitionTime":"2026-03-09T02:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.091117 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-429fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f00772dc49e31beb94fb8c970e126f77b33ecff59ca8d6fbc856704648ec6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4j6pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-429fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:15Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.111379 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7a956f96042c9ac58c08b70e189b199ec512bc56ffedd9df3956127a2380a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:15Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.128279 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f7c21ba0d2836cb5905904941725dc9b5df5e173b47637722608075496850bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ce0f53e8c88e047328fb896ffa5133e9ac723b73795634e24072acde25506d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:15Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.145675 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-429fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f00772dc49e31beb94fb8c970e126f77b33ecff59ca8d6fbc856704648ec6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4j6pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-429fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:15Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.159394 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65e722e8-52c4-4bb6-9927-f378b2f7296a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25b2c2479794812333e6425d16a7432c46578a2a6d240682761545e57af6b4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7db76f28a31cfe2c8e33f8d3addf0b75358f97a36710315a069de1716506eae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5c998\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:15Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.170908 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-c2pjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b166f443-7b7c-478b-bfc1-67291aa158fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f409cfbaaf5a97aa5b05433826d0ba084ecc5dc20afb3e733c9cb2e38a2e507a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4l68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-c2pjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:15Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.181128 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.181194 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.181218 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.181278 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.181300 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:15Z","lastTransitionTime":"2026-03-09T02:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.196154 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f58555f-103d-42bd-affa-68dd3ed5baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18f78a3248e86d96e6293502db1727285fb76668c8242ed3399de4b8e8fb666a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79c92d776bf96579afc6fbc0e1e70bb645a371966be4e4e9d86e18ef3c977b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7936d7271945f5c0b5ab0ddecf9a3f46c9f7306b71e25fabbb3927303c6f9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1098f7874be8a2a538c966845a946ab7993ec921e4e0859953f0187b4fabfd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23f66811373b03763a100551c0390732ccceac869c908d769de48f847e4ab82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:15Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.215608 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:15Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.235252 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:15Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.262650 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40c17e04-3fc2-48a2-95dc-fe0428b91e66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3178075164db642d21adaf4deed75f6f295794530311c6476b6f0376101b61c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce11cf214e5d6313c477f4939e0627d14eecc389d7df43bfa0a4bff5152baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eead4ce26a12a03f0e1808e8e8479ca48c9a409935ea566fae8a7cc2d8bc7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c9f87a22de3336b00d43f7848f50bda9f5f27f7ac84c6c6e63a224583005e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a115bbd5ac61babf8f88452e2c50b49fd24b34589357e4a7d59bd2a3511b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce9289308fa0e0313c1e1de4a2da2e1544bf309f32c6f58cf4a914210f10c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c188c6c7f7cc9a46c9e941cf0096b7d2828dac9036d388388958f1d44f3cfdd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4e27513170313b32833f104dcccd7fe37890a5eff0dc33d3993efba6ae93dbb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T02:43:13Z\\\",\\\"message\\\":\\\"oval\\\\nI0309 02:43:13.253509 6746 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0309 02:43:13.253619 6746 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 02:43:13.253649 6746 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 02:43:13.253698 6746 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0309 02:43:13.253754 6746 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0309 02:43:13.253789 6746 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0309 02:43:13.253846 6746 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0309 02:43:13.253850 6746 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0309 02:43:13.253882 6746 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0309 02:43:13.253906 6746 factory.go:656] Stopping watch factory\\\\nI0309 02:43:13.253925 6746 ovnkube.go:599] Stopped ovnkube\\\\nI0309 02:43:13.253822 6746 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 02:43:13.253973 6746 handler.go:208] Removed *v1.Node event handler 7\\\\nI0309 02:43:13.253985 6746 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0309 02:43:13.253996 6746 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0309 02:43:13.254012 6746 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2c4c63b1ab59c59daca5ed71ee5ed0e53a565ffbc793ac516f9ba75a4ece272\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmfgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:15Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.283297 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ad2682-53b5-4e9d-acd1-0f0d210b322c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a290d0cc95a867bb7d924bfb2e63a518bb753765d4cdb890ccdf96cd36450b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72901ff5efbd52bce10e4651744925f7a5758a7a3c2572b581ab2cd222f58bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d0fe7f5e959b5c71b03b4ad017949229c3a161ea79f7869e610d069248c552\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 02:42:28.169798 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 02:42:28.169917 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 02:42:28.170597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1507146702/tls.crt::/tmp/serving-cert-1507146702/tls.key\\\\\\\"\\\\nI0309 02:42:28.390239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 02:42:28.392919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 02:42:28.392936 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 02:42:28.392959 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 02:42:28.392964 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 02:42:28.400711 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 02:42:28.400732 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 02:42:28.400760 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 02:42:28.400786 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 02:42:28.400792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 02:42:28.400798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 02:42:28.402404 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99df08aa182691c1bdb9ab9fb7b5352b276c236655b1076e2993639d63a94bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:15Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.285439 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.285512 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.285530 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.285555 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.285573 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:15Z","lastTransitionTime":"2026-03-09T02:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.307834 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:15Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.324904 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lg26b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e883667-62d8-4920-a810-558a77f260ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgs6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgs6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lg26b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:15Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.343536 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz59t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d14fa511-1706-4ecd-9190-c39af889657c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://799a782b20c6af25f16146a6f775dbb1f9a70567c6e183bc3760541beba535a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5wgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://994153537c9e7d0ac89a6651b22147c42d9263d9652cc7748afa72e1e46a344a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5wgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rz59t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:15Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.363876 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad3db3b3-d72c-4e61-9db8-fcbc6abb0578\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95a5fc152f730178294cfb5d25a815b2041e4b712e4bdb7b334b4b417f463ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ceafe61efea5730e474e74a9ed5a7558cc06dc75d105c37680ed1eea84b8827c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:17Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 02:41:48.034714 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 02:41:48.036179 1 observer_polling.go:159] Starting file observer\\\\nI0309 02:41:48.038474 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 02:41:48.039660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 02:42:14.296758 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 02:42:17.547772 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 02:42:17.547837 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:47Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0cd220ceed026b8b0448743b09537807f1d4c59e290c74e4afa63b3331aaf97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c86d550b0e0522073aa52f3a6aa6671f7ded403e37eda7c7b91c6e069272dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04aae8495c8f5edddbe537c44cefdb20eaa781df485766f6e709a5d4c6e82df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:15Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.382341 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84cc042c2e8b6984f9b3357afce277a63722f17b3e8fd289eeeaa8a7a47935f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:15Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.388977 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.389015 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.389030 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.389053 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.389069 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:15Z","lastTransitionTime":"2026-03-09T02:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.400354 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tvqrz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ac7cbd-671c-4beb-8994-92502ee47ceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee19f70b95aa359f8230d02f66a5ffe7bd83b01af33602ffee28239e335210d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbmmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tvqrz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:15Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.423804 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jtcxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98293641-7bf9-4473-ae92-c80e56cefdb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8cb9651e7ee62e24bc299a24e24546fc393d36c247cfc7d6fab578f2d5fc392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9388d004278c87c5f70e21f80ffb92a37910be99d37d4890f1802ca8ee862c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9388d004278c87c5f70e21f80ffb92a37910be99d37d4890f1802ca8ee862c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57469350bdcfa9d44c83dd87ce9ef88a75c2ed0302fd633c6c23f95a60572f9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57469350bdcfa9d44c83dd87ce9ef88a75c2ed0302fd633c6c23f95a60572f9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1536a2d9d7cc0d60c0442b625d3e58b3b5ffa615b5424405b8f896940e4085b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1536a2d9d7cc0d60c0442b625d3e58b3b5ffa615b5424405b8f896940e4085b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a7a4a4bf21360999d06acf702346f565fb76f5a5c4e5a8d1666fa395299d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a7a4a4bf21360999d06acf702346f565fb76f5a5c4e5a8d1666fa395299d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce664be16b67bb0782620399124bae458654d66c9331521c2df213476651b740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce664be16b67bb0782620399124bae458654d66c9331521c2df213476651b740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99311d694d177551b5f59aa756a0674f18816e477c3195cd30d40a805c964940\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99311d694d177551b5f59aa756a0674f18816e477c3195cd30d40a805c964940\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jtcxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:15Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.491634 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.491670 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.491680 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.491696 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.491707 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:15Z","lastTransitionTime":"2026-03-09T02:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.593952 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.594010 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.594027 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.594050 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.594069 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:15Z","lastTransitionTime":"2026-03-09T02:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.697662 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.697720 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.697738 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.697764 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.697782 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:15Z","lastTransitionTime":"2026-03-09T02:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.799769 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.799828 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.799845 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.799869 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.799888 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:15Z","lastTransitionTime":"2026-03-09T02:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.804989 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmfgc_40c17e04-3fc2-48a2-95dc-fe0428b91e66/ovnkube-controller/1.log" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.805819 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmfgc_40c17e04-3fc2-48a2-95dc-fe0428b91e66/ovnkube-controller/0.log" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.809928 4901 generic.go:334] "Generic (PLEG): container finished" podID="40c17e04-3fc2-48a2-95dc-fe0428b91e66" containerID="c188c6c7f7cc9a46c9e941cf0096b7d2828dac9036d388388958f1d44f3cfdd4" exitCode=1 Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.810055 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" event={"ID":"40c17e04-3fc2-48a2-95dc-fe0428b91e66","Type":"ContainerDied","Data":"c188c6c7f7cc9a46c9e941cf0096b7d2828dac9036d388388958f1d44f3cfdd4"} Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.810112 4901 scope.go:117] "RemoveContainer" containerID="e4e27513170313b32833f104dcccd7fe37890a5eff0dc33d3993efba6ae93dbb" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.811214 4901 scope.go:117] "RemoveContainer" containerID="c188c6c7f7cc9a46c9e941cf0096b7d2828dac9036d388388958f1d44f3cfdd4" Mar 09 02:43:15 crc kubenswrapper[4901]: E0309 02:43:15.811569 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bmfgc_openshift-ovn-kubernetes(40c17e04-3fc2-48a2-95dc-fe0428b91e66)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" podUID="40c17e04-3fc2-48a2-95dc-fe0428b91e66" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.829096 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-c2pjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b166f443-7b7c-478b-bfc1-67291aa158fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f409cfbaaf5a97aa5b05433826d0ba084ecc5dc20afb3e733c9cb2e38a2e507a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4l68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-c2pjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:15Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:15 crc kubenswrapper[4901]: E0309 02:43:15.850333 4901 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.850886 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e883667-62d8-4920-a810-558a77f260ca-metrics-certs\") pod \"network-metrics-daemon-lg26b\" (UID: \"9e883667-62d8-4920-a810-558a77f260ca\") " pod="openshift-multus/network-metrics-daemon-lg26b" Mar 09 02:43:15 crc kubenswrapper[4901]: E0309 02:43:15.850978 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e883667-62d8-4920-a810-558a77f260ca-metrics-certs podName:9e883667-62d8-4920-a810-558a77f260ca nodeName:}" failed. No retries permitted until 2026-03-09 02:43:17.850949329 +0000 UTC m=+122.440613101 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9e883667-62d8-4920-a810-558a77f260ca-metrics-certs") pod "network-metrics-daemon-lg26b" (UID: "9e883667-62d8-4920-a810-558a77f260ca") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.864598 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f58555f-103d-42bd-affa-68dd3ed5baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18f78a3248e86d96e6293502db1727285fb76668c8242ed3399de4b8e8fb666a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79c92d776bf96579afc6fbc0e1e70bb645a371966be4e4e9d86e18ef3c977b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7936d7271945f5c0b5ab0ddecf9a3f46c9f7306b71e25fabbb3927303c6f9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1098f7874be8a2a538c966845a946ab7993ec921e4e0859953f0187b4fabfd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23f66811373b03763a100551c0390732ccceac869c908d769de48f847e4ab82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:15Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.885007 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:15Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.904128 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.904181 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.904200 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.904256 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.904274 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:15Z","lastTransitionTime":"2026-03-09T02:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.905742 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:15Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.943782 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40c17e04-3fc2-48a2-95dc-fe0428b91e66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3178075164db642d21adaf4deed75f6f295794530311c6476b6f0376101b61c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce11cf214e5d6313c477f4939e0627d14eecc389d7df43bfa0a4bff5152baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eead4ce26a12a03f0e1808e8e8479ca48c9a409935ea566fae8a7cc2d8bc7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c9f87a22de3336b00d43f7848f50bda9f5f27f7ac84c6c6e63a224583005e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a115bbd5ac61babf8f88452e2c50b49fd24b34589357e4a7d59bd2a3511b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce9289308fa0e0313c1e1de4a2da2e1544bf309f32c6f58cf4a914210f10c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c188c6c7f7cc9a46c9e941cf0096b7d2828dac9036d388388958f1d44f3cfdd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4e27513170313b32833f104dcccd7fe37890a5eff0dc33d3993efba6ae93dbb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T02:43:13Z\\\",\\\"message\\\":\\\"oval\\\\nI0309 02:43:13.253509 6746 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0309 02:43:13.253619 6746 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 02:43:13.253649 6746 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 02:43:13.253698 6746 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0309 02:43:13.253754 6746 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0309 02:43:13.253789 6746 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0309 02:43:13.253846 6746 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0309 02:43:13.253850 6746 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0309 02:43:13.253882 6746 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0309 02:43:13.253906 6746 factory.go:656] Stopping watch factory\\\\nI0309 02:43:13.253925 6746 ovnkube.go:599] Stopped ovnkube\\\\nI0309 02:43:13.253822 6746 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 02:43:13.253973 6746 handler.go:208] Removed *v1.Node event handler 7\\\\nI0309 02:43:13.253985 6746 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0309 02:43:13.253996 6746 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0309 02:43:13.254012 6746 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c188c6c7f7cc9a46c9e941cf0096b7d2828dac9036d388388958f1d44f3cfdd4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T02:43:15Z\\\",\\\"message\\\":\\\"1336] Added *v1.EgressIP event handler 8\\\\nI0309 02:43:15.073036 6943 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 02:43:15.073060 6943 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 02:43:15.073089 6943 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 02:43:15.073122 6943 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0309 02:43:15.073137 6943 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0309 02:43:15.073159 6943 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0309 02:43:15.073188 6943 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0309 02:43:15.073200 6943 factory.go:656] Stopping watch factory\\\\nI0309 02:43:15.073201 6943 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0309 02:43:15.073165 6943 handler.go:208] Removed *v1.Node event handler 7\\\\nI0309 02:43:15.073258 6943 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0309 02:43:15.073380 6943 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0309 02:43:15.073436 6943 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0309 02:43:15.073461 6943 ovnkube.go:599] Stopped ovnkube\\\\nI0309 02:43:15.073484 6943 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0309 02:43:15.073539 6943 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2c4c63b1ab59c59daca5ed71ee5ed0e53a565ffbc793ac516f9ba75a4ece272\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmfgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:15Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.966967 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ad2682-53b5-4e9d-acd1-0f0d210b322c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a290d0cc95a867bb7d924bfb2e63a518bb753765d4cdb890ccdf96cd36450b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72901ff5efbd52bce10e4651744925f7a5758a7a3c2572b581ab2cd222f58bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d0fe7f5e959b5c71b03b4ad017949229c3a161ea79f7869e610d069248c552\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 02:42:28.169798 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 02:42:28.169917 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 02:42:28.170597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1507146702/tls.crt::/tmp/serving-cert-1507146702/tls.key\\\\\\\"\\\\nI0309 02:42:28.390239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 02:42:28.392919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 02:42:28.392936 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 02:42:28.392959 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 02:42:28.392964 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 02:42:28.400711 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 02:42:28.400732 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 02:42:28.400760 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 02:42:28.400786 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 02:42:28.400792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 02:42:28.400798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 02:42:28.402404 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99df08aa182691c1bdb9ab9fb7b5352b276c236655b1076e2993639d63a94bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:15Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.986319 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:15Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:15 crc kubenswrapper[4901]: I0309 02:43:15.999430 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lg26b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e883667-62d8-4920-a810-558a77f260ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgs6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgs6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lg26b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:15Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:16 crc kubenswrapper[4901]: I0309 02:43:16.007152 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:16 crc kubenswrapper[4901]: I0309 02:43:16.007212 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:16 crc kubenswrapper[4901]: I0309 02:43:16.007265 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:16 crc kubenswrapper[4901]: I0309 02:43:16.007293 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:16 crc kubenswrapper[4901]: I0309 02:43:16.007312 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:16Z","lastTransitionTime":"2026-03-09T02:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:16 crc kubenswrapper[4901]: I0309 02:43:16.018121 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz59t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d14fa511-1706-4ecd-9190-c39af889657c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://799a782b20c6af25f16146a6f775dbb1f9a70567c6e183bc3760541beba535a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5wgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://994153537c9e7d0ac89a6651b22147c42d9263d9652cc7748afa72e1e46a344a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5wgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rz59t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:16Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:16 crc kubenswrapper[4901]: I0309 02:43:16.041408 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad3db3b3-d72c-4e61-9db8-fcbc6abb0578\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95a5fc152f730178294cfb5d25a815b2041e4b712e4bdb7b334b4b417f463ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ceafe61efea5730e474e74a9ed5a7558cc06dc75d105c37680ed1eea84b8827c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:17Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 02:41:48.034714 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 02:41:48.036179 1 observer_polling.go:159] Starting file observer\\\\nI0309 02:41:48.038474 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 02:41:48.039660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 02:42:14.296758 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 02:42:17.547772 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 02:42:17.547837 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:47Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0cd220ceed026b8b0448743b09537807f1d4c59e290c74e4afa63b3331aaf97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c86d550b0e0522073aa52f3a6aa6671f7ded403e37eda7c7b91c6e069272dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04aae8495c8f5edddbe537c44cefdb20eaa781df485766f6e709a5d4c6e82df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:16Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:16 crc kubenswrapper[4901]: I0309 02:43:16.062115 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84cc042c2e8b6984f9b3357afce277a63722f17b3e8fd289eeeaa8a7a47935f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:16Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:16 crc kubenswrapper[4901]: I0309 02:43:16.078939 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tvqrz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ac7cbd-671c-4beb-8994-92502ee47ceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee19f70b95aa359f8230d02f66a5ffe7bd83b01af33602ffee28239e335210d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbmmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tvqrz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:16Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:16 crc kubenswrapper[4901]: I0309 02:43:16.102688 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jtcxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98293641-7bf9-4473-ae92-c80e56cefdb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8cb9651e7ee62e24bc299a24e24546fc393d36c247cfc7d6fab578f2d5fc392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9388d004278c87c5f70e21f80ffb92a37910be99d37d4890f1802ca8ee862c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9388d004278c87c5f70e21f80ffb92a37910be99d37d4890f1802ca8ee862c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57469350bdcfa9d44c83dd87ce9ef88a75c2ed0302fd633c6c23f95a60572f9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57469350bdcfa9d44c83dd87ce9ef88a75c2ed0302fd633c6c23f95a60572f9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1536a2d9d7cc0d60c0442b625d3e58b3b5ffa615b5424405b8f896940e4085b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1536a2d9d7cc0d60c0442b625d3e58b3b5ffa615b5424405b8f896940e4085b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a7a4a4bf21360999d06acf702346f565fb76f5a5c4e5a8d1666fa395299d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a7a4a4bf21360999d06acf702346f565fb76f5a5c4e5a8d1666fa395299d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce664be16b67bb0782620399124bae458654d66c9331521c2df213476651b740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce664be16b67bb0782620399124bae458654d66c9331521c2df213476651b740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99311d694d177551b5f59aa756a0674f18816e477c3195cd30d40a805c964940\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99311d694d177551b5f59aa756a0674f18816e477c3195cd30d40a805c964940\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jtcxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:16Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:16 crc kubenswrapper[4901]: I0309 02:43:16.107395 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:43:16 crc kubenswrapper[4901]: I0309 02:43:16.107425 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lg26b" Mar 09 02:43:16 crc kubenswrapper[4901]: E0309 02:43:16.107497 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 02:43:16 crc kubenswrapper[4901]: I0309 02:43:16.107562 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:43:16 crc kubenswrapper[4901]: E0309 02:43:16.107668 4901 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 09 02:43:16 crc kubenswrapper[4901]: E0309 02:43:16.107763 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lg26b" podUID="9e883667-62d8-4920-a810-558a77f260ca" Mar 09 02:43:16 crc kubenswrapper[4901]: I0309 02:43:16.107935 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:43:16 crc kubenswrapper[4901]: E0309 02:43:16.108114 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 02:43:16 crc kubenswrapper[4901]: E0309 02:43:16.108265 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 02:43:16 crc kubenswrapper[4901]: I0309 02:43:16.127400 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7a956f96042c9ac58c08b70e189b199ec512bc56ffedd9df3956127a2380a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:16Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:16 crc kubenswrapper[4901]: I0309 02:43:16.150162 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f7c21ba0d2836cb5905904941725dc9b5df5e173b47637722608075496850bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ce0f53e8c88e047328fb896ffa5133e9ac723b73795634e24072acde25506d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:16Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:16 crc kubenswrapper[4901]: I0309 02:43:16.170331 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-429fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f00772dc49e31beb94fb8c970e126f77b33ecff59ca8d6fbc856704648ec6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4j6pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-429fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:16Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:16 crc kubenswrapper[4901]: I0309 02:43:16.186879 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65e722e8-52c4-4bb6-9927-f378b2f7296a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25b2c2479794812333e6425d16a7432c46578a2a6d240682761545e57af6b4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7db76f28a31cfe2c8e33f8d3addf0b75358f97a36710315a069de1716506eae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5c998\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:16Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:16 crc kubenswrapper[4901]: I0309 02:43:16.206162 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad3db3b3-d72c-4e61-9db8-fcbc6abb0578\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95a5fc152f730178294cfb5d25a815b2041e4b712e4bdb7b334b4b417f463ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ceafe61efea5730e474e74a9ed5a7558cc06dc75d105c37680ed1eea84b8827c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:17Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 02:41:48.034714 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 02:41:48.036179 1 observer_polling.go:159] Starting file observer\\\\nI0309 02:41:48.038474 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 02:41:48.039660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 02:42:14.296758 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 02:42:17.547772 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 02:42:17.547837 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:47Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0cd220ceed026b8b0448743b09537807f1d4c59e290c74e4afa63b3331aaf97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c86d550b0e0522073aa52f3a6aa6671f7ded403e37eda7c7b91c6e069272dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04aae8495c8f5edddbe537c44cefdb20eaa781df485766f6e709a5d4c6e82df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:16Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:16 crc kubenswrapper[4901]: I0309 02:43:16.234445 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84cc042c2e8b6984f9b3357afce277a63722f17b3e8fd289eeeaa8a7a47935f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:16Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:16 crc kubenswrapper[4901]: I0309 02:43:16.249744 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tvqrz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ac7cbd-671c-4beb-8994-92502ee47ceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee19f70b95aa359f8230d02f66a5ffe7bd83b01af33602ffee28239e335210d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbmmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tvqrz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:16Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:16 crc kubenswrapper[4901]: I0309 02:43:16.265381 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jtcxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98293641-7bf9-4473-ae92-c80e56cefdb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8cb9651e7ee62e24bc299a24e24546fc393d36c247cfc7d6fab578f2d5fc392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9388d004278c87c5f70e21f80ffb92a37910be99d37d4890f1802ca8ee862c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9388d004278c87c5f70e21f80ffb92a37910be99d37d4890f1802ca8ee862c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57469350bdcfa9d44c83dd87ce9ef88a75c2ed0302fd633c6c23f95a60572f9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57469350bdcfa9d44c83dd87ce9ef88a75c2ed0302fd633c6c23f95a60572f9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1536a2d9d7cc0d60c0442b625d3e58b3b5ffa615b5424405b8f896940e4085b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1536a2d9d7cc0d60c0442b625d3e58b3b5ffa615b5424405b8f896940e4085b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a7a4a4bf21360999d06acf702346f565fb76f5a5c4e5a8d1666fa395299d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a7a4a4bf21360999d06acf702346f565fb76f5a5c4e5a8d1666fa395299d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce664be16b67bb0782620399124bae458654d66c9331521c2df213476651b740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce664be16b67bb0782620399124bae458654d66c9331521c2df213476651b740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99311d694d177551b5f59aa756a0674f18816e477c3195cd30d40a805c964940\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99311d694d177551b5f59aa756a0674f18816e477c3195cd30d40a805c964940\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jtcxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:16Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:16 crc kubenswrapper[4901]: I0309 02:43:16.276278 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz59t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d14fa511-1706-4ecd-9190-c39af889657c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://799a782b20c6af25f16146a6f775dbb1f9a70567c6e183bc3760541beba535a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5wgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://994153537c9e7d0ac89a6651b22147c42d9263d9652cc7748afa72e1e46a344a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5wgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rz59t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:16Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:16 crc kubenswrapper[4901]: I0309 02:43:16.290121 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7a956f96042c9ac58c08b70e189b199ec512bc56ffedd9df3956127a2380a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:16Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:16 crc kubenswrapper[4901]: I0309 02:43:16.301846 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f7c21ba0d2836cb5905904941725dc9b5df5e173b47637722608075496850bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ce0f53e8c88e047328fb896ffa5133e9ac723b73795634e24072acde25506d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:16Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:16 crc kubenswrapper[4901]: I0309 02:43:16.315338 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-429fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f00772dc49e31beb94fb8c970e126f77b33ecff59ca8d6fbc856704648ec6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4j6pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-429fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:16Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:16 crc kubenswrapper[4901]: I0309 02:43:16.325480 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65e722e8-52c4-4bb6-9927-f378b2f7296a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25b2c2479794812333e6425d16a7432c46578a2a6d240682761545e57af6b4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7db76f28a31cfe2c8e33f8d3addf0b75358f97a36710315a069de1716506eae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5c998\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:16Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:16 crc kubenswrapper[4901]: I0309 02:43:16.343748 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f58555f-103d-42bd-affa-68dd3ed5baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18f78a3248e86d96e6293502db1727285fb76668c8242ed3399de4b8e8fb666a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79c92d776bf96579afc6fbc0e1e70bb645a371966be4e4e9d86e18ef3c977b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7936d7271945f5c0b5ab0ddecf9a3f46c9f7306b71e25fabbb3927303c6f9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1098f7874be8a2a538c966845a946ab7993ec921e4e0859953f0187b4fabfd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23f66811373b03763a100551c0390732ccceac869c908d769de48f847e4ab82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:16Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:16 crc kubenswrapper[4901]: I0309 02:43:16.356488 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:16Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:16 crc kubenswrapper[4901]: I0309 02:43:16.372246 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:16Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:16 crc kubenswrapper[4901]: I0309 02:43:16.398588 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40c17e04-3fc2-48a2-95dc-fe0428b91e66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3178075164db642d21adaf4deed75f6f295794530311c6476b6f0376101b61c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce11cf214e5d6313c477f4939e0627d14eecc389d7df43bfa0a4bff5152baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eead4ce26a12a03f0e1808e8e8479ca48c9a409935ea566fae8a7cc2d8bc7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c9f87a22de3336b00d43f7848f50bda9f5f27f7ac84c6c6e63a224583005e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a115bbd5ac61babf8f88452e2c50b49fd24b34589357e4a7d59bd2a3511b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce9289308fa0e0313c1e1de4a2da2e1544bf309f32c6f58cf4a914210f10c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c188c6c7f7cc9a46c9e941cf0096b7d2828dac9036d388388958f1d44f3cfdd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4e27513170313b32833f104dcccd7fe37890a5eff0dc33d3993efba6ae93dbb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T02:43:13Z\\\",\\\"message\\\":\\\"oval\\\\nI0309 02:43:13.253509 6746 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0309 02:43:13.253619 6746 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 02:43:13.253649 6746 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 02:43:13.253698 6746 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0309 02:43:13.253754 6746 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0309 02:43:13.253789 6746 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0309 02:43:13.253846 6746 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0309 02:43:13.253850 6746 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0309 02:43:13.253882 6746 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0309 02:43:13.253906 6746 factory.go:656] Stopping watch factory\\\\nI0309 02:43:13.253925 6746 ovnkube.go:599] Stopped ovnkube\\\\nI0309 02:43:13.253822 6746 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 02:43:13.253973 6746 handler.go:208] Removed *v1.Node event handler 7\\\\nI0309 02:43:13.253985 6746 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0309 02:43:13.253996 6746 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0309 02:43:13.254012 6746 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c188c6c7f7cc9a46c9e941cf0096b7d2828dac9036d388388958f1d44f3cfdd4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T02:43:15Z\\\",\\\"message\\\":\\\"1336] Added *v1.EgressIP event handler 8\\\\nI0309 02:43:15.073036 6943 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 02:43:15.073060 6943 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 02:43:15.073089 6943 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 02:43:15.073122 6943 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0309 02:43:15.073137 6943 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0309 02:43:15.073159 6943 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0309 02:43:15.073188 6943 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0309 02:43:15.073200 6943 factory.go:656] Stopping watch factory\\\\nI0309 02:43:15.073201 6943 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0309 02:43:15.073165 6943 handler.go:208] Removed *v1.Node event handler 7\\\\nI0309 02:43:15.073258 6943 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0309 02:43:15.073380 6943 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0309 02:43:15.073436 6943 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0309 02:43:15.073461 6943 ovnkube.go:599] Stopped ovnkube\\\\nI0309 02:43:15.073484 6943 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0309 02:43:15.073539 6943 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2c4c63b1ab59c59daca5ed71ee5ed0e53a565ffbc793ac516f9ba75a4ece272\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmfgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:16Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:16 crc kubenswrapper[4901]: I0309 02:43:16.412012 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-c2pjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b166f443-7b7c-478b-bfc1-67291aa158fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f409cfbaaf5a97aa5b05433826d0ba084ecc5dc20afb3e733c9cb2e38a2e507a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4l68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-c2pjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:16Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:16 crc kubenswrapper[4901]: I0309 02:43:16.429684 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ad2682-53b5-4e9d-acd1-0f0d210b322c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a290d0cc95a867bb7d924bfb2e63a518bb753765d4cdb890ccdf96cd36450b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72901ff5efbd52bce10e4651744925f7a5758a7a3c2572b581ab2cd222f58bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d0fe7f5e959b5c71b03b4ad017949229c3a161ea79f7869e610d069248c552\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 02:42:28.169798 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 02:42:28.169917 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 02:42:28.170597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1507146702/tls.crt::/tmp/serving-cert-1507146702/tls.key\\\\\\\"\\\\nI0309 02:42:28.390239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 02:42:28.392919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 02:42:28.392936 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 02:42:28.392959 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 02:42:28.392964 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 02:42:28.400711 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 02:42:28.400732 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 02:42:28.400760 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 02:42:28.400786 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 02:42:28.400792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 02:42:28.400798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 02:42:28.402404 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99df08aa182691c1bdb9ab9fb7b5352b276c236655b1076e2993639d63a94bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:16Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:16 crc kubenswrapper[4901]: I0309 02:43:16.443922 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:16Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:16 crc kubenswrapper[4901]: E0309 02:43:16.449255 4901 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 02:43:16 crc kubenswrapper[4901]: I0309 02:43:16.459907 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lg26b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e883667-62d8-4920-a810-558a77f260ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgs6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgs6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lg26b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:16Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:16 crc kubenswrapper[4901]: I0309 02:43:16.816188 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmfgc_40c17e04-3fc2-48a2-95dc-fe0428b91e66/ovnkube-controller/1.log" Mar 09 02:43:16 crc kubenswrapper[4901]: I0309 02:43:16.823495 4901 scope.go:117] "RemoveContainer" containerID="c188c6c7f7cc9a46c9e941cf0096b7d2828dac9036d388388958f1d44f3cfdd4" Mar 09 02:43:16 crc kubenswrapper[4901]: E0309 02:43:16.824356 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bmfgc_openshift-ovn-kubernetes(40c17e04-3fc2-48a2-95dc-fe0428b91e66)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" podUID="40c17e04-3fc2-48a2-95dc-fe0428b91e66" Mar 09 02:43:16 crc kubenswrapper[4901]: I0309 02:43:16.854535 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f58555f-103d-42bd-affa-68dd3ed5baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18f78a3248e86d96e6293502db1727285fb76668c8242ed3399de4b8e8fb666a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79c92d776bf96579afc6fbc0e1e70bb645a371966be4e4e9d86e18ef3c977b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7936d7271945f5c0b5ab0ddecf9a3f46c9f7306b71e25fabbb3927303c6f9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1098f7874be8a2a538c966845a946ab7993ec921e4e0859953f0187b4fabfd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23f66811373b03763a100551c0390732ccceac869c908d769de48f847e4ab82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:16Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:16 crc kubenswrapper[4901]: I0309 02:43:16.874039 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:16Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:16 crc kubenswrapper[4901]: I0309 02:43:16.891355 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:16Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:16 crc kubenswrapper[4901]: I0309 02:43:16.920892 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40c17e04-3fc2-48a2-95dc-fe0428b91e66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3178075164db642d21adaf4deed75f6f295794530311c6476b6f0376101b61c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce11cf214e5d6313c477f4939e0627d14eecc389d7df43bfa0a4bff5152baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eead4ce26a12a03f0e1808e8e8479ca48c9a409935ea566fae8a7cc2d8bc7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c9f87a22de3336b00d43f7848f50bda9f5f27f7ac84c6c6e63a224583005e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a115bbd5ac61babf8f88452e2c50b49fd24b34589357e4a7d59bd2a3511b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce9289308fa0e0313c1e1de4a2da2e1544bf309f32c6f58cf4a914210f10c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c188c6c7f7cc9a46c9e941cf0096b7d2828dac9036d388388958f1d44f3cfdd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c188c6c7f7cc9a46c9e941cf0096b7d2828dac9036d388388958f1d44f3cfdd4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T02:43:15Z\\\",\\\"message\\\":\\\"1336] Added *v1.EgressIP event handler 8\\\\nI0309 02:43:15.073036 6943 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 02:43:15.073060 6943 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 02:43:15.073089 6943 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 02:43:15.073122 6943 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0309 02:43:15.073137 6943 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0309 02:43:15.073159 6943 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0309 02:43:15.073188 6943 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0309 02:43:15.073200 6943 factory.go:656] Stopping watch factory\\\\nI0309 02:43:15.073201 6943 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0309 02:43:15.073165 6943 handler.go:208] Removed *v1.Node event handler 7\\\\nI0309 02:43:15.073258 6943 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0309 02:43:15.073380 6943 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0309 02:43:15.073436 6943 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0309 02:43:15.073461 6943 ovnkube.go:599] Stopped ovnkube\\\\nI0309 02:43:15.073484 6943 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0309 02:43:15.073539 6943 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bmfgc_openshift-ovn-kubernetes(40c17e04-3fc2-48a2-95dc-fe0428b91e66)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2c4c63b1ab59c59daca5ed71ee5ed0e53a565ffbc793ac516f9ba75a4ece272\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmfgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:16Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:16 crc kubenswrapper[4901]: I0309 02:43:16.935793 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-c2pjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b166f443-7b7c-478b-bfc1-67291aa158fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f409cfbaaf5a97aa5b05433826d0ba084ecc5dc20afb3e733c9cb2e38a2e507a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4l68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-c2pjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:16Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:16 crc kubenswrapper[4901]: I0309 02:43:16.952692 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:16Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:16 crc kubenswrapper[4901]: I0309 02:43:16.967269 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lg26b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e883667-62d8-4920-a810-558a77f260ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgs6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgs6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lg26b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:16Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:16 crc kubenswrapper[4901]: I0309 02:43:16.989817 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ad2682-53b5-4e9d-acd1-0f0d210b322c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a290d0cc95a867bb7d924bfb2e63a518bb753765d4cdb890ccdf96cd36450b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72901ff5efbd52bce10e4651744925f7a5758a7a3c2572b581ab2cd222f58bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d0fe7f5e959b5c71b03b4ad017949229c3a161ea79f7869e610d069248c552\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 02:42:28.169798 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 02:42:28.169917 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 02:42:28.170597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1507146702/tls.crt::/tmp/serving-cert-1507146702/tls.key\\\\\\\"\\\\nI0309 02:42:28.390239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 02:42:28.392919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 02:42:28.392936 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 02:42:28.392959 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 02:42:28.392964 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 02:42:28.400711 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 02:42:28.400732 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 02:42:28.400760 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 02:42:28.400786 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 02:42:28.400792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 02:42:28.400798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 02:42:28.402404 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99df08aa182691c1bdb9ab9fb7b5352b276c236655b1076e2993639d63a94bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:16Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:17 crc kubenswrapper[4901]: I0309 02:43:17.006760 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84cc042c2e8b6984f9b3357afce277a63722f17b3e8fd289eeeaa8a7a47935f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:17Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:17 crc kubenswrapper[4901]: I0309 02:43:17.019819 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tvqrz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ac7cbd-671c-4beb-8994-92502ee47ceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee19f70b95aa359f8230d02f66a5ffe7bd83b01af33602ffee28239e335210d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbmmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tvqrz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:17Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:17 crc kubenswrapper[4901]: I0309 02:43:17.037883 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jtcxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98293641-7bf9-4473-ae92-c80e56cefdb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8cb9651e7ee62e24bc299a24e24546fc393d36c247cfc7d6fab578f2d5fc392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9388d004278c87c5f70e21f80ffb92a37910be99d37d4890f1802ca8ee862c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9388d004278c87c5f70e21f80ffb92a37910be99d37d4890f1802ca8ee862c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57469350bdcfa9d44c83dd87ce9ef88a75c2ed0302fd633c6c23f95a60572f9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57469350bdcfa9d44c83dd87ce9ef88a75c2ed0302fd633c6c23f95a60572f9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1536a2d9d7cc0d60c0442b625d3e58b3b5ffa615b5424405b8f896940e4085b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1536a2d9d7cc0d60c0442b625d3e58b3b5ffa615b5424405b8f896940e4085b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a7a4a4bf21360999d06acf702346f565fb76f5a5c4e5a8d1666fa395299d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a7a4a4bf21360999d06acf702346f565fb76f5a5c4e5a8d1666fa395299d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce664be16b67bb0782620399124bae458654d66c9331521c2df213476651b740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce664be16b67bb0782620399124bae458654d66c9331521c2df213476651b740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99311d694d177551b5f59aa756a0674f18816e477c3195cd30d40a805c964940\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99311d694d177551b5f59aa756a0674f18816e477c3195cd30d40a805c964940\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jtcxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:17Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:17 crc kubenswrapper[4901]: I0309 02:43:17.051729 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz59t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d14fa511-1706-4ecd-9190-c39af889657c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://799a782b20c6af25f16146a6f775dbb1f9a70567c6e183bc3760541beba535a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5wgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://994153537c9e7d0ac89a6651b22147c42d9263d9652cc7748afa72e1e46a344a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5wgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rz59t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:17Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:17 crc kubenswrapper[4901]: I0309 02:43:17.065292 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad3db3b3-d72c-4e61-9db8-fcbc6abb0578\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95a5fc152f730178294cfb5d25a815b2041e4b712e4bdb7b334b4b417f463ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ceafe61efea5730e474e74a9ed5a7558cc06dc75d105c37680ed1eea84b8827c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:17Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 02:41:48.034714 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 02:41:48.036179 1 observer_polling.go:159] Starting file observer\\\\nI0309 02:41:48.038474 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 02:41:48.039660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 02:42:14.296758 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 02:42:17.547772 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 02:42:17.547837 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:47Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0cd220ceed026b8b0448743b09537807f1d4c59e290c74e4afa63b3331aaf97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c86d550b0e0522073aa52f3a6aa6671f7ded403e37eda7c7b91c6e069272dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04aae8495c8f5edddbe537c44cefdb20eaa781df485766f6e709a5d4c6e82df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:17Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:17 crc kubenswrapper[4901]: I0309 02:43:17.077202 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f7c21ba0d2836cb5905904941725dc9b5df5e173b47637722608075496850bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ce0f53e8c88e047328fb896ffa5133e9ac723b73795634e24072acde25506d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:17Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:17 crc kubenswrapper[4901]: I0309 02:43:17.089803 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-429fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f00772dc49e31beb94fb8c970e126f77b33ecff59ca8d6fbc856704648ec6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4j6pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-429fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:17Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:17 crc kubenswrapper[4901]: I0309 02:43:17.103149 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65e722e8-52c4-4bb6-9927-f378b2f7296a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25b2c2479794812333e6425d16a7432c46578a2a6d240682761545e57af6b4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7db76f28a31cfe2c8e33f8d3addf0b75358f97a36710315a069de1716506eae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5c998\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:17Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:17 crc kubenswrapper[4901]: I0309 02:43:17.119267 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7a956f96042c9ac58c08b70e189b199ec512bc56ffedd9df3956127a2380a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:17Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:17 crc kubenswrapper[4901]: I0309 02:43:17.873345 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e883667-62d8-4920-a810-558a77f260ca-metrics-certs\") pod \"network-metrics-daemon-lg26b\" (UID: \"9e883667-62d8-4920-a810-558a77f260ca\") " pod="openshift-multus/network-metrics-daemon-lg26b" Mar 09 02:43:17 crc kubenswrapper[4901]: E0309 02:43:17.873571 4901 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 02:43:17 crc kubenswrapper[4901]: E0309 02:43:17.874058 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e883667-62d8-4920-a810-558a77f260ca-metrics-certs podName:9e883667-62d8-4920-a810-558a77f260ca nodeName:}" failed. No retries permitted until 2026-03-09 02:43:21.874029311 +0000 UTC m=+126.463693084 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9e883667-62d8-4920-a810-558a77f260ca-metrics-certs") pod "network-metrics-daemon-lg26b" (UID: "9e883667-62d8-4920-a810-558a77f260ca") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 02:43:18 crc kubenswrapper[4901]: I0309 02:43:18.106277 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lg26b" Mar 09 02:43:18 crc kubenswrapper[4901]: I0309 02:43:18.106332 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:43:18 crc kubenswrapper[4901]: I0309 02:43:18.106333 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:43:18 crc kubenswrapper[4901]: E0309 02:43:18.106484 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lg26b" podUID="9e883667-62d8-4920-a810-558a77f260ca" Mar 09 02:43:18 crc kubenswrapper[4901]: E0309 02:43:18.106575 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 02:43:18 crc kubenswrapper[4901]: E0309 02:43:18.106650 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 02:43:18 crc kubenswrapper[4901]: I0309 02:43:18.107383 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:43:18 crc kubenswrapper[4901]: E0309 02:43:18.107608 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 02:43:20 crc kubenswrapper[4901]: I0309 02:43:20.105752 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:43:20 crc kubenswrapper[4901]: I0309 02:43:20.105793 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:43:20 crc kubenswrapper[4901]: I0309 02:43:20.105758 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:43:20 crc kubenswrapper[4901]: E0309 02:43:20.105931 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 02:43:20 crc kubenswrapper[4901]: I0309 02:43:20.106006 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lg26b" Mar 09 02:43:20 crc kubenswrapper[4901]: E0309 02:43:20.106160 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 02:43:20 crc kubenswrapper[4901]: E0309 02:43:20.106738 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 02:43:20 crc kubenswrapper[4901]: E0309 02:43:20.107543 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lg26b" podUID="9e883667-62d8-4920-a810-558a77f260ca" Mar 09 02:43:20 crc kubenswrapper[4901]: I0309 02:43:20.107942 4901 scope.go:117] "RemoveContainer" containerID="d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc" Mar 09 02:43:20 crc kubenswrapper[4901]: I0309 02:43:20.840656 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 09 02:43:20 crc kubenswrapper[4901]: I0309 02:43:20.843375 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9f1f96e1c6d66ab0dc08021bc7fe05a276034895cd0bb6784bd0ce5a18b7d406"} Mar 09 02:43:20 crc kubenswrapper[4901]: I0309 02:43:20.844600 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 02:43:20 crc kubenswrapper[4901]: I0309 02:43:20.865106 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7a956f96042c9ac58c08b70e189b199ec512bc56ffedd9df3956127a2380a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:20Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:20 crc kubenswrapper[4901]: I0309 02:43:20.881713 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f7c21ba0d2836cb5905904941725dc9b5df5e173b47637722608075496850bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ce0f53e8c88e047328fb896ffa5133e9ac723b73795634e24072acde25506d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:20Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:20 crc kubenswrapper[4901]: I0309 02:43:20.902638 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-429fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f00772dc49e31beb94fb8c970e126f77b33ecff59ca8d6fbc856704648ec6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4j6pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-429fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:20Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:20 crc kubenswrapper[4901]: I0309 02:43:20.921889 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65e722e8-52c4-4bb6-9927-f378b2f7296a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25b2c2479794812333e6425d16a7432c46578a2a6d240682761545e57af6b4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7db76f28a31cfe2c8e33f8d3addf0b75358f97a36710315a069de1716506eae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5c998\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:20Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:20 crc kubenswrapper[4901]: I0309 02:43:20.953719 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f58555f-103d-42bd-affa-68dd3ed5baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18f78a3248e86d96e6293502db1727285fb76668c8242ed3399de4b8e8fb666a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79c92d776bf96579afc6fbc0e1e70bb645a371966be4e4e9d86e18ef3c977b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7936d7271945f5c0b5ab0ddecf9a3f46c9f7306b71e25fabbb3927303c6f9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1098f7874be8a2a538c966845a946ab7993ec921e4e0859953f0187b4fabfd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23f66811373b03763a100551c0390732ccceac869c908d769de48f847e4ab82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:20Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:20 crc kubenswrapper[4901]: I0309 02:43:20.971768 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:20Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:20 crc kubenswrapper[4901]: I0309 02:43:20.991481 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:20Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:21 crc kubenswrapper[4901]: I0309 02:43:21.025566 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40c17e04-3fc2-48a2-95dc-fe0428b91e66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3178075164db642d21adaf4deed75f6f295794530311c6476b6f0376101b61c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce11cf214e5d6313c477f4939e0627d14eecc389d7df43bfa0a4bff5152baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eead4ce26a12a03f0e1808e8e8479ca48c9a409935ea566fae8a7cc2d8bc7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c9f87a22de3336b00d43f7848f50bda9f5f27f7ac84c6c6e63a224583005e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a115bbd5ac61babf8f88452e2c50b49fd24b34589357e4a7d59bd2a3511b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce9289308fa0e0313c1e1de4a2da2e1544bf309f32c6f58cf4a914210f10c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c188c6c7f7cc9a46c9e941cf0096b7d2828dac9036d388388958f1d44f3cfdd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c188c6c7f7cc9a46c9e941cf0096b7d2828dac9036d388388958f1d44f3cfdd4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T02:43:15Z\\\",\\\"message\\\":\\\"1336] Added *v1.EgressIP event handler 8\\\\nI0309 02:43:15.073036 6943 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 02:43:15.073060 6943 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 02:43:15.073089 6943 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 02:43:15.073122 6943 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0309 02:43:15.073137 6943 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0309 02:43:15.073159 6943 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0309 02:43:15.073188 6943 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0309 02:43:15.073200 6943 factory.go:656] Stopping watch factory\\\\nI0309 02:43:15.073201 6943 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0309 02:43:15.073165 6943 handler.go:208] Removed *v1.Node event handler 7\\\\nI0309 02:43:15.073258 6943 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0309 02:43:15.073380 6943 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0309 02:43:15.073436 6943 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0309 02:43:15.073461 6943 ovnkube.go:599] Stopped ovnkube\\\\nI0309 02:43:15.073484 6943 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0309 02:43:15.073539 6943 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bmfgc_openshift-ovn-kubernetes(40c17e04-3fc2-48a2-95dc-fe0428b91e66)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2c4c63b1ab59c59daca5ed71ee5ed0e53a565ffbc793ac516f9ba75a4ece272\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmfgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:21Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:21 crc kubenswrapper[4901]: I0309 02:43:21.042874 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-c2pjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b166f443-7b7c-478b-bfc1-67291aa158fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f409cfbaaf5a97aa5b05433826d0ba084ecc5dc20afb3e733c9cb2e38a2e507a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4l68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-c2pjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:21Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:21 crc kubenswrapper[4901]: I0309 02:43:21.063744 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ad2682-53b5-4e9d-acd1-0f0d210b322c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a290d0cc95a867bb7d924bfb2e63a518bb753765d4cdb890ccdf96cd36450b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72901ff5efbd52bce10e4651744925f7a5758a7a3c2572b581ab2cd222f58bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d0fe7f5e959b5c71b03b4ad017949229c3a161ea79f7869e610d069248c552\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f1f96e1c6d66ab0dc08021bc7fe05a276034895cd0bb6784bd0ce5a18b7d406\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 02:42:28.169798 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 02:42:28.169917 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 02:42:28.170597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1507146702/tls.crt::/tmp/serving-cert-1507146702/tls.key\\\\\\\"\\\\nI0309 02:42:28.390239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 02:42:28.392919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 02:42:28.392936 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 02:42:28.392959 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 02:42:28.392964 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 02:42:28.400711 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 02:42:28.400732 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 02:42:28.400760 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 02:42:28.400786 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 02:42:28.400792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 02:42:28.400798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 02:42:28.402404 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99df08aa182691c1bdb9ab9fb7b5352b276c236655b1076e2993639d63a94bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:21Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:21 crc kubenswrapper[4901]: I0309 02:43:21.080407 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:21Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:21 crc kubenswrapper[4901]: I0309 02:43:21.097181 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lg26b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e883667-62d8-4920-a810-558a77f260ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgs6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgs6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lg26b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:21Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:21 crc kubenswrapper[4901]: I0309 02:43:21.115948 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad3db3b3-d72c-4e61-9db8-fcbc6abb0578\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95a5fc152f730178294cfb5d25a815b2041e4b712e4bdb7b334b4b417f463ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ceafe61efea5730e474e74a9ed5a7558cc06dc75d105c37680ed1eea84b8827c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:17Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 02:41:48.034714 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 02:41:48.036179 1 observer_polling.go:159] Starting file observer\\\\nI0309 02:41:48.038474 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 02:41:48.039660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 02:42:14.296758 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 02:42:17.547772 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 02:42:17.547837 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:47Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0cd220ceed026b8b0448743b09537807f1d4c59e290c74e4afa63b3331aaf97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c86d550b0e0522073aa52f3a6aa6671f7ded403e37eda7c7b91c6e069272dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04aae8495c8f5edddbe537c44cefdb20eaa781df485766f6e709a5d4c6e82df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:21Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:21 crc kubenswrapper[4901]: I0309 02:43:21.136906 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84cc042c2e8b6984f9b3357afce277a63722f17b3e8fd289eeeaa8a7a47935f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:21Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:21 crc kubenswrapper[4901]: I0309 02:43:21.153107 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tvqrz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ac7cbd-671c-4beb-8994-92502ee47ceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee19f70b95aa359f8230d02f66a5ffe7bd83b01af33602ffee28239e335210d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbmmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tvqrz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:21Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:21 crc kubenswrapper[4901]: I0309 02:43:21.173141 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jtcxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98293641-7bf9-4473-ae92-c80e56cefdb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8cb9651e7ee62e24bc299a24e24546fc393d36c247cfc7d6fab578f2d5fc392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9388d004278c87c5f70e21f80ffb92a37910be99d37d4890f1802ca8ee862c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9388d004278c87c5f70e21f80ffb92a37910be99d37d4890f1802ca8ee862c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57469350bdcfa9d44c83dd87ce9ef88a75c2ed0302fd633c6c23f95a60572f9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57469350bdcfa9d44c83dd87ce9ef88a75c2ed0302fd633c6c23f95a60572f9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1536a2d9d7cc0d60c0442b625d3e58b3b5ffa615b5424405b8f896940e4085b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1536a2d9d7cc0d60c0442b625d3e58b3b5ffa615b5424405b8f896940e4085b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a7a4a4bf21360999d06acf702346f565fb76f5a5c4e5a8d1666fa395299d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a7a4a4bf21360999d06acf702346f565fb76f5a5c4e5a8d1666fa395299d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce664be16b67bb0782620399124bae458654d66c9331521c2df213476651b740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce664be16b67bb0782620399124bae458654d66c9331521c2df213476651b740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99311d694d177551b5f59aa756a0674f18816e477c3195cd30d40a805c964940\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99311d694d177551b5f59aa756a0674f18816e477c3195cd30d40a805c964940\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jtcxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:21Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:21 crc kubenswrapper[4901]: I0309 02:43:21.189848 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz59t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d14fa511-1706-4ecd-9190-c39af889657c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://799a782b20c6af25f16146a6f775dbb1f9a70567c6e183bc3760541beba535a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5wgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://994153537c9e7d0ac89a6651b22147c42d9263d9652cc7748afa72e1e46a344a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5wgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rz59t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:21Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:21 crc kubenswrapper[4901]: E0309 02:43:21.451374 4901 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 02:43:21 crc kubenswrapper[4901]: I0309 02:43:21.921480 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e883667-62d8-4920-a810-558a77f260ca-metrics-certs\") pod \"network-metrics-daemon-lg26b\" (UID: \"9e883667-62d8-4920-a810-558a77f260ca\") " pod="openshift-multus/network-metrics-daemon-lg26b" Mar 09 02:43:21 crc kubenswrapper[4901]: E0309 02:43:21.921755 4901 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 02:43:21 crc kubenswrapper[4901]: E0309 02:43:21.921891 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e883667-62d8-4920-a810-558a77f260ca-metrics-certs podName:9e883667-62d8-4920-a810-558a77f260ca nodeName:}" failed. No retries permitted until 2026-03-09 02:43:29.921865346 +0000 UTC m=+134.511529118 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9e883667-62d8-4920-a810-558a77f260ca-metrics-certs") pod "network-metrics-daemon-lg26b" (UID: "9e883667-62d8-4920-a810-558a77f260ca") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 02:43:22 crc kubenswrapper[4901]: I0309 02:43:22.105877 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:43:22 crc kubenswrapper[4901]: I0309 02:43:22.105940 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lg26b" Mar 09 02:43:22 crc kubenswrapper[4901]: I0309 02:43:22.106314 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:43:22 crc kubenswrapper[4901]: I0309 02:43:22.106378 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:43:22 crc kubenswrapper[4901]: E0309 02:43:22.106431 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 02:43:22 crc kubenswrapper[4901]: E0309 02:43:22.106520 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 02:43:22 crc kubenswrapper[4901]: E0309 02:43:22.106528 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 02:43:22 crc kubenswrapper[4901]: E0309 02:43:22.106638 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lg26b" podUID="9e883667-62d8-4920-a810-558a77f260ca" Mar 09 02:43:24 crc kubenswrapper[4901]: I0309 02:43:24.105917 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:43:24 crc kubenswrapper[4901]: E0309 02:43:24.106176 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 02:43:24 crc kubenswrapper[4901]: I0309 02:43:24.106330 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:43:24 crc kubenswrapper[4901]: I0309 02:43:24.106364 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:43:24 crc kubenswrapper[4901]: I0309 02:43:24.106410 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lg26b" Mar 09 02:43:24 crc kubenswrapper[4901]: E0309 02:43:24.106531 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 02:43:24 crc kubenswrapper[4901]: E0309 02:43:24.106675 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lg26b" podUID="9e883667-62d8-4920-a810-558a77f260ca" Mar 09 02:43:24 crc kubenswrapper[4901]: E0309 02:43:24.106800 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 02:43:24 crc kubenswrapper[4901]: I0309 02:43:24.323551 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:24 crc kubenswrapper[4901]: I0309 02:43:24.323625 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:24 crc kubenswrapper[4901]: I0309 02:43:24.323643 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:24 crc kubenswrapper[4901]: I0309 02:43:24.323668 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:24 crc kubenswrapper[4901]: I0309 02:43:24.323686 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:24Z","lastTransitionTime":"2026-03-09T02:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:24 crc kubenswrapper[4901]: E0309 02:43:24.347089 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4fb5477d-c9aa-418f-9a0d-560ac0227b13\\\",\\\"systemUUID\\\":\\\"92bcf75b-bfed-4296-bbf2-d35c6ac3a586\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:24Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:24 crc kubenswrapper[4901]: I0309 02:43:24.357052 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:24 crc kubenswrapper[4901]: I0309 02:43:24.357177 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:24 crc kubenswrapper[4901]: I0309 02:43:24.357203 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:24 crc kubenswrapper[4901]: I0309 02:43:24.357272 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:24 crc kubenswrapper[4901]: I0309 02:43:24.357303 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:24Z","lastTransitionTime":"2026-03-09T02:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:24 crc kubenswrapper[4901]: E0309 02:43:24.376838 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4fb5477d-c9aa-418f-9a0d-560ac0227b13\\\",\\\"systemUUID\\\":\\\"92bcf75b-bfed-4296-bbf2-d35c6ac3a586\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:24Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:24 crc kubenswrapper[4901]: I0309 02:43:24.383077 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:24 crc kubenswrapper[4901]: I0309 02:43:24.383142 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:24 crc kubenswrapper[4901]: I0309 02:43:24.383161 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:24 crc kubenswrapper[4901]: I0309 02:43:24.383189 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:24 crc kubenswrapper[4901]: I0309 02:43:24.383209 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:24Z","lastTransitionTime":"2026-03-09T02:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:24 crc kubenswrapper[4901]: E0309 02:43:24.405135 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4fb5477d-c9aa-418f-9a0d-560ac0227b13\\\",\\\"systemUUID\\\":\\\"92bcf75b-bfed-4296-bbf2-d35c6ac3a586\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:24Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:24 crc kubenswrapper[4901]: I0309 02:43:24.410247 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:24 crc kubenswrapper[4901]: I0309 02:43:24.410381 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:24 crc kubenswrapper[4901]: I0309 02:43:24.410470 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:24 crc kubenswrapper[4901]: I0309 02:43:24.410575 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:24 crc kubenswrapper[4901]: I0309 02:43:24.410675 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:24Z","lastTransitionTime":"2026-03-09T02:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:24 crc kubenswrapper[4901]: E0309 02:43:24.431424 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4fb5477d-c9aa-418f-9a0d-560ac0227b13\\\",\\\"systemUUID\\\":\\\"92bcf75b-bfed-4296-bbf2-d35c6ac3a586\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:24Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:24 crc kubenswrapper[4901]: I0309 02:43:24.436809 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:24 crc kubenswrapper[4901]: I0309 02:43:24.436938 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:24 crc kubenswrapper[4901]: I0309 02:43:24.437033 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:24 crc kubenswrapper[4901]: I0309 02:43:24.437141 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:24 crc kubenswrapper[4901]: I0309 02:43:24.437260 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:24Z","lastTransitionTime":"2026-03-09T02:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:24 crc kubenswrapper[4901]: E0309 02:43:24.456832 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4fb5477d-c9aa-418f-9a0d-560ac0227b13\\\",\\\"systemUUID\\\":\\\"92bcf75b-bfed-4296-bbf2-d35c6ac3a586\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:24Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:24 crc kubenswrapper[4901]: E0309 02:43:24.457074 4901 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 02:43:26 crc kubenswrapper[4901]: I0309 02:43:26.106087 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:43:26 crc kubenswrapper[4901]: I0309 02:43:26.106146 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:43:26 crc kubenswrapper[4901]: I0309 02:43:26.106273 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:43:26 crc kubenswrapper[4901]: I0309 02:43:26.106109 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lg26b" Mar 09 02:43:26 crc kubenswrapper[4901]: E0309 02:43:26.106382 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 02:43:26 crc kubenswrapper[4901]: E0309 02:43:26.106568 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 02:43:26 crc kubenswrapper[4901]: E0309 02:43:26.106783 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lg26b" podUID="9e883667-62d8-4920-a810-558a77f260ca" Mar 09 02:43:26 crc kubenswrapper[4901]: E0309 02:43:26.106982 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 02:43:26 crc kubenswrapper[4901]: I0309 02:43:26.130507 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7a956f96042c9ac58c08b70e189b199ec512bc56ffedd9df3956127a2380a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:26Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:26 crc kubenswrapper[4901]: I0309 02:43:26.154941 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f7c21ba0d2836cb5905904941725dc9b5df5e173b47637722608075496850bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ce0f53e8c88e047328fb896ffa5133e9ac723b73795634e24072acde25506d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:26Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:26 crc kubenswrapper[4901]: I0309 02:43:26.178729 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-429fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f00772dc49e31beb94fb8c970e126f77b33ecff59ca8d6fbc856704648ec6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4j6pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-429fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:26Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:26 crc kubenswrapper[4901]: I0309 02:43:26.196015 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65e722e8-52c4-4bb6-9927-f378b2f7296a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25b2c2479794812333e6425d16a7432c46578a2a6d240682761545e57af6b4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7db76f28a31cfe2c8e33f8d3addf0b75358f97a36710315a069de1716506eae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5c998\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:26Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:26 crc kubenswrapper[4901]: I0309 02:43:26.213949 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-c2pjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b166f443-7b7c-478b-bfc1-67291aa158fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f409cfbaaf5a97aa5b05433826d0ba084ecc5dc20afb3e733c9cb2e38a2e507a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4l68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-c2pjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:26Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:26 crc kubenswrapper[4901]: I0309 02:43:26.251124 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f58555f-103d-42bd-affa-68dd3ed5baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18f78a3248e86d96e6293502db1727285fb76668c8242ed3399de4b8e8fb666a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79c92d776bf96579afc6fbc0e1e70bb645a371966be4e4e9d86e18ef3c977b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7936d7271945f5c0b5ab0ddecf9a3f46c9f7306b71e25fabbb3927303c6f9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1098f7874be8a2a538c966845a946ab7993ec921e4e0859953f0187b4fabfd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23f66811373b03763a100551c0390732ccceac869c908d769de48f847e4ab82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:26Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:26 crc kubenswrapper[4901]: I0309 02:43:26.271839 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:26Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:26 crc kubenswrapper[4901]: I0309 02:43:26.291677 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:26Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:26 crc kubenswrapper[4901]: I0309 02:43:26.325304 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40c17e04-3fc2-48a2-95dc-fe0428b91e66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3178075164db642d21adaf4deed75f6f295794530311c6476b6f0376101b61c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce11cf214e5d6313c477f4939e0627d14eecc389d7df43bfa0a4bff5152baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eead4ce26a12a03f0e1808e8e8479ca48c9a409935ea566fae8a7cc2d8bc7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c9f87a22de3336b00d43f7848f50bda9f5f27f7ac84c6c6e63a224583005e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a115bbd5ac61babf8f88452e2c50b49fd24b34589357e4a7d59bd2a3511b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce9289308fa0e0313c1e1de4a2da2e1544bf309f32c6f58cf4a914210f10c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c188c6c7f7cc9a46c9e941cf0096b7d2828dac9036d388388958f1d44f3cfdd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c188c6c7f7cc9a46c9e941cf0096b7d2828dac9036d388388958f1d44f3cfdd4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T02:43:15Z\\\",\\\"message\\\":\\\"1336] Added *v1.EgressIP event handler 8\\\\nI0309 02:43:15.073036 6943 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 02:43:15.073060 6943 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 02:43:15.073089 6943 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 02:43:15.073122 6943 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0309 02:43:15.073137 6943 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0309 02:43:15.073159 6943 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0309 02:43:15.073188 6943 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0309 02:43:15.073200 6943 factory.go:656] Stopping watch factory\\\\nI0309 02:43:15.073201 6943 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0309 02:43:15.073165 6943 handler.go:208] Removed *v1.Node event handler 7\\\\nI0309 02:43:15.073258 6943 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0309 02:43:15.073380 6943 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0309 02:43:15.073436 6943 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0309 02:43:15.073461 6943 ovnkube.go:599] Stopped ovnkube\\\\nI0309 02:43:15.073484 6943 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0309 02:43:15.073539 6943 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bmfgc_openshift-ovn-kubernetes(40c17e04-3fc2-48a2-95dc-fe0428b91e66)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2c4c63b1ab59c59daca5ed71ee5ed0e53a565ffbc793ac516f9ba75a4ece272\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmfgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:26Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:26 crc kubenswrapper[4901]: I0309 02:43:26.348350 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ad2682-53b5-4e9d-acd1-0f0d210b322c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a290d0cc95a867bb7d924bfb2e63a518bb753765d4cdb890ccdf96cd36450b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72901ff5efbd52bce10e4651744925f7a5758a7a3c2572b581ab2cd222f58bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d0fe7f5e959b5c71b03b4ad017949229c3a161ea79f7869e610d069248c552\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f1f96e1c6d66ab0dc08021bc7fe05a276034895cd0bb6784bd0ce5a18b7d406\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 02:42:28.169798 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 02:42:28.169917 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 02:42:28.170597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1507146702/tls.crt::/tmp/serving-cert-1507146702/tls.key\\\\\\\"\\\\nI0309 02:42:28.390239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 02:42:28.392919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 02:42:28.392936 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 02:42:28.392959 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 02:42:28.392964 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 02:42:28.400711 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 02:42:28.400732 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 02:42:28.400760 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 02:42:28.400786 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 02:42:28.400792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 02:42:28.400798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 02:42:28.402404 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99df08aa182691c1bdb9ab9fb7b5352b276c236655b1076e2993639d63a94bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:26Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:26 crc kubenswrapper[4901]: I0309 02:43:26.373995 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:26Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:26 crc kubenswrapper[4901]: I0309 02:43:26.398503 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lg26b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e883667-62d8-4920-a810-558a77f260ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgs6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgs6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lg26b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:26Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:26 crc kubenswrapper[4901]: I0309 02:43:26.421120 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz59t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d14fa511-1706-4ecd-9190-c39af889657c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://799a782b20c6af25f16146a6f775dbb1f9a70567c6e183bc3760541beba535a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5wgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://994153537c9e7d0ac89a6651b22147c42d9263d9652cc7748afa72e1e46a344a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5wgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rz59t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:26Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:26 crc kubenswrapper[4901]: I0309 02:43:26.446727 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad3db3b3-d72c-4e61-9db8-fcbc6abb0578\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95a5fc152f730178294cfb5d25a815b2041e4b712e4bdb7b334b4b417f463ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ceafe61efea5730e474e74a9ed5a7558cc06dc75d105c37680ed1eea84b8827c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:17Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 02:41:48.034714 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 02:41:48.036179 1 observer_polling.go:159] Starting file observer\\\\nI0309 02:41:48.038474 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 02:41:48.039660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 02:42:14.296758 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 02:42:17.547772 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 02:42:17.547837 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:47Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0cd220ceed026b8b0448743b09537807f1d4c59e290c74e4afa63b3331aaf97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c86d550b0e0522073aa52f3a6aa6671f7ded403e37eda7c7b91c6e069272dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04aae8495c8f5edddbe537c44cefdb20eaa781df485766f6e709a5d4c6e82df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:26Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:26 crc kubenswrapper[4901]: E0309 02:43:26.452866 4901 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 02:43:26 crc kubenswrapper[4901]: I0309 02:43:26.468923 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84cc042c2e8b6984f9b3357afce277a63722f17b3e8fd289eeeaa8a7a47935f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:26Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:26 crc kubenswrapper[4901]: I0309 02:43:26.489756 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tvqrz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ac7cbd-671c-4beb-8994-92502ee47ceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee19f70b95aa359f8230d02f66a5ffe7bd83b01af33602ffee28239e335210d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbmmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tvqrz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:26Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:26 crc kubenswrapper[4901]: I0309 02:43:26.512805 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jtcxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98293641-7bf9-4473-ae92-c80e56cefdb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8cb9651e7ee62e24bc299a24e24546fc393d36c247cfc7d6fab578f2d5fc392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9388d004278c87c5f70e21f80ffb92a37910be99d37d4890f1802ca8ee862c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9388d004278c87c5f70e21f80ffb92a37910be99d37d4890f1802ca8ee862c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57469350bdcfa9d44c83dd87ce9ef88a75c2ed0302fd633c6c23f95a60572f9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57469350bdcfa9d44c83dd87ce9ef88a75c2ed0302fd633c6c23f95a60572f9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1536a2d9d7cc0d60c0442b625d3e58b3b5ffa615b5424405b8f896940e4085b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1536a2d9d7cc0d60c0442b625d3e58b3b5ffa615b5424405b8f896940e4085b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a7a4a4bf21360999d06acf702346f565fb76f5a5c4e5a8d1666fa395299d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a7a4a4bf21360999d06acf702346f565fb76f5a5c4e5a8d1666fa395299d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce664be16b67bb0782620399124bae458654d66c9331521c2df213476651b740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce664be16b67bb0782620399124bae458654d66c9331521c2df213476651b740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99311d694d177551b5f59aa756a0674f18816e477c3195cd30d40a805c964940\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99311d694d177551b5f59aa756a0674f18816e477c3195cd30d40a805c964940\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jtcxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:26Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:28 crc kubenswrapper[4901]: I0309 02:43:28.105395 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:43:28 crc kubenswrapper[4901]: I0309 02:43:28.105566 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:43:28 crc kubenswrapper[4901]: E0309 02:43:28.105709 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 02:43:28 crc kubenswrapper[4901]: I0309 02:43:28.105995 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:43:28 crc kubenswrapper[4901]: I0309 02:43:28.106050 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lg26b" Mar 09 02:43:28 crc kubenswrapper[4901]: E0309 02:43:28.106158 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 02:43:28 crc kubenswrapper[4901]: E0309 02:43:28.106440 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lg26b" podUID="9e883667-62d8-4920-a810-558a77f260ca" Mar 09 02:43:28 crc kubenswrapper[4901]: E0309 02:43:28.106943 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 02:43:28 crc kubenswrapper[4901]: I0309 02:43:28.119818 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 09 02:43:30 crc kubenswrapper[4901]: I0309 02:43:30.022493 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e883667-62d8-4920-a810-558a77f260ca-metrics-certs\") pod \"network-metrics-daemon-lg26b\" (UID: \"9e883667-62d8-4920-a810-558a77f260ca\") " pod="openshift-multus/network-metrics-daemon-lg26b" Mar 09 02:43:30 crc kubenswrapper[4901]: E0309 02:43:30.022658 4901 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 02:43:30 crc kubenswrapper[4901]: E0309 02:43:30.022806 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e883667-62d8-4920-a810-558a77f260ca-metrics-certs podName:9e883667-62d8-4920-a810-558a77f260ca nodeName:}" failed. No retries permitted until 2026-03-09 02:43:46.022771792 +0000 UTC m=+150.612435554 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9e883667-62d8-4920-a810-558a77f260ca-metrics-certs") pod "network-metrics-daemon-lg26b" (UID: "9e883667-62d8-4920-a810-558a77f260ca") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 02:43:30 crc kubenswrapper[4901]: I0309 02:43:30.106291 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:43:30 crc kubenswrapper[4901]: I0309 02:43:30.106444 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:43:30 crc kubenswrapper[4901]: I0309 02:43:30.106384 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lg26b" Mar 09 02:43:30 crc kubenswrapper[4901]: E0309 02:43:30.106617 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 02:43:30 crc kubenswrapper[4901]: I0309 02:43:30.106829 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:43:30 crc kubenswrapper[4901]: E0309 02:43:30.106944 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lg26b" podUID="9e883667-62d8-4920-a810-558a77f260ca" Mar 09 02:43:30 crc kubenswrapper[4901]: E0309 02:43:30.107080 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 02:43:30 crc kubenswrapper[4901]: E0309 02:43:30.107207 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 02:43:30 crc kubenswrapper[4901]: I0309 02:43:30.671493 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 02:43:30 crc kubenswrapper[4901]: I0309 02:43:30.695670 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ad2682-53b5-4e9d-acd1-0f0d210b322c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a290d0cc95a867bb7d924bfb2e63a518bb753765d4cdb890ccdf96cd36450b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72901ff5efbd52bce10e4651744925f7a5758a7a3c2572b581ab2cd222f58bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d0fe7f5e959b5c71b03b4ad017949229c3a161ea79f7869e610d069248c552\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f1f96e1c6d66ab0dc08021bc7fe05a276034895cd0bb6784bd0ce5a18b7d406\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 02:42:28.169798 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 02:42:28.169917 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 02:42:28.170597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1507146702/tls.crt::/tmp/serving-cert-1507146702/tls.key\\\\\\\"\\\\nI0309 02:42:28.390239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 02:42:28.392919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 02:42:28.392936 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 02:42:28.392959 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 02:42:28.392964 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 02:42:28.400711 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 02:42:28.400732 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 02:42:28.400760 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 02:42:28.400786 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 02:42:28.400792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 02:42:28.400798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 02:42:28.402404 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99df08aa182691c1bdb9ab9fb7b5352b276c236655b1076e2993639d63a94bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:30Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:30 crc kubenswrapper[4901]: I0309 02:43:30.720124 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:30Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:30 crc kubenswrapper[4901]: I0309 02:43:30.738747 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lg26b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e883667-62d8-4920-a810-558a77f260ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgs6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgs6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lg26b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:30Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:30 crc kubenswrapper[4901]: I0309 02:43:30.771375 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tvqrz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ac7cbd-671c-4beb-8994-92502ee47ceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee19f70b95aa359f8230d02f66a5ffe7bd83b01af33602ffee28239e335210d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbmmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tvqrz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:30Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:30 crc kubenswrapper[4901]: I0309 02:43:30.790938 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jtcxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98293641-7bf9-4473-ae92-c80e56cefdb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8cb9651e7ee62e24bc299a24e24546fc393d36c247cfc7d6fab578f2d5fc392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9388d004278c87c5f70e21f80ffb92a37910be99d37d4890f1802ca8ee862c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9388d004278c87c5f70e21f80ffb92a37910be99d37d4890f1802ca8ee862c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57469350bdcfa9d44c83dd87ce9ef88a75c2ed0302fd633c6c23f95a60572f9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57469350bdcfa9d44c83dd87ce9ef88a75c2ed0302fd633c6c23f95a60572f9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1536a2d9d7cc0d60c0442b625d3e58b3b5ffa615b5424405b8f896940e4085b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1536a2d9d7cc0d60c0442b625d3e58b3b5ffa615b5424405b8f896940e4085b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a7a4a4bf21360999d06acf702346f565fb76f5a5c4e5a8d1666fa395299d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a7a4a4bf21360999d06acf702346f565fb76f5a5c4e5a8d1666fa395299d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce664be16b67bb0782620399124bae458654d66c9331521c2df213476651b740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce664be16b67bb0782620399124bae458654d66c9331521c2df213476651b740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99311d694d177551b5f59aa756a0674f18816e477c3195cd30d40a805c964940\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99311d694d177551b5f59aa756a0674f18816e477c3195cd30d40a805c964940\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jtcxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:30Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:30 crc kubenswrapper[4901]: I0309 02:43:30.806338 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz59t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d14fa511-1706-4ecd-9190-c39af889657c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://799a782b20c6af25f16146a6f775dbb1f9a70567c6e183bc3760541beba535a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5wgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://994153537c9e7d0ac89a6651b22147c42d9263d9652cc7748afa72e1e46a344a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5wgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rz59t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:30Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:30 crc kubenswrapper[4901]: I0309 02:43:30.821660 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad3db3b3-d72c-4e61-9db8-fcbc6abb0578\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95a5fc152f730178294cfb5d25a815b2041e4b712e4bdb7b334b4b417f463ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ceafe61efea5730e474e74a9ed5a7558cc06dc75d105c37680ed1eea84b8827c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:17Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 02:41:48.034714 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 02:41:48.036179 1 observer_polling.go:159] Starting file observer\\\\nI0309 02:41:48.038474 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 02:41:48.039660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 02:42:14.296758 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 02:42:17.547772 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 02:42:17.547837 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:47Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0cd220ceed026b8b0448743b09537807f1d4c59e290c74e4afa63b3331aaf97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c86d550b0e0522073aa52f3a6aa6671f7ded403e37eda7c7b91c6e069272dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04aae8495c8f5edddbe537c44cefdb20eaa781df485766f6e709a5d4c6e82df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:30Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:30 crc kubenswrapper[4901]: I0309 02:43:30.836091 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84cc042c2e8b6984f9b3357afce277a63722f17b3e8fd289eeeaa8a7a47935f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:30Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:30 crc kubenswrapper[4901]: I0309 02:43:30.850795 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65e722e8-52c4-4bb6-9927-f378b2f7296a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25b2c2479794812333e6425d16a7432c46578a2a6d240682761545e57af6b4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7db76f28a31cfe2c8e33f8d3addf0b75358f97a36710315a069de1716506eae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5c998\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:30Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:30 crc kubenswrapper[4901]: I0309 02:43:30.871731 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7a956f96042c9ac58c08b70e189b199ec512bc56ffedd9df3956127a2380a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:30Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:30 crc kubenswrapper[4901]: I0309 02:43:30.888670 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f7c21ba0d2836cb5905904941725dc9b5df5e173b47637722608075496850bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ce0f53e8c88e047328fb896ffa5133e9ac723b73795634e24072acde25506d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:30Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:30 crc kubenswrapper[4901]: I0309 02:43:30.908193 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-429fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f00772dc49e31beb94fb8c970e126f77b33ecff59ca8d6fbc856704648ec6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4j6pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-429fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:30Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:30 crc kubenswrapper[4901]: I0309 02:43:30.925577 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:30Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:30 crc kubenswrapper[4901]: I0309 02:43:30.945738 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40c17e04-3fc2-48a2-95dc-fe0428b91e66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3178075164db642d21adaf4deed75f6f295794530311c6476b6f0376101b61c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce11cf214e5d6313c477f4939e0627d14eecc389d7df43bfa0a4bff5152baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eead4ce26a12a03f0e1808e8e8479ca48c9a409935ea566fae8a7cc2d8bc7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c9f87a22de3336b00d43f7848f50bda9f5f27f7ac84c6c6e63a224583005e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a115bbd5ac61babf8f88452e2c50b49fd24b34589357e4a7d59bd2a3511b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce9289308fa0e0313c1e1de4a2da2e1544bf309f32c6f58cf4a914210f10c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c188c6c7f7cc9a46c9e941cf0096b7d2828dac9036d388388958f1d44f3cfdd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c188c6c7f7cc9a46c9e941cf0096b7d2828dac9036d388388958f1d44f3cfdd4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T02:43:15Z\\\",\\\"message\\\":\\\"1336] Added *v1.EgressIP event handler 8\\\\nI0309 02:43:15.073036 6943 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 02:43:15.073060 6943 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 02:43:15.073089 6943 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 02:43:15.073122 6943 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0309 02:43:15.073137 6943 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0309 02:43:15.073159 6943 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0309 02:43:15.073188 6943 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0309 02:43:15.073200 6943 factory.go:656] Stopping watch factory\\\\nI0309 02:43:15.073201 6943 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0309 02:43:15.073165 6943 handler.go:208] Removed *v1.Node event handler 7\\\\nI0309 02:43:15.073258 6943 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0309 02:43:15.073380 6943 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0309 02:43:15.073436 6943 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0309 02:43:15.073461 6943 ovnkube.go:599] Stopped ovnkube\\\\nI0309 02:43:15.073484 6943 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0309 02:43:15.073539 6943 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bmfgc_openshift-ovn-kubernetes(40c17e04-3fc2-48a2-95dc-fe0428b91e66)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2c4c63b1ab59c59daca5ed71ee5ed0e53a565ffbc793ac516f9ba75a4ece272\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmfgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:30Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:30 crc kubenswrapper[4901]: I0309 02:43:30.959414 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-c2pjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b166f443-7b7c-478b-bfc1-67291aa158fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f409cfbaaf5a97aa5b05433826d0ba084ecc5dc20afb3e733c9cb2e38a2e507a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4l68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-c2pjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:30Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:30 crc kubenswrapper[4901]: I0309 02:43:30.978087 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d957c835-1f04-423a-9641-444392da0360\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3836e17de731d173fb726c092d6f4a3ed70b2eab0297fb4e146004c22a3fbc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e716f348e3907e1861b6fd17ba499078ec41d0ee31bc8855d9744e4b0f640\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf2092a65f531dedb9529485252943746497cea4fec2b6c52ed5220eed868129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b72286bfa72acc7e3ad2d5eb5734f182e165eacaa4a7ccfd35675c43fc65a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b72286bfa72acc7e3ad2d5eb5734f182e165eacaa4a7ccfd35675c43fc65a82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:30Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:31 crc kubenswrapper[4901]: I0309 02:43:31.009208 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f58555f-103d-42bd-affa-68dd3ed5baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18f78a3248e86d96e6293502db1727285fb76668c8242ed3399de4b8e8fb666a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79c92d776bf96579afc6fbc0e1e70bb645a371966be4e4e9d86e18ef3c977b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7936d7271945f5c0b5ab0ddecf9a3f46c9f7306b71e25fabbb3927303c6f9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1098f7874be8a2a538c966845a946ab7993ec921e4e0859953f0187b4fabfd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23f66811373b03763a100551c0390732ccceac869c908d769de48f847e4ab82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:31Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:31 crc kubenswrapper[4901]: I0309 02:43:31.033078 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:31Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:31 crc kubenswrapper[4901]: E0309 02:43:31.454512 4901 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 02:43:32 crc kubenswrapper[4901]: I0309 02:43:32.105818 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:43:32 crc kubenswrapper[4901]: I0309 02:43:32.105825 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lg26b" Mar 09 02:43:32 crc kubenswrapper[4901]: I0309 02:43:32.105861 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:43:32 crc kubenswrapper[4901]: I0309 02:43:32.105897 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:43:32 crc kubenswrapper[4901]: E0309 02:43:32.106510 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lg26b" podUID="9e883667-62d8-4920-a810-558a77f260ca" Mar 09 02:43:32 crc kubenswrapper[4901]: E0309 02:43:32.106745 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 02:43:32 crc kubenswrapper[4901]: E0309 02:43:32.106817 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 02:43:32 crc kubenswrapper[4901]: E0309 02:43:32.106659 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 02:43:32 crc kubenswrapper[4901]: I0309 02:43:32.107020 4901 scope.go:117] "RemoveContainer" containerID="c188c6c7f7cc9a46c9e941cf0096b7d2828dac9036d388388958f1d44f3cfdd4" Mar 09 02:43:32 crc kubenswrapper[4901]: I0309 02:43:32.889421 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmfgc_40c17e04-3fc2-48a2-95dc-fe0428b91e66/ovnkube-controller/1.log" Mar 09 02:43:32 crc kubenswrapper[4901]: I0309 02:43:32.894213 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" event={"ID":"40c17e04-3fc2-48a2-95dc-fe0428b91e66","Type":"ContainerStarted","Data":"42126b9cd7c418140407793d61fb0f841ddd7a74cc146c4a922d49bb56828d3e"} Mar 09 02:43:32 crc kubenswrapper[4901]: I0309 02:43:32.894934 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:43:32 crc kubenswrapper[4901]: I0309 02:43:32.931293 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jtcxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98293641-7bf9-4473-ae92-c80e56cefdb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8cb9651e7ee62e24bc299a24e24546fc393d36c247cfc7d6fab578f2d5fc392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9388d004278c87c5f70e21f80ffb92a37910be99d37d4890f1802ca8ee862c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9388d004278c87c5f70e21f80ffb92a37910be99d37d4890f1802ca8ee862c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57469350bdcfa9d44c83dd87ce9ef88a75c2ed0302fd633c6c23f95a60572f9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57469350bdcfa9d44c83dd87ce9ef88a75c2ed0302fd633c6c23f95a60572f9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1536a2d9d7cc0d60c0442b625d3e58b3b5ffa615b5424405b8f896940e4085b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1536a2d9d7cc0d60c0442b625d3e58b3b5ffa615b5424405b8f896940e4085b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a7a4a4bf21360999d06acf702346f565fb76f5a5c4e5a8d1666fa395299d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a7a4a4bf21360999d06acf702346f565fb76f5a5c4e5a8d1666fa395299d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce664be16b67bb0782620399124bae458654d66c9331521c2df213476651b740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce664be16b67bb0782620399124bae458654d66c9331521c2df213476651b740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99311d694d177551b5f59aa756a0674f18816e477c3195cd30d40a805c964940\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99311d694d177551b5f59aa756a0674f18816e477c3195cd30d40a805c964940\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jtcxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:32Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:32 crc kubenswrapper[4901]: I0309 02:43:32.957596 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz59t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d14fa511-1706-4ecd-9190-c39af889657c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://799a782b20c6af25f16146a6f775dbb1f9a70567c6e183bc3760541beba535a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5wgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://994153537c9e7d0ac89a6651b22147c42d9263d9652cc7748afa72e1e46a344a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5wgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rz59t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:32Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:32 crc kubenswrapper[4901]: I0309 02:43:32.974109 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad3db3b3-d72c-4e61-9db8-fcbc6abb0578\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95a5fc152f730178294cfb5d25a815b2041e4b712e4bdb7b334b4b417f463ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ceafe61efea5730e474e74a9ed5a7558cc06dc75d105c37680ed1eea84b8827c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:17Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 02:41:48.034714 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 02:41:48.036179 1 observer_polling.go:159] Starting file observer\\\\nI0309 02:41:48.038474 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 02:41:48.039660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 02:42:14.296758 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 02:42:17.547772 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 02:42:17.547837 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:47Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0cd220ceed026b8b0448743b09537807f1d4c59e290c74e4afa63b3331aaf97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c86d550b0e0522073aa52f3a6aa6671f7ded403e37eda7c7b91c6e069272dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04aae8495c8f5edddbe537c44cefdb20eaa781df485766f6e709a5d4c6e82df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:32Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:32 crc kubenswrapper[4901]: I0309 02:43:32.990274 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84cc042c2e8b6984f9b3357afce277a63722f17b3e8fd289eeeaa8a7a47935f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:32Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:33 crc kubenswrapper[4901]: I0309 02:43:33.003490 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tvqrz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ac7cbd-671c-4beb-8994-92502ee47ceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee19f70b95aa359f8230d02f66a5ffe7bd83b01af33602ffee28239e335210d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbmmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tvqrz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:33Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:33 crc kubenswrapper[4901]: I0309 02:43:33.017943 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7a956f96042c9ac58c08b70e189b199ec512bc56ffedd9df3956127a2380a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:33Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:33 crc kubenswrapper[4901]: I0309 02:43:33.034471 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f7c21ba0d2836cb5905904941725dc9b5df5e173b47637722608075496850bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ce0f53e8c88e047328fb896ffa5133e9ac723b73795634e24072acde25506d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:33Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:33 crc kubenswrapper[4901]: I0309 02:43:33.050162 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-429fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f00772dc49e31beb94fb8c970e126f77b33ecff59ca8d6fbc856704648ec6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4j6pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-429fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:33Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:33 crc kubenswrapper[4901]: I0309 02:43:33.063368 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65e722e8-52c4-4bb6-9927-f378b2f7296a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25b2c2479794812333e6425d16a7432c46578a2a6d240682761545e57af6b4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7db76f28a31cfe2c8e33f8d3addf0b75358f97a36710315a069de1716506eae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5c998\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:33Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:33 crc kubenswrapper[4901]: I0309 02:43:33.082699 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40c17e04-3fc2-48a2-95dc-fe0428b91e66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3178075164db642d21adaf4deed75f6f295794530311c6476b6f0376101b61c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce11cf214e5d6313c477f4939e0627d14eecc389d7df43bfa0a4bff5152baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eead4ce26a12a03f0e1808e8e8479ca48c9a409935ea566fae8a7cc2d8bc7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c9f87a22de3336b00d43f7848f50bda9f5f27f7ac84c6c6e63a224583005e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a115bbd5ac61babf8f88452e2c50b49fd24b34589357e4a7d59bd2a3511b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce9289308fa0e0313c1e1de4a2da2e1544bf309f32c6f58cf4a914210f10c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42126b9cd7c418140407793d61fb0f841ddd7a74cc146c4a922d49bb56828d3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c188c6c7f7cc9a46c9e941cf0096b7d2828dac9036d388388958f1d44f3cfdd4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T02:43:15Z\\\",\\\"message\\\":\\\"1336] Added *v1.EgressIP event handler 8\\\\nI0309 02:43:15.073036 6943 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 02:43:15.073060 6943 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 02:43:15.073089 6943 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 02:43:15.073122 6943 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0309 02:43:15.073137 6943 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0309 02:43:15.073159 6943 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0309 02:43:15.073188 6943 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0309 02:43:15.073200 6943 factory.go:656] Stopping watch factory\\\\nI0309 02:43:15.073201 6943 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0309 02:43:15.073165 6943 handler.go:208] Removed *v1.Node event handler 7\\\\nI0309 02:43:15.073258 6943 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0309 02:43:15.073380 6943 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0309 02:43:15.073436 6943 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0309 02:43:15.073461 6943 ovnkube.go:599] Stopped ovnkube\\\\nI0309 02:43:15.073484 6943 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0309 02:43:15.073539 6943 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2c4c63b1ab59c59daca5ed71ee5ed0e53a565ffbc793ac516f9ba75a4ece272\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmfgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:33Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:33 crc kubenswrapper[4901]: I0309 02:43:33.096785 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-c2pjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b166f443-7b7c-478b-bfc1-67291aa158fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f409cfbaaf5a97aa5b05433826d0ba084ecc5dc20afb3e733c9cb2e38a2e507a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4l68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-c2pjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:33Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:33 crc kubenswrapper[4901]: I0309 02:43:33.109383 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d957c835-1f04-423a-9641-444392da0360\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3836e17de731d173fb726c092d6f4a3ed70b2eab0297fb4e146004c22a3fbc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e716f348e3907e1861b6fd17ba499078ec41d0ee31bc8855d9744e4b0f640\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf2092a65f531dedb9529485252943746497cea4fec2b6c52ed5220eed868129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b72286bfa72acc7e3ad2d5eb5734f182e165eacaa4a7ccfd35675c43fc65a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b72286bfa72acc7e3ad2d5eb5734f182e165eacaa4a7ccfd35675c43fc65a82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:33Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:33 crc kubenswrapper[4901]: I0309 02:43:33.135078 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f58555f-103d-42bd-affa-68dd3ed5baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18f78a3248e86d96e6293502db1727285fb76668c8242ed3399de4b8e8fb666a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79c92d776bf96579afc6fbc0e1e70bb645a371966be4e4e9d86e18ef3c977b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7936d7271945f5c0b5ab0ddecf9a3f46c9f7306b71e25fabbb3927303c6f9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1098f7874be8a2a538c966845a946ab7993ec921e4e0859953f0187b4fabfd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23f66811373b03763a100551c0390732ccceac869c908d769de48f847e4ab82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:33Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:33 crc kubenswrapper[4901]: I0309 02:43:33.150647 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:33Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:33 crc kubenswrapper[4901]: I0309 02:43:33.168405 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:33Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:33 crc kubenswrapper[4901]: I0309 02:43:33.188142 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ad2682-53b5-4e9d-acd1-0f0d210b322c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a290d0cc95a867bb7d924bfb2e63a518bb753765d4cdb890ccdf96cd36450b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72901ff5efbd52bce10e4651744925f7a5758a7a3c2572b581ab2cd222f58bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d0fe7f5e959b5c71b03b4ad017949229c3a161ea79f7869e610d069248c552\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f1f96e1c6d66ab0dc08021bc7fe05a276034895cd0bb6784bd0ce5a18b7d406\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 02:42:28.169798 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 02:42:28.169917 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 02:42:28.170597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1507146702/tls.crt::/tmp/serving-cert-1507146702/tls.key\\\\\\\"\\\\nI0309 02:42:28.390239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 02:42:28.392919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 02:42:28.392936 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 02:42:28.392959 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 02:42:28.392964 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 02:42:28.400711 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 02:42:28.400732 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 02:42:28.400760 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 02:42:28.400786 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 02:42:28.400792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 02:42:28.400798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 02:42:28.402404 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99df08aa182691c1bdb9ab9fb7b5352b276c236655b1076e2993639d63a94bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:33Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:33 crc kubenswrapper[4901]: I0309 02:43:33.200134 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:33Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:33 crc kubenswrapper[4901]: I0309 02:43:33.211862 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lg26b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e883667-62d8-4920-a810-558a77f260ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgs6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgs6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lg26b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:33Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:34 crc kubenswrapper[4901]: I0309 02:43:34.106031 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lg26b" Mar 09 02:43:34 crc kubenswrapper[4901]: I0309 02:43:34.106084 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:43:34 crc kubenswrapper[4901]: I0309 02:43:34.106200 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:43:34 crc kubenswrapper[4901]: E0309 02:43:34.106397 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lg26b" podUID="9e883667-62d8-4920-a810-558a77f260ca" Mar 09 02:43:34 crc kubenswrapper[4901]: I0309 02:43:34.106435 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:43:34 crc kubenswrapper[4901]: E0309 02:43:34.106628 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 02:43:34 crc kubenswrapper[4901]: E0309 02:43:34.106861 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 02:43:34 crc kubenswrapper[4901]: E0309 02:43:34.106942 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 02:43:34 crc kubenswrapper[4901]: I0309 02:43:34.623642 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:34 crc kubenswrapper[4901]: I0309 02:43:34.623719 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:34 crc kubenswrapper[4901]: I0309 02:43:34.623741 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:34 crc kubenswrapper[4901]: I0309 02:43:34.623769 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:34 crc kubenswrapper[4901]: I0309 02:43:34.623791 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:34Z","lastTransitionTime":"2026-03-09T02:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:34 crc kubenswrapper[4901]: E0309 02:43:34.645269 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4fb5477d-c9aa-418f-9a0d-560ac0227b13\\\",\\\"systemUUID\\\":\\\"92bcf75b-bfed-4296-bbf2-d35c6ac3a586\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:34Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:34 crc kubenswrapper[4901]: I0309 02:43:34.650820 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:34 crc kubenswrapper[4901]: I0309 02:43:34.650870 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:34 crc kubenswrapper[4901]: I0309 02:43:34.650887 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:34 crc kubenswrapper[4901]: I0309 02:43:34.650911 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:34 crc kubenswrapper[4901]: I0309 02:43:34.650929 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:34Z","lastTransitionTime":"2026-03-09T02:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:34 crc kubenswrapper[4901]: E0309 02:43:34.670204 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4fb5477d-c9aa-418f-9a0d-560ac0227b13\\\",\\\"systemUUID\\\":\\\"92bcf75b-bfed-4296-bbf2-d35c6ac3a586\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:34Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:34 crc kubenswrapper[4901]: I0309 02:43:34.675821 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:34 crc kubenswrapper[4901]: I0309 02:43:34.675875 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:34 crc kubenswrapper[4901]: I0309 02:43:34.675909 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:34 crc kubenswrapper[4901]: I0309 02:43:34.675931 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:34 crc kubenswrapper[4901]: I0309 02:43:34.675948 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:34Z","lastTransitionTime":"2026-03-09T02:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:34 crc kubenswrapper[4901]: E0309 02:43:34.696702 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4fb5477d-c9aa-418f-9a0d-560ac0227b13\\\",\\\"systemUUID\\\":\\\"92bcf75b-bfed-4296-bbf2-d35c6ac3a586\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:34Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:34 crc kubenswrapper[4901]: I0309 02:43:34.702458 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:34 crc kubenswrapper[4901]: I0309 02:43:34.702526 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:34 crc kubenswrapper[4901]: I0309 02:43:34.702549 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:34 crc kubenswrapper[4901]: I0309 02:43:34.702577 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:34 crc kubenswrapper[4901]: I0309 02:43:34.702602 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:34Z","lastTransitionTime":"2026-03-09T02:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:34 crc kubenswrapper[4901]: E0309 02:43:34.717008 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4fb5477d-c9aa-418f-9a0d-560ac0227b13\\\",\\\"systemUUID\\\":\\\"92bcf75b-bfed-4296-bbf2-d35c6ac3a586\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:34Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:34 crc kubenswrapper[4901]: I0309 02:43:34.721853 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:34 crc kubenswrapper[4901]: I0309 02:43:34.721908 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:34 crc kubenswrapper[4901]: I0309 02:43:34.721932 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:34 crc kubenswrapper[4901]: I0309 02:43:34.721961 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:34 crc kubenswrapper[4901]: I0309 02:43:34.721984 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:34Z","lastTransitionTime":"2026-03-09T02:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:34 crc kubenswrapper[4901]: E0309 02:43:34.742955 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4fb5477d-c9aa-418f-9a0d-560ac0227b13\\\",\\\"systemUUID\\\":\\\"92bcf75b-bfed-4296-bbf2-d35c6ac3a586\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:34Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:34 crc kubenswrapper[4901]: E0309 02:43:34.743172 4901 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 02:43:36 crc kubenswrapper[4901]: I0309 02:43:36.105700 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lg26b" Mar 09 02:43:36 crc kubenswrapper[4901]: I0309 02:43:36.105739 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:43:36 crc kubenswrapper[4901]: E0309 02:43:36.105849 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lg26b" podUID="9e883667-62d8-4920-a810-558a77f260ca" Mar 09 02:43:36 crc kubenswrapper[4901]: I0309 02:43:36.105929 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:43:36 crc kubenswrapper[4901]: E0309 02:43:36.106036 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 02:43:36 crc kubenswrapper[4901]: I0309 02:43:36.106144 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:43:36 crc kubenswrapper[4901]: E0309 02:43:36.106429 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 02:43:36 crc kubenswrapper[4901]: E0309 02:43:36.107329 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 02:43:36 crc kubenswrapper[4901]: I0309 02:43:36.129213 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f58555f-103d-42bd-affa-68dd3ed5baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18f78a3248e86d96e6293502db1727285fb76668c8242ed3399de4b8e8fb666a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79c92d776bf96579afc6fbc0e1e70bb645a371966be4e4e9d86e18ef3c977b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7936d7271945f5c0b5ab0ddecf9a3f46c9f7306b71e25fabbb3927303c6f9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1098f7874be8a2a538c966845a946ab7993ec921e4e0859953f0187b4fabfd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23f66811373b03763a100551c0390732ccceac869c908d769de48f847e4ab82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:36Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:36 crc kubenswrapper[4901]: I0309 02:43:36.146700 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:36Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:36 crc kubenswrapper[4901]: I0309 02:43:36.162997 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:36Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:36 crc kubenswrapper[4901]: I0309 02:43:36.184680 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40c17e04-3fc2-48a2-95dc-fe0428b91e66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3178075164db642d21adaf4deed75f6f295794530311c6476b6f0376101b61c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce11cf214e5d6313c477f4939e0627d14eecc389d7df43bfa0a4bff5152baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eead4ce26a12a03f0e1808e8e8479ca48c9a409935ea566fae8a7cc2d8bc7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c9f87a22de3336b00d43f7848f50bda9f5f27f7ac84c6c6e63a224583005e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a115bbd5ac61babf8f88452e2c50b49fd24b34589357e4a7d59bd2a3511b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce9289308fa0e0313c1e1de4a2da2e1544bf309f32c6f58cf4a914210f10c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42126b9cd7c418140407793d61fb0f841ddd7a74cc146c4a922d49bb56828d3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c188c6c7f7cc9a46c9e941cf0096b7d2828dac9036d388388958f1d44f3cfdd4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T02:43:15Z\\\",\\\"message\\\":\\\"1336] Added *v1.EgressIP event handler 8\\\\nI0309 02:43:15.073036 6943 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 02:43:15.073060 6943 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 02:43:15.073089 6943 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 02:43:15.073122 6943 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0309 02:43:15.073137 6943 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0309 02:43:15.073159 6943 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0309 02:43:15.073188 6943 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0309 02:43:15.073200 6943 factory.go:656] Stopping watch factory\\\\nI0309 02:43:15.073201 6943 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0309 02:43:15.073165 6943 handler.go:208] Removed *v1.Node event handler 7\\\\nI0309 02:43:15.073258 6943 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0309 02:43:15.073380 6943 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0309 02:43:15.073436 6943 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0309 02:43:15.073461 6943 ovnkube.go:599] Stopped ovnkube\\\\nI0309 02:43:15.073484 6943 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0309 02:43:15.073539 6943 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2c4c63b1ab59c59daca5ed71ee5ed0e53a565ffbc793ac516f9ba75a4ece272\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmfgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:36Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:36 crc kubenswrapper[4901]: I0309 02:43:36.198595 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-c2pjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b166f443-7b7c-478b-bfc1-67291aa158fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f409cfbaaf5a97aa5b05433826d0ba084ecc5dc20afb3e733c9cb2e38a2e507a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4l68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-c2pjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:36Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:36 crc kubenswrapper[4901]: I0309 02:43:36.216999 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d957c835-1f04-423a-9641-444392da0360\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3836e17de731d173fb726c092d6f4a3ed70b2eab0297fb4e146004c22a3fbc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e716f348e3907e1861b6fd17ba499078ec41d0ee31bc8855d9744e4b0f640\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf2092a65f531dedb9529485252943746497cea4fec2b6c52ed5220eed868129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b72286bfa72acc7e3ad2d5eb5734f182e165eacaa4a7ccfd35675c43fc65a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b72286bfa72acc7e3ad2d5eb5734f182e165eacaa4a7ccfd35675c43fc65a82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:36Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:36 crc kubenswrapper[4901]: I0309 02:43:36.235174 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:36Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:36 crc kubenswrapper[4901]: I0309 02:43:36.250912 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lg26b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e883667-62d8-4920-a810-558a77f260ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgs6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgs6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lg26b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:36Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:36 crc kubenswrapper[4901]: I0309 02:43:36.270808 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ad2682-53b5-4e9d-acd1-0f0d210b322c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a290d0cc95a867bb7d924bfb2e63a518bb753765d4cdb890ccdf96cd36450b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72901ff5efbd52bce10e4651744925f7a5758a7a3c2572b581ab2cd222f58bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d0fe7f5e959b5c71b03b4ad017949229c3a161ea79f7869e610d069248c552\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f1f96e1c6d66ab0dc08021bc7fe05a276034895cd0bb6784bd0ce5a18b7d406\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 02:42:28.169798 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 02:42:28.169917 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 02:42:28.170597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1507146702/tls.crt::/tmp/serving-cert-1507146702/tls.key\\\\\\\"\\\\nI0309 02:42:28.390239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 02:42:28.392919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 02:42:28.392936 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 02:42:28.392959 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 02:42:28.392964 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 02:42:28.400711 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 02:42:28.400732 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 02:42:28.400760 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 02:42:28.400786 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 02:42:28.400792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 02:42:28.400798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 02:42:28.402404 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99df08aa182691c1bdb9ab9fb7b5352b276c236655b1076e2993639d63a94bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:36Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:36 crc kubenswrapper[4901]: I0309 02:43:36.285910 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84cc042c2e8b6984f9b3357afce277a63722f17b3e8fd289eeeaa8a7a47935f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:36Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:36 crc kubenswrapper[4901]: I0309 02:43:36.296994 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tvqrz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ac7cbd-671c-4beb-8994-92502ee47ceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee19f70b95aa359f8230d02f66a5ffe7bd83b01af33602ffee28239e335210d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbmmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tvqrz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:36Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:36 crc kubenswrapper[4901]: I0309 02:43:36.313127 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jtcxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98293641-7bf9-4473-ae92-c80e56cefdb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8cb9651e7ee62e24bc299a24e24546fc393d36c247cfc7d6fab578f2d5fc392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9388d004278c87c5f70e21f80ffb92a37910be99d37d4890f1802ca8ee862c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9388d004278c87c5f70e21f80ffb92a37910be99d37d4890f1802ca8ee862c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57469350bdcfa9d44c83dd87ce9ef88a75c2ed0302fd633c6c23f95a60572f9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57469350bdcfa9d44c83dd87ce9ef88a75c2ed0302fd633c6c23f95a60572f9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1536a2d9d7cc0d60c0442b625d3e58b3b5ffa615b5424405b8f896940e4085b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1536a2d9d7cc0d60c0442b625d3e58b3b5ffa615b5424405b8f896940e4085b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a7a4a4bf21360999d06acf702346f565fb76f5a5c4e5a8d1666fa395299d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a7a4a4bf21360999d06acf702346f565fb76f5a5c4e5a8d1666fa395299d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce664be16b67bb0782620399124bae458654d66c9331521c2df213476651b740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce664be16b67bb0782620399124bae458654d66c9331521c2df213476651b740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99311d694d177551b5f59aa756a0674f18816e477c3195cd30d40a805c964940\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99311d694d177551b5f59aa756a0674f18816e477c3195cd30d40a805c964940\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jtcxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:36Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:36 crc kubenswrapper[4901]: I0309 02:43:36.325698 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz59t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d14fa511-1706-4ecd-9190-c39af889657c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://799a782b20c6af25f16146a6f775dbb1f9a70567c6e183bc3760541beba535a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5wgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://994153537c9e7d0ac89a6651b22147c42d9263d9652cc7748afa72e1e46a344a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5wgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rz59t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:36Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:36 crc kubenswrapper[4901]: I0309 02:43:36.343419 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad3db3b3-d72c-4e61-9db8-fcbc6abb0578\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95a5fc152f730178294cfb5d25a815b2041e4b712e4bdb7b334b4b417f463ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ceafe61efea5730e474e74a9ed5a7558cc06dc75d105c37680ed1eea84b8827c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:17Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 02:41:48.034714 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 02:41:48.036179 1 observer_polling.go:159] Starting file observer\\\\nI0309 02:41:48.038474 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 02:41:48.039660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 02:42:14.296758 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 02:42:17.547772 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 02:42:17.547837 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:47Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0cd220ceed026b8b0448743b09537807f1d4c59e290c74e4afa63b3331aaf97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c86d550b0e0522073aa52f3a6aa6671f7ded403e37eda7c7b91c6e069272dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04aae8495c8f5edddbe537c44cefdb20eaa781df485766f6e709a5d4c6e82df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:36Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:36 crc kubenswrapper[4901]: I0309 02:43:36.358149 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f7c21ba0d2836cb5905904941725dc9b5df5e173b47637722608075496850bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ce0f53e8c88e047328fb896ffa5133e9ac723b73795634e24072acde25506d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:36Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:36 crc kubenswrapper[4901]: I0309 02:43:36.375027 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-429fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f00772dc49e31beb94fb8c970e126f77b33ecff59ca8d6fbc856704648ec6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4j6pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-429fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:36Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:36 crc kubenswrapper[4901]: I0309 02:43:36.386521 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65e722e8-52c4-4bb6-9927-f378b2f7296a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25b2c2479794812333e6425d16a7432c46578a2a6d240682761545e57af6b4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7db76f28a31cfe2c8e33f8d3addf0b75358f97a36710315a069de1716506eae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5c998\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:36Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:36 crc kubenswrapper[4901]: I0309 02:43:36.405278 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7a956f96042c9ac58c08b70e189b199ec512bc56ffedd9df3956127a2380a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:36Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:36 crc kubenswrapper[4901]: E0309 02:43:36.455740 4901 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 02:43:38 crc kubenswrapper[4901]: I0309 02:43:38.019867 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 02:43:38 crc kubenswrapper[4901]: I0309 02:43:38.020009 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:43:38 crc kubenswrapper[4901]: E0309 02:43:38.020066 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 02:44:42.020023389 +0000 UTC m=+206.609687141 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:43:38 crc kubenswrapper[4901]: I0309 02:43:38.020131 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:43:38 crc kubenswrapper[4901]: I0309 02:43:38.020187 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:43:38 crc kubenswrapper[4901]: E0309 02:43:38.020199 4901 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 02:43:38 crc kubenswrapper[4901]: I0309 02:43:38.020332 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:43:38 crc kubenswrapper[4901]: E0309 02:43:38.020350 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 02:44:42.020323836 +0000 UTC m=+206.609987598 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 02:43:38 crc kubenswrapper[4901]: E0309 02:43:38.020412 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 02:43:38 crc kubenswrapper[4901]: E0309 02:43:38.020432 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 02:43:38 crc kubenswrapper[4901]: E0309 02:43:38.020443 4901 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 02:43:38 crc kubenswrapper[4901]: E0309 02:43:38.020460 4901 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 02:43:38 crc kubenswrapper[4901]: E0309 02:43:38.020498 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 02:44:42.02048247 +0000 UTC m=+206.610146302 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 02:43:38 crc kubenswrapper[4901]: E0309 02:43:38.020464 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 02:43:38 crc kubenswrapper[4901]: E0309 02:43:38.020519 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 02:44:42.020507241 +0000 UTC m=+206.610171103 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 02:43:38 crc kubenswrapper[4901]: E0309 02:43:38.020583 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 02:43:38 crc kubenswrapper[4901]: E0309 02:43:38.020609 4901 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 02:43:38 crc kubenswrapper[4901]: E0309 02:43:38.020711 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 02:44:42.020687545 +0000 UTC m=+206.610351357 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 02:43:38 crc kubenswrapper[4901]: I0309 02:43:38.106468 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lg26b" Mar 09 02:43:38 crc kubenswrapper[4901]: I0309 02:43:38.106537 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:43:38 crc kubenswrapper[4901]: E0309 02:43:38.106676 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lg26b" podUID="9e883667-62d8-4920-a810-558a77f260ca" Mar 09 02:43:38 crc kubenswrapper[4901]: I0309 02:43:38.106969 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:43:38 crc kubenswrapper[4901]: E0309 02:43:38.107105 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 02:43:38 crc kubenswrapper[4901]: I0309 02:43:38.107178 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:43:38 crc kubenswrapper[4901]: E0309 02:43:38.107368 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 02:43:38 crc kubenswrapper[4901]: E0309 02:43:38.107579 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 02:43:40 crc kubenswrapper[4901]: I0309 02:43:40.105711 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lg26b" Mar 09 02:43:40 crc kubenswrapper[4901]: E0309 02:43:40.106184 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lg26b" podUID="9e883667-62d8-4920-a810-558a77f260ca" Mar 09 02:43:40 crc kubenswrapper[4901]: I0309 02:43:40.106272 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:43:40 crc kubenswrapper[4901]: I0309 02:43:40.106335 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:43:40 crc kubenswrapper[4901]: I0309 02:43:40.106288 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:43:40 crc kubenswrapper[4901]: E0309 02:43:40.106593 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 02:43:40 crc kubenswrapper[4901]: E0309 02:43:40.106921 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 02:43:40 crc kubenswrapper[4901]: E0309 02:43:40.107007 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 02:43:41 crc kubenswrapper[4901]: E0309 02:43:41.456945 4901 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 02:43:42 crc kubenswrapper[4901]: I0309 02:43:42.105966 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:43:42 crc kubenswrapper[4901]: I0309 02:43:42.106040 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:43:42 crc kubenswrapper[4901]: I0309 02:43:42.106134 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lg26b" Mar 09 02:43:42 crc kubenswrapper[4901]: E0309 02:43:42.106318 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 02:43:42 crc kubenswrapper[4901]: I0309 02:43:42.106361 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:43:42 crc kubenswrapper[4901]: E0309 02:43:42.106528 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lg26b" podUID="9e883667-62d8-4920-a810-558a77f260ca" Mar 09 02:43:42 crc kubenswrapper[4901]: E0309 02:43:42.106641 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 02:43:42 crc kubenswrapper[4901]: E0309 02:43:42.107204 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 02:43:44 crc kubenswrapper[4901]: I0309 02:43:44.105341 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:43:44 crc kubenswrapper[4901]: I0309 02:43:44.105456 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lg26b" Mar 09 02:43:44 crc kubenswrapper[4901]: E0309 02:43:44.105509 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 02:43:44 crc kubenswrapper[4901]: E0309 02:43:44.105734 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lg26b" podUID="9e883667-62d8-4920-a810-558a77f260ca" Mar 09 02:43:44 crc kubenswrapper[4901]: I0309 02:43:44.105802 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:43:44 crc kubenswrapper[4901]: E0309 02:43:44.105980 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 02:43:44 crc kubenswrapper[4901]: I0309 02:43:44.106517 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:43:44 crc kubenswrapper[4901]: E0309 02:43:44.106733 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 02:43:44 crc kubenswrapper[4901]: I0309 02:43:44.974031 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:44 crc kubenswrapper[4901]: I0309 02:43:44.974112 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:44 crc kubenswrapper[4901]: I0309 02:43:44.974135 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:44 crc kubenswrapper[4901]: I0309 02:43:44.974164 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:44 crc kubenswrapper[4901]: I0309 02:43:44.974186 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:44Z","lastTransitionTime":"2026-03-09T02:43:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:44 crc kubenswrapper[4901]: E0309 02:43:44.995356 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4fb5477d-c9aa-418f-9a0d-560ac0227b13\\\",\\\"systemUUID\\\":\\\"92bcf75b-bfed-4296-bbf2-d35c6ac3a586\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:44Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:45 crc kubenswrapper[4901]: I0309 02:43:45.005936 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:45 crc kubenswrapper[4901]: I0309 02:43:45.005992 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:45 crc kubenswrapper[4901]: I0309 02:43:45.006010 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:45 crc kubenswrapper[4901]: I0309 02:43:45.006033 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:45 crc kubenswrapper[4901]: I0309 02:43:45.006052 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:45Z","lastTransitionTime":"2026-03-09T02:43:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:45 crc kubenswrapper[4901]: E0309 02:43:45.027072 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4fb5477d-c9aa-418f-9a0d-560ac0227b13\\\",\\\"systemUUID\\\":\\\"92bcf75b-bfed-4296-bbf2-d35c6ac3a586\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:45Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:45 crc kubenswrapper[4901]: I0309 02:43:45.032514 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:45 crc kubenswrapper[4901]: I0309 02:43:45.032587 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:45 crc kubenswrapper[4901]: I0309 02:43:45.032611 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:45 crc kubenswrapper[4901]: I0309 02:43:45.032642 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:45 crc kubenswrapper[4901]: I0309 02:43:45.032664 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:45Z","lastTransitionTime":"2026-03-09T02:43:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:45 crc kubenswrapper[4901]: E0309 02:43:45.054085 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4fb5477d-c9aa-418f-9a0d-560ac0227b13\\\",\\\"systemUUID\\\":\\\"92bcf75b-bfed-4296-bbf2-d35c6ac3a586\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:45Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:45 crc kubenswrapper[4901]: I0309 02:43:45.059653 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:45 crc kubenswrapper[4901]: I0309 02:43:45.059728 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:45 crc kubenswrapper[4901]: I0309 02:43:45.059746 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:45 crc kubenswrapper[4901]: I0309 02:43:45.059773 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:45 crc kubenswrapper[4901]: I0309 02:43:45.059791 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:45Z","lastTransitionTime":"2026-03-09T02:43:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:45 crc kubenswrapper[4901]: E0309 02:43:45.080612 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4fb5477d-c9aa-418f-9a0d-560ac0227b13\\\",\\\"systemUUID\\\":\\\"92bcf75b-bfed-4296-bbf2-d35c6ac3a586\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:45Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:45 crc kubenswrapper[4901]: I0309 02:43:45.085797 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:45 crc kubenswrapper[4901]: I0309 02:43:45.085880 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:45 crc kubenswrapper[4901]: I0309 02:43:45.085903 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:45 crc kubenswrapper[4901]: I0309 02:43:45.085936 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:45 crc kubenswrapper[4901]: I0309 02:43:45.085958 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:45Z","lastTransitionTime":"2026-03-09T02:43:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:45 crc kubenswrapper[4901]: E0309 02:43:45.107813 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T02:43:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4fb5477d-c9aa-418f-9a0d-560ac0227b13\\\",\\\"systemUUID\\\":\\\"92bcf75b-bfed-4296-bbf2-d35c6ac3a586\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:45Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:45 crc kubenswrapper[4901]: E0309 02:43:45.108049 4901 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 02:43:46 crc kubenswrapper[4901]: I0309 02:43:46.025896 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e883667-62d8-4920-a810-558a77f260ca-metrics-certs\") pod \"network-metrics-daemon-lg26b\" (UID: \"9e883667-62d8-4920-a810-558a77f260ca\") " pod="openshift-multus/network-metrics-daemon-lg26b" Mar 09 02:43:46 crc kubenswrapper[4901]: E0309 02:43:46.026176 4901 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 02:43:46 crc kubenswrapper[4901]: E0309 02:43:46.026312 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e883667-62d8-4920-a810-558a77f260ca-metrics-certs podName:9e883667-62d8-4920-a810-558a77f260ca nodeName:}" failed. No retries permitted until 2026-03-09 02:44:18.026284038 +0000 UTC m=+182.615947860 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9e883667-62d8-4920-a810-558a77f260ca-metrics-certs") pod "network-metrics-daemon-lg26b" (UID: "9e883667-62d8-4920-a810-558a77f260ca") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 02:43:46 crc kubenswrapper[4901]: I0309 02:43:46.105877 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:43:46 crc kubenswrapper[4901]: I0309 02:43:46.105892 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:43:46 crc kubenswrapper[4901]: I0309 02:43:46.106010 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lg26b" Mar 09 02:43:46 crc kubenswrapper[4901]: I0309 02:43:46.106436 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:43:46 crc kubenswrapper[4901]: E0309 02:43:46.106454 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 02:43:46 crc kubenswrapper[4901]: E0309 02:43:46.106814 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 02:43:46 crc kubenswrapper[4901]: E0309 02:43:46.106921 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 02:43:46 crc kubenswrapper[4901]: E0309 02:43:46.106944 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lg26b" podUID="9e883667-62d8-4920-a810-558a77f260ca" Mar 09 02:43:46 crc kubenswrapper[4901]: I0309 02:43:46.126382 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d957c835-1f04-423a-9641-444392da0360\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3836e17de731d173fb726c092d6f4a3ed70b2eab0297fb4e146004c22a3fbc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e716f348e3907e1861b6fd17ba499078ec41d0ee31bc8855d9744e4b0f640\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf2092a65f531dedb9529485252943746497cea4fec2b6c52ed5220eed868129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b72286bfa72acc7e3ad2d5eb5734f182e165eacaa4a7ccfd35675c43fc65a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b72286bfa72acc7e3ad2d5eb5734f182e165eacaa4a7ccfd35675c43fc65a82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:46Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:46 crc kubenswrapper[4901]: I0309 02:43:46.160111 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f58555f-103d-42bd-affa-68dd3ed5baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18f78a3248e86d96e6293502db1727285fb76668c8242ed3399de4b8e8fb666a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79c92d776bf96579afc6fbc0e1e70bb645a371966be4e4e9d86e18ef3c977b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7936d7271945f5c0b5ab0ddecf9a3f46c9f7306b71e25fabbb3927303c6f9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1098f7874be8a2a538c966845a946ab7993ec921e4e0859953f0187b4fabfd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23f66811373b03763a100551c0390732ccceac869c908d769de48f847e4ab82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:46Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:46 crc kubenswrapper[4901]: I0309 02:43:46.181460 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:46Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:46 crc kubenswrapper[4901]: I0309 02:43:46.201843 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:46Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:46 crc kubenswrapper[4901]: I0309 02:43:46.224130 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40c17e04-3fc2-48a2-95dc-fe0428b91e66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3178075164db642d21adaf4deed75f6f295794530311c6476b6f0376101b61c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce11cf214e5d6313c477f4939e0627d14eecc389d7df43bfa0a4bff5152baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eead4ce26a12a03f0e1808e8e8479ca48c9a409935ea566fae8a7cc2d8bc7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c9f87a22de3336b00d43f7848f50bda9f5f27f7ac84c6c6e63a224583005e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a115bbd5ac61babf8f88452e2c50b49fd24b34589357e4a7d59bd2a3511b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce9289308fa0e0313c1e1de4a2da2e1544bf309f32c6f58cf4a914210f10c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42126b9cd7c418140407793d61fb0f841ddd7a74cc146c4a922d49bb56828d3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c188c6c7f7cc9a46c9e941cf0096b7d2828dac9036d388388958f1d44f3cfdd4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T02:43:15Z\\\",\\\"message\\\":\\\"1336] Added *v1.EgressIP event handler 8\\\\nI0309 02:43:15.073036 6943 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 02:43:15.073060 6943 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 02:43:15.073089 6943 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 02:43:15.073122 6943 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0309 02:43:15.073137 6943 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0309 02:43:15.073159 6943 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0309 02:43:15.073188 6943 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0309 02:43:15.073200 6943 factory.go:656] Stopping watch factory\\\\nI0309 02:43:15.073201 6943 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0309 02:43:15.073165 6943 handler.go:208] Removed *v1.Node event handler 7\\\\nI0309 02:43:15.073258 6943 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0309 02:43:15.073380 6943 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0309 02:43:15.073436 6943 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0309 02:43:15.073461 6943 ovnkube.go:599] Stopped ovnkube\\\\nI0309 02:43:15.073484 6943 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0309 02:43:15.073539 6943 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2c4c63b1ab59c59daca5ed71ee5ed0e53a565ffbc793ac516f9ba75a4ece272\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmfgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:46Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:46 crc kubenswrapper[4901]: I0309 02:43:46.238192 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-c2pjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b166f443-7b7c-478b-bfc1-67291aa158fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f409cfbaaf5a97aa5b05433826d0ba084ecc5dc20afb3e733c9cb2e38a2e507a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4l68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-c2pjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:46Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:46 crc kubenswrapper[4901]: I0309 02:43:46.251983 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ad2682-53b5-4e9d-acd1-0f0d210b322c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a290d0cc95a867bb7d924bfb2e63a518bb753765d4cdb890ccdf96cd36450b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72901ff5efbd52bce10e4651744925f7a5758a7a3c2572b581ab2cd222f58bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d0fe7f5e959b5c71b03b4ad017949229c3a161ea79f7869e610d069248c552\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f1f96e1c6d66ab0dc08021bc7fe05a276034895cd0bb6784bd0ce5a18b7d406\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 02:42:28.169798 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 02:42:28.169917 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 02:42:28.170597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1507146702/tls.crt::/tmp/serving-cert-1507146702/tls.key\\\\\\\"\\\\nI0309 02:42:28.390239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 02:42:28.392919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 02:42:28.392936 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 02:42:28.392959 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 02:42:28.392964 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 02:42:28.400711 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 02:42:28.400732 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 02:42:28.400760 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 02:42:28.400786 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 02:42:28.400792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 02:42:28.400798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 02:42:28.402404 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99df08aa182691c1bdb9ab9fb7b5352b276c236655b1076e2993639d63a94bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:46Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:46 crc kubenswrapper[4901]: I0309 02:43:46.270323 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:46Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:46 crc kubenswrapper[4901]: I0309 02:43:46.283140 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lg26b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e883667-62d8-4920-a810-558a77f260ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgs6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgs6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lg26b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:46Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:46 crc kubenswrapper[4901]: I0309 02:43:46.303415 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad3db3b3-d72c-4e61-9db8-fcbc6abb0578\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95a5fc152f730178294cfb5d25a815b2041e4b712e4bdb7b334b4b417f463ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ceafe61efea5730e474e74a9ed5a7558cc06dc75d105c37680ed1eea84b8827c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:17Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 02:41:48.034714 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 02:41:48.036179 1 observer_polling.go:159] Starting file observer\\\\nI0309 02:41:48.038474 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 02:41:48.039660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 02:42:14.296758 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 02:42:17.547772 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 02:42:17.547837 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:47Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0cd220ceed026b8b0448743b09537807f1d4c59e290c74e4afa63b3331aaf97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c86d550b0e0522073aa52f3a6aa6671f7ded403e37eda7c7b91c6e069272dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04aae8495c8f5edddbe537c44cefdb20eaa781df485766f6e709a5d4c6e82df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:46Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:46 crc kubenswrapper[4901]: I0309 02:43:46.318215 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84cc042c2e8b6984f9b3357afce277a63722f17b3e8fd289eeeaa8a7a47935f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:46Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:46 crc kubenswrapper[4901]: I0309 02:43:46.329966 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tvqrz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ac7cbd-671c-4beb-8994-92502ee47ceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee19f70b95aa359f8230d02f66a5ffe7bd83b01af33602ffee28239e335210d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbmmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tvqrz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:46Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:46 crc kubenswrapper[4901]: I0309 02:43:46.344433 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jtcxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98293641-7bf9-4473-ae92-c80e56cefdb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8cb9651e7ee62e24bc299a24e24546fc393d36c247cfc7d6fab578f2d5fc392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9388d004278c87c5f70e21f80ffb92a37910be99d37d4890f1802ca8ee862c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9388d004278c87c5f70e21f80ffb92a37910be99d37d4890f1802ca8ee862c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57469350bdcfa9d44c83dd87ce9ef88a75c2ed0302fd633c6c23f95a60572f9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57469350bdcfa9d44c83dd87ce9ef88a75c2ed0302fd633c6c23f95a60572f9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1536a2d9d7cc0d60c0442b625d3e58b3b5ffa615b5424405b8f896940e4085b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1536a2d9d7cc0d60c0442b625d3e58b3b5ffa615b5424405b8f896940e4085b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a7a4a4bf21360999d06acf702346f565fb76f5a5c4e5a8d1666fa395299d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a7a4a4bf21360999d06acf702346f565fb76f5a5c4e5a8d1666fa395299d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce664be16b67bb0782620399124bae458654d66c9331521c2df213476651b740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce664be16b67bb0782620399124bae458654d66c9331521c2df213476651b740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99311d694d177551b5f59aa756a0674f18816e477c3195cd30d40a805c964940\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99311d694d177551b5f59aa756a0674f18816e477c3195cd30d40a805c964940\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jtcxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:46Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:46 crc kubenswrapper[4901]: I0309 02:43:46.357101 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz59t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d14fa511-1706-4ecd-9190-c39af889657c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://799a782b20c6af25f16146a6f775dbb1f9a70567c6e183bc3760541beba535a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5wgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://994153537c9e7d0ac89a6651b22147c42d9263d9652cc7748afa72e1e46a344a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5wgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rz59t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:46Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:46 crc kubenswrapper[4901]: I0309 02:43:46.374948 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7a956f96042c9ac58c08b70e189b199ec512bc56ffedd9df3956127a2380a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:46Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:46 crc kubenswrapper[4901]: I0309 02:43:46.392071 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f7c21ba0d2836cb5905904941725dc9b5df5e173b47637722608075496850bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ce0f53e8c88e047328fb896ffa5133e9ac723b73795634e24072acde25506d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:46Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:46 crc kubenswrapper[4901]: I0309 02:43:46.410665 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-429fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f00772dc49e31beb94fb8c970e126f77b33ecff59ca8d6fbc856704648ec6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4j6pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-429fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:46Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:46 crc kubenswrapper[4901]: I0309 02:43:46.431955 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65e722e8-52c4-4bb6-9927-f378b2f7296a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25b2c2479794812333e6425d16a7432c46578a2a6d240682761545e57af6b4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7db76f28a31cfe2c8e33f8d3addf0b75358f97a36710315a069de1716506eae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5c998\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:46Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:46 crc kubenswrapper[4901]: E0309 02:43:46.459032 4901 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 02:43:48 crc kubenswrapper[4901]: I0309 02:43:48.105651 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:43:48 crc kubenswrapper[4901]: I0309 02:43:48.105749 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:43:48 crc kubenswrapper[4901]: E0309 02:43:48.105907 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 02:43:48 crc kubenswrapper[4901]: I0309 02:43:48.105931 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lg26b" Mar 09 02:43:48 crc kubenswrapper[4901]: I0309 02:43:48.105660 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:43:48 crc kubenswrapper[4901]: E0309 02:43:48.106104 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 02:43:48 crc kubenswrapper[4901]: E0309 02:43:48.106306 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lg26b" podUID="9e883667-62d8-4920-a810-558a77f260ca" Mar 09 02:43:48 crc kubenswrapper[4901]: E0309 02:43:48.106411 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 02:43:48 crc kubenswrapper[4901]: I0309 02:43:48.955496 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-429fk_a0d0e040-7ca3-4af8-9f02-d96cff6b3edf/kube-multus/0.log" Mar 09 02:43:48 crc kubenswrapper[4901]: I0309 02:43:48.955582 4901 generic.go:334] "Generic (PLEG): container finished" podID="a0d0e040-7ca3-4af8-9f02-d96cff6b3edf" containerID="0f00772dc49e31beb94fb8c970e126f77b33ecff59ca8d6fbc856704648ec6d1" exitCode=1 Mar 09 02:43:48 crc kubenswrapper[4901]: I0309 02:43:48.955626 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-429fk" event={"ID":"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf","Type":"ContainerDied","Data":"0f00772dc49e31beb94fb8c970e126f77b33ecff59ca8d6fbc856704648ec6d1"} Mar 09 02:43:48 crc kubenswrapper[4901]: I0309 02:43:48.956318 4901 scope.go:117] "RemoveContainer" containerID="0f00772dc49e31beb94fb8c970e126f77b33ecff59ca8d6fbc856704648ec6d1" Mar 09 02:43:48 crc kubenswrapper[4901]: I0309 02:43:48.977206 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d957c835-1f04-423a-9641-444392da0360\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3836e17de731d173fb726c092d6f4a3ed70b2eab0297fb4e146004c22a3fbc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e716f348e3907e1861b6fd17ba499078ec41d0ee31bc8855d9744e4b0f640\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf2092a65f531dedb9529485252943746497cea4fec2b6c52ed5220eed868129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b72286bfa72acc7e3ad2d5eb5734f182e165eacaa4a7ccfd35675c43fc65a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b72286bfa72acc7e3ad2d5eb5734f182e165eacaa4a7ccfd35675c43fc65a82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:48Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:49 crc kubenswrapper[4901]: I0309 02:43:49.003883 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f58555f-103d-42bd-affa-68dd3ed5baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18f78a3248e86d96e6293502db1727285fb76668c8242ed3399de4b8e8fb666a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79c92d776bf96579afc6fbc0e1e70bb645a371966be4e4e9d86e18ef3c977b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7936d7271945f5c0b5ab0ddecf9a3f46c9f7306b71e25fabbb3927303c6f9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1098f7874be8a2a538c966845a946ab7993ec921e4e0859953f0187b4fabfd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23f66811373b03763a100551c0390732ccceac869c908d769de48f847e4ab82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:49Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:49 crc kubenswrapper[4901]: I0309 02:43:49.025877 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:49Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:49 crc kubenswrapper[4901]: I0309 02:43:49.045975 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:49Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:49 crc kubenswrapper[4901]: I0309 02:43:49.075341 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40c17e04-3fc2-48a2-95dc-fe0428b91e66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3178075164db642d21adaf4deed75f6f295794530311c6476b6f0376101b61c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce11cf214e5d6313c477f4939e0627d14eecc389d7df43bfa0a4bff5152baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eead4ce26a12a03f0e1808e8e8479ca48c9a409935ea566fae8a7cc2d8bc7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c9f87a22de3336b00d43f7848f50bda9f5f27f7ac84c6c6e63a224583005e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a115bbd5ac61babf8f88452e2c50b49fd24b34589357e4a7d59bd2a3511b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce9289308fa0e0313c1e1de4a2da2e1544bf309f32c6f58cf4a914210f10c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42126b9cd7c418140407793d61fb0f841ddd7a74cc146c4a922d49bb56828d3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c188c6c7f7cc9a46c9e941cf0096b7d2828dac9036d388388958f1d44f3cfdd4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T02:43:15Z\\\",\\\"message\\\":\\\"1336] Added *v1.EgressIP event handler 8\\\\nI0309 02:43:15.073036 6943 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 02:43:15.073060 6943 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 02:43:15.073089 6943 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 02:43:15.073122 6943 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0309 02:43:15.073137 6943 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0309 02:43:15.073159 6943 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0309 02:43:15.073188 6943 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0309 02:43:15.073200 6943 factory.go:656] Stopping watch factory\\\\nI0309 02:43:15.073201 6943 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0309 02:43:15.073165 6943 handler.go:208] Removed *v1.Node event handler 7\\\\nI0309 02:43:15.073258 6943 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0309 02:43:15.073380 6943 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0309 02:43:15.073436 6943 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0309 02:43:15.073461 6943 ovnkube.go:599] Stopped ovnkube\\\\nI0309 02:43:15.073484 6943 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0309 02:43:15.073539 6943 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2c4c63b1ab59c59daca5ed71ee5ed0e53a565ffbc793ac516f9ba75a4ece272\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmfgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:49Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:49 crc kubenswrapper[4901]: I0309 02:43:49.089595 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-c2pjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b166f443-7b7c-478b-bfc1-67291aa158fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f409cfbaaf5a97aa5b05433826d0ba084ecc5dc20afb3e733c9cb2e38a2e507a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4l68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-c2pjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:49Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:49 crc kubenswrapper[4901]: I0309 02:43:49.108017 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ad2682-53b5-4e9d-acd1-0f0d210b322c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a290d0cc95a867bb7d924bfb2e63a518bb753765d4cdb890ccdf96cd36450b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72901ff5efbd52bce10e4651744925f7a5758a7a3c2572b581ab2cd222f58bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d0fe7f5e959b5c71b03b4ad017949229c3a161ea79f7869e610d069248c552\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f1f96e1c6d66ab0dc08021bc7fe05a276034895cd0bb6784bd0ce5a18b7d406\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 02:42:28.169798 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 02:42:28.169917 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 02:42:28.170597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1507146702/tls.crt::/tmp/serving-cert-1507146702/tls.key\\\\\\\"\\\\nI0309 02:42:28.390239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 02:42:28.392919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 02:42:28.392936 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 02:42:28.392959 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 02:42:28.392964 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 02:42:28.400711 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 02:42:28.400732 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 02:42:28.400760 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 02:42:28.400786 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 02:42:28.400792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 02:42:28.400798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 02:42:28.402404 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99df08aa182691c1bdb9ab9fb7b5352b276c236655b1076e2993639d63a94bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:49Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:49 crc kubenswrapper[4901]: I0309 02:43:49.130545 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:49Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:49 crc kubenswrapper[4901]: I0309 02:43:49.146970 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lg26b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e883667-62d8-4920-a810-558a77f260ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgs6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgs6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lg26b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:49Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:49 crc kubenswrapper[4901]: I0309 02:43:49.164632 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad3db3b3-d72c-4e61-9db8-fcbc6abb0578\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95a5fc152f730178294cfb5d25a815b2041e4b712e4bdb7b334b4b417f463ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ceafe61efea5730e474e74a9ed5a7558cc06dc75d105c37680ed1eea84b8827c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:17Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 02:41:48.034714 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 02:41:48.036179 1 observer_polling.go:159] Starting file observer\\\\nI0309 02:41:48.038474 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 02:41:48.039660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 02:42:14.296758 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 02:42:17.547772 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 02:42:17.547837 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:47Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0cd220ceed026b8b0448743b09537807f1d4c59e290c74e4afa63b3331aaf97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c86d550b0e0522073aa52f3a6aa6671f7ded403e37eda7c7b91c6e069272dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04aae8495c8f5edddbe537c44cefdb20eaa781df485766f6e709a5d4c6e82df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:49Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:49 crc kubenswrapper[4901]: I0309 02:43:49.179589 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84cc042c2e8b6984f9b3357afce277a63722f17b3e8fd289eeeaa8a7a47935f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:49Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:49 crc kubenswrapper[4901]: I0309 02:43:49.191883 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tvqrz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ac7cbd-671c-4beb-8994-92502ee47ceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee19f70b95aa359f8230d02f66a5ffe7bd83b01af33602ffee28239e335210d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbmmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tvqrz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:49Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:49 crc kubenswrapper[4901]: I0309 02:43:49.213813 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jtcxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98293641-7bf9-4473-ae92-c80e56cefdb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8cb9651e7ee62e24bc299a24e24546fc393d36c247cfc7d6fab578f2d5fc392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9388d004278c87c5f70e21f80ffb92a37910be99d37d4890f1802ca8ee862c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9388d004278c87c5f70e21f80ffb92a37910be99d37d4890f1802ca8ee862c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57469350bdcfa9d44c83dd87ce9ef88a75c2ed0302fd633c6c23f95a60572f9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57469350bdcfa9d44c83dd87ce9ef88a75c2ed0302fd633c6c23f95a60572f9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1536a2d9d7cc0d60c0442b625d3e58b3b5ffa615b5424405b8f896940e4085b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1536a2d9d7cc0d60c0442b625d3e58b3b5ffa615b5424405b8f896940e4085b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a7a4a4bf21360999d06acf702346f565fb76f5a5c4e5a8d1666fa395299d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a7a4a4bf21360999d06acf702346f565fb76f5a5c4e5a8d1666fa395299d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce664be16b67bb0782620399124bae458654d66c9331521c2df213476651b740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce664be16b67bb0782620399124bae458654d66c9331521c2df213476651b740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99311d694d177551b5f59aa756a0674f18816e477c3195cd30d40a805c964940\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99311d694d177551b5f59aa756a0674f18816e477c3195cd30d40a805c964940\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jtcxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:49Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:49 crc kubenswrapper[4901]: I0309 02:43:49.227977 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz59t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d14fa511-1706-4ecd-9190-c39af889657c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://799a782b20c6af25f16146a6f775dbb1f9a70567c6e183bc3760541beba535a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5wgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://994153537c9e7d0ac89a6651b22147c42d9263d9652cc7748afa72e1e46a344a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5wgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rz59t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:49Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:49 crc kubenswrapper[4901]: I0309 02:43:49.244378 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7a956f96042c9ac58c08b70e189b199ec512bc56ffedd9df3956127a2380a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:49Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:49 crc kubenswrapper[4901]: I0309 02:43:49.258813 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f7c21ba0d2836cb5905904941725dc9b5df5e173b47637722608075496850bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ce0f53e8c88e047328fb896ffa5133e9ac723b73795634e24072acde25506d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:49Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:49 crc kubenswrapper[4901]: I0309 02:43:49.277446 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-429fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f00772dc49e31beb94fb8c970e126f77b33ecff59ca8d6fbc856704648ec6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f00772dc49e31beb94fb8c970e126f77b33ecff59ca8d6fbc856704648ec6d1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T02:43:48Z\\\",\\\"message\\\":\\\"2026-03-09T02:43:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8ed7c822-a8d2-4694-b891-16aaa4b3a68b\\\\n2026-03-09T02:43:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8ed7c822-a8d2-4694-b891-16aaa4b3a68b to /host/opt/cni/bin/\\\\n2026-03-09T02:43:03Z [verbose] multus-daemon started\\\\n2026-03-09T02:43:03Z [verbose] Readiness Indicator file check\\\\n2026-03-09T02:43:48Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4j6pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-429fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:49Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:49 crc kubenswrapper[4901]: I0309 02:43:49.290735 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65e722e8-52c4-4bb6-9927-f378b2f7296a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25b2c2479794812333e6425d16a7432c46578a2a6d240682761545e57af6b4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7db76f28a31cfe2c8e33f8d3addf0b75358f97a36710315a069de1716506eae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5c998\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:49Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:49 crc kubenswrapper[4901]: I0309 02:43:49.962694 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-429fk_a0d0e040-7ca3-4af8-9f02-d96cff6b3edf/kube-multus/0.log" Mar 09 02:43:49 crc kubenswrapper[4901]: I0309 02:43:49.962756 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-429fk" event={"ID":"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf","Type":"ContainerStarted","Data":"3c66d39cf5683043c4f736459fb89385f1480a81932fe2d1f1fd5dc7314f6f54"} Mar 09 02:43:49 crc kubenswrapper[4901]: I0309 02:43:49.986828 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7a956f96042c9ac58c08b70e189b199ec512bc56ffedd9df3956127a2380a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:49Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:50 crc kubenswrapper[4901]: I0309 02:43:50.007923 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f7c21ba0d2836cb5905904941725dc9b5df5e173b47637722608075496850bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ce0f53e8c88e047328fb896ffa5133e9ac723b73795634e24072acde25506d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:50Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:50 crc kubenswrapper[4901]: I0309 02:43:50.028817 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-429fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c66d39cf5683043c4f736459fb89385f1480a81932fe2d1f1fd5dc7314f6f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f00772dc49e31beb94fb8c970e126f77b33ecff59ca8d6fbc856704648ec6d1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T02:43:48Z\\\",\\\"message\\\":\\\"2026-03-09T02:43:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8ed7c822-a8d2-4694-b891-16aaa4b3a68b\\\\n2026-03-09T02:43:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8ed7c822-a8d2-4694-b891-16aaa4b3a68b to /host/opt/cni/bin/\\\\n2026-03-09T02:43:03Z [verbose] multus-daemon started\\\\n2026-03-09T02:43:03Z [verbose] Readiness Indicator file check\\\\n2026-03-09T02:43:48Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4j6pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-429fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:50Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:50 crc kubenswrapper[4901]: I0309 02:43:50.044044 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65e722e8-52c4-4bb6-9927-f378b2f7296a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25b2c2479794812333e6425d16a7432c46578a2a6d240682761545e57af6b4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7db76f28a31cfe2c8e33f8d3addf0b75358f97a36710315a069de1716506eae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5c998\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:50Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:50 crc kubenswrapper[4901]: I0309 02:43:50.059387 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-c2pjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b166f443-7b7c-478b-bfc1-67291aa158fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f409cfbaaf5a97aa5b05433826d0ba084ecc5dc20afb3e733c9cb2e38a2e507a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4l68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-c2pjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:50Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:50 crc kubenswrapper[4901]: I0309 02:43:50.078014 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d957c835-1f04-423a-9641-444392da0360\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3836e17de731d173fb726c092d6f4a3ed70b2eab0297fb4e146004c22a3fbc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e716f348e3907e1861b6fd17ba499078ec41d0ee31bc8855d9744e4b0f640\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf2092a65f531dedb9529485252943746497cea4fec2b6c52ed5220eed868129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b72286bfa72acc7e3ad2d5eb5734f182e165eacaa4a7ccfd35675c43fc65a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b72286bfa72acc7e3ad2d5eb5734f182e165eacaa4a7ccfd35675c43fc65a82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:50Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:50 crc kubenswrapper[4901]: I0309 02:43:50.105755 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:43:50 crc kubenswrapper[4901]: I0309 02:43:50.105863 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:43:50 crc kubenswrapper[4901]: E0309 02:43:50.105892 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 02:43:50 crc kubenswrapper[4901]: E0309 02:43:50.106044 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 02:43:50 crc kubenswrapper[4901]: I0309 02:43:50.106158 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lg26b" Mar 09 02:43:50 crc kubenswrapper[4901]: E0309 02:43:50.106287 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lg26b" podUID="9e883667-62d8-4920-a810-558a77f260ca" Mar 09 02:43:50 crc kubenswrapper[4901]: I0309 02:43:50.106297 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:43:50 crc kubenswrapper[4901]: E0309 02:43:50.106388 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 02:43:50 crc kubenswrapper[4901]: I0309 02:43:50.112574 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f58555f-103d-42bd-affa-68dd3ed5baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18f78a3248e86d96e6293502db1727285fb76668c8242ed3399de4b8e8fb666a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79c92d776bf96579afc6fbc0e1e70bb645a371966be4e4e9d86e18ef3c977b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7936d7271945f5c0b5ab0ddecf9a3f46c9f7306b71e25fabbb3927303c6f9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1098f7874be8a2a538c966845a946ab7993ec921e4e0859953f0187b4fabfd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23f66811373b03763a100551c0390732ccceac869c908d769de48f847e4ab82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:50Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:50 crc kubenswrapper[4901]: I0309 02:43:50.130361 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:50Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:50 crc kubenswrapper[4901]: I0309 02:43:50.144530 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:50Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:50 crc kubenswrapper[4901]: I0309 02:43:50.172973 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40c17e04-3fc2-48a2-95dc-fe0428b91e66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3178075164db642d21adaf4deed75f6f295794530311c6476b6f0376101b61c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce11cf214e5d6313c477f4939e0627d14eecc389d7df43bfa0a4bff5152baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eead4ce26a12a03f0e1808e8e8479ca48c9a409935ea566fae8a7cc2d8bc7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c9f87a22de3336b00d43f7848f50bda9f5f27f7ac84c6c6e63a224583005e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a115bbd5ac61babf8f88452e2c50b49fd24b34589357e4a7d59bd2a3511b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce9289308fa0e0313c1e1de4a2da2e1544bf309f32c6f58cf4a914210f10c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42126b9cd7c418140407793d61fb0f841ddd7a74cc146c4a922d49bb56828d3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c188c6c7f7cc9a46c9e941cf0096b7d2828dac9036d388388958f1d44f3cfdd4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T02:43:15Z\\\",\\\"message\\\":\\\"1336] Added *v1.EgressIP event handler 8\\\\nI0309 02:43:15.073036 6943 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 02:43:15.073060 6943 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 02:43:15.073089 6943 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 02:43:15.073122 6943 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0309 02:43:15.073137 6943 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0309 02:43:15.073159 6943 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0309 02:43:15.073188 6943 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0309 02:43:15.073200 6943 factory.go:656] Stopping watch factory\\\\nI0309 02:43:15.073201 6943 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0309 02:43:15.073165 6943 handler.go:208] Removed *v1.Node event handler 7\\\\nI0309 02:43:15.073258 6943 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0309 02:43:15.073380 6943 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0309 02:43:15.073436 6943 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0309 02:43:15.073461 6943 ovnkube.go:599] Stopped ovnkube\\\\nI0309 02:43:15.073484 6943 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0309 02:43:15.073539 6943 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2c4c63b1ab59c59daca5ed71ee5ed0e53a565ffbc793ac516f9ba75a4ece272\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmfgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:50Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:50 crc kubenswrapper[4901]: I0309 02:43:50.189362 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ad2682-53b5-4e9d-acd1-0f0d210b322c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a290d0cc95a867bb7d924bfb2e63a518bb753765d4cdb890ccdf96cd36450b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72901ff5efbd52bce10e4651744925f7a5758a7a3c2572b581ab2cd222f58bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d0fe7f5e959b5c71b03b4ad017949229c3a161ea79f7869e610d069248c552\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f1f96e1c6d66ab0dc08021bc7fe05a276034895cd0bb6784bd0ce5a18b7d406\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 02:42:28.169798 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 02:42:28.169917 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 02:42:28.170597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1507146702/tls.crt::/tmp/serving-cert-1507146702/tls.key\\\\\\\"\\\\nI0309 02:42:28.390239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 02:42:28.392919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 02:42:28.392936 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 02:42:28.392959 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 02:42:28.392964 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 02:42:28.400711 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 02:42:28.400732 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 02:42:28.400760 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 02:42:28.400786 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 02:42:28.400792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 02:42:28.400798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 02:42:28.402404 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99df08aa182691c1bdb9ab9fb7b5352b276c236655b1076e2993639d63a94bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:50Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:50 crc kubenswrapper[4901]: I0309 02:43:50.203342 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:50Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:50 crc kubenswrapper[4901]: I0309 02:43:50.215718 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lg26b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e883667-62d8-4920-a810-558a77f260ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgs6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgs6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lg26b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:50Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:50 crc kubenswrapper[4901]: I0309 02:43:50.232326 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz59t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d14fa511-1706-4ecd-9190-c39af889657c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://799a782b20c6af25f16146a6f775dbb1f9a70567c6e183bc3760541beba535a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5wgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://994153537c9e7d0ac89a6651b22147c42d9263d9652cc7748afa72e1e46a344a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5wgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rz59t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:50Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:50 crc kubenswrapper[4901]: I0309 02:43:50.249812 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad3db3b3-d72c-4e61-9db8-fcbc6abb0578\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95a5fc152f730178294cfb5d25a815b2041e4b712e4bdb7b334b4b417f463ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ceafe61efea5730e474e74a9ed5a7558cc06dc75d105c37680ed1eea84b8827c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:17Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 02:41:48.034714 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 02:41:48.036179 1 observer_polling.go:159] Starting file observer\\\\nI0309 02:41:48.038474 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 02:41:48.039660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 02:42:14.296758 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 02:42:17.547772 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 02:42:17.547837 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:47Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0cd220ceed026b8b0448743b09537807f1d4c59e290c74e4afa63b3331aaf97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c86d550b0e0522073aa52f3a6aa6671f7ded403e37eda7c7b91c6e069272dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04aae8495c8f5edddbe537c44cefdb20eaa781df485766f6e709a5d4c6e82df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:50Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:50 crc kubenswrapper[4901]: I0309 02:43:50.266812 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84cc042c2e8b6984f9b3357afce277a63722f17b3e8fd289eeeaa8a7a47935f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:50Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:50 crc kubenswrapper[4901]: I0309 02:43:50.280866 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tvqrz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ac7cbd-671c-4beb-8994-92502ee47ceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee19f70b95aa359f8230d02f66a5ffe7bd83b01af33602ffee28239e335210d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbmmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tvqrz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:50Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:50 crc kubenswrapper[4901]: I0309 02:43:50.303335 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jtcxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98293641-7bf9-4473-ae92-c80e56cefdb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8cb9651e7ee62e24bc299a24e24546fc393d36c247cfc7d6fab578f2d5fc392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9388d004278c87c5f70e21f80ffb92a37910be99d37d4890f1802ca8ee862c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9388d004278c87c5f70e21f80ffb92a37910be99d37d4890f1802ca8ee862c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57469350bdcfa9d44c83dd87ce9ef88a75c2ed0302fd633c6c23f95a60572f9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57469350bdcfa9d44c83dd87ce9ef88a75c2ed0302fd633c6c23f95a60572f9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1536a2d9d7cc0d60c0442b625d3e58b3b5ffa615b5424405b8f896940e4085b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1536a2d9d7cc0d60c0442b625d3e58b3b5ffa615b5424405b8f896940e4085b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a7a4a4bf21360999d06acf702346f565fb76f5a5c4e5a8d1666fa395299d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a7a4a4bf21360999d06acf702346f565fb76f5a5c4e5a8d1666fa395299d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce664be16b67bb0782620399124bae458654d66c9331521c2df213476651b740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce664be16b67bb0782620399124bae458654d66c9331521c2df213476651b740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99311d694d177551b5f59aa756a0674f18816e477c3195cd30d40a805c964940\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99311d694d177551b5f59aa756a0674f18816e477c3195cd30d40a805c964940\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jtcxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:50Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:51 crc kubenswrapper[4901]: E0309 02:43:51.461022 4901 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 02:43:52 crc kubenswrapper[4901]: I0309 02:43:52.106172 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:43:52 crc kubenswrapper[4901]: I0309 02:43:52.106265 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:43:52 crc kubenswrapper[4901]: E0309 02:43:52.106760 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 02:43:52 crc kubenswrapper[4901]: I0309 02:43:52.106485 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lg26b" Mar 09 02:43:52 crc kubenswrapper[4901]: E0309 02:43:52.106934 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 02:43:52 crc kubenswrapper[4901]: I0309 02:43:52.106411 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:43:52 crc kubenswrapper[4901]: E0309 02:43:52.107417 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lg26b" podUID="9e883667-62d8-4920-a810-558a77f260ca" Mar 09 02:43:52 crc kubenswrapper[4901]: E0309 02:43:52.107654 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 02:43:53 crc kubenswrapper[4901]: I0309 02:43:53.980961 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmfgc_40c17e04-3fc2-48a2-95dc-fe0428b91e66/ovnkube-controller/2.log" Mar 09 02:43:53 crc kubenswrapper[4901]: I0309 02:43:53.982620 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmfgc_40c17e04-3fc2-48a2-95dc-fe0428b91e66/ovnkube-controller/1.log" Mar 09 02:43:53 crc kubenswrapper[4901]: I0309 02:43:53.986668 4901 generic.go:334] "Generic (PLEG): container finished" podID="40c17e04-3fc2-48a2-95dc-fe0428b91e66" containerID="42126b9cd7c418140407793d61fb0f841ddd7a74cc146c4a922d49bb56828d3e" exitCode=1 Mar 09 02:43:53 crc kubenswrapper[4901]: I0309 02:43:53.986734 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" event={"ID":"40c17e04-3fc2-48a2-95dc-fe0428b91e66","Type":"ContainerDied","Data":"42126b9cd7c418140407793d61fb0f841ddd7a74cc146c4a922d49bb56828d3e"} Mar 09 02:43:53 crc kubenswrapper[4901]: I0309 02:43:53.986808 4901 scope.go:117] "RemoveContainer" containerID="c188c6c7f7cc9a46c9e941cf0096b7d2828dac9036d388388958f1d44f3cfdd4" Mar 09 02:43:53 crc kubenswrapper[4901]: I0309 02:43:53.987742 4901 scope.go:117] "RemoveContainer" containerID="42126b9cd7c418140407793d61fb0f841ddd7a74cc146c4a922d49bb56828d3e" Mar 09 02:43:53 crc kubenswrapper[4901]: E0309 02:43:53.987950 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bmfgc_openshift-ovn-kubernetes(40c17e04-3fc2-48a2-95dc-fe0428b91e66)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" podUID="40c17e04-3fc2-48a2-95dc-fe0428b91e66" Mar 09 02:43:54 crc kubenswrapper[4901]: I0309 02:43:54.011538 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:54Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:54 crc kubenswrapper[4901]: I0309 02:43:54.037842 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40c17e04-3fc2-48a2-95dc-fe0428b91e66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3178075164db642d21adaf4deed75f6f295794530311c6476b6f0376101b61c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce11cf214e5d6313c477f4939e0627d14eecc389d7df43bfa0a4bff5152baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8eead4ce26a12a03f0e1808e8e8479ca48c9a409935ea566fae8a7cc2d8bc7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c9f87a22de3336b00d43f7848f50bda9f5f27f7ac84c6c6e63a224583005e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a115bbd5ac61babf8f88452e2c50b49fd24b34589357e4a7d59bd2a3511b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce9289308fa0e0313c1e1de4a2da2e1544bf309f32c6f58cf4a914210f10c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42126b9cd7c418140407793d61fb0f841ddd7a74cc146c4a922d49bb56828d3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c188c6c7f7cc9a46c9e941cf0096b7d2828dac9036d388388958f1d44f3cfdd4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T02:43:15Z\\\",\\\"message\\\":\\\"1336] Added *v1.EgressIP event handler 8\\\\nI0309 02:43:15.073036 6943 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 02:43:15.073060 6943 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 02:43:15.073089 6943 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 02:43:15.073122 6943 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0309 02:43:15.073137 6943 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0309 02:43:15.073159 6943 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0309 02:43:15.073188 6943 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0309 02:43:15.073200 6943 factory.go:656] Stopping watch factory\\\\nI0309 02:43:15.073201 6943 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0309 02:43:15.073165 6943 handler.go:208] Removed *v1.Node event handler 7\\\\nI0309 02:43:15.073258 6943 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0309 02:43:15.073380 6943 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0309 02:43:15.073436 6943 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0309 02:43:15.073461 6943 ovnkube.go:599] Stopped ovnkube\\\\nI0309 02:43:15.073484 6943 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0309 02:43:15.073539 6943 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42126b9cd7c418140407793d61fb0f841ddd7a74cc146c4a922d49bb56828d3e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T02:43:53Z\\\",\\\"message\\\":\\\"e-apiserver-operator-766d6c64bb-xtpw8 is in primary UDN: could not find OVN pod annotation in map[]\\\\nI0309 02:43:53.058650 7279 controller.go:257] Controller udn-host-isolation-manager: error found while processing openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5jjc7: failed to check if pod openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5jjc7 is in primary UDN: could not find OVN pod annotation in map[openshift.io/required-scc:restricted-v2 openshift.io/scc:restricted-v2 seccomp.security.alpha.kubernetes.io/pod:runtime/default]\\\\nI0309 02:43:53.058672 7279 controller.go:257] Controller udn-host-isolation-manager: error found while processing openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fl2d7: failed to check if pod openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fl2d7 is in primary UDN: could not find OVN pod annotation in map[openshift.io/required-scc:restricted-v2 openshift.io/scc:restricted-v2 seccomp.security.alpha.kubernetes.io/pod:runtime/default]\\\\nE0309 02:43:53.140491 7279 shared_informer.go:316] \\\\\\\"Unhandled Error\\\\\\\" err=\\\\\\\"unable to sync caches for ovn-lb-controller\\\\\\\" logger=\\\\\\\"UnhandledError\\\\\\\"\\\\nI0309 02:43:53.141669 7279 ovnkube.go:599] Stopped ovnkube\\\\nI0309 02:43:53.141749 7279 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2c4c63b1ab59c59daca5ed71ee5ed0e53a565ffbc793ac516f9ba75a4ece272\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cddsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmfgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:54Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:54 crc kubenswrapper[4901]: I0309 02:43:54.054985 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-c2pjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b166f443-7b7c-478b-bfc1-67291aa158fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f409cfbaaf5a97aa5b05433826d0ba084ecc5dc20afb3e733c9cb2e38a2e507a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4l68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-c2pjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:54Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:54 crc kubenswrapper[4901]: I0309 02:43:54.070667 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d957c835-1f04-423a-9641-444392da0360\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3836e17de731d173fb726c092d6f4a3ed70b2eab0297fb4e146004c22a3fbc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e716f348e3907e1861b6fd17ba499078ec41d0ee31bc8855d9744e4b0f640\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf2092a65f531dedb9529485252943746497cea4fec2b6c52ed5220eed868129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b72286bfa72acc7e3ad2d5eb5734f182e165eacaa4a7ccfd35675c43fc65a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b72286bfa72acc7e3ad2d5eb5734f182e165eacaa4a7ccfd35675c43fc65a82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:54Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:54 crc kubenswrapper[4901]: I0309 02:43:54.098558 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f58555f-103d-42bd-affa-68dd3ed5baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18f78a3248e86d96e6293502db1727285fb76668c8242ed3399de4b8e8fb666a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79c92d776bf96579afc6fbc0e1e70bb645a371966be4e4e9d86e18ef3c977b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7936d7271945f5c0b5ab0ddecf9a3f46c9f7306b71e25fabbb3927303c6f9adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1098f7874be8a2a538c966845a946ab7993ec921e4e0859953f0187b4fabfd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23f66811373b03763a100551c0390732ccceac869c908d769de48f847e4ab82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b53a32afd9495980bb02a0904460a68c8a0f1a69febb98a74f030abde06f4765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56c2088d5e47e527c26f2d31179205d6d5d3a79bab87bfad7849e9fa9a7d9886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://351df9957e5702cd2dd6907d91014c7adddfff064d77a7ef0c3d58b8934c3af4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:54Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:54 crc kubenswrapper[4901]: I0309 02:43:54.105946 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lg26b" Mar 09 02:43:54 crc kubenswrapper[4901]: E0309 02:43:54.106144 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lg26b" podUID="9e883667-62d8-4920-a810-558a77f260ca" Mar 09 02:43:54 crc kubenswrapper[4901]: I0309 02:43:54.106499 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:43:54 crc kubenswrapper[4901]: E0309 02:43:54.106609 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 02:43:54 crc kubenswrapper[4901]: I0309 02:43:54.106890 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:43:54 crc kubenswrapper[4901]: I0309 02:43:54.106969 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:43:54 crc kubenswrapper[4901]: E0309 02:43:54.107188 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 02:43:54 crc kubenswrapper[4901]: E0309 02:43:54.107396 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 02:43:54 crc kubenswrapper[4901]: I0309 02:43:54.120877 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:54Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:54 crc kubenswrapper[4901]: I0309 02:43:54.141658 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ad2682-53b5-4e9d-acd1-0f0d210b322c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a290d0cc95a867bb7d924bfb2e63a518bb753765d4cdb890ccdf96cd36450b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72901ff5efbd52bce10e4651744925f7a5758a7a3c2572b581ab2cd222f58bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d0fe7f5e959b5c71b03b4ad017949229c3a161ea79f7869e610d069248c552\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f1f96e1c6d66ab0dc08021bc7fe05a276034895cd0bb6784bd0ce5a18b7d406\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 02:42:28.169798 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 02:42:28.169917 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 02:42:28.170597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1507146702/tls.crt::/tmp/serving-cert-1507146702/tls.key\\\\\\\"\\\\nI0309 02:42:28.390239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 02:42:28.392919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 02:42:28.392936 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 02:42:28.392959 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 02:42:28.392964 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 02:42:28.400711 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 02:42:28.400732 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 02:42:28.400760 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 02:42:28.400778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 02:42:28.400786 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 02:42:28.400792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 02:42:28.400798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 02:42:28.402404 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99df08aa182691c1bdb9ab9fb7b5352b276c236655b1076e2993639d63a94bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:41:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:54Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:54 crc kubenswrapper[4901]: I0309 02:43:54.166016 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:54Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:54 crc kubenswrapper[4901]: I0309 02:43:54.181160 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lg26b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e883667-62d8-4920-a810-558a77f260ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgs6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hgs6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lg26b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:54Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:54 crc kubenswrapper[4901]: I0309 02:43:54.197029 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tvqrz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ac7cbd-671c-4beb-8994-92502ee47ceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee19f70b95aa359f8230d02f66a5ffe7bd83b01af33602ffee28239e335210d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbmmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tvqrz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:54Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:54 crc kubenswrapper[4901]: I0309 02:43:54.219481 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jtcxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98293641-7bf9-4473-ae92-c80e56cefdb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8cb9651e7ee62e24bc299a24e24546fc393d36c247cfc7d6fab578f2d5fc392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9388d004278c87c5f70e21f80ffb92a37910be99d37d4890f1802ca8ee862c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9388d004278c87c5f70e21f80ffb92a37910be99d37d4890f1802ca8ee862c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57469350bdcfa9d44c83dd87ce9ef88a75c2ed0302fd633c6c23f95a60572f9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57469350bdcfa9d44c83dd87ce9ef88a75c2ed0302fd633c6c23f95a60572f9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1536a2d9d7cc0d60c0442b625d3e58b3b5ffa615b5424405b8f896940e4085b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1536a2d9d7cc0d60c0442b625d3e58b3b5ffa615b5424405b8f896940e4085b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a7a4a4bf21360999d06acf702346f565fb76f5a5c4e5a8d1666fa395299d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a7a4a4bf21360999d06acf702346f565fb76f5a5c4e5a8d1666fa395299d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce664be16b67bb0782620399124bae458654d66c9331521c2df213476651b740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce664be16b67bb0782620399124bae458654d66c9331521c2df213476651b740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99311d694d177551b5f59aa756a0674f18816e477c3195cd30d40a805c964940\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99311d694d177551b5f59aa756a0674f18816e477c3195cd30d40a805c964940\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T02:43:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm787\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jtcxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:54Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:54 crc kubenswrapper[4901]: I0309 02:43:54.235956 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz59t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d14fa511-1706-4ecd-9190-c39af889657c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://799a782b20c6af25f16146a6f775dbb1f9a70567c6e183bc3760541beba535a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5wgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://994153537c9e7d0ac89a6651b22147c42d9263d9652cc7748afa72e1e46a344a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5wgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rz59t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:54Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:54 crc kubenswrapper[4901]: I0309 02:43:54.256685 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad3db3b3-d72c-4e61-9db8-fcbc6abb0578\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:41:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95a5fc152f730178294cfb5d25a815b2041e4b712e4bdb7b334b4b417f463ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ceafe61efea5730e474e74a9ed5a7558cc06dc75d105c37680ed1eea84b8827c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T02:42:17Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 02:41:48.034714 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 02:41:48.036179 1 observer_polling.go:159] Starting file observer\\\\nI0309 02:41:48.038474 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 02:41:48.039660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 02:42:14.296758 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 02:42:17.547772 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 02:42:17.547837 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:41:47Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0cd220ceed026b8b0448743b09537807f1d4c59e290c74e4afa63b3331aaf97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c86d550b0e0522073aa52f3a6aa6671f7ded403e37eda7c7b91c6e069272dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04aae8495c8f5edddbe537c44cefdb20eaa781df485766f6e709a5d4c6e82df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:41:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:54Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:54 crc kubenswrapper[4901]: I0309 02:43:54.273119 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84cc042c2e8b6984f9b3357afce277a63722f17b3e8fd289eeeaa8a7a47935f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:54Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:54 crc kubenswrapper[4901]: I0309 02:43:54.290056 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65e722e8-52c4-4bb6-9927-f378b2f7296a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25b2c2479794812333e6425d16a7432c46578a2a6d240682761545e57af6b4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7db76f28a31cfe2c8e33f8d3addf0b75358f97a36710315a069de1716506eae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr22q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5c998\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:54Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:54 crc kubenswrapper[4901]: I0309 02:43:54.308641 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7a956f96042c9ac58c08b70e189b199ec512bc56ffedd9df3956127a2380a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:54Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:54 crc kubenswrapper[4901]: I0309 02:43:54.321683 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T02:42:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f7c21ba0d2836cb5905904941725dc9b5df5e173b47637722608075496850bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ce0f53e8c88e047328fb896ffa5133e9ac723b73795634e24072acde25506d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:42:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:54Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:54 crc kubenswrapper[4901]: I0309 02:43:54.334130 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-429fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T02:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c66d39cf5683043c4f736459fb89385f1480a81932fe2d1f1fd5dc7314f6f54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f00772dc49e31beb94fb8c970e126f77b33ecff59ca8d6fbc856704648ec6d1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T02:43:48Z\\\",\\\"message\\\":\\\"2026-03-09T02:43:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8ed7c822-a8d2-4694-b891-16aaa4b3a68b\\\\n2026-03-09T02:43:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8ed7c822-a8d2-4694-b891-16aaa4b3a68b to /host/opt/cni/bin/\\\\n2026-03-09T02:43:03Z [verbose] multus-daemon started\\\\n2026-03-09T02:43:03Z [verbose] Readiness Indicator file check\\\\n2026-03-09T02:43:48Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T02:43:01Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T02:43:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4j6pq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T02:43:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-429fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T02:43:54Z is after 2025-08-24T17:21:41Z" Mar 09 02:43:54 crc kubenswrapper[4901]: I0309 02:43:54.992199 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmfgc_40c17e04-3fc2-48a2-95dc-fe0428b91e66/ovnkube-controller/2.log" Mar 09 02:43:55 crc kubenswrapper[4901]: I0309 02:43:55.317023 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 02:43:55 crc kubenswrapper[4901]: I0309 02:43:55.317102 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 02:43:55 crc kubenswrapper[4901]: I0309 02:43:55.317126 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 02:43:55 crc kubenswrapper[4901]: I0309 02:43:55.317155 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 02:43:55 crc kubenswrapper[4901]: I0309 02:43:55.317178 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T02:43:55Z","lastTransitionTime":"2026-03-09T02:43:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 02:43:55 crc kubenswrapper[4901]: I0309 02:43:55.385500 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-cvmcm"] Mar 09 02:43:55 crc kubenswrapper[4901]: I0309 02:43:55.389980 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cvmcm" Mar 09 02:43:55 crc kubenswrapper[4901]: I0309 02:43:55.396024 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 09 02:43:55 crc kubenswrapper[4901]: I0309 02:43:55.396128 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 09 02:43:55 crc kubenswrapper[4901]: I0309 02:43:55.396570 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 09 02:43:55 crc kubenswrapper[4901]: I0309 02:43:55.396763 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 09 02:43:55 crc kubenswrapper[4901]: I0309 02:43:55.413978 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24482c0c-a3e4-43d3-b08d-efa013d596ca-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-cvmcm\" (UID: \"24482c0c-a3e4-43d3-b08d-efa013d596ca\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cvmcm" Mar 09 02:43:55 crc kubenswrapper[4901]: I0309 02:43:55.414632 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/24482c0c-a3e4-43d3-b08d-efa013d596ca-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-cvmcm\" (UID: \"24482c0c-a3e4-43d3-b08d-efa013d596ca\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cvmcm" Mar 09 02:43:55 crc kubenswrapper[4901]: I0309 02:43:55.414802 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/24482c0c-a3e4-43d3-b08d-efa013d596ca-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-cvmcm\" (UID: \"24482c0c-a3e4-43d3-b08d-efa013d596ca\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cvmcm" Mar 09 02:43:55 crc kubenswrapper[4901]: I0309 02:43:55.414848 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/24482c0c-a3e4-43d3-b08d-efa013d596ca-service-ca\") pod \"cluster-version-operator-5c965bbfc6-cvmcm\" (UID: \"24482c0c-a3e4-43d3-b08d-efa013d596ca\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cvmcm" Mar 09 02:43:55 crc kubenswrapper[4901]: I0309 02:43:55.414887 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/24482c0c-a3e4-43d3-b08d-efa013d596ca-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-cvmcm\" (UID: \"24482c0c-a3e4-43d3-b08d-efa013d596ca\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cvmcm" Mar 09 02:43:55 crc kubenswrapper[4901]: I0309 02:43:55.439873 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=79.439848334 podStartE2EDuration="1m19.439848334s" podCreationTimestamp="2026-03-09 02:42:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:43:55.422387135 +0000 UTC m=+160.012050917" watchObservedRunningTime="2026-03-09 02:43:55.439848334 +0000 UTC m=+160.029512116" Mar 09 02:43:55 crc kubenswrapper[4901]: I0309 02:43:55.471154 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rz59t" podStartSLOduration=87.471125903 podStartE2EDuration="1m27.471125903s" podCreationTimestamp="2026-03-09 02:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:43:55.471116132 +0000 UTC m=+160.060779904" watchObservedRunningTime="2026-03-09 02:43:55.471125903 +0000 UTC m=+160.060789675" Mar 09 02:43:55 crc kubenswrapper[4901]: I0309 02:43:55.491370 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=79.491336659 podStartE2EDuration="1m19.491336659s" podCreationTimestamp="2026-03-09 02:42:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:43:55.490656342 +0000 UTC m=+160.080320114" watchObservedRunningTime="2026-03-09 02:43:55.491336659 +0000 UTC m=+160.081000431" Mar 09 02:43:55 crc kubenswrapper[4901]: I0309 02:43:55.515497 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/24482c0c-a3e4-43d3-b08d-efa013d596ca-service-ca\") pod \"cluster-version-operator-5c965bbfc6-cvmcm\" (UID: \"24482c0c-a3e4-43d3-b08d-efa013d596ca\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cvmcm" Mar 09 02:43:55 crc kubenswrapper[4901]: I0309 02:43:55.515559 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/24482c0c-a3e4-43d3-b08d-efa013d596ca-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-cvmcm\" (UID: \"24482c0c-a3e4-43d3-b08d-efa013d596ca\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cvmcm" Mar 09 02:43:55 crc kubenswrapper[4901]: I0309 02:43:55.515652 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24482c0c-a3e4-43d3-b08d-efa013d596ca-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-cvmcm\" (UID: \"24482c0c-a3e4-43d3-b08d-efa013d596ca\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cvmcm" Mar 09 02:43:55 crc kubenswrapper[4901]: I0309 02:43:55.515711 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/24482c0c-a3e4-43d3-b08d-efa013d596ca-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-cvmcm\" (UID: \"24482c0c-a3e4-43d3-b08d-efa013d596ca\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cvmcm" Mar 09 02:43:55 crc kubenswrapper[4901]: I0309 02:43:55.515720 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/24482c0c-a3e4-43d3-b08d-efa013d596ca-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-cvmcm\" (UID: \"24482c0c-a3e4-43d3-b08d-efa013d596ca\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cvmcm" Mar 09 02:43:55 crc kubenswrapper[4901]: I0309 02:43:55.515742 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/24482c0c-a3e4-43d3-b08d-efa013d596ca-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-cvmcm\" (UID: \"24482c0c-a3e4-43d3-b08d-efa013d596ca\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cvmcm" Mar 09 02:43:55 crc kubenswrapper[4901]: I0309 02:43:55.515811 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/24482c0c-a3e4-43d3-b08d-efa013d596ca-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-cvmcm\" (UID: \"24482c0c-a3e4-43d3-b08d-efa013d596ca\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cvmcm" Mar 09 02:43:55 crc kubenswrapper[4901]: I0309 02:43:55.516922 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/24482c0c-a3e4-43d3-b08d-efa013d596ca-service-ca\") pod \"cluster-version-operator-5c965bbfc6-cvmcm\" (UID: \"24482c0c-a3e4-43d3-b08d-efa013d596ca\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cvmcm" Mar 09 02:43:55 crc kubenswrapper[4901]: I0309 02:43:55.524973 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24482c0c-a3e4-43d3-b08d-efa013d596ca-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-cvmcm\" (UID: \"24482c0c-a3e4-43d3-b08d-efa013d596ca\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cvmcm" Mar 09 02:43:55 crc kubenswrapper[4901]: I0309 02:43:55.541964 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-tvqrz" podStartSLOduration=88.541944712 podStartE2EDuration="1m28.541944712s" podCreationTimestamp="2026-03-09 02:42:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:43:55.520539126 +0000 UTC m=+160.110202918" watchObservedRunningTime="2026-03-09 02:43:55.541944712 +0000 UTC m=+160.131608444" Mar 09 02:43:55 crc kubenswrapper[4901]: I0309 02:43:55.546315 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/24482c0c-a3e4-43d3-b08d-efa013d596ca-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-cvmcm\" (UID: \"24482c0c-a3e4-43d3-b08d-efa013d596ca\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cvmcm" Mar 09 02:43:55 crc kubenswrapper[4901]: I0309 02:43:55.562728 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-jtcxx" podStartSLOduration=88.562708202 podStartE2EDuration="1m28.562708202s" podCreationTimestamp="2026-03-09 02:42:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:43:55.546629187 +0000 UTC m=+160.136292919" watchObservedRunningTime="2026-03-09 02:43:55.562708202 +0000 UTC m=+160.152371934" Mar 09 02:43:55 crc kubenswrapper[4901]: I0309 02:43:55.592959 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-429fk" podStartSLOduration=88.592937235 podStartE2EDuration="1m28.592937235s" podCreationTimestamp="2026-03-09 02:42:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:43:55.591837378 +0000 UTC m=+160.181501110" watchObservedRunningTime="2026-03-09 02:43:55.592937235 +0000 UTC m=+160.182600967" Mar 09 02:43:55 crc kubenswrapper[4901]: I0309 02:43:55.608436 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podStartSLOduration=88.608416035 podStartE2EDuration="1m28.608416035s" podCreationTimestamp="2026-03-09 02:42:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:43:55.608076177 +0000 UTC m=+160.197739939" watchObservedRunningTime="2026-03-09 02:43:55.608416035 +0000 UTC m=+160.198079787" Mar 09 02:43:55 crc kubenswrapper[4901]: I0309 02:43:55.623134 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-c2pjf" podStartSLOduration=88.623117116 podStartE2EDuration="1m28.623117116s" podCreationTimestamp="2026-03-09 02:42:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:43:55.62288537 +0000 UTC m=+160.212549102" watchObservedRunningTime="2026-03-09 02:43:55.623117116 +0000 UTC m=+160.212780848" Mar 09 02:43:55 crc kubenswrapper[4901]: I0309 02:43:55.682759 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=73.682736441 podStartE2EDuration="1m13.682736441s" podCreationTimestamp="2026-03-09 02:42:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:43:55.682022913 +0000 UTC m=+160.271686645" watchObservedRunningTime="2026-03-09 02:43:55.682736441 +0000 UTC m=+160.272400173" Mar 09 02:43:55 crc kubenswrapper[4901]: I0309 02:43:55.683397 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=27.683391907 podStartE2EDuration="27.683391907s" podCreationTimestamp="2026-03-09 02:43:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:43:55.654276802 +0000 UTC m=+160.243940544" watchObservedRunningTime="2026-03-09 02:43:55.683391907 +0000 UTC m=+160.273055639" Mar 09 02:43:55 crc kubenswrapper[4901]: I0309 02:43:55.712214 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cvmcm" Mar 09 02:43:56 crc kubenswrapper[4901]: I0309 02:43:56.001904 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cvmcm" event={"ID":"24482c0c-a3e4-43d3-b08d-efa013d596ca","Type":"ContainerStarted","Data":"62860b21f853d5f893bb2cb08ac0c0df72952c0926ece7da83d7d576f84798f2"} Mar 09 02:43:56 crc kubenswrapper[4901]: I0309 02:43:56.001978 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cvmcm" event={"ID":"24482c0c-a3e4-43d3-b08d-efa013d596ca","Type":"ContainerStarted","Data":"5aa942f42c3ecc4077e299f09c11a93b5e38fd8ce5dcc0073e0bafdc7932f333"} Mar 09 02:43:56 crc kubenswrapper[4901]: I0309 02:43:56.030714 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cvmcm" podStartSLOduration=89.030683678 podStartE2EDuration="1m29.030683678s" podCreationTimestamp="2026-03-09 02:42:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:43:56.029855957 +0000 UTC m=+160.619519719" watchObservedRunningTime="2026-03-09 02:43:56.030683678 +0000 UTC m=+160.620347450" Mar 09 02:43:56 crc kubenswrapper[4901]: I0309 02:43:56.112540 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:43:56 crc kubenswrapper[4901]: I0309 02:43:56.112540 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:43:56 crc kubenswrapper[4901]: I0309 02:43:56.112631 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lg26b" Mar 09 02:43:56 crc kubenswrapper[4901]: E0309 02:43:56.114479 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 02:43:56 crc kubenswrapper[4901]: I0309 02:43:56.114818 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:43:56 crc kubenswrapper[4901]: E0309 02:43:56.114931 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 02:43:56 crc kubenswrapper[4901]: E0309 02:43:56.115161 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lg26b" podUID="9e883667-62d8-4920-a810-558a77f260ca" Mar 09 02:43:56 crc kubenswrapper[4901]: E0309 02:43:56.115408 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 02:43:56 crc kubenswrapper[4901]: I0309 02:43:56.135787 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 09 02:43:56 crc kubenswrapper[4901]: I0309 02:43:56.148903 4901 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 09 02:43:56 crc kubenswrapper[4901]: E0309 02:43:56.463009 4901 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 02:43:58 crc kubenswrapper[4901]: I0309 02:43:58.105816 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lg26b" Mar 09 02:43:58 crc kubenswrapper[4901]: E0309 02:43:58.107145 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lg26b" podUID="9e883667-62d8-4920-a810-558a77f260ca" Mar 09 02:43:58 crc kubenswrapper[4901]: I0309 02:43:58.107653 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:43:58 crc kubenswrapper[4901]: E0309 02:43:58.107885 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 02:43:58 crc kubenswrapper[4901]: I0309 02:43:58.108254 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:43:58 crc kubenswrapper[4901]: E0309 02:43:58.108500 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 02:43:58 crc kubenswrapper[4901]: I0309 02:43:58.108935 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:43:58 crc kubenswrapper[4901]: E0309 02:43:58.109208 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 02:43:59 crc kubenswrapper[4901]: I0309 02:43:59.126677 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 09 02:44:00 crc kubenswrapper[4901]: I0309 02:44:00.105910 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lg26b" Mar 09 02:44:00 crc kubenswrapper[4901]: I0309 02:44:00.105975 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:44:00 crc kubenswrapper[4901]: I0309 02:44:00.106094 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:44:00 crc kubenswrapper[4901]: E0309 02:44:00.106161 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lg26b" podUID="9e883667-62d8-4920-a810-558a77f260ca" Mar 09 02:44:00 crc kubenswrapper[4901]: E0309 02:44:00.106339 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 02:44:00 crc kubenswrapper[4901]: E0309 02:44:00.106591 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 02:44:00 crc kubenswrapper[4901]: I0309 02:44:00.106600 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:44:00 crc kubenswrapper[4901]: E0309 02:44:00.106778 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 02:44:01 crc kubenswrapper[4901]: E0309 02:44:01.464448 4901 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 02:44:02 crc kubenswrapper[4901]: I0309 02:44:02.105724 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lg26b" Mar 09 02:44:02 crc kubenswrapper[4901]: I0309 02:44:02.105754 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:44:02 crc kubenswrapper[4901]: I0309 02:44:02.105835 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:44:02 crc kubenswrapper[4901]: E0309 02:44:02.106057 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lg26b" podUID="9e883667-62d8-4920-a810-558a77f260ca" Mar 09 02:44:02 crc kubenswrapper[4901]: I0309 02:44:02.106374 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:44:02 crc kubenswrapper[4901]: E0309 02:44:02.106494 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 02:44:02 crc kubenswrapper[4901]: E0309 02:44:02.106687 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 02:44:02 crc kubenswrapper[4901]: E0309 02:44:02.106870 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 02:44:04 crc kubenswrapper[4901]: I0309 02:44:04.105525 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:44:04 crc kubenswrapper[4901]: I0309 02:44:04.105582 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:44:04 crc kubenswrapper[4901]: E0309 02:44:04.105701 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 02:44:04 crc kubenswrapper[4901]: I0309 02:44:04.105772 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:44:04 crc kubenswrapper[4901]: E0309 02:44:04.105828 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 02:44:04 crc kubenswrapper[4901]: E0309 02:44:04.105956 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 02:44:04 crc kubenswrapper[4901]: I0309 02:44:04.106865 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lg26b" Mar 09 02:44:04 crc kubenswrapper[4901]: E0309 02:44:04.107253 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lg26b" podUID="9e883667-62d8-4920-a810-558a77f260ca" Mar 09 02:44:06 crc kubenswrapper[4901]: I0309 02:44:06.105921 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lg26b" Mar 09 02:44:06 crc kubenswrapper[4901]: I0309 02:44:06.106013 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:44:06 crc kubenswrapper[4901]: E0309 02:44:06.108007 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lg26b" podUID="9e883667-62d8-4920-a810-558a77f260ca" Mar 09 02:44:06 crc kubenswrapper[4901]: I0309 02:44:06.108138 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:44:06 crc kubenswrapper[4901]: I0309 02:44:06.108172 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:44:06 crc kubenswrapper[4901]: E0309 02:44:06.108741 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 02:44:06 crc kubenswrapper[4901]: E0309 02:44:06.108910 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 02:44:06 crc kubenswrapper[4901]: E0309 02:44:06.108358 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 02:44:06 crc kubenswrapper[4901]: I0309 02:44:06.125186 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=7.12516426 podStartE2EDuration="7.12516426s" podCreationTimestamp="2026-03-09 02:43:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:44:06.124539875 +0000 UTC m=+170.714203647" watchObservedRunningTime="2026-03-09 02:44:06.12516426 +0000 UTC m=+170.714828032" Mar 09 02:44:06 crc kubenswrapper[4901]: E0309 02:44:06.466146 4901 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 02:44:08 crc kubenswrapper[4901]: I0309 02:44:08.106083 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lg26b" Mar 09 02:44:08 crc kubenswrapper[4901]: I0309 02:44:08.106324 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:44:08 crc kubenswrapper[4901]: I0309 02:44:08.106351 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:44:08 crc kubenswrapper[4901]: E0309 02:44:08.106565 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lg26b" podUID="9e883667-62d8-4920-a810-558a77f260ca" Mar 09 02:44:08 crc kubenswrapper[4901]: I0309 02:44:08.106374 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:44:08 crc kubenswrapper[4901]: E0309 02:44:08.106663 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 02:44:08 crc kubenswrapper[4901]: E0309 02:44:08.106945 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 02:44:08 crc kubenswrapper[4901]: E0309 02:44:08.107099 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 02:44:08 crc kubenswrapper[4901]: I0309 02:44:08.108597 4901 scope.go:117] "RemoveContainer" containerID="42126b9cd7c418140407793d61fb0f841ddd7a74cc146c4a922d49bb56828d3e" Mar 09 02:44:08 crc kubenswrapper[4901]: E0309 02:44:08.108937 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bmfgc_openshift-ovn-kubernetes(40c17e04-3fc2-48a2-95dc-fe0428b91e66)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" podUID="40c17e04-3fc2-48a2-95dc-fe0428b91e66" Mar 09 02:44:10 crc kubenswrapper[4901]: I0309 02:44:10.106073 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:44:10 crc kubenswrapper[4901]: I0309 02:44:10.106073 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:44:10 crc kubenswrapper[4901]: I0309 02:44:10.106404 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:44:10 crc kubenswrapper[4901]: E0309 02:44:10.106442 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 02:44:10 crc kubenswrapper[4901]: I0309 02:44:10.106524 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lg26b" Mar 09 02:44:10 crc kubenswrapper[4901]: E0309 02:44:10.106669 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 02:44:10 crc kubenswrapper[4901]: E0309 02:44:10.106794 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lg26b" podUID="9e883667-62d8-4920-a810-558a77f260ca" Mar 09 02:44:10 crc kubenswrapper[4901]: E0309 02:44:10.106885 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 02:44:11 crc kubenswrapper[4901]: E0309 02:44:11.467619 4901 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 02:44:12 crc kubenswrapper[4901]: I0309 02:44:12.106293 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lg26b" Mar 09 02:44:12 crc kubenswrapper[4901]: I0309 02:44:12.106316 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:44:12 crc kubenswrapper[4901]: I0309 02:44:12.106357 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:44:12 crc kubenswrapper[4901]: E0309 02:44:12.107028 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 02:44:12 crc kubenswrapper[4901]: I0309 02:44:12.106384 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:44:12 crc kubenswrapper[4901]: E0309 02:44:12.107141 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 02:44:12 crc kubenswrapper[4901]: E0309 02:44:12.107350 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 02:44:12 crc kubenswrapper[4901]: E0309 02:44:12.107578 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lg26b" podUID="9e883667-62d8-4920-a810-558a77f260ca" Mar 09 02:44:14 crc kubenswrapper[4901]: I0309 02:44:14.105896 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:44:14 crc kubenswrapper[4901]: I0309 02:44:14.105933 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:44:14 crc kubenswrapper[4901]: I0309 02:44:14.106064 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:44:14 crc kubenswrapper[4901]: E0309 02:44:14.106066 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 02:44:14 crc kubenswrapper[4901]: I0309 02:44:14.106139 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lg26b" Mar 09 02:44:14 crc kubenswrapper[4901]: E0309 02:44:14.106440 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 02:44:14 crc kubenswrapper[4901]: E0309 02:44:14.106566 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 02:44:14 crc kubenswrapper[4901]: E0309 02:44:14.106792 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lg26b" podUID="9e883667-62d8-4920-a810-558a77f260ca" Mar 09 02:44:16 crc kubenswrapper[4901]: I0309 02:44:16.106170 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lg26b" Mar 09 02:44:16 crc kubenswrapper[4901]: I0309 02:44:16.106245 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:44:16 crc kubenswrapper[4901]: I0309 02:44:16.106213 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:44:16 crc kubenswrapper[4901]: I0309 02:44:16.106170 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:44:16 crc kubenswrapper[4901]: E0309 02:44:16.108458 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 02:44:16 crc kubenswrapper[4901]: E0309 02:44:16.108580 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 02:44:16 crc kubenswrapper[4901]: E0309 02:44:16.108750 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lg26b" podUID="9e883667-62d8-4920-a810-558a77f260ca" Mar 09 02:44:16 crc kubenswrapper[4901]: E0309 02:44:16.108889 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 02:44:16 crc kubenswrapper[4901]: E0309 02:44:16.469383 4901 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 02:44:18 crc kubenswrapper[4901]: I0309 02:44:18.064805 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e883667-62d8-4920-a810-558a77f260ca-metrics-certs\") pod \"network-metrics-daemon-lg26b\" (UID: \"9e883667-62d8-4920-a810-558a77f260ca\") " pod="openshift-multus/network-metrics-daemon-lg26b" Mar 09 02:44:18 crc kubenswrapper[4901]: E0309 02:44:18.065102 4901 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 02:44:18 crc kubenswrapper[4901]: E0309 02:44:18.065274 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e883667-62d8-4920-a810-558a77f260ca-metrics-certs podName:9e883667-62d8-4920-a810-558a77f260ca nodeName:}" failed. No retries permitted until 2026-03-09 02:45:22.065220618 +0000 UTC m=+246.654884340 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9e883667-62d8-4920-a810-558a77f260ca-metrics-certs") pod "network-metrics-daemon-lg26b" (UID: "9e883667-62d8-4920-a810-558a77f260ca") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 02:44:18 crc kubenswrapper[4901]: I0309 02:44:18.105935 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:44:18 crc kubenswrapper[4901]: E0309 02:44:18.106144 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 02:44:18 crc kubenswrapper[4901]: I0309 02:44:18.106444 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lg26b" Mar 09 02:44:18 crc kubenswrapper[4901]: E0309 02:44:18.106520 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lg26b" podUID="9e883667-62d8-4920-a810-558a77f260ca" Mar 09 02:44:18 crc kubenswrapper[4901]: I0309 02:44:18.106582 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:44:18 crc kubenswrapper[4901]: E0309 02:44:18.106725 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 02:44:18 crc kubenswrapper[4901]: I0309 02:44:18.106857 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:44:18 crc kubenswrapper[4901]: E0309 02:44:18.107063 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 02:44:20 crc kubenswrapper[4901]: I0309 02:44:20.105977 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lg26b" Mar 09 02:44:20 crc kubenswrapper[4901]: E0309 02:44:20.106155 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lg26b" podUID="9e883667-62d8-4920-a810-558a77f260ca" Mar 09 02:44:20 crc kubenswrapper[4901]: I0309 02:44:20.106493 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:44:20 crc kubenswrapper[4901]: E0309 02:44:20.106619 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 02:44:20 crc kubenswrapper[4901]: I0309 02:44:20.106888 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:44:20 crc kubenswrapper[4901]: E0309 02:44:20.107013 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 02:44:20 crc kubenswrapper[4901]: I0309 02:44:20.107938 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:44:20 crc kubenswrapper[4901]: I0309 02:44:20.108470 4901 scope.go:117] "RemoveContainer" containerID="42126b9cd7c418140407793d61fb0f841ddd7a74cc146c4a922d49bb56828d3e" Mar 09 02:44:20 crc kubenswrapper[4901]: E0309 02:44:20.108347 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 02:44:21 crc kubenswrapper[4901]: I0309 02:44:21.091960 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-lg26b"] Mar 09 02:44:21 crc kubenswrapper[4901]: I0309 02:44:21.112510 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmfgc_40c17e04-3fc2-48a2-95dc-fe0428b91e66/ovnkube-controller/2.log" Mar 09 02:44:21 crc kubenswrapper[4901]: I0309 02:44:21.120114 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lg26b" Mar 09 02:44:21 crc kubenswrapper[4901]: I0309 02:44:21.120355 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" event={"ID":"40c17e04-3fc2-48a2-95dc-fe0428b91e66","Type":"ContainerStarted","Data":"3e4b1f890041b1c44ceb9cb0ef734bdaa379b3af137404deaacb49deb082a289"} Mar 09 02:44:21 crc kubenswrapper[4901]: E0309 02:44:21.120374 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lg26b" podUID="9e883667-62d8-4920-a810-558a77f260ca" Mar 09 02:44:21 crc kubenswrapper[4901]: I0309 02:44:21.120923 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:44:21 crc kubenswrapper[4901]: I0309 02:44:21.162908 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" podStartSLOduration=114.1628861 podStartE2EDuration="1m54.1628861s" podCreationTimestamp="2026-03-09 02:42:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:44:21.162768967 +0000 UTC m=+185.752432699" watchObservedRunningTime="2026-03-09 02:44:21.1628861 +0000 UTC m=+185.752549842" Mar 09 02:44:21 crc kubenswrapper[4901]: E0309 02:44:21.470329 4901 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 02:44:22 crc kubenswrapper[4901]: I0309 02:44:22.105767 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:44:22 crc kubenswrapper[4901]: I0309 02:44:22.105857 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:44:22 crc kubenswrapper[4901]: I0309 02:44:22.105928 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:44:22 crc kubenswrapper[4901]: E0309 02:44:22.106027 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 02:44:22 crc kubenswrapper[4901]: E0309 02:44:22.106250 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 02:44:22 crc kubenswrapper[4901]: E0309 02:44:22.106460 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 02:44:23 crc kubenswrapper[4901]: I0309 02:44:23.105717 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lg26b" Mar 09 02:44:23 crc kubenswrapper[4901]: E0309 02:44:23.105974 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lg26b" podUID="9e883667-62d8-4920-a810-558a77f260ca" Mar 09 02:44:24 crc kubenswrapper[4901]: I0309 02:44:24.105973 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:44:24 crc kubenswrapper[4901]: I0309 02:44:24.106069 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:44:24 crc kubenswrapper[4901]: I0309 02:44:24.106291 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:44:24 crc kubenswrapper[4901]: E0309 02:44:24.106218 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 02:44:24 crc kubenswrapper[4901]: E0309 02:44:24.106497 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 02:44:24 crc kubenswrapper[4901]: E0309 02:44:24.106857 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 02:44:25 crc kubenswrapper[4901]: I0309 02:44:25.105754 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lg26b" Mar 09 02:44:25 crc kubenswrapper[4901]: E0309 02:44:25.105933 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lg26b" podUID="9e883667-62d8-4920-a810-558a77f260ca" Mar 09 02:44:26 crc kubenswrapper[4901]: I0309 02:44:26.106117 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:44:26 crc kubenswrapper[4901]: I0309 02:44:26.106177 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:44:26 crc kubenswrapper[4901]: I0309 02:44:26.108444 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:44:26 crc kubenswrapper[4901]: E0309 02:44:26.108799 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 02:44:26 crc kubenswrapper[4901]: E0309 02:44:26.108965 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 02:44:26 crc kubenswrapper[4901]: E0309 02:44:26.109391 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 02:44:27 crc kubenswrapper[4901]: I0309 02:44:27.105400 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lg26b" Mar 09 02:44:27 crc kubenswrapper[4901]: I0309 02:44:27.109186 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 09 02:44:27 crc kubenswrapper[4901]: I0309 02:44:27.109610 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 09 02:44:28 crc kubenswrapper[4901]: I0309 02:44:28.106055 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:44:28 crc kubenswrapper[4901]: I0309 02:44:28.106186 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:44:28 crc kubenswrapper[4901]: I0309 02:44:28.106101 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:44:28 crc kubenswrapper[4901]: I0309 02:44:28.109637 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 09 02:44:28 crc kubenswrapper[4901]: I0309 02:44:28.110490 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 09 02:44:28 crc kubenswrapper[4901]: I0309 02:44:28.110800 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 09 02:44:28 crc kubenswrapper[4901]: I0309 02:44:28.110977 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 09 02:44:31 crc kubenswrapper[4901]: I0309 02:44:31.232888 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.242246 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.298041 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-92tt5"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.299043 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-92tt5" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.302084 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.302401 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.302547 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.302725 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.302888 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.303020 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.303169 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.305743 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.306491 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.306591 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.308291 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5jgbr"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.318263 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-f6c5c"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.318917 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-xs7zs"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.319570 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-57xhf"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.320076 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-57xhf" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.320860 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-5jgbr" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.321976 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xs7zs" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.333546 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-f6c5c" Mar 09 02:44:36 crc kubenswrapper[4901]: W0309 02:44:36.333780 4901 reflector.go:561] object-"openshift-route-controller-manager"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'crc' and this object Mar 09 02:44:36 crc kubenswrapper[4901]: E0309 02:44:36.334054 4901 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.334075 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 09 02:44:36 crc kubenswrapper[4901]: W0309 02:44:36.334605 4901 reflector.go:561] object-"openshift-route-controller-manager"/"client-ca": failed to list *v1.ConfigMap: configmaps "client-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'crc' and this object Mar 09 02:44:36 crc kubenswrapper[4901]: E0309 02:44:36.334647 4901 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"client-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"client-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.335936 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6t2h5"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.336674 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6t2h5" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.349509 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-pxqjl"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.350149 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.350168 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pxqjl" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.376791 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.377548 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.377877 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.378000 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.378159 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.378652 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-tzf54"] Mar 09 02:44:36 crc kubenswrapper[4901]: W0309 02:44:36.378807 4901 reflector.go:561] object-"openshift-route-controller-manager"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'crc' and this object Mar 09 02:44:36 crc kubenswrapper[4901]: E0309 02:44:36.378839 4901 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 09 02:44:36 crc kubenswrapper[4901]: W0309 02:44:36.378894 4901 reflector.go:561] object-"openshift-route-controller-manager"/"config": failed to list *v1.ConfigMap: configmaps "config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'crc' and this object Mar 09 02:44:36 crc kubenswrapper[4901]: E0309 02:44:36.378907 4901 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.378952 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.379092 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.379269 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.379385 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.379692 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.379803 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.379927 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.380302 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.380429 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.380527 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 09 02:44:36 crc kubenswrapper[4901]: W0309 02:44:36.380603 4901 reflector.go:561] object-"openshift-route-controller-manager"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'crc' and this object Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.380611 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 09 02:44:36 crc kubenswrapper[4901]: E0309 02:44:36.380621 4901 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.380706 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.380849 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.380884 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.380985 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.394558 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.380852 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.397669 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.398663 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.399198 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.400271 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.400463 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.400635 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.400775 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.400952 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.401301 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.402058 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.402597 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.429799 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-vwwl2"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.429983 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e87e0a88-a606-4f22-be47-72cc718fce1b-audit-dir\") pod \"apiserver-7bbb656c7d-xs7zs\" (UID: \"e87e0a88-a606-4f22-be47-72cc718fce1b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xs7zs" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.430010 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e1f8e22e-89a6-46b7-94f6-65aa27575c48-serving-cert\") pod \"controller-manager-879f6c89f-5jgbr\" (UID: \"e1f8e22e-89a6-46b7-94f6-65aa27575c48\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5jgbr" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.430047 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/bffc6cea-5224-4f60-b2c2-3e93aa5cff97-audit\") pod \"apiserver-76f77b778f-92tt5\" (UID: \"bffc6cea-5224-4f60-b2c2-3e93aa5cff97\") " pod="openshift-apiserver/apiserver-76f77b778f-92tt5" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.430068 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bffc6cea-5224-4f60-b2c2-3e93aa5cff97-node-pullsecrets\") pod \"apiserver-76f77b778f-92tt5\" (UID: \"bffc6cea-5224-4f60-b2c2-3e93aa5cff97\") " pod="openshift-apiserver/apiserver-76f77b778f-92tt5" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.430081 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e87e0a88-a606-4f22-be47-72cc718fce1b-serving-cert\") pod \"apiserver-7bbb656c7d-xs7zs\" (UID: \"e87e0a88-a606-4f22-be47-72cc718fce1b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xs7zs" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.430100 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-tzf54\" (UID: \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\") " pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.430117 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bffc6cea-5224-4f60-b2c2-3e93aa5cff97-audit-dir\") pod \"apiserver-76f77b778f-92tt5\" (UID: \"bffc6cea-5224-4f60-b2c2-3e93aa5cff97\") " pod="openshift-apiserver/apiserver-76f77b778f-92tt5" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.430131 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1f8e22e-89a6-46b7-94f6-65aa27575c48-config\") pod \"controller-manager-879f6c89f-5jgbr\" (UID: \"e1f8e22e-89a6-46b7-94f6-65aa27575c48\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5jgbr" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.430145 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rz995\" (UniqueName: \"kubernetes.io/projected/e1f8e22e-89a6-46b7-94f6-65aa27575c48-kube-api-access-rz995\") pod \"controller-manager-879f6c89f-5jgbr\" (UID: \"e1f8e22e-89a6-46b7-94f6-65aa27575c48\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5jgbr" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.430164 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bffc6cea-5224-4f60-b2c2-3e93aa5cff97-etcd-client\") pod \"apiserver-76f77b778f-92tt5\" (UID: \"bffc6cea-5224-4f60-b2c2-3e93aa5cff97\") " pod="openshift-apiserver/apiserver-76f77b778f-92tt5" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.430177 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8919ddf1-096f-407f-8ea3-91a26a623f43-images\") pod \"machine-api-operator-5694c8668f-f6c5c\" (UID: \"8919ddf1-096f-407f-8ea3-91a26a623f43\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-f6c5c" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.430192 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0bef1913-8737-48c8-bcf5-89daf1bd1c54-audit-policies\") pod \"oauth-openshift-558db77b4-tzf54\" (UID: \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\") " pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.430207 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/bffc6cea-5224-4f60-b2c2-3e93aa5cff97-etcd-serving-ca\") pod \"apiserver-76f77b778f-92tt5\" (UID: \"bffc6cea-5224-4f60-b2c2-3e93aa5cff97\") " pod="openshift-apiserver/apiserver-76f77b778f-92tt5" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.430243 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/8919ddf1-096f-407f-8ea3-91a26a623f43-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-f6c5c\" (UID: \"8919ddf1-096f-407f-8ea3-91a26a623f43\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-f6c5c" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.430266 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-tzf54\" (UID: \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\") " pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.430281 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bffc6cea-5224-4f60-b2c2-3e93aa5cff97-serving-cert\") pod \"apiserver-76f77b778f-92tt5\" (UID: \"bffc6cea-5224-4f60-b2c2-3e93aa5cff97\") " pod="openshift-apiserver/apiserver-76f77b778f-92tt5" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.430296 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0bef1913-8737-48c8-bcf5-89daf1bd1c54-audit-dir\") pod \"oauth-openshift-558db77b4-tzf54\" (UID: \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\") " pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.430311 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2m5m\" (UniqueName: \"kubernetes.io/projected/8919ddf1-096f-407f-8ea3-91a26a623f43-kube-api-access-p2m5m\") pod \"machine-api-operator-5694c8668f-f6c5c\" (UID: \"8919ddf1-096f-407f-8ea3-91a26a623f43\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-f6c5c" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.430318 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-vwwl2" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.430328 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e87e0a88-a606-4f22-be47-72cc718fce1b-etcd-client\") pod \"apiserver-7bbb656c7d-xs7zs\" (UID: \"e87e0a88-a606-4f22-be47-72cc718fce1b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xs7zs" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.430346 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-tzf54\" (UID: \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\") " pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.430363 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-tzf54\" (UID: \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\") " pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.430384 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vrmc\" (UniqueName: \"kubernetes.io/projected/e87e0a88-a606-4f22-be47-72cc718fce1b-kube-api-access-8vrmc\") pod \"apiserver-7bbb656c7d-xs7zs\" (UID: \"e87e0a88-a606-4f22-be47-72cc718fce1b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xs7zs" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.430401 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0382652-0457-4ef8-a28a-ac1a9c2de73d-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-6t2h5\" (UID: \"c0382652-0457-4ef8-a28a-ac1a9c2de73d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6t2h5" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.430437 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-tzf54\" (UID: \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\") " pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.430456 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e1f8e22e-89a6-46b7-94f6-65aa27575c48-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-5jgbr\" (UID: \"e1f8e22e-89a6-46b7-94f6-65aa27575c48\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5jgbr" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.430483 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-tzf54\" (UID: \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\") " pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.430505 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/bffc6cea-5224-4f60-b2c2-3e93aa5cff97-image-import-ca\") pod \"apiserver-76f77b778f-92tt5\" (UID: \"bffc6cea-5224-4f60-b2c2-3e93aa5cff97\") " pod="openshift-apiserver/apiserver-76f77b778f-92tt5" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.430518 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/bffc6cea-5224-4f60-b2c2-3e93aa5cff97-encryption-config\") pod \"apiserver-76f77b778f-92tt5\" (UID: \"bffc6cea-5224-4f60-b2c2-3e93aa5cff97\") " pod="openshift-apiserver/apiserver-76f77b778f-92tt5" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.430532 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e87e0a88-a606-4f22-be47-72cc718fce1b-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-xs7zs\" (UID: \"e87e0a88-a606-4f22-be47-72cc718fce1b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xs7zs" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.430545 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bffc6cea-5224-4f60-b2c2-3e93aa5cff97-config\") pod \"apiserver-76f77b778f-92tt5\" (UID: \"bffc6cea-5224-4f60-b2c2-3e93aa5cff97\") " pod="openshift-apiserver/apiserver-76f77b778f-92tt5" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.430570 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8919ddf1-096f-407f-8ea3-91a26a623f43-config\") pod \"machine-api-operator-5694c8668f-f6c5c\" (UID: \"8919ddf1-096f-407f-8ea3-91a26a623f43\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-f6c5c" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.430585 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7n5q\" (UniqueName: \"kubernetes.io/projected/c0382652-0457-4ef8-a28a-ac1a9c2de73d-kube-api-access-z7n5q\") pod \"openshift-apiserver-operator-796bbdcf4f-6t2h5\" (UID: \"c0382652-0457-4ef8-a28a-ac1a9c2de73d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6t2h5" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.430584 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-f6c5c"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.430602 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e87e0a88-a606-4f22-be47-72cc718fce1b-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-xs7zs\" (UID: \"e87e0a88-a606-4f22-be47-72cc718fce1b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xs7zs" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.431475 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5jgbr"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.431999 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e87e0a88-a606-4f22-be47-72cc718fce1b-encryption-config\") pod \"apiserver-7bbb656c7d-xs7zs\" (UID: \"e87e0a88-a606-4f22-be47-72cc718fce1b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xs7zs" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.432028 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-tzf54\" (UID: \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\") " pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.432100 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0382652-0457-4ef8-a28a-ac1a9c2de73d-config\") pod \"openshift-apiserver-operator-796bbdcf4f-6t2h5\" (UID: \"c0382652-0457-4ef8-a28a-ac1a9c2de73d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6t2h5" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.432198 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wtzq\" (UniqueName: \"kubernetes.io/projected/bffc6cea-5224-4f60-b2c2-3e93aa5cff97-kube-api-access-6wtzq\") pod \"apiserver-76f77b778f-92tt5\" (UID: \"bffc6cea-5224-4f60-b2c2-3e93aa5cff97\") " pod="openshift-apiserver/apiserver-76f77b778f-92tt5" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.432235 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-tzf54\" (UID: \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\") " pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.432273 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e1f8e22e-89a6-46b7-94f6-65aa27575c48-client-ca\") pod \"controller-manager-879f6c89f-5jgbr\" (UID: \"e1f8e22e-89a6-46b7-94f6-65aa27575c48\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5jgbr" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.432295 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-tzf54\" (UID: \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\") " pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.432313 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e87e0a88-a606-4f22-be47-72cc718fce1b-audit-policies\") pod \"apiserver-7bbb656c7d-xs7zs\" (UID: \"e87e0a88-a606-4f22-be47-72cc718fce1b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xs7zs" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.432347 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bffc6cea-5224-4f60-b2c2-3e93aa5cff97-trusted-ca-bundle\") pod \"apiserver-76f77b778f-92tt5\" (UID: \"bffc6cea-5224-4f60-b2c2-3e93aa5cff97\") " pod="openshift-apiserver/apiserver-76f77b778f-92tt5" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.432730 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-92tt5"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.437362 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-nrgmz"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.437933 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nrgmz" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.458852 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.458938 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.459305 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.459422 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.459454 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.459588 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.459621 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.459665 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.459728 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.459923 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.459975 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.460111 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.460213 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.460856 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.460943 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-zfqpl"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.461305 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.461402 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jffft"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.461485 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.461766 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.461897 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.461936 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.462004 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.462032 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.462151 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-zfqpl" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.462331 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jffft" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.464464 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-xs7zs"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.464503 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-sclvh"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.464865 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-p7chk"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.465262 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-p7chk" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.465504 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-sclvh" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.467323 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-jznjb"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.467741 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-jznjb" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.467831 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-stq8d"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.468556 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-stq8d" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.469207 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-kxd66"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.469658 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kxd66" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.473329 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-vwwl2"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.483270 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.484373 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.484495 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.484598 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.484702 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.484970 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.485069 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.485175 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.485337 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.487887 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.487989 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.491299 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tpmfc"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.491706 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xtpw8"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.491938 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-np7wv"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.494363 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-np7wv" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.494645 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.494866 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xtpw8" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.504290 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.504518 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.504753 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.505033 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.505203 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.505473 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.505682 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.505805 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.505902 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.506019 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.518127 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.519196 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.519957 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.519568 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.521107 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.522027 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.525914 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.538948 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.539108 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.540163 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.542369 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/890abe57-aa9b-4c46-8a26-c2c1fd724fab-config\") pod \"route-controller-manager-6576b87f9c-57xhf\" (UID: \"890abe57-aa9b-4c46-8a26-c2c1fd724fab\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-57xhf" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.542398 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/64d1cf60-f9b7-4722-83aa-7967cd9827d6-auth-proxy-config\") pod \"machine-approver-56656f9798-pxqjl\" (UID: \"64d1cf60-f9b7-4722-83aa-7967cd9827d6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pxqjl" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.542425 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-tzf54\" (UID: \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\") " pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.542444 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-tzf54\" (UID: \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\") " pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.542463 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vrmc\" (UniqueName: \"kubernetes.io/projected/e87e0a88-a606-4f22-be47-72cc718fce1b-kube-api-access-8vrmc\") pod \"apiserver-7bbb656c7d-xs7zs\" (UID: \"e87e0a88-a606-4f22-be47-72cc718fce1b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xs7zs" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.542483 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0382652-0457-4ef8-a28a-ac1a9c2de73d-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-6t2h5\" (UID: \"c0382652-0457-4ef8-a28a-ac1a9c2de73d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6t2h5" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.542519 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.542562 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jx2wh\" (UniqueName: \"kubernetes.io/projected/64d1cf60-f9b7-4722-83aa-7967cd9827d6-kube-api-access-jx2wh\") pod \"machine-approver-56656f9798-pxqjl\" (UID: \"64d1cf60-f9b7-4722-83aa-7967cd9827d6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pxqjl" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.542625 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e1f8e22e-89a6-46b7-94f6-65aa27575c48-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-5jgbr\" (UID: \"e1f8e22e-89a6-46b7-94f6-65aa27575c48\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5jgbr" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.542646 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-tzf54\" (UID: \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\") " pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.542668 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-tzf54\" (UID: \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\") " pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.542685 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64d1cf60-f9b7-4722-83aa-7967cd9827d6-config\") pod \"machine-approver-56656f9798-pxqjl\" (UID: \"64d1cf60-f9b7-4722-83aa-7967cd9827d6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pxqjl" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.542939 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e87e0a88-a606-4f22-be47-72cc718fce1b-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-xs7zs\" (UID: \"e87e0a88-a606-4f22-be47-72cc718fce1b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xs7zs" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.543014 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/bffc6cea-5224-4f60-b2c2-3e93aa5cff97-image-import-ca\") pod \"apiserver-76f77b778f-92tt5\" (UID: \"bffc6cea-5224-4f60-b2c2-3e93aa5cff97\") " pod="openshift-apiserver/apiserver-76f77b778f-92tt5" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.543033 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/bffc6cea-5224-4f60-b2c2-3e93aa5cff97-encryption-config\") pod \"apiserver-76f77b778f-92tt5\" (UID: \"bffc6cea-5224-4f60-b2c2-3e93aa5cff97\") " pod="openshift-apiserver/apiserver-76f77b778f-92tt5" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.543051 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bffc6cea-5224-4f60-b2c2-3e93aa5cff97-config\") pod \"apiserver-76f77b778f-92tt5\" (UID: \"bffc6cea-5224-4f60-b2c2-3e93aa5cff97\") " pod="openshift-apiserver/apiserver-76f77b778f-92tt5" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.543110 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e87e0a88-a606-4f22-be47-72cc718fce1b-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-xs7zs\" (UID: \"e87e0a88-a606-4f22-be47-72cc718fce1b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xs7zs" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.543301 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.543812 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bffc6cea-5224-4f60-b2c2-3e93aa5cff97-config\") pod \"apiserver-76f77b778f-92tt5\" (UID: \"bffc6cea-5224-4f60-b2c2-3e93aa5cff97\") " pod="openshift-apiserver/apiserver-76f77b778f-92tt5" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.543897 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e87e0a88-a606-4f22-be47-72cc718fce1b-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-xs7zs\" (UID: \"e87e0a88-a606-4f22-be47-72cc718fce1b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xs7zs" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.544447 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/bffc6cea-5224-4f60-b2c2-3e93aa5cff97-image-import-ca\") pod \"apiserver-76f77b778f-92tt5\" (UID: \"bffc6cea-5224-4f60-b2c2-3e93aa5cff97\") " pod="openshift-apiserver/apiserver-76f77b778f-92tt5" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.544601 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e87e0a88-a606-4f22-be47-72cc718fce1b-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-xs7zs\" (UID: \"e87e0a88-a606-4f22-be47-72cc718fce1b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xs7zs" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.544660 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8919ddf1-096f-407f-8ea3-91a26a623f43-config\") pod \"machine-api-operator-5694c8668f-f6c5c\" (UID: \"8919ddf1-096f-407f-8ea3-91a26a623f43\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-f6c5c" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.544685 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7n5q\" (UniqueName: \"kubernetes.io/projected/c0382652-0457-4ef8-a28a-ac1a9c2de73d-kube-api-access-z7n5q\") pod \"openshift-apiserver-operator-796bbdcf4f-6t2h5\" (UID: \"c0382652-0457-4ef8-a28a-ac1a9c2de73d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6t2h5" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.544708 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-tzf54\" (UID: \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\") " pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.544855 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e87e0a88-a606-4f22-be47-72cc718fce1b-encryption-config\") pod \"apiserver-7bbb656c7d-xs7zs\" (UID: \"e87e0a88-a606-4f22-be47-72cc718fce1b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xs7zs" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.545313 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0382652-0457-4ef8-a28a-ac1a9c2de73d-config\") pod \"openshift-apiserver-operator-796bbdcf4f-6t2h5\" (UID: \"c0382652-0457-4ef8-a28a-ac1a9c2de73d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6t2h5" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.545359 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-tzf54\" (UID: \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\") " pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.545462 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-9v4g4"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.545783 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-tzf54\" (UID: \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\") " pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.545939 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-tzf54\" (UID: \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\") " pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.546001 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-9v4g4" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.546006 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wtzq\" (UniqueName: \"kubernetes.io/projected/bffc6cea-5224-4f60-b2c2-3e93aa5cff97-kube-api-access-6wtzq\") pod \"apiserver-76f77b778f-92tt5\" (UID: \"bffc6cea-5224-4f60-b2c2-3e93aa5cff97\") " pod="openshift-apiserver/apiserver-76f77b778f-92tt5" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.546060 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e1f8e22e-89a6-46b7-94f6-65aa27575c48-client-ca\") pod \"controller-manager-879f6c89f-5jgbr\" (UID: \"e1f8e22e-89a6-46b7-94f6-65aa27575c48\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5jgbr" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.546120 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-tzf54\" (UID: \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\") " pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.546174 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bffc6cea-5224-4f60-b2c2-3e93aa5cff97-trusted-ca-bundle\") pod \"apiserver-76f77b778f-92tt5\" (UID: \"bffc6cea-5224-4f60-b2c2-3e93aa5cff97\") " pod="openshift-apiserver/apiserver-76f77b778f-92tt5" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.546203 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e87e0a88-a606-4f22-be47-72cc718fce1b-audit-policies\") pod \"apiserver-7bbb656c7d-xs7zs\" (UID: \"e87e0a88-a606-4f22-be47-72cc718fce1b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xs7zs" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.546258 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/64d1cf60-f9b7-4722-83aa-7967cd9827d6-machine-approver-tls\") pod \"machine-approver-56656f9798-pxqjl\" (UID: \"64d1cf60-f9b7-4722-83aa-7967cd9827d6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pxqjl" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.546409 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e87e0a88-a606-4f22-be47-72cc718fce1b-audit-dir\") pod \"apiserver-7bbb656c7d-xs7zs\" (UID: \"e87e0a88-a606-4f22-be47-72cc718fce1b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xs7zs" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.547427 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8919ddf1-096f-407f-8ea3-91a26a623f43-config\") pod \"machine-api-operator-5694c8668f-f6c5c\" (UID: \"8919ddf1-096f-407f-8ea3-91a26a623f43\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-f6c5c" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.547646 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0382652-0457-4ef8-a28a-ac1a9c2de73d-config\") pod \"openshift-apiserver-operator-796bbdcf4f-6t2h5\" (UID: \"c0382652-0457-4ef8-a28a-ac1a9c2de73d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6t2h5" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.548171 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-tzf54\" (UID: \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\") " pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.548336 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e87e0a88-a606-4f22-be47-72cc718fce1b-audit-dir\") pod \"apiserver-7bbb656c7d-xs7zs\" (UID: \"e87e0a88-a606-4f22-be47-72cc718fce1b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xs7zs" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.548371 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e1f8e22e-89a6-46b7-94f6-65aa27575c48-serving-cert\") pod \"controller-manager-879f6c89f-5jgbr\" (UID: \"e1f8e22e-89a6-46b7-94f6-65aa27575c48\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5jgbr" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.548808 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-cwmk8"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.548936 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e87e0a88-a606-4f22-be47-72cc718fce1b-audit-policies\") pod \"apiserver-7bbb656c7d-xs7zs\" (UID: \"e87e0a88-a606-4f22-be47-72cc718fce1b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xs7zs" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.549260 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bffc6cea-5224-4f60-b2c2-3e93aa5cff97-trusted-ca-bundle\") pod \"apiserver-76f77b778f-92tt5\" (UID: \"bffc6cea-5224-4f60-b2c2-3e93aa5cff97\") " pod="openshift-apiserver/apiserver-76f77b778f-92tt5" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.549583 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-cwmk8" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.549747 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.550173 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zf94m"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.550467 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/bffc6cea-5224-4f60-b2c2-3e93aa5cff97-audit\") pod \"apiserver-76f77b778f-92tt5\" (UID: \"bffc6cea-5224-4f60-b2c2-3e93aa5cff97\") " pod="openshift-apiserver/apiserver-76f77b778f-92tt5" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.550591 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bffc6cea-5224-4f60-b2c2-3e93aa5cff97-node-pullsecrets\") pod \"apiserver-76f77b778f-92tt5\" (UID: \"bffc6cea-5224-4f60-b2c2-3e93aa5cff97\") " pod="openshift-apiserver/apiserver-76f77b778f-92tt5" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.550621 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e87e0a88-a606-4f22-be47-72cc718fce1b-serving-cert\") pod \"apiserver-7bbb656c7d-xs7zs\" (UID: \"e87e0a88-a606-4f22-be47-72cc718fce1b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xs7zs" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.550648 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-tzf54\" (UID: \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\") " pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.550671 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bffc6cea-5224-4f60-b2c2-3e93aa5cff97-audit-dir\") pod \"apiserver-76f77b778f-92tt5\" (UID: \"bffc6cea-5224-4f60-b2c2-3e93aa5cff97\") " pod="openshift-apiserver/apiserver-76f77b778f-92tt5" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.550694 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1f8e22e-89a6-46b7-94f6-65aa27575c48-config\") pod \"controller-manager-879f6c89f-5jgbr\" (UID: \"e1f8e22e-89a6-46b7-94f6-65aa27575c48\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5jgbr" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.550699 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/bffc6cea-5224-4f60-b2c2-3e93aa5cff97-audit\") pod \"apiserver-76f77b778f-92tt5\" (UID: \"bffc6cea-5224-4f60-b2c2-3e93aa5cff97\") " pod="openshift-apiserver/apiserver-76f77b778f-92tt5" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.550720 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rz995\" (UniqueName: \"kubernetes.io/projected/e1f8e22e-89a6-46b7-94f6-65aa27575c48-kube-api-access-rz995\") pod \"controller-manager-879f6c89f-5jgbr\" (UID: \"e1f8e22e-89a6-46b7-94f6-65aa27575c48\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5jgbr" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.550749 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgfr4\" (UniqueName: \"kubernetes.io/projected/0bef1913-8737-48c8-bcf5-89daf1bd1c54-kube-api-access-hgfr4\") pod \"oauth-openshift-558db77b4-tzf54\" (UID: \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\") " pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.550773 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0bef1913-8737-48c8-bcf5-89daf1bd1c54-audit-policies\") pod \"oauth-openshift-558db77b4-tzf54\" (UID: \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\") " pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.550796 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bffc6cea-5224-4f60-b2c2-3e93aa5cff97-etcd-client\") pod \"apiserver-76f77b778f-92tt5\" (UID: \"bffc6cea-5224-4f60-b2c2-3e93aa5cff97\") " pod="openshift-apiserver/apiserver-76f77b778f-92tt5" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.550823 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8919ddf1-096f-407f-8ea3-91a26a623f43-images\") pod \"machine-api-operator-5694c8668f-f6c5c\" (UID: \"8919ddf1-096f-407f-8ea3-91a26a623f43\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-f6c5c" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.550844 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/bffc6cea-5224-4f60-b2c2-3e93aa5cff97-etcd-serving-ca\") pod \"apiserver-76f77b778f-92tt5\" (UID: \"bffc6cea-5224-4f60-b2c2-3e93aa5cff97\") " pod="openshift-apiserver/apiserver-76f77b778f-92tt5" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.550866 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/8919ddf1-096f-407f-8ea3-91a26a623f43-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-f6c5c\" (UID: \"8919ddf1-096f-407f-8ea3-91a26a623f43\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-f6c5c" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.550875 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bffc6cea-5224-4f60-b2c2-3e93aa5cff97-node-pullsecrets\") pod \"apiserver-76f77b778f-92tt5\" (UID: \"bffc6cea-5224-4f60-b2c2-3e93aa5cff97\") " pod="openshift-apiserver/apiserver-76f77b778f-92tt5" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.550895 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-tzf54\" (UID: \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\") " pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.550918 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwtv4\" (UniqueName: \"kubernetes.io/projected/890abe57-aa9b-4c46-8a26-c2c1fd724fab-kube-api-access-xwtv4\") pod \"route-controller-manager-6576b87f9c-57xhf\" (UID: \"890abe57-aa9b-4c46-8a26-c2c1fd724fab\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-57xhf" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.550937 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e1f8e22e-89a6-46b7-94f6-65aa27575c48-client-ca\") pod \"controller-manager-879f6c89f-5jgbr\" (UID: \"e1f8e22e-89a6-46b7-94f6-65aa27575c48\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5jgbr" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.550940 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bffc6cea-5224-4f60-b2c2-3e93aa5cff97-serving-cert\") pod \"apiserver-76f77b778f-92tt5\" (UID: \"bffc6cea-5224-4f60-b2c2-3e93aa5cff97\") " pod="openshift-apiserver/apiserver-76f77b778f-92tt5" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.550987 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0bef1913-8737-48c8-bcf5-89daf1bd1c54-audit-dir\") pod \"oauth-openshift-558db77b4-tzf54\" (UID: \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\") " pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.551010 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/890abe57-aa9b-4c46-8a26-c2c1fd724fab-client-ca\") pod \"route-controller-manager-6576b87f9c-57xhf\" (UID: \"890abe57-aa9b-4c46-8a26-c2c1fd724fab\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-57xhf" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.551028 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e87e0a88-a606-4f22-be47-72cc718fce1b-etcd-client\") pod \"apiserver-7bbb656c7d-xs7zs\" (UID: \"e87e0a88-a606-4f22-be47-72cc718fce1b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xs7zs" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.551050 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2m5m\" (UniqueName: \"kubernetes.io/projected/8919ddf1-096f-407f-8ea3-91a26a623f43-kube-api-access-p2m5m\") pod \"machine-api-operator-5694c8668f-f6c5c\" (UID: \"8919ddf1-096f-407f-8ea3-91a26a623f43\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-f6c5c" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.551070 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/890abe57-aa9b-4c46-8a26-c2c1fd724fab-serving-cert\") pod \"route-controller-manager-6576b87f9c-57xhf\" (UID: \"890abe57-aa9b-4c46-8a26-c2c1fd724fab\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-57xhf" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.551091 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-tzf54\" (UID: \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\") " pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.551516 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.551848 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e1f8e22e-89a6-46b7-94f6-65aa27575c48-serving-cert\") pod \"controller-manager-879f6c89f-5jgbr\" (UID: \"e1f8e22e-89a6-46b7-94f6-65aa27575c48\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5jgbr" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.551911 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-tzf54\" (UID: \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\") " pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.551995 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0bef1913-8737-48c8-bcf5-89daf1bd1c54-audit-dir\") pod \"oauth-openshift-558db77b4-tzf54\" (UID: \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\") " pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.552950 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.553028 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zf94m" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.553471 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bffc6cea-5224-4f60-b2c2-3e93aa5cff97-audit-dir\") pod \"apiserver-76f77b778f-92tt5\" (UID: \"bffc6cea-5224-4f60-b2c2-3e93aa5cff97\") " pod="openshift-apiserver/apiserver-76f77b778f-92tt5" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.554282 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8919ddf1-096f-407f-8ea3-91a26a623f43-images\") pod \"machine-api-operator-5694c8668f-f6c5c\" (UID: \"8919ddf1-096f-407f-8ea3-91a26a623f43\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-f6c5c" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.554664 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1f8e22e-89a6-46b7-94f6-65aa27575c48-config\") pod \"controller-manager-879f6c89f-5jgbr\" (UID: \"e1f8e22e-89a6-46b7-94f6-65aa27575c48\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5jgbr" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.555963 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.557170 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/bffc6cea-5224-4f60-b2c2-3e93aa5cff97-etcd-serving-ca\") pod \"apiserver-76f77b778f-92tt5\" (UID: \"bffc6cea-5224-4f60-b2c2-3e93aa5cff97\") " pod="openshift-apiserver/apiserver-76f77b778f-92tt5" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.557636 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0bef1913-8737-48c8-bcf5-89daf1bd1c54-audit-policies\") pod \"oauth-openshift-558db77b4-tzf54\" (UID: \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\") " pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.558291 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qnwn6"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.558849 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e1f8e22e-89a6-46b7-94f6-65aa27575c48-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-5jgbr\" (UID: \"e1f8e22e-89a6-46b7-94f6-65aa27575c48\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5jgbr" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.558860 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qnwn6" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.561040 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-tzf54\" (UID: \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\") " pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.561305 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.561401 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e87e0a88-a606-4f22-be47-72cc718fce1b-serving-cert\") pod \"apiserver-7bbb656c7d-xs7zs\" (UID: \"e87e0a88-a606-4f22-be47-72cc718fce1b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xs7zs" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.561700 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.562704 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-tzf54\" (UID: \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\") " pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.563699 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e87e0a88-a606-4f22-be47-72cc718fce1b-encryption-config\") pod \"apiserver-7bbb656c7d-xs7zs\" (UID: \"e87e0a88-a606-4f22-be47-72cc718fce1b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xs7zs" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.564046 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/bffc6cea-5224-4f60-b2c2-3e93aa5cff97-encryption-config\") pod \"apiserver-76f77b778f-92tt5\" (UID: \"bffc6cea-5224-4f60-b2c2-3e93aa5cff97\") " pod="openshift-apiserver/apiserver-76f77b778f-92tt5" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.564246 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0382652-0457-4ef8-a28a-ac1a9c2de73d-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-6t2h5\" (UID: \"c0382652-0457-4ef8-a28a-ac1a9c2de73d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6t2h5" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.564654 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.566285 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bffc6cea-5224-4f60-b2c2-3e93aa5cff97-etcd-client\") pod \"apiserver-76f77b778f-92tt5\" (UID: \"bffc6cea-5224-4f60-b2c2-3e93aa5cff97\") " pod="openshift-apiserver/apiserver-76f77b778f-92tt5" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.566973 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-tzf54\" (UID: \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\") " pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.567027 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6t2h5"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.567441 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-tzf54\" (UID: \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\") " pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.567533 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/8919ddf1-096f-407f-8ea3-91a26a623f43-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-f6c5c\" (UID: \"8919ddf1-096f-407f-8ea3-91a26a623f43\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-f6c5c" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.567776 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-tzf54\" (UID: \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\") " pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.568502 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-tzf54"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.568792 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bffc6cea-5224-4f60-b2c2-3e93aa5cff97-serving-cert\") pod \"apiserver-76f77b778f-92tt5\" (UID: \"bffc6cea-5224-4f60-b2c2-3e93aa5cff97\") " pod="openshift-apiserver/apiserver-76f77b778f-92tt5" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.569622 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e87e0a88-a606-4f22-be47-72cc718fce1b-etcd-client\") pod \"apiserver-7bbb656c7d-xs7zs\" (UID: \"e87e0a88-a606-4f22-be47-72cc718fce1b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xs7zs" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.572119 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fl2d7"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.573083 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fl2d7" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.573535 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-nrgmz"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.574054 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-tzf54\" (UID: \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\") " pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.574986 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nlx8v"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.575470 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nlx8v" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.576487 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8m6x2"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.577469 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8m6x2" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.578772 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-57xhf"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.579915 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9fxwq"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.580804 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m299m"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.581176 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m299m" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.581347 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9fxwq" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.581877 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-ffb55"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.582639 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.588397 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ffb55" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.594747 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5jjc7"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.596548 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5jjc7" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.597275 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-n9lvf"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.598796 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n9lvf" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.599926 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-kwffm"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.600449 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-kwffm" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.602510 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.602703 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jffft"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.605431 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-m52jv"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.606642 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-m52jv" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.607714 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-m8h87"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.608578 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-m8h87" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.608762 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-v478l"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.609837 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550390-hrjpt"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.609959 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-v478l" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.610675 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550390-hrjpt" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.610836 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550404-w8858"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.611591 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550404-w8858" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.612043 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-sclvh"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.613391 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-lgxp2"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.614449 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-lgxp2" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.614598 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-xc8gr"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.615182 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-xc8gr" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.615696 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-p7chk"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.616617 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jvk7f"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.617702 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jvk7f" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.617947 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-np7wv"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.619314 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-jznjb"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.620435 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tpmfc"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.621813 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.622149 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9fxwq"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.623037 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-zfqpl"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.625506 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-stq8d"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.626679 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nlx8v"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.627715 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xtpw8"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.628768 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-cwmk8"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.629832 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-n9lvf"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.631162 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m299m"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.631925 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-x28d9"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.632889 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-x28d9" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.633281 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-kwffm"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.634301 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-m8h87"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.635409 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-kxd66"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.637971 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550404-w8858"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.641164 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5jjc7"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.641502 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.642724 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-ffb55"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.644242 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8m6x2"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.645432 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fl2d7"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.646734 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zf94m"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.648088 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qnwn6"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.649310 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550390-hrjpt"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.650933 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-xc8gr"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.651552 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dj9g\" (UniqueName: \"kubernetes.io/projected/9800e019-e239-4af9-9059-0c29be7ca479-kube-api-access-7dj9g\") pod \"etcd-operator-b45778765-stq8d\" (UID: \"9800e019-e239-4af9-9059-0c29be7ca479\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stq8d" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.651582 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sc9g\" (UniqueName: \"kubernetes.io/projected/c8f28c79-5540-44c0-acea-aa36ce8a47d9-kube-api-access-7sc9g\") pod \"ingress-operator-5b745b69d9-kxd66\" (UID: \"c8f28c79-5540-44c0-acea-aa36ce8a47d9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kxd66" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.651607 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57ac07da-c623-40dc-b43b-87d5b43c4468-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-vwwl2\" (UID: \"57ac07da-c623-40dc-b43b-87d5b43c4468\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vwwl2" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.651630 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwtv4\" (UniqueName: \"kubernetes.io/projected/890abe57-aa9b-4c46-8a26-c2c1fd724fab-kube-api-access-xwtv4\") pod \"route-controller-manager-6576b87f9c-57xhf\" (UID: \"890abe57-aa9b-4c46-8a26-c2c1fd724fab\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-57xhf" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.651650 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/890abe57-aa9b-4c46-8a26-c2c1fd724fab-client-ca\") pod \"route-controller-manager-6576b87f9c-57xhf\" (UID: \"890abe57-aa9b-4c46-8a26-c2c1fd724fab\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-57xhf" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.651672 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/e4c5ab6f-bc3e-48e7-a896-c8f69da5d9a0-available-featuregates\") pod \"openshift-config-operator-7777fb866f-nrgmz\" (UID: \"e4c5ab6f-bc3e-48e7-a896-c8f69da5d9a0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nrgmz" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.651694 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c8f28c79-5540-44c0-acea-aa36ce8a47d9-metrics-tls\") pod \"ingress-operator-5b745b69d9-kxd66\" (UID: \"c8f28c79-5540-44c0-acea-aa36ce8a47d9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kxd66" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.651721 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/890abe57-aa9b-4c46-8a26-c2c1fd724fab-serving-cert\") pod \"route-controller-manager-6576b87f9c-57xhf\" (UID: \"890abe57-aa9b-4c46-8a26-c2c1fd724fab\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-57xhf" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.651742 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/99c51354-0ea1-4e7a-bd34-1bb57e6422c0-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-jffft\" (UID: \"99c51354-0ea1-4e7a-bd34-1bb57e6422c0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jffft" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.651760 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-tzf54\" (UID: \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\") " pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.651780 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/890abe57-aa9b-4c46-8a26-c2c1fd724fab-config\") pod \"route-controller-manager-6576b87f9c-57xhf\" (UID: \"890abe57-aa9b-4c46-8a26-c2c1fd724fab\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-57xhf" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.651798 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/64d1cf60-f9b7-4722-83aa-7967cd9827d6-auth-proxy-config\") pod \"machine-approver-56656f9798-pxqjl\" (UID: \"64d1cf60-f9b7-4722-83aa-7967cd9827d6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pxqjl" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.651825 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c8f28c79-5540-44c0-acea-aa36ce8a47d9-trusted-ca\") pod \"ingress-operator-5b745b69d9-kxd66\" (UID: \"c8f28c79-5540-44c0-acea-aa36ce8a47d9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kxd66" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.651842 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1336f974-5df8-4d81-91bb-2c2364c87479-config\") pod \"kube-apiserver-operator-766d6c64bb-xtpw8\" (UID: \"1336f974-5df8-4d81-91bb-2c2364c87479\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xtpw8" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.651864 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jx2wh\" (UniqueName: \"kubernetes.io/projected/64d1cf60-f9b7-4722-83aa-7967cd9827d6-kube-api-access-jx2wh\") pod \"machine-approver-56656f9798-pxqjl\" (UID: \"64d1cf60-f9b7-4722-83aa-7967cd9827d6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pxqjl" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.651883 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7d59\" (UniqueName: \"kubernetes.io/projected/e4c5ab6f-bc3e-48e7-a896-c8f69da5d9a0-kube-api-access-k7d59\") pod \"openshift-config-operator-7777fb866f-nrgmz\" (UID: \"e4c5ab6f-bc3e-48e7-a896-c8f69da5d9a0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nrgmz" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.651917 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57ac07da-c623-40dc-b43b-87d5b43c4468-serving-cert\") pod \"authentication-operator-69f744f599-vwwl2\" (UID: \"57ac07da-c623-40dc-b43b-87d5b43c4468\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vwwl2" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.651937 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64d1cf60-f9b7-4722-83aa-7967cd9827d6-config\") pod \"machine-approver-56656f9798-pxqjl\" (UID: \"64d1cf60-f9b7-4722-83aa-7967cd9827d6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pxqjl" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.651958 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4c5ab6f-bc3e-48e7-a896-c8f69da5d9a0-serving-cert\") pod \"openshift-config-operator-7777fb866f-nrgmz\" (UID: \"e4c5ab6f-bc3e-48e7-a896-c8f69da5d9a0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nrgmz" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.651979 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1336f974-5df8-4d81-91bb-2c2364c87479-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-xtpw8\" (UID: \"1336f974-5df8-4d81-91bb-2c2364c87479\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xtpw8" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.651995 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1336f974-5df8-4d81-91bb-2c2364c87479-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-xtpw8\" (UID: \"1336f974-5df8-4d81-91bb-2c2364c87479\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xtpw8" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.652030 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-tzf54\" (UID: \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\") " pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.652049 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57ac07da-c623-40dc-b43b-87d5b43c4468-config\") pod \"authentication-operator-69f744f599-vwwl2\" (UID: \"57ac07da-c623-40dc-b43b-87d5b43c4468\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vwwl2" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.652066 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9800e019-e239-4af9-9059-0c29be7ca479-etcd-client\") pod \"etcd-operator-b45778765-stq8d\" (UID: \"9800e019-e239-4af9-9059-0c29be7ca479\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stq8d" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.652084 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkfnm\" (UniqueName: \"kubernetes.io/projected/054fc716-7800-43b0-af23-328b685f89f9-kube-api-access-wkfnm\") pod \"downloads-7954f5f757-jznjb\" (UID: \"054fc716-7800-43b0-af23-328b685f89f9\") " pod="openshift-console/downloads-7954f5f757-jznjb" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.652102 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p85mn\" (UniqueName: \"kubernetes.io/projected/57ac07da-c623-40dc-b43b-87d5b43c4468-kube-api-access-p85mn\") pod \"authentication-operator-69f744f599-vwwl2\" (UID: \"57ac07da-c623-40dc-b43b-87d5b43c4468\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vwwl2" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.652128 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c8f28c79-5540-44c0-acea-aa36ce8a47d9-bound-sa-token\") pod \"ingress-operator-5b745b69d9-kxd66\" (UID: \"c8f28c79-5540-44c0-acea-aa36ce8a47d9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kxd66" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.652148 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/64d1cf60-f9b7-4722-83aa-7967cd9827d6-machine-approver-tls\") pod \"machine-approver-56656f9798-pxqjl\" (UID: \"64d1cf60-f9b7-4722-83aa-7967cd9827d6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pxqjl" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.652183 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6jbh\" (UniqueName: \"kubernetes.io/projected/99c51354-0ea1-4e7a-bd34-1bb57e6422c0-kube-api-access-x6jbh\") pod \"cluster-samples-operator-665b6dd947-jffft\" (UID: \"99c51354-0ea1-4e7a-bd34-1bb57e6422c0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jffft" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.652201 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/9800e019-e239-4af9-9059-0c29be7ca479-etcd-ca\") pod \"etcd-operator-b45778765-stq8d\" (UID: \"9800e019-e239-4af9-9059-0c29be7ca479\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stq8d" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.652264 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgfr4\" (UniqueName: \"kubernetes.io/projected/0bef1913-8737-48c8-bcf5-89daf1bd1c54-kube-api-access-hgfr4\") pod \"oauth-openshift-558db77b4-tzf54\" (UID: \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\") " pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.652285 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9800e019-e239-4af9-9059-0c29be7ca479-config\") pod \"etcd-operator-b45778765-stq8d\" (UID: \"9800e019-e239-4af9-9059-0c29be7ca479\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stq8d" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.652302 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/9800e019-e239-4af9-9059-0c29be7ca479-etcd-service-ca\") pod \"etcd-operator-b45778765-stq8d\" (UID: \"9800e019-e239-4af9-9059-0c29be7ca479\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stq8d" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.652318 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57ac07da-c623-40dc-b43b-87d5b43c4468-service-ca-bundle\") pod \"authentication-operator-69f744f599-vwwl2\" (UID: \"57ac07da-c623-40dc-b43b-87d5b43c4468\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vwwl2" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.652337 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9800e019-e239-4af9-9059-0c29be7ca479-serving-cert\") pod \"etcd-operator-b45778765-stq8d\" (UID: \"9800e019-e239-4af9-9059-0c29be7ca479\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stq8d" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.652615 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/64d1cf60-f9b7-4722-83aa-7967cd9827d6-auth-proxy-config\") pod \"machine-approver-56656f9798-pxqjl\" (UID: \"64d1cf60-f9b7-4722-83aa-7967cd9827d6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pxqjl" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.653010 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-lgxp2"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.653020 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64d1cf60-f9b7-4722-83aa-7967cd9827d6-config\") pod \"machine-approver-56656f9798-pxqjl\" (UID: \"64d1cf60-f9b7-4722-83aa-7967cd9827d6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pxqjl" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.653269 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-tzf54\" (UID: \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\") " pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.655337 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/64d1cf60-f9b7-4722-83aa-7967cd9827d6-machine-approver-tls\") pod \"machine-approver-56656f9798-pxqjl\" (UID: \"64d1cf60-f9b7-4722-83aa-7967cd9827d6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pxqjl" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.659278 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-p949f"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.660974 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-v478l"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.661111 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-p949f" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.663807 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-tzf54\" (UID: \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\") " pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.664111 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-p949f"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.665481 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.669472 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jvk7f"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.670601 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-m52jv"] Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.683536 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.702082 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.722025 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.743079 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.754057 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c8f28c79-5540-44c0-acea-aa36ce8a47d9-trusted-ca\") pod \"ingress-operator-5b745b69d9-kxd66\" (UID: \"c8f28c79-5540-44c0-acea-aa36ce8a47d9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kxd66" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.754145 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1336f974-5df8-4d81-91bb-2c2364c87479-config\") pod \"kube-apiserver-operator-766d6c64bb-xtpw8\" (UID: \"1336f974-5df8-4d81-91bb-2c2364c87479\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xtpw8" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.754216 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7d59\" (UniqueName: \"kubernetes.io/projected/e4c5ab6f-bc3e-48e7-a896-c8f69da5d9a0-kube-api-access-k7d59\") pod \"openshift-config-operator-7777fb866f-nrgmz\" (UID: \"e4c5ab6f-bc3e-48e7-a896-c8f69da5d9a0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nrgmz" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.754303 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57ac07da-c623-40dc-b43b-87d5b43c4468-serving-cert\") pod \"authentication-operator-69f744f599-vwwl2\" (UID: \"57ac07da-c623-40dc-b43b-87d5b43c4468\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vwwl2" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.754352 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4c5ab6f-bc3e-48e7-a896-c8f69da5d9a0-serving-cert\") pod \"openshift-config-operator-7777fb866f-nrgmz\" (UID: \"e4c5ab6f-bc3e-48e7-a896-c8f69da5d9a0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nrgmz" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.754386 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1336f974-5df8-4d81-91bb-2c2364c87479-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-xtpw8\" (UID: \"1336f974-5df8-4d81-91bb-2c2364c87479\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xtpw8" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.754419 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1336f974-5df8-4d81-91bb-2c2364c87479-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-xtpw8\" (UID: \"1336f974-5df8-4d81-91bb-2c2364c87479\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xtpw8" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.754485 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57ac07da-c623-40dc-b43b-87d5b43c4468-config\") pod \"authentication-operator-69f744f599-vwwl2\" (UID: \"57ac07da-c623-40dc-b43b-87d5b43c4468\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vwwl2" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.754529 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkfnm\" (UniqueName: \"kubernetes.io/projected/054fc716-7800-43b0-af23-328b685f89f9-kube-api-access-wkfnm\") pod \"downloads-7954f5f757-jznjb\" (UID: \"054fc716-7800-43b0-af23-328b685f89f9\") " pod="openshift-console/downloads-7954f5f757-jznjb" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.754561 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9800e019-e239-4af9-9059-0c29be7ca479-etcd-client\") pod \"etcd-operator-b45778765-stq8d\" (UID: \"9800e019-e239-4af9-9059-0c29be7ca479\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stq8d" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.754600 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p85mn\" (UniqueName: \"kubernetes.io/projected/57ac07da-c623-40dc-b43b-87d5b43c4468-kube-api-access-p85mn\") pod \"authentication-operator-69f744f599-vwwl2\" (UID: \"57ac07da-c623-40dc-b43b-87d5b43c4468\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vwwl2" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.754648 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c8f28c79-5540-44c0-acea-aa36ce8a47d9-bound-sa-token\") pod \"ingress-operator-5b745b69d9-kxd66\" (UID: \"c8f28c79-5540-44c0-acea-aa36ce8a47d9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kxd66" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.754716 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/9800e019-e239-4af9-9059-0c29be7ca479-etcd-ca\") pod \"etcd-operator-b45778765-stq8d\" (UID: \"9800e019-e239-4af9-9059-0c29be7ca479\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stq8d" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.754750 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6jbh\" (UniqueName: \"kubernetes.io/projected/99c51354-0ea1-4e7a-bd34-1bb57e6422c0-kube-api-access-x6jbh\") pod \"cluster-samples-operator-665b6dd947-jffft\" (UID: \"99c51354-0ea1-4e7a-bd34-1bb57e6422c0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jffft" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.754817 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9800e019-e239-4af9-9059-0c29be7ca479-config\") pod \"etcd-operator-b45778765-stq8d\" (UID: \"9800e019-e239-4af9-9059-0c29be7ca479\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stq8d" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.754849 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/9800e019-e239-4af9-9059-0c29be7ca479-etcd-service-ca\") pod \"etcd-operator-b45778765-stq8d\" (UID: \"9800e019-e239-4af9-9059-0c29be7ca479\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stq8d" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.754895 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9800e019-e239-4af9-9059-0c29be7ca479-serving-cert\") pod \"etcd-operator-b45778765-stq8d\" (UID: \"9800e019-e239-4af9-9059-0c29be7ca479\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stq8d" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.754926 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57ac07da-c623-40dc-b43b-87d5b43c4468-service-ca-bundle\") pod \"authentication-operator-69f744f599-vwwl2\" (UID: \"57ac07da-c623-40dc-b43b-87d5b43c4468\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vwwl2" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.754966 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dj9g\" (UniqueName: \"kubernetes.io/projected/9800e019-e239-4af9-9059-0c29be7ca479-kube-api-access-7dj9g\") pod \"etcd-operator-b45778765-stq8d\" (UID: \"9800e019-e239-4af9-9059-0c29be7ca479\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stq8d" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.755000 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sc9g\" (UniqueName: \"kubernetes.io/projected/c8f28c79-5540-44c0-acea-aa36ce8a47d9-kube-api-access-7sc9g\") pod \"ingress-operator-5b745b69d9-kxd66\" (UID: \"c8f28c79-5540-44c0-acea-aa36ce8a47d9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kxd66" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.755045 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57ac07da-c623-40dc-b43b-87d5b43c4468-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-vwwl2\" (UID: \"57ac07da-c623-40dc-b43b-87d5b43c4468\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vwwl2" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.755103 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/e4c5ab6f-bc3e-48e7-a896-c8f69da5d9a0-available-featuregates\") pod \"openshift-config-operator-7777fb866f-nrgmz\" (UID: \"e4c5ab6f-bc3e-48e7-a896-c8f69da5d9a0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nrgmz" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.755136 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c8f28c79-5540-44c0-acea-aa36ce8a47d9-metrics-tls\") pod \"ingress-operator-5b745b69d9-kxd66\" (UID: \"c8f28c79-5540-44c0-acea-aa36ce8a47d9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kxd66" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.755189 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/99c51354-0ea1-4e7a-bd34-1bb57e6422c0-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-jffft\" (UID: \"99c51354-0ea1-4e7a-bd34-1bb57e6422c0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jffft" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.755555 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57ac07da-c623-40dc-b43b-87d5b43c4468-config\") pod \"authentication-operator-69f744f599-vwwl2\" (UID: \"57ac07da-c623-40dc-b43b-87d5b43c4468\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vwwl2" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.755729 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1336f974-5df8-4d81-91bb-2c2364c87479-config\") pod \"kube-apiserver-operator-766d6c64bb-xtpw8\" (UID: \"1336f974-5df8-4d81-91bb-2c2364c87479\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xtpw8" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.755873 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/9800e019-e239-4af9-9059-0c29be7ca479-etcd-service-ca\") pod \"etcd-operator-b45778765-stq8d\" (UID: \"9800e019-e239-4af9-9059-0c29be7ca479\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stq8d" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.756070 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/e4c5ab6f-bc3e-48e7-a896-c8f69da5d9a0-available-featuregates\") pod \"openshift-config-operator-7777fb866f-nrgmz\" (UID: \"e4c5ab6f-bc3e-48e7-a896-c8f69da5d9a0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nrgmz" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.756501 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9800e019-e239-4af9-9059-0c29be7ca479-config\") pod \"etcd-operator-b45778765-stq8d\" (UID: \"9800e019-e239-4af9-9059-0c29be7ca479\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stq8d" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.756735 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c8f28c79-5540-44c0-acea-aa36ce8a47d9-trusted-ca\") pod \"ingress-operator-5b745b69d9-kxd66\" (UID: \"c8f28c79-5540-44c0-acea-aa36ce8a47d9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kxd66" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.756763 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/9800e019-e239-4af9-9059-0c29be7ca479-etcd-ca\") pod \"etcd-operator-b45778765-stq8d\" (UID: \"9800e019-e239-4af9-9059-0c29be7ca479\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stq8d" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.756856 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57ac07da-c623-40dc-b43b-87d5b43c4468-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-vwwl2\" (UID: \"57ac07da-c623-40dc-b43b-87d5b43c4468\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vwwl2" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.757679 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57ac07da-c623-40dc-b43b-87d5b43c4468-service-ca-bundle\") pod \"authentication-operator-69f744f599-vwwl2\" (UID: \"57ac07da-c623-40dc-b43b-87d5b43c4468\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vwwl2" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.759523 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/99c51354-0ea1-4e7a-bd34-1bb57e6422c0-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-jffft\" (UID: \"99c51354-0ea1-4e7a-bd34-1bb57e6422c0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jffft" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.760087 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9800e019-e239-4af9-9059-0c29be7ca479-serving-cert\") pod \"etcd-operator-b45778765-stq8d\" (UID: \"9800e019-e239-4af9-9059-0c29be7ca479\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stq8d" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.760344 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4c5ab6f-bc3e-48e7-a896-c8f69da5d9a0-serving-cert\") pod \"openshift-config-operator-7777fb866f-nrgmz\" (UID: \"e4c5ab6f-bc3e-48e7-a896-c8f69da5d9a0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nrgmz" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.761060 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57ac07da-c623-40dc-b43b-87d5b43c4468-serving-cert\") pod \"authentication-operator-69f744f599-vwwl2\" (UID: \"57ac07da-c623-40dc-b43b-87d5b43c4468\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vwwl2" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.761513 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9800e019-e239-4af9-9059-0c29be7ca479-etcd-client\") pod \"etcd-operator-b45778765-stq8d\" (UID: \"9800e019-e239-4af9-9059-0c29be7ca479\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stq8d" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.762875 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1336f974-5df8-4d81-91bb-2c2364c87479-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-xtpw8\" (UID: \"1336f974-5df8-4d81-91bb-2c2364c87479\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xtpw8" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.763539 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c8f28c79-5540-44c0-acea-aa36ce8a47d9-metrics-tls\") pod \"ingress-operator-5b745b69d9-kxd66\" (UID: \"c8f28c79-5540-44c0-acea-aa36ce8a47d9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kxd66" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.787046 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vrmc\" (UniqueName: \"kubernetes.io/projected/e87e0a88-a606-4f22-be47-72cc718fce1b-kube-api-access-8vrmc\") pod \"apiserver-7bbb656c7d-xs7zs\" (UID: \"e87e0a88-a606-4f22-be47-72cc718fce1b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xs7zs" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.801520 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.811452 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7n5q\" (UniqueName: \"kubernetes.io/projected/c0382652-0457-4ef8-a28a-ac1a9c2de73d-kube-api-access-z7n5q\") pod \"openshift-apiserver-operator-796bbdcf4f-6t2h5\" (UID: \"c0382652-0457-4ef8-a28a-ac1a9c2de73d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6t2h5" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.821981 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.841681 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.862558 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.881911 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.922496 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.930507 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wtzq\" (UniqueName: \"kubernetes.io/projected/bffc6cea-5224-4f60-b2c2-3e93aa5cff97-kube-api-access-6wtzq\") pod \"apiserver-76f77b778f-92tt5\" (UID: \"bffc6cea-5224-4f60-b2c2-3e93aa5cff97\") " pod="openshift-apiserver/apiserver-76f77b778f-92tt5" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.942623 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.943412 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-92tt5" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.961908 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.982451 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 09 02:44:36 crc kubenswrapper[4901]: I0309 02:44:36.996526 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xs7zs" Mar 09 02:44:37 crc kubenswrapper[4901]: I0309 02:44:37.042956 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 09 02:44:37 crc kubenswrapper[4901]: I0309 02:44:37.054007 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6t2h5" Mar 09 02:44:37 crc kubenswrapper[4901]: I0309 02:44:37.089844 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2m5m\" (UniqueName: \"kubernetes.io/projected/8919ddf1-096f-407f-8ea3-91a26a623f43-kube-api-access-p2m5m\") pod \"machine-api-operator-5694c8668f-f6c5c\" (UID: \"8919ddf1-096f-407f-8ea3-91a26a623f43\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-f6c5c" Mar 09 02:44:37 crc kubenswrapper[4901]: I0309 02:44:37.101569 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 09 02:44:37 crc kubenswrapper[4901]: I0309 02:44:37.108337 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rz995\" (UniqueName: \"kubernetes.io/projected/e1f8e22e-89a6-46b7-94f6-65aa27575c48-kube-api-access-rz995\") pod \"controller-manager-879f6c89f-5jgbr\" (UID: \"e1f8e22e-89a6-46b7-94f6-65aa27575c48\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5jgbr" Mar 09 02:44:37 crc kubenswrapper[4901]: I0309 02:44:37.121816 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 09 02:44:37 crc kubenswrapper[4901]: I0309 02:44:37.145508 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 09 02:44:37 crc kubenswrapper[4901]: I0309 02:44:37.163450 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 09 02:44:37 crc kubenswrapper[4901]: I0309 02:44:37.181998 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 09 02:44:37 crc kubenswrapper[4901]: I0309 02:44:37.201696 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 09 02:44:37 crc kubenswrapper[4901]: I0309 02:44:37.217043 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-92tt5"] Mar 09 02:44:37 crc kubenswrapper[4901]: I0309 02:44:37.221561 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 09 02:44:37 crc kubenswrapper[4901]: W0309 02:44:37.227667 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbffc6cea_5224_4f60_b2c2_3e93aa5cff97.slice/crio-b9167486f9c8c7a0050c9f9e894b45e1cd17a07e4b4dfac55dd3fd78d11861f9 WatchSource:0}: Error finding container b9167486f9c8c7a0050c9f9e894b45e1cd17a07e4b4dfac55dd3fd78d11861f9: Status 404 returned error can't find the container with id b9167486f9c8c7a0050c9f9e894b45e1cd17a07e4b4dfac55dd3fd78d11861f9 Mar 09 02:44:37 crc kubenswrapper[4901]: I0309 02:44:37.244706 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 09 02:44:37 crc kubenswrapper[4901]: I0309 02:44:37.254479 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-xs7zs"] Mar 09 02:44:37 crc kubenswrapper[4901]: I0309 02:44:37.261728 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 09 02:44:37 crc kubenswrapper[4901]: I0309 02:44:37.276420 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-5jgbr" Mar 09 02:44:37 crc kubenswrapper[4901]: I0309 02:44:37.284793 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 09 02:44:37 crc kubenswrapper[4901]: I0309 02:44:37.294702 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6t2h5"] Mar 09 02:44:37 crc kubenswrapper[4901]: I0309 02:44:37.302550 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 09 02:44:37 crc kubenswrapper[4901]: I0309 02:44:37.303646 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-f6c5c" Mar 09 02:44:37 crc kubenswrapper[4901]: W0309 02:44:37.314276 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0382652_0457_4ef8_a28a_ac1a9c2de73d.slice/crio-0c87f5c5a7732e8f8d84efcf756e11b0a03e2f489221a81b5be4e81334e1633c WatchSource:0}: Error finding container 0c87f5c5a7732e8f8d84efcf756e11b0a03e2f489221a81b5be4e81334e1633c: Status 404 returned error can't find the container with id 0c87f5c5a7732e8f8d84efcf756e11b0a03e2f489221a81b5be4e81334e1633c Mar 09 02:44:37 crc kubenswrapper[4901]: I0309 02:44:37.321950 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 09 02:44:37 crc kubenswrapper[4901]: I0309 02:44:37.342280 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 09 02:44:37 crc kubenswrapper[4901]: I0309 02:44:37.374838 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 09 02:44:37 crc kubenswrapper[4901]: I0309 02:44:37.382641 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 09 02:44:37 crc kubenswrapper[4901]: I0309 02:44:37.402789 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 09 02:44:37 crc kubenswrapper[4901]: I0309 02:44:37.422567 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 09 02:44:37 crc kubenswrapper[4901]: I0309 02:44:37.443811 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 09 02:44:37 crc kubenswrapper[4901]: I0309 02:44:37.471085 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 09 02:44:37 crc kubenswrapper[4901]: I0309 02:44:37.477557 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5jgbr"] Mar 09 02:44:37 crc kubenswrapper[4901]: I0309 02:44:37.481977 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 09 02:44:37 crc kubenswrapper[4901]: I0309 02:44:37.501424 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 09 02:44:37 crc kubenswrapper[4901]: I0309 02:44:37.512218 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-f6c5c"] Mar 09 02:44:37 crc kubenswrapper[4901]: I0309 02:44:37.521909 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 09 02:44:37 crc kubenswrapper[4901]: I0309 02:44:37.541515 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 09 02:44:37 crc kubenswrapper[4901]: I0309 02:44:37.562634 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 09 02:44:37 crc kubenswrapper[4901]: I0309 02:44:37.581084 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 09 02:44:37 crc kubenswrapper[4901]: I0309 02:44:37.600169 4901 request.go:700] Waited for 1.001006652s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dmachine-config-operator-dockercfg-98p87&limit=500&resourceVersion=0 Mar 09 02:44:37 crc kubenswrapper[4901]: I0309 02:44:37.601347 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 09 02:44:37 crc kubenswrapper[4901]: W0309 02:44:37.612648 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8919ddf1_096f_407f_8ea3_91a26a623f43.slice/crio-1ae58b3b7f00ddb9fdbf9ef93ea06305b3a06b6567d7573db71bc7ea71607476 WatchSource:0}: Error finding container 1ae58b3b7f00ddb9fdbf9ef93ea06305b3a06b6567d7573db71bc7ea71607476: Status 404 returned error can't find the container with id 1ae58b3b7f00ddb9fdbf9ef93ea06305b3a06b6567d7573db71bc7ea71607476 Mar 09 02:44:37 crc kubenswrapper[4901]: I0309 02:44:37.621247 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 09 02:44:37 crc kubenswrapper[4901]: I0309 02:44:37.641518 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 09 02:44:37 crc kubenswrapper[4901]: E0309 02:44:37.652315 4901 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/config: failed to sync configmap cache: timed out waiting for the condition Mar 09 02:44:37 crc kubenswrapper[4901]: E0309 02:44:37.652343 4901 secret.go:188] Couldn't get secret openshift-route-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 09 02:44:37 crc kubenswrapper[4901]: E0309 02:44:37.652398 4901 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Mar 09 02:44:37 crc kubenswrapper[4901]: E0309 02:44:37.652757 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/890abe57-aa9b-4c46-8a26-c2c1fd724fab-config podName:890abe57-aa9b-4c46-8a26-c2c1fd724fab nodeName:}" failed. No retries permitted until 2026-03-09 02:44:38.152567841 +0000 UTC m=+202.742231593 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/890abe57-aa9b-4c46-8a26-c2c1fd724fab-config") pod "route-controller-manager-6576b87f9c-57xhf" (UID: "890abe57-aa9b-4c46-8a26-c2c1fd724fab") : failed to sync configmap cache: timed out waiting for the condition Mar 09 02:44:37 crc kubenswrapper[4901]: E0309 02:44:37.652899 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/890abe57-aa9b-4c46-8a26-c2c1fd724fab-serving-cert podName:890abe57-aa9b-4c46-8a26-c2c1fd724fab nodeName:}" failed. No retries permitted until 2026-03-09 02:44:38.152880501 +0000 UTC m=+202.742544283 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/890abe57-aa9b-4c46-8a26-c2c1fd724fab-serving-cert") pod "route-controller-manager-6576b87f9c-57xhf" (UID: "890abe57-aa9b-4c46-8a26-c2c1fd724fab") : failed to sync secret cache: timed out waiting for the condition Mar 09 02:44:37 crc kubenswrapper[4901]: E0309 02:44:37.653024 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/890abe57-aa9b-4c46-8a26-c2c1fd724fab-client-ca podName:890abe57-aa9b-4c46-8a26-c2c1fd724fab nodeName:}" failed. No retries permitted until 2026-03-09 02:44:38.153006345 +0000 UTC m=+202.742670097 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/890abe57-aa9b-4c46-8a26-c2c1fd724fab-client-ca") pod "route-controller-manager-6576b87f9c-57xhf" (UID: "890abe57-aa9b-4c46-8a26-c2c1fd724fab") : failed to sync configmap cache: timed out waiting for the condition Mar 09 02:44:37 crc kubenswrapper[4901]: I0309 02:44:37.662366 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 09 02:44:37 crc kubenswrapper[4901]: I0309 02:44:37.682392 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 09 02:44:37 crc kubenswrapper[4901]: I0309 02:44:37.706930 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 09 02:44:37 crc kubenswrapper[4901]: I0309 02:44:37.740078 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 09 02:44:37 crc kubenswrapper[4901]: I0309 02:44:37.743307 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 09 02:44:37 crc kubenswrapper[4901]: I0309 02:44:37.762983 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 09 02:44:37 crc kubenswrapper[4901]: I0309 02:44:37.784532 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 09 02:44:37 crc kubenswrapper[4901]: I0309 02:44:37.802183 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 09 02:44:37 crc kubenswrapper[4901]: I0309 02:44:37.822310 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 09 02:44:37 crc kubenswrapper[4901]: I0309 02:44:37.841870 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 09 02:44:37 crc kubenswrapper[4901]: I0309 02:44:37.862351 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 09 02:44:37 crc kubenswrapper[4901]: I0309 02:44:37.881466 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 09 02:44:37 crc kubenswrapper[4901]: I0309 02:44:37.902055 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 09 02:44:37 crc kubenswrapper[4901]: I0309 02:44:37.921519 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 09 02:44:37 crc kubenswrapper[4901]: I0309 02:44:37.942341 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 09 02:44:37 crc kubenswrapper[4901]: I0309 02:44:37.962029 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 09 02:44:37 crc kubenswrapper[4901]: I0309 02:44:37.981582 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.002395 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.020994 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.041927 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.098137 4901 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.098275 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.101363 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.122272 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.142665 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.162923 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.186251 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-f6c5c" event={"ID":"8919ddf1-096f-407f-8ea3-91a26a623f43","Type":"ContainerStarted","Data":"78c538d5d4ce9f601edbb37e9b71f22f20411472bf5d899f4f7ba48cf975f0fe"} Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.186311 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-f6c5c" event={"ID":"8919ddf1-096f-407f-8ea3-91a26a623f43","Type":"ContainerStarted","Data":"195010185b621dfd1c0c30a4f7793b26083b0ae104f05296f74c1fad0f259a7f"} Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.186336 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-f6c5c" event={"ID":"8919ddf1-096f-407f-8ea3-91a26a623f43","Type":"ContainerStarted","Data":"1ae58b3b7f00ddb9fdbf9ef93ea06305b3a06b6567d7573db71bc7ea71607476"} Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.186953 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.189254 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-5jgbr" event={"ID":"e1f8e22e-89a6-46b7-94f6-65aa27575c48","Type":"ContainerStarted","Data":"763e802f5f830b75c23367fc42d4541bdde7131419f766f2f05ea4e153a0e6de"} Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.189319 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-5jgbr" event={"ID":"e1f8e22e-89a6-46b7-94f6-65aa27575c48","Type":"ContainerStarted","Data":"906d980e15aacd868d5461777efa76c684fc94240e37f6c98e5dd557d2670978"} Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.189820 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-5jgbr" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.191543 4901 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-5jgbr container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.191635 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-5jgbr" podUID="e1f8e22e-89a6-46b7-94f6-65aa27575c48" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.191706 4901 generic.go:334] "Generic (PLEG): container finished" podID="e87e0a88-a606-4f22-be47-72cc718fce1b" containerID="8be478cde9b5e3cd930875412b5b76741ca01179af048ad337e6da05aa023840" exitCode=0 Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.191791 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xs7zs" event={"ID":"e87e0a88-a606-4f22-be47-72cc718fce1b","Type":"ContainerDied","Data":"8be478cde9b5e3cd930875412b5b76741ca01179af048ad337e6da05aa023840"} Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.191822 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xs7zs" event={"ID":"e87e0a88-a606-4f22-be47-72cc718fce1b","Type":"ContainerStarted","Data":"7a874cb493dfd29d9da1f5365a91ca91245b83f0058f540c200ab41c7605b378"} Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.193746 4901 generic.go:334] "Generic (PLEG): container finished" podID="bffc6cea-5224-4f60-b2c2-3e93aa5cff97" containerID="6288384aa2e1856d3c507edc8500c7563a8c3bd5d7165c52dc4bc8602e0e11e5" exitCode=0 Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.193820 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-92tt5" event={"ID":"bffc6cea-5224-4f60-b2c2-3e93aa5cff97","Type":"ContainerDied","Data":"6288384aa2e1856d3c507edc8500c7563a8c3bd5d7165c52dc4bc8602e0e11e5"} Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.193847 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-92tt5" event={"ID":"bffc6cea-5224-4f60-b2c2-3e93aa5cff97","Type":"ContainerStarted","Data":"b9167486f9c8c7a0050c9f9e894b45e1cd17a07e4b4dfac55dd3fd78d11861f9"} Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.196999 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/890abe57-aa9b-4c46-8a26-c2c1fd724fab-client-ca\") pod \"route-controller-manager-6576b87f9c-57xhf\" (UID: \"890abe57-aa9b-4c46-8a26-c2c1fd724fab\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-57xhf" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.197163 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/890abe57-aa9b-4c46-8a26-c2c1fd724fab-serving-cert\") pod \"route-controller-manager-6576b87f9c-57xhf\" (UID: \"890abe57-aa9b-4c46-8a26-c2c1fd724fab\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-57xhf" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.197278 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/890abe57-aa9b-4c46-8a26-c2c1fd724fab-config\") pod \"route-controller-manager-6576b87f9c-57xhf\" (UID: \"890abe57-aa9b-4c46-8a26-c2c1fd724fab\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-57xhf" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.198741 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6t2h5" event={"ID":"c0382652-0457-4ef8-a28a-ac1a9c2de73d","Type":"ContainerStarted","Data":"5286e0b1b74e784f0744cd7123a24c443e6bac3f4941b05a0709676a9bdf18b7"} Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.198804 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6t2h5" event={"ID":"c0382652-0457-4ef8-a28a-ac1a9c2de73d","Type":"ContainerStarted","Data":"0c87f5c5a7732e8f8d84efcf756e11b0a03e2f489221a81b5be4e81334e1633c"} Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.202437 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.237922 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.241838 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.262072 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.281939 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.302857 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.321342 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.342798 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.362176 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.384318 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.403022 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.422198 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.480373 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jx2wh\" (UniqueName: \"kubernetes.io/projected/64d1cf60-f9b7-4722-83aa-7967cd9827d6-kube-api-access-jx2wh\") pod \"machine-approver-56656f9798-pxqjl\" (UID: \"64d1cf60-f9b7-4722-83aa-7967cd9827d6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pxqjl" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.503581 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgfr4\" (UniqueName: \"kubernetes.io/projected/0bef1913-8737-48c8-bcf5-89daf1bd1c54-kube-api-access-hgfr4\") pod \"oauth-openshift-558db77b4-tzf54\" (UID: \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\") " pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.503613 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.521706 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.542708 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.552615 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pxqjl" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.565507 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.574888 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.608041 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7d59\" (UniqueName: \"kubernetes.io/projected/e4c5ab6f-bc3e-48e7-a896-c8f69da5d9a0-kube-api-access-k7d59\") pod \"openshift-config-operator-7777fb866f-nrgmz\" (UID: \"e4c5ab6f-bc3e-48e7-a896-c8f69da5d9a0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nrgmz" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.620090 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nrgmz" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.620268 4901 request.go:700] Waited for 1.865146232s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication-operator/serviceaccounts/authentication-operator/token Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.645027 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p85mn\" (UniqueName: \"kubernetes.io/projected/57ac07da-c623-40dc-b43b-87d5b43c4468-kube-api-access-p85mn\") pod \"authentication-operator-69f744f599-vwwl2\" (UID: \"57ac07da-c623-40dc-b43b-87d5b43c4468\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vwwl2" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.652309 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c8f28c79-5540-44c0-acea-aa36ce8a47d9-bound-sa-token\") pod \"ingress-operator-5b745b69d9-kxd66\" (UID: \"c8f28c79-5540-44c0-acea-aa36ce8a47d9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kxd66" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.659563 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkfnm\" (UniqueName: \"kubernetes.io/projected/054fc716-7800-43b0-af23-328b685f89f9-kube-api-access-wkfnm\") pod \"downloads-7954f5f757-jznjb\" (UID: \"054fc716-7800-43b0-af23-328b685f89f9\") " pod="openshift-console/downloads-7954f5f757-jznjb" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.687301 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6jbh\" (UniqueName: \"kubernetes.io/projected/99c51354-0ea1-4e7a-bd34-1bb57e6422c0-kube-api-access-x6jbh\") pod \"cluster-samples-operator-665b6dd947-jffft\" (UID: \"99c51354-0ea1-4e7a-bd34-1bb57e6422c0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jffft" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.698797 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sc9g\" (UniqueName: \"kubernetes.io/projected/c8f28c79-5540-44c0-acea-aa36ce8a47d9-kube-api-access-7sc9g\") pod \"ingress-operator-5b745b69d9-kxd66\" (UID: \"c8f28c79-5540-44c0-acea-aa36ce8a47d9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kxd66" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.717115 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1336f974-5df8-4d81-91bb-2c2364c87479-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-xtpw8\" (UID: \"1336f974-5df8-4d81-91bb-2c2364c87479\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xtpw8" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.717393 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-jznjb" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.733204 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kxd66" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.736604 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dj9g\" (UniqueName: \"kubernetes.io/projected/9800e019-e239-4af9-9059-0c29be7ca479-kube-api-access-7dj9g\") pod \"etcd-operator-b45778765-stq8d\" (UID: \"9800e019-e239-4af9-9059-0c29be7ca479\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stq8d" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.749509 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xtpw8" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.787871 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.792795 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/890abe57-aa9b-4c46-8a26-c2c1fd724fab-serving-cert\") pod \"route-controller-manager-6576b87f9c-57xhf\" (UID: \"890abe57-aa9b-4c46-8a26-c2c1fd724fab\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-57xhf" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.807201 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.807816 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3befa408-f48c-4244-81ca-6bf178967fbe-registry-certificates\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.807870 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4816493b-f6f6-425b-88f2-67aa948f3c67-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-sclvh\" (UID: \"4816493b-f6f6-425b-88f2-67aa948f3c67\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-sclvh" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.807892 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a7e3f000-a2fc-4ebb-a6ee-00fda57097b7-metrics-tls\") pod \"dns-operator-744455d44c-p7chk\" (UID: \"a7e3f000-a2fc-4ebb-a6ee-00fda57097b7\") " pod="openshift-dns-operator/dns-operator-744455d44c-p7chk" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.807917 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1969ad35-8f20-4676-b7b6-375c5b93aa14-config-volume\") pod \"dns-default-np7wv\" (UID: \"1969ad35-8f20-4676-b7b6-375c5b93aa14\") " pod="openshift-dns/dns-default-np7wv" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.807935 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bwvv\" (UniqueName: \"kubernetes.io/projected/1969ad35-8f20-4676-b7b6-375c5b93aa14-kube-api-access-4bwvv\") pod \"dns-default-np7wv\" (UID: \"1969ad35-8f20-4676-b7b6-375c5b93aa14\") " pod="openshift-dns/dns-default-np7wv" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.810425 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-tzf54"] Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.812129 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/890abe57-aa9b-4c46-8a26-c2c1fd724fab-config\") pod \"route-controller-manager-6576b87f9c-57xhf\" (UID: \"890abe57-aa9b-4c46-8a26-c2c1fd724fab\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-57xhf" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.812827 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/730c2ff6-fe2c-425c-88a8-686d2d7617f0-serving-cert\") pod \"console-operator-58897d9998-zfqpl\" (UID: \"730c2ff6-fe2c-425c-88a8-686d2d7617f0\") " pod="openshift-console-operator/console-operator-58897d9998-zfqpl" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.812862 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cnjd\" (UniqueName: \"kubernetes.io/projected/730c2ff6-fe2c-425c-88a8-686d2d7617f0-kube-api-access-7cnjd\") pod \"console-operator-58897d9998-zfqpl\" (UID: \"730c2ff6-fe2c-425c-88a8-686d2d7617f0\") " pod="openshift-console-operator/console-operator-58897d9998-zfqpl" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.812897 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3befa408-f48c-4244-81ca-6bf178967fbe-installation-pull-secrets\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.812915 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/4816493b-f6f6-425b-88f2-67aa948f3c67-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-sclvh\" (UID: \"4816493b-f6f6-425b-88f2-67aa948f3c67\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-sclvh" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.812948 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4816493b-f6f6-425b-88f2-67aa948f3c67-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-sclvh\" (UID: \"4816493b-f6f6-425b-88f2-67aa948f3c67\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-sclvh" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.813028 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7w2l5\" (UniqueName: \"kubernetes.io/projected/3befa408-f48c-4244-81ca-6bf178967fbe-kube-api-access-7w2l5\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.813070 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/730c2ff6-fe2c-425c-88a8-686d2d7617f0-config\") pod \"console-operator-58897d9998-zfqpl\" (UID: \"730c2ff6-fe2c-425c-88a8-686d2d7617f0\") " pod="openshift-console-operator/console-operator-58897d9998-zfqpl" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.813240 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.813264 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fv8ld\" (UniqueName: \"kubernetes.io/projected/4816493b-f6f6-425b-88f2-67aa948f3c67-kube-api-access-fv8ld\") pod \"cluster-image-registry-operator-dc59b4c8b-sclvh\" (UID: \"4816493b-f6f6-425b-88f2-67aa948f3c67\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-sclvh" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.813284 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1969ad35-8f20-4676-b7b6-375c5b93aa14-metrics-tls\") pod \"dns-default-np7wv\" (UID: \"1969ad35-8f20-4676-b7b6-375c5b93aa14\") " pod="openshift-dns/dns-default-np7wv" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.813347 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3befa408-f48c-4244-81ca-6bf178967fbe-bound-sa-token\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.813372 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/730c2ff6-fe2c-425c-88a8-686d2d7617f0-trusted-ca\") pod \"console-operator-58897d9998-zfqpl\" (UID: \"730c2ff6-fe2c-425c-88a8-686d2d7617f0\") " pod="openshift-console-operator/console-operator-58897d9998-zfqpl" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.813421 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3befa408-f48c-4244-81ca-6bf178967fbe-registry-tls\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.813453 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzp6t\" (UniqueName: \"kubernetes.io/projected/a7e3f000-a2fc-4ebb-a6ee-00fda57097b7-kube-api-access-gzp6t\") pod \"dns-operator-744455d44c-p7chk\" (UID: \"a7e3f000-a2fc-4ebb-a6ee-00fda57097b7\") " pod="openshift-dns-operator/dns-operator-744455d44c-p7chk" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.813471 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3befa408-f48c-4244-81ca-6bf178967fbe-ca-trust-extracted\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.813537 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3befa408-f48c-4244-81ca-6bf178967fbe-trusted-ca\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:38 crc kubenswrapper[4901]: E0309 02:44:38.813832 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 02:44:39.313818638 +0000 UTC m=+203.903482370 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tpmfc" (UID: "3befa408-f48c-4244-81ca-6bf178967fbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.822924 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.854147 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.861743 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.862409 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwtv4\" (UniqueName: \"kubernetes.io/projected/890abe57-aa9b-4c46-8a26-c2c1fd724fab-kube-api-access-xwtv4\") pod \"route-controller-manager-6576b87f9c-57xhf\" (UID: \"890abe57-aa9b-4c46-8a26-c2c1fd724fab\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-57xhf" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.867873 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/890abe57-aa9b-4c46-8a26-c2c1fd724fab-client-ca\") pod \"route-controller-manager-6576b87f9c-57xhf\" (UID: \"890abe57-aa9b-4c46-8a26-c2c1fd724fab\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-57xhf" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.883462 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-vwwl2" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.914661 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 02:44:38 crc kubenswrapper[4901]: E0309 02:44:38.914862 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 02:44:39.414837996 +0000 UTC m=+204.004501728 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.914995 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1969ad35-8f20-4676-b7b6-375c5b93aa14-metrics-tls\") pod \"dns-default-np7wv\" (UID: \"1969ad35-8f20-4676-b7b6-375c5b93aa14\") " pod="openshift-dns/dns-default-np7wv" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.915022 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/5eb34c1b-916a-4f45-a6b3-e5e8b17a0872-stats-auth\") pod \"router-default-5444994796-9v4g4\" (UID: \"5eb34c1b-916a-4f45-a6b3-e5e8b17a0872\") " pod="openshift-ingress/router-default-5444994796-9v4g4" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.915049 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqrxt\" (UniqueName: \"kubernetes.io/projected/65646690-9b87-47d8-a187-207924a2c486-kube-api-access-nqrxt\") pod \"console-f9d7485db-xc8gr\" (UID: \"65646690-9b87-47d8-a187-207924a2c486\") " pod="openshift-console/console-f9d7485db-xc8gr" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.915074 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7mnw\" (UniqueName: \"kubernetes.io/projected/9c86283c-c972-4467-b374-5d638dbfd9b9-kube-api-access-h7mnw\") pod \"service-ca-9c57cc56f-v478l\" (UID: \"9c86283c-c972-4467-b374-5d638dbfd9b9\") " pod="openshift-service-ca/service-ca-9c57cc56f-v478l" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.915098 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sq525\" (UniqueName: \"kubernetes.io/projected/8e691e4a-54da-49cd-acaf-e1b14cadde2e-kube-api-access-sq525\") pod \"machine-config-controller-84d6567774-ffb55\" (UID: \"8e691e4a-54da-49cd-acaf-e1b14cadde2e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ffb55" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.915112 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwn9p\" (UniqueName: \"kubernetes.io/projected/c9a5b87c-5fb9-44a9-9f71-84aec278ac58-kube-api-access-lwn9p\") pod \"ingress-canary-p949f\" (UID: \"c9a5b87c-5fb9-44a9-9f71-84aec278ac58\") " pod="openshift-ingress-canary/ingress-canary-p949f" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.915128 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f1f65b0f-1ef9-445f-8e08-3a4b7bc82f93-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-5jjc7\" (UID: \"f1f65b0f-1ef9-445f-8e08-3a4b7bc82f93\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5jjc7" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.915145 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d65d3de3-6f6f-42e4-9853-bc5b7bad5236-tmpfs\") pod \"packageserver-d55dfcdfc-9fxwq\" (UID: \"d65d3de3-6f6f-42e4-9853-bc5b7bad5236\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9fxwq" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.915161 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvfpj\" (UniqueName: \"kubernetes.io/projected/0860921b-92bb-498b-97b1-ee87b6e985cc-kube-api-access-xvfpj\") pod \"service-ca-operator-777779d784-m52jv\" (UID: \"0860921b-92bb-498b-97b1-ee87b6e985cc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m52jv" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.915177 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/c80e8dad-f5d6-4a23-90f9-4de04483c9be-certs\") pod \"machine-config-server-x28d9\" (UID: \"c80e8dad-f5d6-4a23-90f9-4de04483c9be\") " pod="openshift-machine-config-operator/machine-config-server-x28d9" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.915190 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8e691e4a-54da-49cd-acaf-e1b14cadde2e-proxy-tls\") pod \"machine-config-controller-84d6567774-ffb55\" (UID: \"8e691e4a-54da-49cd-acaf-e1b14cadde2e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ffb55" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.915205 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79vzb\" (UniqueName: \"kubernetes.io/projected/2009785b-23de-4c85-8dbc-285219ade858-kube-api-access-79vzb\") pod \"control-plane-machine-set-operator-78cbb6b69f-8m6x2\" (UID: \"2009785b-23de-4c85-8dbc-285219ade858\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8m6x2" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.915252 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8e691e4a-54da-49cd-acaf-e1b14cadde2e-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-ffb55\" (UID: \"8e691e4a-54da-49cd-acaf-e1b14cadde2e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ffb55" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.915271 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7d39bde0-c78d-4e46-b388-78ce5fbadb9f-srv-cert\") pod \"catalog-operator-68c6474976-fl2d7\" (UID: \"7d39bde0-c78d-4e46-b388-78ce5fbadb9f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fl2d7" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.915297 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzp6t\" (UniqueName: \"kubernetes.io/projected/a7e3f000-a2fc-4ebb-a6ee-00fda57097b7-kube-api-access-gzp6t\") pod \"dns-operator-744455d44c-p7chk\" (UID: \"a7e3f000-a2fc-4ebb-a6ee-00fda57097b7\") " pod="openshift-dns-operator/dns-operator-744455d44c-p7chk" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.915313 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0860921b-92bb-498b-97b1-ee87b6e985cc-config\") pod \"service-ca-operator-777779d784-m52jv\" (UID: \"0860921b-92bb-498b-97b1-ee87b6e985cc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m52jv" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.915355 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3befa408-f48c-4244-81ca-6bf178967fbe-ca-trust-extracted\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.915371 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9pdz\" (UniqueName: \"kubernetes.io/projected/f1f65b0f-1ef9-445f-8e08-3a4b7bc82f93-kube-api-access-c9pdz\") pod \"package-server-manager-789f6589d5-5jjc7\" (UID: \"f1f65b0f-1ef9-445f-8e08-3a4b7bc82f93\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5jjc7" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.915386 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3befa408-f48c-4244-81ca-6bf178967fbe-trusted-ca\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.915403 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqz2l\" (UniqueName: \"kubernetes.io/projected/5eb34c1b-916a-4f45-a6b3-e5e8b17a0872-kube-api-access-tqz2l\") pod \"router-default-5444994796-9v4g4\" (UID: \"5eb34c1b-916a-4f45-a6b3-e5e8b17a0872\") " pod="openshift-ingress/router-default-5444994796-9v4g4" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.915419 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59743d02-8280-4571-b67f-fba4e4659d39-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-nlx8v\" (UID: \"59743d02-8280-4571-b67f-fba4e4659d39\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nlx8v" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.915435 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12637446-5b6a-4b60-911a-335c765680b4-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-jvk7f\" (UID: \"12637446-5b6a-4b60-911a-335c765680b4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jvk7f" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.915452 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4816493b-f6f6-425b-88f2-67aa948f3c67-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-sclvh\" (UID: \"4816493b-f6f6-425b-88f2-67aa948f3c67\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-sclvh" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.915468 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a7e3f000-a2fc-4ebb-a6ee-00fda57097b7-metrics-tls\") pod \"dns-operator-744455d44c-p7chk\" (UID: \"a7e3f000-a2fc-4ebb-a6ee-00fda57097b7\") " pod="openshift-dns-operator/dns-operator-744455d44c-p7chk" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.915483 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/463898fa-b0fc-411e-abce-b0c64f32e240-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-qnwn6\" (UID: \"463898fa-b0fc-411e-abce-b0c64f32e240\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qnwn6" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.915506 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1969ad35-8f20-4676-b7b6-375c5b93aa14-config-volume\") pod \"dns-default-np7wv\" (UID: \"1969ad35-8f20-4676-b7b6-375c5b93aa14\") " pod="openshift-dns/dns-default-np7wv" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.915523 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4a65957-3afe-4859-8ec3-4b3a2180e744-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zf94m\" (UID: \"c4a65957-3afe-4859-8ec3-4b3a2180e744\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zf94m" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.915542 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/38e11ab1-ab41-4665-90ce-5c7aba1639f2-profile-collector-cert\") pod \"olm-operator-6b444d44fb-m299m\" (UID: \"38e11ab1-ab41-4665-90ce-5c7aba1639f2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m299m" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.915555 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d65d3de3-6f6f-42e4-9853-bc5b7bad5236-webhook-cert\") pod \"packageserver-d55dfcdfc-9fxwq\" (UID: \"d65d3de3-6f6f-42e4-9853-bc5b7bad5236\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9fxwq" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.915580 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/463898fa-b0fc-411e-abce-b0c64f32e240-config\") pod \"kube-controller-manager-operator-78b949d7b-qnwn6\" (UID: \"463898fa-b0fc-411e-abce-b0c64f32e240\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qnwn6" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.915596 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ccdc6c7c-cc5b-4b4e-a1d3-663d9a7900ae-auth-proxy-config\") pod \"machine-config-operator-74547568cd-n9lvf\" (UID: \"ccdc6c7c-cc5b-4b4e-a1d3-663d9a7900ae\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n9lvf" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.915642 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkn9d\" (UniqueName: \"kubernetes.io/projected/680e9e87-71a2-402c-84f2-e8eb2b7a4c44-kube-api-access-dkn9d\") pod \"auto-csr-approver-29550404-w8858\" (UID: \"680e9e87-71a2-402c-84f2-e8eb2b7a4c44\") " pod="openshift-infra/auto-csr-approver-29550404-w8858" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.915670 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/9c86283c-c972-4467-b374-5d638dbfd9b9-signing-key\") pod \"service-ca-9c57cc56f-v478l\" (UID: \"9c86283c-c972-4467-b374-5d638dbfd9b9\") " pod="openshift-service-ca/service-ca-9c57cc56f-v478l" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.915686 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5eb34c1b-916a-4f45-a6b3-e5e8b17a0872-metrics-certs\") pod \"router-default-5444994796-9v4g4\" (UID: \"5eb34c1b-916a-4f45-a6b3-e5e8b17a0872\") " pod="openshift-ingress/router-default-5444994796-9v4g4" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.915700 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e3336f36-4389-49db-a669-fe0cbc0bfdfd-config-volume\") pod \"collect-profiles-29550390-hrjpt\" (UID: \"e3336f36-4389-49db-a669-fe0cbc0bfdfd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550390-hrjpt" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.915738 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3befa408-f48c-4244-81ca-6bf178967fbe-installation-pull-secrets\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.915754 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/5eb34c1b-916a-4f45-a6b3-e5e8b17a0872-default-certificate\") pod \"router-default-5444994796-9v4g4\" (UID: \"5eb34c1b-916a-4f45-a6b3-e5e8b17a0872\") " pod="openshift-ingress/router-default-5444994796-9v4g4" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.915787 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/4816493b-f6f6-425b-88f2-67aa948f3c67-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-sclvh\" (UID: \"4816493b-f6f6-425b-88f2-67aa948f3c67\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-sclvh" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.915801 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/38e11ab1-ab41-4665-90ce-5c7aba1639f2-srv-cert\") pod \"olm-operator-6b444d44fb-m299m\" (UID: \"38e11ab1-ab41-4665-90ce-5c7aba1639f2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m299m" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.915834 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9n5l\" (UniqueName: \"kubernetes.io/projected/59743d02-8280-4571-b67f-fba4e4659d39-kube-api-access-d9n5l\") pod \"kube-storage-version-migrator-operator-b67b599dd-nlx8v\" (UID: \"59743d02-8280-4571-b67f-fba4e4659d39\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nlx8v" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.915858 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4816493b-f6f6-425b-88f2-67aa948f3c67-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-sclvh\" (UID: \"4816493b-f6f6-425b-88f2-67aa948f3c67\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-sclvh" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.915874 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mf9xq\" (UniqueName: \"kubernetes.io/projected/469c34ce-1d46-4d99-a1c4-ac180fd08322-kube-api-access-mf9xq\") pod \"migrator-59844c95c7-cwmk8\" (UID: \"469c34ce-1d46-4d99-a1c4-ac180fd08322\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-cwmk8" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.915908 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e3336f36-4389-49db-a669-fe0cbc0bfdfd-secret-volume\") pod \"collect-profiles-29550390-hrjpt\" (UID: \"e3336f36-4389-49db-a669-fe0cbc0bfdfd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550390-hrjpt" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.915926 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/9e9eea51-12bc-40f5-94b0-3fb75a48b898-mountpoint-dir\") pod \"csi-hostpathplugin-lgxp2\" (UID: \"9e9eea51-12bc-40f5-94b0-3fb75a48b898\") " pod="hostpath-provisioner/csi-hostpathplugin-lgxp2" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.915940 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ae631b64-6f22-4112-8fb8-aa2c5140275b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-kwffm\" (UID: \"ae631b64-6f22-4112-8fb8-aa2c5140275b\") " pod="openshift-marketplace/marketplace-operator-79b997595-kwffm" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.915956 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fv8ld\" (UniqueName: \"kubernetes.io/projected/4816493b-f6f6-425b-88f2-67aa948f3c67-kube-api-access-fv8ld\") pod \"cluster-image-registry-operator-dc59b4c8b-sclvh\" (UID: \"4816493b-f6f6-425b-88f2-67aa948f3c67\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-sclvh" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.915972 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c4a65957-3afe-4859-8ec3-4b3a2180e744-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zf94m\" (UID: \"c4a65957-3afe-4859-8ec3-4b3a2180e744\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zf94m" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.915988 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12637446-5b6a-4b60-911a-335c765680b4-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-jvk7f\" (UID: \"12637446-5b6a-4b60-911a-335c765680b4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jvk7f" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.916011 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bk27\" (UniqueName: \"kubernetes.io/projected/c80e8dad-f5d6-4a23-90f9-4de04483c9be-kube-api-access-6bk27\") pod \"machine-config-server-x28d9\" (UID: \"c80e8dad-f5d6-4a23-90f9-4de04483c9be\") " pod="openshift-machine-config-operator/machine-config-server-x28d9" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.916036 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59743d02-8280-4571-b67f-fba4e4659d39-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-nlx8v\" (UID: \"59743d02-8280-4571-b67f-fba4e4659d39\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nlx8v" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.916051 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/65646690-9b87-47d8-a187-207924a2c486-oauth-serving-cert\") pod \"console-f9d7485db-xc8gr\" (UID: \"65646690-9b87-47d8-a187-207924a2c486\") " pod="openshift-console/console-f9d7485db-xc8gr" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.916076 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3befa408-f48c-4244-81ca-6bf178967fbe-bound-sa-token\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.916093 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7d39bde0-c78d-4e46-b388-78ce5fbadb9f-profile-collector-cert\") pod \"catalog-operator-68c6474976-fl2d7\" (UID: \"7d39bde0-c78d-4e46-b388-78ce5fbadb9f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fl2d7" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.916114 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpr9w\" (UniqueName: \"kubernetes.io/projected/38e11ab1-ab41-4665-90ce-5c7aba1639f2-kube-api-access-lpr9w\") pod \"olm-operator-6b444d44fb-m299m\" (UID: \"38e11ab1-ab41-4665-90ce-5c7aba1639f2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m299m" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.916132 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ccdc6c7c-cc5b-4b4e-a1d3-663d9a7900ae-proxy-tls\") pod \"machine-config-operator-74547568cd-n9lvf\" (UID: \"ccdc6c7c-cc5b-4b4e-a1d3-663d9a7900ae\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n9lvf" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.916150 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/730c2ff6-fe2c-425c-88a8-686d2d7617f0-trusted-ca\") pod \"console-operator-58897d9998-zfqpl\" (UID: \"730c2ff6-fe2c-425c-88a8-686d2d7617f0\") " pod="openshift-console-operator/console-operator-58897d9998-zfqpl" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.916167 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5eb34c1b-916a-4f45-a6b3-e5e8b17a0872-service-ca-bundle\") pod \"router-default-5444994796-9v4g4\" (UID: \"5eb34c1b-916a-4f45-a6b3-e5e8b17a0872\") " pod="openshift-ingress/router-default-5444994796-9v4g4" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.916182 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65646690-9b87-47d8-a187-207924a2c486-trusted-ca-bundle\") pod \"console-f9d7485db-xc8gr\" (UID: \"65646690-9b87-47d8-a187-207924a2c486\") " pod="openshift-console/console-f9d7485db-xc8gr" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.916199 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/18a59baf-2a09-41ff-94c9-1219cf47dfc2-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-m8h87\" (UID: \"18a59baf-2a09-41ff-94c9-1219cf47dfc2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-m8h87" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.916216 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d65d3de3-6f6f-42e4-9853-bc5b7bad5236-apiservice-cert\") pod \"packageserver-d55dfcdfc-9fxwq\" (UID: \"d65d3de3-6f6f-42e4-9853-bc5b7bad5236\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9fxwq" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.916254 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3befa408-f48c-4244-81ca-6bf178967fbe-registry-tls\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.916270 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlgdf\" (UniqueName: \"kubernetes.io/projected/ccdc6c7c-cc5b-4b4e-a1d3-663d9a7900ae-kube-api-access-mlgdf\") pod \"machine-config-operator-74547568cd-n9lvf\" (UID: \"ccdc6c7c-cc5b-4b4e-a1d3-663d9a7900ae\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n9lvf" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.916285 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ae631b64-6f22-4112-8fb8-aa2c5140275b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-kwffm\" (UID: \"ae631b64-6f22-4112-8fb8-aa2c5140275b\") " pod="openshift-marketplace/marketplace-operator-79b997595-kwffm" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.916310 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/65646690-9b87-47d8-a187-207924a2c486-console-oauth-config\") pod \"console-f9d7485db-xc8gr\" (UID: \"65646690-9b87-47d8-a187-207924a2c486\") " pod="openshift-console/console-f9d7485db-xc8gr" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.916336 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/9e9eea51-12bc-40f5-94b0-3fb75a48b898-plugins-dir\") pod \"csi-hostpathplugin-lgxp2\" (UID: \"9e9eea51-12bc-40f5-94b0-3fb75a48b898\") " pod="hostpath-provisioner/csi-hostpathplugin-lgxp2" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.916352 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3befa408-f48c-4244-81ca-6bf178967fbe-registry-certificates\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.916399 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/65646690-9b87-47d8-a187-207924a2c486-console-config\") pod \"console-f9d7485db-xc8gr\" (UID: \"65646690-9b87-47d8-a187-207924a2c486\") " pod="openshift-console/console-f9d7485db-xc8gr" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.916414 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ccdc6c7c-cc5b-4b4e-a1d3-663d9a7900ae-images\") pod \"machine-config-operator-74547568cd-n9lvf\" (UID: \"ccdc6c7c-cc5b-4b4e-a1d3-663d9a7900ae\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n9lvf" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.916454 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bwvv\" (UniqueName: \"kubernetes.io/projected/1969ad35-8f20-4676-b7b6-375c5b93aa14-kube-api-access-4bwvv\") pod \"dns-default-np7wv\" (UID: \"1969ad35-8f20-4676-b7b6-375c5b93aa14\") " pod="openshift-dns/dns-default-np7wv" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.916477 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwnxp\" (UniqueName: \"kubernetes.io/projected/ae631b64-6f22-4112-8fb8-aa2c5140275b-kube-api-access-zwnxp\") pod \"marketplace-operator-79b997595-kwffm\" (UID: \"ae631b64-6f22-4112-8fb8-aa2c5140275b\") " pod="openshift-marketplace/marketplace-operator-79b997595-kwffm" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.916492 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbzwb\" (UniqueName: \"kubernetes.io/projected/e3336f36-4389-49db-a669-fe0cbc0bfdfd-kube-api-access-qbzwb\") pod \"collect-profiles-29550390-hrjpt\" (UID: \"e3336f36-4389-49db-a669-fe0cbc0bfdfd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550390-hrjpt" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.916516 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ck6m\" (UniqueName: \"kubernetes.io/projected/18a59baf-2a09-41ff-94c9-1219cf47dfc2-kube-api-access-4ck6m\") pod \"multus-admission-controller-857f4d67dd-m8h87\" (UID: \"18a59baf-2a09-41ff-94c9-1219cf47dfc2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-m8h87" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.916531 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/65646690-9b87-47d8-a187-207924a2c486-console-serving-cert\") pod \"console-f9d7485db-xc8gr\" (UID: \"65646690-9b87-47d8-a187-207924a2c486\") " pod="openshift-console/console-f9d7485db-xc8gr" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.916546 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cnjd\" (UniqueName: \"kubernetes.io/projected/730c2ff6-fe2c-425c-88a8-686d2d7617f0-kube-api-access-7cnjd\") pod \"console-operator-58897d9998-zfqpl\" (UID: \"730c2ff6-fe2c-425c-88a8-686d2d7617f0\") " pod="openshift-console-operator/console-operator-58897d9998-zfqpl" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.916562 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9e9eea51-12bc-40f5-94b0-3fb75a48b898-socket-dir\") pod \"csi-hostpathplugin-lgxp2\" (UID: \"9e9eea51-12bc-40f5-94b0-3fb75a48b898\") " pod="hostpath-provisioner/csi-hostpathplugin-lgxp2" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.916588 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/c80e8dad-f5d6-4a23-90f9-4de04483c9be-node-bootstrap-token\") pod \"machine-config-server-x28d9\" (UID: \"c80e8dad-f5d6-4a23-90f9-4de04483c9be\") " pod="openshift-machine-config-operator/machine-config-server-x28d9" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.916609 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/730c2ff6-fe2c-425c-88a8-686d2d7617f0-serving-cert\") pod \"console-operator-58897d9998-zfqpl\" (UID: \"730c2ff6-fe2c-425c-88a8-686d2d7617f0\") " pod="openshift-console-operator/console-operator-58897d9998-zfqpl" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.916634 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/463898fa-b0fc-411e-abce-b0c64f32e240-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-qnwn6\" (UID: \"463898fa-b0fc-411e-abce-b0c64f32e240\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qnwn6" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.916649 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hm2vx\" (UniqueName: \"kubernetes.io/projected/12637446-5b6a-4b60-911a-335c765680b4-kube-api-access-hm2vx\") pod \"openshift-controller-manager-operator-756b6f6bc6-jvk7f\" (UID: \"12637446-5b6a-4b60-911a-335c765680b4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jvk7f" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.916696 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/9e9eea51-12bc-40f5-94b0-3fb75a48b898-csi-data-dir\") pod \"csi-hostpathplugin-lgxp2\" (UID: \"9e9eea51-12bc-40f5-94b0-3fb75a48b898\") " pod="hostpath-provisioner/csi-hostpathplugin-lgxp2" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.916713 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/2009785b-23de-4c85-8dbc-285219ade858-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-8m6x2\" (UID: \"2009785b-23de-4c85-8dbc-285219ade858\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8m6x2" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.916749 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4a65957-3afe-4859-8ec3-4b3a2180e744-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zf94m\" (UID: \"c4a65957-3afe-4859-8ec3-4b3a2180e744\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zf94m" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.916774 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwh8x\" (UniqueName: \"kubernetes.io/projected/7d39bde0-c78d-4e46-b388-78ce5fbadb9f-kube-api-access-fwh8x\") pod \"catalog-operator-68c6474976-fl2d7\" (UID: \"7d39bde0-c78d-4e46-b388-78ce5fbadb9f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fl2d7" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.916788 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c9a5b87c-5fb9-44a9-9f71-84aec278ac58-cert\") pod \"ingress-canary-p949f\" (UID: \"c9a5b87c-5fb9-44a9-9f71-84aec278ac58\") " pod="openshift-ingress-canary/ingress-canary-p949f" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.916804 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/65646690-9b87-47d8-a187-207924a2c486-service-ca\") pod \"console-f9d7485db-xc8gr\" (UID: \"65646690-9b87-47d8-a187-207924a2c486\") " pod="openshift-console/console-f9d7485db-xc8gr" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.916822 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7w2l5\" (UniqueName: \"kubernetes.io/projected/3befa408-f48c-4244-81ca-6bf178967fbe-kube-api-access-7w2l5\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.916840 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bchg7\" (UniqueName: \"kubernetes.io/projected/d65d3de3-6f6f-42e4-9853-bc5b7bad5236-kube-api-access-bchg7\") pod \"packageserver-d55dfcdfc-9fxwq\" (UID: \"d65d3de3-6f6f-42e4-9853-bc5b7bad5236\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9fxwq" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.916859 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/730c2ff6-fe2c-425c-88a8-686d2d7617f0-config\") pod \"console-operator-58897d9998-zfqpl\" (UID: \"730c2ff6-fe2c-425c-88a8-686d2d7617f0\") " pod="openshift-console-operator/console-operator-58897d9998-zfqpl" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.916891 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.916907 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrfvl\" (UniqueName: \"kubernetes.io/projected/9e9eea51-12bc-40f5-94b0-3fb75a48b898-kube-api-access-nrfvl\") pod \"csi-hostpathplugin-lgxp2\" (UID: \"9e9eea51-12bc-40f5-94b0-3fb75a48b898\") " pod="hostpath-provisioner/csi-hostpathplugin-lgxp2" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.916922 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9e9eea51-12bc-40f5-94b0-3fb75a48b898-registration-dir\") pod \"csi-hostpathplugin-lgxp2\" (UID: \"9e9eea51-12bc-40f5-94b0-3fb75a48b898\") " pod="hostpath-provisioner/csi-hostpathplugin-lgxp2" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.917002 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/9c86283c-c972-4467-b374-5d638dbfd9b9-signing-cabundle\") pod \"service-ca-9c57cc56f-v478l\" (UID: \"9c86283c-c972-4467-b374-5d638dbfd9b9\") " pod="openshift-service-ca/service-ca-9c57cc56f-v478l" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.918527 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3befa408-f48c-4244-81ca-6bf178967fbe-ca-trust-extracted\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.919295 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0860921b-92bb-498b-97b1-ee87b6e985cc-serving-cert\") pod \"service-ca-operator-777779d784-m52jv\" (UID: \"0860921b-92bb-498b-97b1-ee87b6e985cc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m52jv" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.919746 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/730c2ff6-fe2c-425c-88a8-686d2d7617f0-trusted-ca\") pod \"console-operator-58897d9998-zfqpl\" (UID: \"730c2ff6-fe2c-425c-88a8-686d2d7617f0\") " pod="openshift-console-operator/console-operator-58897d9998-zfqpl" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.920512 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3befa408-f48c-4244-81ca-6bf178967fbe-registry-certificates\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.922622 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3befa408-f48c-4244-81ca-6bf178967fbe-trusted-ca\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.923105 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1969ad35-8f20-4676-b7b6-375c5b93aa14-config-volume\") pod \"dns-default-np7wv\" (UID: \"1969ad35-8f20-4676-b7b6-375c5b93aa14\") " pod="openshift-dns/dns-default-np7wv" Mar 09 02:44:38 crc kubenswrapper[4901]: E0309 02:44:38.923979 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 02:44:39.423966435 +0000 UTC m=+204.013630167 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tpmfc" (UID: "3befa408-f48c-4244-81ca-6bf178967fbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.924372 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/730c2ff6-fe2c-425c-88a8-686d2d7617f0-config\") pod \"console-operator-58897d9998-zfqpl\" (UID: \"730c2ff6-fe2c-425c-88a8-686d2d7617f0\") " pod="openshift-console-operator/console-operator-58897d9998-zfqpl" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.932516 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4816493b-f6f6-425b-88f2-67aa948f3c67-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-sclvh\" (UID: \"4816493b-f6f6-425b-88f2-67aa948f3c67\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-sclvh" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.933726 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a7e3f000-a2fc-4ebb-a6ee-00fda57097b7-metrics-tls\") pod \"dns-operator-744455d44c-p7chk\" (UID: \"a7e3f000-a2fc-4ebb-a6ee-00fda57097b7\") " pod="openshift-dns-operator/dns-operator-744455d44c-p7chk" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.940052 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/4816493b-f6f6-425b-88f2-67aa948f3c67-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-sclvh\" (UID: \"4816493b-f6f6-425b-88f2-67aa948f3c67\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-sclvh" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.944265 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3befa408-f48c-4244-81ca-6bf178967fbe-installation-pull-secrets\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.947985 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/730c2ff6-fe2c-425c-88a8-686d2d7617f0-serving-cert\") pod \"console-operator-58897d9998-zfqpl\" (UID: \"730c2ff6-fe2c-425c-88a8-686d2d7617f0\") " pod="openshift-console-operator/console-operator-58897d9998-zfqpl" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.948554 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jffft" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.952623 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3befa408-f48c-4244-81ca-6bf178967fbe-registry-tls\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.958020 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1969ad35-8f20-4676-b7b6-375c5b93aa14-metrics-tls\") pod \"dns-default-np7wv\" (UID: \"1969ad35-8f20-4676-b7b6-375c5b93aa14\") " pod="openshift-dns/dns-default-np7wv" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.968820 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fv8ld\" (UniqueName: \"kubernetes.io/projected/4816493b-f6f6-425b-88f2-67aa948f3c67-kube-api-access-fv8ld\") pod \"cluster-image-registry-operator-dc59b4c8b-sclvh\" (UID: \"4816493b-f6f6-425b-88f2-67aa948f3c67\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-sclvh" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.979560 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cnjd\" (UniqueName: \"kubernetes.io/projected/730c2ff6-fe2c-425c-88a8-686d2d7617f0-kube-api-access-7cnjd\") pod \"console-operator-58897d9998-zfqpl\" (UID: \"730c2ff6-fe2c-425c-88a8-686d2d7617f0\") " pod="openshift-console-operator/console-operator-58897d9998-zfqpl" Mar 09 02:44:38 crc kubenswrapper[4901]: I0309 02:44:38.998463 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3befa408-f48c-4244-81ca-6bf178967fbe-bound-sa-token\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.020732 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.020904 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/5eb34c1b-916a-4f45-a6b3-e5e8b17a0872-stats-auth\") pod \"router-default-5444994796-9v4g4\" (UID: \"5eb34c1b-916a-4f45-a6b3-e5e8b17a0872\") " pod="openshift-ingress/router-default-5444994796-9v4g4" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.020931 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqrxt\" (UniqueName: \"kubernetes.io/projected/65646690-9b87-47d8-a187-207924a2c486-kube-api-access-nqrxt\") pod \"console-f9d7485db-xc8gr\" (UID: \"65646690-9b87-47d8-a187-207924a2c486\") " pod="openshift-console/console-f9d7485db-xc8gr" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.020957 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7mnw\" (UniqueName: \"kubernetes.io/projected/9c86283c-c972-4467-b374-5d638dbfd9b9-kube-api-access-h7mnw\") pod \"service-ca-9c57cc56f-v478l\" (UID: \"9c86283c-c972-4467-b374-5d638dbfd9b9\") " pod="openshift-service-ca/service-ca-9c57cc56f-v478l" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.020979 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sq525\" (UniqueName: \"kubernetes.io/projected/8e691e4a-54da-49cd-acaf-e1b14cadde2e-kube-api-access-sq525\") pod \"machine-config-controller-84d6567774-ffb55\" (UID: \"8e691e4a-54da-49cd-acaf-e1b14cadde2e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ffb55" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.020998 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwn9p\" (UniqueName: \"kubernetes.io/projected/c9a5b87c-5fb9-44a9-9f71-84aec278ac58-kube-api-access-lwn9p\") pod \"ingress-canary-p949f\" (UID: \"c9a5b87c-5fb9-44a9-9f71-84aec278ac58\") " pod="openshift-ingress-canary/ingress-canary-p949f" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.021015 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f1f65b0f-1ef9-445f-8e08-3a4b7bc82f93-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-5jjc7\" (UID: \"f1f65b0f-1ef9-445f-8e08-3a4b7bc82f93\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5jjc7" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.021031 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d65d3de3-6f6f-42e4-9853-bc5b7bad5236-tmpfs\") pod \"packageserver-d55dfcdfc-9fxwq\" (UID: \"d65d3de3-6f6f-42e4-9853-bc5b7bad5236\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9fxwq" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.021048 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvfpj\" (UniqueName: \"kubernetes.io/projected/0860921b-92bb-498b-97b1-ee87b6e985cc-kube-api-access-xvfpj\") pod \"service-ca-operator-777779d784-m52jv\" (UID: \"0860921b-92bb-498b-97b1-ee87b6e985cc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m52jv" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.021067 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/c80e8dad-f5d6-4a23-90f9-4de04483c9be-certs\") pod \"machine-config-server-x28d9\" (UID: \"c80e8dad-f5d6-4a23-90f9-4de04483c9be\") " pod="openshift-machine-config-operator/machine-config-server-x28d9" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.021091 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8e691e4a-54da-49cd-acaf-e1b14cadde2e-proxy-tls\") pod \"machine-config-controller-84d6567774-ffb55\" (UID: \"8e691e4a-54da-49cd-acaf-e1b14cadde2e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ffb55" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.021109 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79vzb\" (UniqueName: \"kubernetes.io/projected/2009785b-23de-4c85-8dbc-285219ade858-kube-api-access-79vzb\") pod \"control-plane-machine-set-operator-78cbb6b69f-8m6x2\" (UID: \"2009785b-23de-4c85-8dbc-285219ade858\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8m6x2" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.021126 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8e691e4a-54da-49cd-acaf-e1b14cadde2e-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-ffb55\" (UID: \"8e691e4a-54da-49cd-acaf-e1b14cadde2e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ffb55" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.021142 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7d39bde0-c78d-4e46-b388-78ce5fbadb9f-srv-cert\") pod \"catalog-operator-68c6474976-fl2d7\" (UID: \"7d39bde0-c78d-4e46-b388-78ce5fbadb9f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fl2d7" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.021164 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0860921b-92bb-498b-97b1-ee87b6e985cc-config\") pod \"service-ca-operator-777779d784-m52jv\" (UID: \"0860921b-92bb-498b-97b1-ee87b6e985cc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m52jv" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.021186 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9pdz\" (UniqueName: \"kubernetes.io/projected/f1f65b0f-1ef9-445f-8e08-3a4b7bc82f93-kube-api-access-c9pdz\") pod \"package-server-manager-789f6589d5-5jjc7\" (UID: \"f1f65b0f-1ef9-445f-8e08-3a4b7bc82f93\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5jjc7" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.021203 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqz2l\" (UniqueName: \"kubernetes.io/projected/5eb34c1b-916a-4f45-a6b3-e5e8b17a0872-kube-api-access-tqz2l\") pod \"router-default-5444994796-9v4g4\" (UID: \"5eb34c1b-916a-4f45-a6b3-e5e8b17a0872\") " pod="openshift-ingress/router-default-5444994796-9v4g4" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.021235 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59743d02-8280-4571-b67f-fba4e4659d39-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-nlx8v\" (UID: \"59743d02-8280-4571-b67f-fba4e4659d39\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nlx8v" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.021252 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12637446-5b6a-4b60-911a-335c765680b4-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-jvk7f\" (UID: \"12637446-5b6a-4b60-911a-335c765680b4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jvk7f" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.021268 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/463898fa-b0fc-411e-abce-b0c64f32e240-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-qnwn6\" (UID: \"463898fa-b0fc-411e-abce-b0c64f32e240\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qnwn6" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.021287 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4a65957-3afe-4859-8ec3-4b3a2180e744-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zf94m\" (UID: \"c4a65957-3afe-4859-8ec3-4b3a2180e744\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zf94m" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.021305 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/38e11ab1-ab41-4665-90ce-5c7aba1639f2-profile-collector-cert\") pod \"olm-operator-6b444d44fb-m299m\" (UID: \"38e11ab1-ab41-4665-90ce-5c7aba1639f2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m299m" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.021321 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/463898fa-b0fc-411e-abce-b0c64f32e240-config\") pod \"kube-controller-manager-operator-78b949d7b-qnwn6\" (UID: \"463898fa-b0fc-411e-abce-b0c64f32e240\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qnwn6" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.021342 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d65d3de3-6f6f-42e4-9853-bc5b7bad5236-webhook-cert\") pod \"packageserver-d55dfcdfc-9fxwq\" (UID: \"d65d3de3-6f6f-42e4-9853-bc5b7bad5236\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9fxwq" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.021359 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ccdc6c7c-cc5b-4b4e-a1d3-663d9a7900ae-auth-proxy-config\") pod \"machine-config-operator-74547568cd-n9lvf\" (UID: \"ccdc6c7c-cc5b-4b4e-a1d3-663d9a7900ae\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n9lvf" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.021377 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkn9d\" (UniqueName: \"kubernetes.io/projected/680e9e87-71a2-402c-84f2-e8eb2b7a4c44-kube-api-access-dkn9d\") pod \"auto-csr-approver-29550404-w8858\" (UID: \"680e9e87-71a2-402c-84f2-e8eb2b7a4c44\") " pod="openshift-infra/auto-csr-approver-29550404-w8858" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.021393 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/9c86283c-c972-4467-b374-5d638dbfd9b9-signing-key\") pod \"service-ca-9c57cc56f-v478l\" (UID: \"9c86283c-c972-4467-b374-5d638dbfd9b9\") " pod="openshift-service-ca/service-ca-9c57cc56f-v478l" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.021408 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5eb34c1b-916a-4f45-a6b3-e5e8b17a0872-metrics-certs\") pod \"router-default-5444994796-9v4g4\" (UID: \"5eb34c1b-916a-4f45-a6b3-e5e8b17a0872\") " pod="openshift-ingress/router-default-5444994796-9v4g4" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.021425 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e3336f36-4389-49db-a669-fe0cbc0bfdfd-config-volume\") pod \"collect-profiles-29550390-hrjpt\" (UID: \"e3336f36-4389-49db-a669-fe0cbc0bfdfd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550390-hrjpt" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.021444 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/5eb34c1b-916a-4f45-a6b3-e5e8b17a0872-default-certificate\") pod \"router-default-5444994796-9v4g4\" (UID: \"5eb34c1b-916a-4f45-a6b3-e5e8b17a0872\") " pod="openshift-ingress/router-default-5444994796-9v4g4" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.021460 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9n5l\" (UniqueName: \"kubernetes.io/projected/59743d02-8280-4571-b67f-fba4e4659d39-kube-api-access-d9n5l\") pod \"kube-storage-version-migrator-operator-b67b599dd-nlx8v\" (UID: \"59743d02-8280-4571-b67f-fba4e4659d39\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nlx8v" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.021475 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/38e11ab1-ab41-4665-90ce-5c7aba1639f2-srv-cert\") pod \"olm-operator-6b444d44fb-m299m\" (UID: \"38e11ab1-ab41-4665-90ce-5c7aba1639f2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m299m" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.021496 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mf9xq\" (UniqueName: \"kubernetes.io/projected/469c34ce-1d46-4d99-a1c4-ac180fd08322-kube-api-access-mf9xq\") pod \"migrator-59844c95c7-cwmk8\" (UID: \"469c34ce-1d46-4d99-a1c4-ac180fd08322\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-cwmk8" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.021510 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e3336f36-4389-49db-a669-fe0cbc0bfdfd-secret-volume\") pod \"collect-profiles-29550390-hrjpt\" (UID: \"e3336f36-4389-49db-a669-fe0cbc0bfdfd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550390-hrjpt" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.021534 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/9e9eea51-12bc-40f5-94b0-3fb75a48b898-mountpoint-dir\") pod \"csi-hostpathplugin-lgxp2\" (UID: \"9e9eea51-12bc-40f5-94b0-3fb75a48b898\") " pod="hostpath-provisioner/csi-hostpathplugin-lgxp2" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.021551 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ae631b64-6f22-4112-8fb8-aa2c5140275b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-kwffm\" (UID: \"ae631b64-6f22-4112-8fb8-aa2c5140275b\") " pod="openshift-marketplace/marketplace-operator-79b997595-kwffm" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.021566 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c4a65957-3afe-4859-8ec3-4b3a2180e744-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zf94m\" (UID: \"c4a65957-3afe-4859-8ec3-4b3a2180e744\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zf94m" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.021584 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12637446-5b6a-4b60-911a-335c765680b4-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-jvk7f\" (UID: \"12637446-5b6a-4b60-911a-335c765680b4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jvk7f" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.021599 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bk27\" (UniqueName: \"kubernetes.io/projected/c80e8dad-f5d6-4a23-90f9-4de04483c9be-kube-api-access-6bk27\") pod \"machine-config-server-x28d9\" (UID: \"c80e8dad-f5d6-4a23-90f9-4de04483c9be\") " pod="openshift-machine-config-operator/machine-config-server-x28d9" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.021613 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59743d02-8280-4571-b67f-fba4e4659d39-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-nlx8v\" (UID: \"59743d02-8280-4571-b67f-fba4e4659d39\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nlx8v" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.021628 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/65646690-9b87-47d8-a187-207924a2c486-oauth-serving-cert\") pod \"console-f9d7485db-xc8gr\" (UID: \"65646690-9b87-47d8-a187-207924a2c486\") " pod="openshift-console/console-f9d7485db-xc8gr" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.021650 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7d39bde0-c78d-4e46-b388-78ce5fbadb9f-profile-collector-cert\") pod \"catalog-operator-68c6474976-fl2d7\" (UID: \"7d39bde0-c78d-4e46-b388-78ce5fbadb9f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fl2d7" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.021664 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpr9w\" (UniqueName: \"kubernetes.io/projected/38e11ab1-ab41-4665-90ce-5c7aba1639f2-kube-api-access-lpr9w\") pod \"olm-operator-6b444d44fb-m299m\" (UID: \"38e11ab1-ab41-4665-90ce-5c7aba1639f2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m299m" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.021680 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ccdc6c7c-cc5b-4b4e-a1d3-663d9a7900ae-proxy-tls\") pod \"machine-config-operator-74547568cd-n9lvf\" (UID: \"ccdc6c7c-cc5b-4b4e-a1d3-663d9a7900ae\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n9lvf" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.021695 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5eb34c1b-916a-4f45-a6b3-e5e8b17a0872-service-ca-bundle\") pod \"router-default-5444994796-9v4g4\" (UID: \"5eb34c1b-916a-4f45-a6b3-e5e8b17a0872\") " pod="openshift-ingress/router-default-5444994796-9v4g4" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.021712 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65646690-9b87-47d8-a187-207924a2c486-trusted-ca-bundle\") pod \"console-f9d7485db-xc8gr\" (UID: \"65646690-9b87-47d8-a187-207924a2c486\") " pod="openshift-console/console-f9d7485db-xc8gr" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.021729 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/18a59baf-2a09-41ff-94c9-1219cf47dfc2-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-m8h87\" (UID: \"18a59baf-2a09-41ff-94c9-1219cf47dfc2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-m8h87" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.021745 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d65d3de3-6f6f-42e4-9853-bc5b7bad5236-apiservice-cert\") pod \"packageserver-d55dfcdfc-9fxwq\" (UID: \"d65d3de3-6f6f-42e4-9853-bc5b7bad5236\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9fxwq" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.021763 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlgdf\" (UniqueName: \"kubernetes.io/projected/ccdc6c7c-cc5b-4b4e-a1d3-663d9a7900ae-kube-api-access-mlgdf\") pod \"machine-config-operator-74547568cd-n9lvf\" (UID: \"ccdc6c7c-cc5b-4b4e-a1d3-663d9a7900ae\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n9lvf" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.021779 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ae631b64-6f22-4112-8fb8-aa2c5140275b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-kwffm\" (UID: \"ae631b64-6f22-4112-8fb8-aa2c5140275b\") " pod="openshift-marketplace/marketplace-operator-79b997595-kwffm" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.021794 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/65646690-9b87-47d8-a187-207924a2c486-console-oauth-config\") pod \"console-f9d7485db-xc8gr\" (UID: \"65646690-9b87-47d8-a187-207924a2c486\") " pod="openshift-console/console-f9d7485db-xc8gr" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.021810 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/9e9eea51-12bc-40f5-94b0-3fb75a48b898-plugins-dir\") pod \"csi-hostpathplugin-lgxp2\" (UID: \"9e9eea51-12bc-40f5-94b0-3fb75a48b898\") " pod="hostpath-provisioner/csi-hostpathplugin-lgxp2" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.021827 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/65646690-9b87-47d8-a187-207924a2c486-console-config\") pod \"console-f9d7485db-xc8gr\" (UID: \"65646690-9b87-47d8-a187-207924a2c486\") " pod="openshift-console/console-f9d7485db-xc8gr" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.021843 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ccdc6c7c-cc5b-4b4e-a1d3-663d9a7900ae-images\") pod \"machine-config-operator-74547568cd-n9lvf\" (UID: \"ccdc6c7c-cc5b-4b4e-a1d3-663d9a7900ae\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n9lvf" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.021869 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwnxp\" (UniqueName: \"kubernetes.io/projected/ae631b64-6f22-4112-8fb8-aa2c5140275b-kube-api-access-zwnxp\") pod \"marketplace-operator-79b997595-kwffm\" (UID: \"ae631b64-6f22-4112-8fb8-aa2c5140275b\") " pod="openshift-marketplace/marketplace-operator-79b997595-kwffm" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.021884 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbzwb\" (UniqueName: \"kubernetes.io/projected/e3336f36-4389-49db-a669-fe0cbc0bfdfd-kube-api-access-qbzwb\") pod \"collect-profiles-29550390-hrjpt\" (UID: \"e3336f36-4389-49db-a669-fe0cbc0bfdfd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550390-hrjpt" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.021907 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ck6m\" (UniqueName: \"kubernetes.io/projected/18a59baf-2a09-41ff-94c9-1219cf47dfc2-kube-api-access-4ck6m\") pod \"multus-admission-controller-857f4d67dd-m8h87\" (UID: \"18a59baf-2a09-41ff-94c9-1219cf47dfc2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-m8h87" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.021921 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/65646690-9b87-47d8-a187-207924a2c486-console-serving-cert\") pod \"console-f9d7485db-xc8gr\" (UID: \"65646690-9b87-47d8-a187-207924a2c486\") " pod="openshift-console/console-f9d7485db-xc8gr" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.021938 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9e9eea51-12bc-40f5-94b0-3fb75a48b898-socket-dir\") pod \"csi-hostpathplugin-lgxp2\" (UID: \"9e9eea51-12bc-40f5-94b0-3fb75a48b898\") " pod="hostpath-provisioner/csi-hostpathplugin-lgxp2" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.021954 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/c80e8dad-f5d6-4a23-90f9-4de04483c9be-node-bootstrap-token\") pod \"machine-config-server-x28d9\" (UID: \"c80e8dad-f5d6-4a23-90f9-4de04483c9be\") " pod="openshift-machine-config-operator/machine-config-server-x28d9" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.021970 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/463898fa-b0fc-411e-abce-b0c64f32e240-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-qnwn6\" (UID: \"463898fa-b0fc-411e-abce-b0c64f32e240\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qnwn6" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.021986 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hm2vx\" (UniqueName: \"kubernetes.io/projected/12637446-5b6a-4b60-911a-335c765680b4-kube-api-access-hm2vx\") pod \"openshift-controller-manager-operator-756b6f6bc6-jvk7f\" (UID: \"12637446-5b6a-4b60-911a-335c765680b4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jvk7f" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.022002 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/9e9eea51-12bc-40f5-94b0-3fb75a48b898-csi-data-dir\") pod \"csi-hostpathplugin-lgxp2\" (UID: \"9e9eea51-12bc-40f5-94b0-3fb75a48b898\") " pod="hostpath-provisioner/csi-hostpathplugin-lgxp2" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.022018 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/2009785b-23de-4c85-8dbc-285219ade858-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-8m6x2\" (UID: \"2009785b-23de-4c85-8dbc-285219ade858\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8m6x2" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.022034 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4a65957-3afe-4859-8ec3-4b3a2180e744-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zf94m\" (UID: \"c4a65957-3afe-4859-8ec3-4b3a2180e744\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zf94m" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.022051 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwh8x\" (UniqueName: \"kubernetes.io/projected/7d39bde0-c78d-4e46-b388-78ce5fbadb9f-kube-api-access-fwh8x\") pod \"catalog-operator-68c6474976-fl2d7\" (UID: \"7d39bde0-c78d-4e46-b388-78ce5fbadb9f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fl2d7" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.022067 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c9a5b87c-5fb9-44a9-9f71-84aec278ac58-cert\") pod \"ingress-canary-p949f\" (UID: \"c9a5b87c-5fb9-44a9-9f71-84aec278ac58\") " pod="openshift-ingress-canary/ingress-canary-p949f" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.022088 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/65646690-9b87-47d8-a187-207924a2c486-service-ca\") pod \"console-f9d7485db-xc8gr\" (UID: \"65646690-9b87-47d8-a187-207924a2c486\") " pod="openshift-console/console-f9d7485db-xc8gr" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.022105 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bchg7\" (UniqueName: \"kubernetes.io/projected/d65d3de3-6f6f-42e4-9853-bc5b7bad5236-kube-api-access-bchg7\") pod \"packageserver-d55dfcdfc-9fxwq\" (UID: \"d65d3de3-6f6f-42e4-9853-bc5b7bad5236\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9fxwq" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.022130 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrfvl\" (UniqueName: \"kubernetes.io/projected/9e9eea51-12bc-40f5-94b0-3fb75a48b898-kube-api-access-nrfvl\") pod \"csi-hostpathplugin-lgxp2\" (UID: \"9e9eea51-12bc-40f5-94b0-3fb75a48b898\") " pod="hostpath-provisioner/csi-hostpathplugin-lgxp2" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.022146 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/9c86283c-c972-4467-b374-5d638dbfd9b9-signing-cabundle\") pod \"service-ca-9c57cc56f-v478l\" (UID: \"9c86283c-c972-4467-b374-5d638dbfd9b9\") " pod="openshift-service-ca/service-ca-9c57cc56f-v478l" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.022160 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0860921b-92bb-498b-97b1-ee87b6e985cc-serving-cert\") pod \"service-ca-operator-777779d784-m52jv\" (UID: \"0860921b-92bb-498b-97b1-ee87b6e985cc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m52jv" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.022174 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9e9eea51-12bc-40f5-94b0-3fb75a48b898-registration-dir\") pod \"csi-hostpathplugin-lgxp2\" (UID: \"9e9eea51-12bc-40f5-94b0-3fb75a48b898\") " pod="hostpath-provisioner/csi-hostpathplugin-lgxp2" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.022367 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9e9eea51-12bc-40f5-94b0-3fb75a48b898-registration-dir\") pod \"csi-hostpathplugin-lgxp2\" (UID: \"9e9eea51-12bc-40f5-94b0-3fb75a48b898\") " pod="hostpath-provisioner/csi-hostpathplugin-lgxp2" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.022877 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0860921b-92bb-498b-97b1-ee87b6e985cc-config\") pod \"service-ca-operator-777779d784-m52jv\" (UID: \"0860921b-92bb-498b-97b1-ee87b6e985cc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m52jv" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.023811 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzp6t\" (UniqueName: \"kubernetes.io/projected/a7e3f000-a2fc-4ebb-a6ee-00fda57097b7-kube-api-access-gzp6t\") pod \"dns-operator-744455d44c-p7chk\" (UID: \"a7e3f000-a2fc-4ebb-a6ee-00fda57097b7\") " pod="openshift-dns-operator/dns-operator-744455d44c-p7chk" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.024461 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5eb34c1b-916a-4f45-a6b3-e5e8b17a0872-service-ca-bundle\") pod \"router-default-5444994796-9v4g4\" (UID: \"5eb34c1b-916a-4f45-a6b3-e5e8b17a0872\") " pod="openshift-ingress/router-default-5444994796-9v4g4" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.025923 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59743d02-8280-4571-b67f-fba4e4659d39-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-nlx8v\" (UID: \"59743d02-8280-4571-b67f-fba4e4659d39\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nlx8v" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.026447 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-stq8d" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.026745 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65646690-9b87-47d8-a187-207924a2c486-trusted-ca-bundle\") pod \"console-f9d7485db-xc8gr\" (UID: \"65646690-9b87-47d8-a187-207924a2c486\") " pod="openshift-console/console-f9d7485db-xc8gr" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.026881 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9e9eea51-12bc-40f5-94b0-3fb75a48b898-socket-dir\") pod \"csi-hostpathplugin-lgxp2\" (UID: \"9e9eea51-12bc-40f5-94b0-3fb75a48b898\") " pod="hostpath-provisioner/csi-hostpathplugin-lgxp2" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.027537 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12637446-5b6a-4b60-911a-335c765680b4-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-jvk7f\" (UID: \"12637446-5b6a-4b60-911a-335c765680b4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jvk7f" Mar 09 02:44:39 crc kubenswrapper[4901]: E0309 02:44:39.029472 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 02:44:39.529452575 +0000 UTC m=+204.119116307 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.035139 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/5eb34c1b-916a-4f45-a6b3-e5e8b17a0872-stats-auth\") pod \"router-default-5444994796-9v4g4\" (UID: \"5eb34c1b-916a-4f45-a6b3-e5e8b17a0872\") " pod="openshift-ingress/router-default-5444994796-9v4g4" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.035433 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/463898fa-b0fc-411e-abce-b0c64f32e240-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-qnwn6\" (UID: \"463898fa-b0fc-411e-abce-b0c64f32e240\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qnwn6" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.038557 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/c80e8dad-f5d6-4a23-90f9-4de04483c9be-node-bootstrap-token\") pod \"machine-config-server-x28d9\" (UID: \"c80e8dad-f5d6-4a23-90f9-4de04483c9be\") " pod="openshift-machine-config-operator/machine-config-server-x28d9" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.038773 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/9e9eea51-12bc-40f5-94b0-3fb75a48b898-csi-data-dir\") pod \"csi-hostpathplugin-lgxp2\" (UID: \"9e9eea51-12bc-40f5-94b0-3fb75a48b898\") " pod="hostpath-provisioner/csi-hostpathplugin-lgxp2" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.040327 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f1f65b0f-1ef9-445f-8e08-3a4b7bc82f93-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-5jjc7\" (UID: \"f1f65b0f-1ef9-445f-8e08-3a4b7bc82f93\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5jjc7" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.040657 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d65d3de3-6f6f-42e4-9853-bc5b7bad5236-tmpfs\") pod \"packageserver-d55dfcdfc-9fxwq\" (UID: \"d65d3de3-6f6f-42e4-9853-bc5b7bad5236\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9fxwq" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.042040 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/2009785b-23de-4c85-8dbc-285219ade858-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-8m6x2\" (UID: \"2009785b-23de-4c85-8dbc-285219ade858\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8m6x2" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.042518 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4a65957-3afe-4859-8ec3-4b3a2180e744-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zf94m\" (UID: \"c4a65957-3afe-4859-8ec3-4b3a2180e744\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zf94m" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.043500 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/38e11ab1-ab41-4665-90ce-5c7aba1639f2-profile-collector-cert\") pod \"olm-operator-6b444d44fb-m299m\" (UID: \"38e11ab1-ab41-4665-90ce-5c7aba1639f2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m299m" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.043835 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7d39bde0-c78d-4e46-b388-78ce5fbadb9f-srv-cert\") pod \"catalog-operator-68c6474976-fl2d7\" (UID: \"7d39bde0-c78d-4e46-b388-78ce5fbadb9f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fl2d7" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.043919 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/463898fa-b0fc-411e-abce-b0c64f32e240-config\") pod \"kube-controller-manager-operator-78b949d7b-qnwn6\" (UID: \"463898fa-b0fc-411e-abce-b0c64f32e240\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qnwn6" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.044590 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/65646690-9b87-47d8-a187-207924a2c486-console-config\") pod \"console-f9d7485db-xc8gr\" (UID: \"65646690-9b87-47d8-a187-207924a2c486\") " pod="openshift-console/console-f9d7485db-xc8gr" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.044815 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c9a5b87c-5fb9-44a9-9f71-84aec278ac58-cert\") pod \"ingress-canary-p949f\" (UID: \"c9a5b87c-5fb9-44a9-9f71-84aec278ac58\") " pod="openshift-ingress-canary/ingress-canary-p949f" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.044842 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-jznjb"] Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.045357 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/65646690-9b87-47d8-a187-207924a2c486-service-ca\") pod \"console-f9d7485db-xc8gr\" (UID: \"65646690-9b87-47d8-a187-207924a2c486\") " pod="openshift-console/console-f9d7485db-xc8gr" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.046011 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/9c86283c-c972-4467-b374-5d638dbfd9b9-signing-cabundle\") pod \"service-ca-9c57cc56f-v478l\" (UID: \"9c86283c-c972-4467-b374-5d638dbfd9b9\") " pod="openshift-service-ca/service-ca-9c57cc56f-v478l" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.046589 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d65d3de3-6f6f-42e4-9853-bc5b7bad5236-webhook-cert\") pod \"packageserver-d55dfcdfc-9fxwq\" (UID: \"d65d3de3-6f6f-42e4-9853-bc5b7bad5236\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9fxwq" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.047611 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ccdc6c7c-cc5b-4b4e-a1d3-663d9a7900ae-auth-proxy-config\") pod \"machine-config-operator-74547568cd-n9lvf\" (UID: \"ccdc6c7c-cc5b-4b4e-a1d3-663d9a7900ae\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n9lvf" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.047681 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8e691e4a-54da-49cd-acaf-e1b14cadde2e-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-ffb55\" (UID: \"8e691e4a-54da-49cd-acaf-e1b14cadde2e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ffb55" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.048177 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ccdc6c7c-cc5b-4b4e-a1d3-663d9a7900ae-images\") pod \"machine-config-operator-74547568cd-n9lvf\" (UID: \"ccdc6c7c-cc5b-4b4e-a1d3-663d9a7900ae\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n9lvf" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.048561 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e3336f36-4389-49db-a669-fe0cbc0bfdfd-config-volume\") pod \"collect-profiles-29550390-hrjpt\" (UID: \"e3336f36-4389-49db-a669-fe0cbc0bfdfd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550390-hrjpt" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.048861 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4a65957-3afe-4859-8ec3-4b3a2180e744-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zf94m\" (UID: \"c4a65957-3afe-4859-8ec3-4b3a2180e744\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zf94m" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.048906 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/9e9eea51-12bc-40f5-94b0-3fb75a48b898-mountpoint-dir\") pod \"csi-hostpathplugin-lgxp2\" (UID: \"9e9eea51-12bc-40f5-94b0-3fb75a48b898\") " pod="hostpath-provisioner/csi-hostpathplugin-lgxp2" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.050423 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/5eb34c1b-916a-4f45-a6b3-e5e8b17a0872-default-certificate\") pod \"router-default-5444994796-9v4g4\" (UID: \"5eb34c1b-916a-4f45-a6b3-e5e8b17a0872\") " pod="openshift-ingress/router-default-5444994796-9v4g4" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.051287 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0860921b-92bb-498b-97b1-ee87b6e985cc-serving-cert\") pod \"service-ca-operator-777779d784-m52jv\" (UID: \"0860921b-92bb-498b-97b1-ee87b6e985cc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m52jv" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.052151 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12637446-5b6a-4b60-911a-335c765680b4-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-jvk7f\" (UID: \"12637446-5b6a-4b60-911a-335c765680b4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jvk7f" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.052482 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ae631b64-6f22-4112-8fb8-aa2c5140275b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-kwffm\" (UID: \"ae631b64-6f22-4112-8fb8-aa2c5140275b\") " pod="openshift-marketplace/marketplace-operator-79b997595-kwffm" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.052579 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/9e9eea51-12bc-40f5-94b0-3fb75a48b898-plugins-dir\") pod \"csi-hostpathplugin-lgxp2\" (UID: \"9e9eea51-12bc-40f5-94b0-3fb75a48b898\") " pod="hostpath-provisioner/csi-hostpathplugin-lgxp2" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.052926 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/c80e8dad-f5d6-4a23-90f9-4de04483c9be-certs\") pod \"machine-config-server-x28d9\" (UID: \"c80e8dad-f5d6-4a23-90f9-4de04483c9be\") " pod="openshift-machine-config-operator/machine-config-server-x28d9" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.053062 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59743d02-8280-4571-b67f-fba4e4659d39-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-nlx8v\" (UID: \"59743d02-8280-4571-b67f-fba4e4659d39\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nlx8v" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.053065 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/65646690-9b87-47d8-a187-207924a2c486-oauth-serving-cert\") pod \"console-f9d7485db-xc8gr\" (UID: \"65646690-9b87-47d8-a187-207924a2c486\") " pod="openshift-console/console-f9d7485db-xc8gr" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.054653 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/18a59baf-2a09-41ff-94c9-1219cf47dfc2-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-m8h87\" (UID: \"18a59baf-2a09-41ff-94c9-1219cf47dfc2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-m8h87" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.054941 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8e691e4a-54da-49cd-acaf-e1b14cadde2e-proxy-tls\") pod \"machine-config-controller-84d6567774-ffb55\" (UID: \"8e691e4a-54da-49cd-acaf-e1b14cadde2e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ffb55" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.056756 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/65646690-9b87-47d8-a187-207924a2c486-console-serving-cert\") pod \"console-f9d7485db-xc8gr\" (UID: \"65646690-9b87-47d8-a187-207924a2c486\") " pod="openshift-console/console-f9d7485db-xc8gr" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.057493 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/9c86283c-c972-4467-b374-5d638dbfd9b9-signing-key\") pod \"service-ca-9c57cc56f-v478l\" (UID: \"9c86283c-c972-4467-b374-5d638dbfd9b9\") " pod="openshift-service-ca/service-ca-9c57cc56f-v478l" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.058383 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/65646690-9b87-47d8-a187-207924a2c486-console-oauth-config\") pod \"console-f9d7485db-xc8gr\" (UID: \"65646690-9b87-47d8-a187-207924a2c486\") " pod="openshift-console/console-f9d7485db-xc8gr" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.059192 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d65d3de3-6f6f-42e4-9853-bc5b7bad5236-apiservice-cert\") pod \"packageserver-d55dfcdfc-9fxwq\" (UID: \"d65d3de3-6f6f-42e4-9853-bc5b7bad5236\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9fxwq" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.059668 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7d39bde0-c78d-4e46-b388-78ce5fbadb9f-profile-collector-cert\") pod \"catalog-operator-68c6474976-fl2d7\" (UID: \"7d39bde0-c78d-4e46-b388-78ce5fbadb9f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fl2d7" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.061656 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-57xhf" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.062095 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bwvv\" (UniqueName: \"kubernetes.io/projected/1969ad35-8f20-4676-b7b6-375c5b93aa14-kube-api-access-4bwvv\") pod \"dns-default-np7wv\" (UID: \"1969ad35-8f20-4676-b7b6-375c5b93aa14\") " pod="openshift-dns/dns-default-np7wv" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.063124 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ccdc6c7c-cc5b-4b4e-a1d3-663d9a7900ae-proxy-tls\") pod \"machine-config-operator-74547568cd-n9lvf\" (UID: \"ccdc6c7c-cc5b-4b4e-a1d3-663d9a7900ae\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n9lvf" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.068469 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5eb34c1b-916a-4f45-a6b3-e5e8b17a0872-metrics-certs\") pod \"router-default-5444994796-9v4g4\" (UID: \"5eb34c1b-916a-4f45-a6b3-e5e8b17a0872\") " pod="openshift-ingress/router-default-5444994796-9v4g4" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.068682 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e3336f36-4389-49db-a669-fe0cbc0bfdfd-secret-volume\") pod \"collect-profiles-29550390-hrjpt\" (UID: \"e3336f36-4389-49db-a669-fe0cbc0bfdfd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550390-hrjpt" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.076664 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ae631b64-6f22-4112-8fb8-aa2c5140275b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-kwffm\" (UID: \"ae631b64-6f22-4112-8fb8-aa2c5140275b\") " pod="openshift-marketplace/marketplace-operator-79b997595-kwffm" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.077678 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/38e11ab1-ab41-4665-90ce-5c7aba1639f2-srv-cert\") pod \"olm-operator-6b444d44fb-m299m\" (UID: \"38e11ab1-ab41-4665-90ce-5c7aba1639f2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m299m" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.077760 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7w2l5\" (UniqueName: \"kubernetes.io/projected/3befa408-f48c-4244-81ca-6bf178967fbe-kube-api-access-7w2l5\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.088730 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4816493b-f6f6-425b-88f2-67aa948f3c67-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-sclvh\" (UID: \"4816493b-f6f6-425b-88f2-67aa948f3c67\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-sclvh" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.127534 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:39 crc kubenswrapper[4901]: E0309 02:44:39.128014 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 02:44:39.628001705 +0000 UTC m=+204.217665437 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tpmfc" (UID: "3befa408-f48c-4244-81ca-6bf178967fbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.153337 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqz2l\" (UniqueName: \"kubernetes.io/projected/5eb34c1b-916a-4f45-a6b3-e5e8b17a0872-kube-api-access-tqz2l\") pod \"router-default-5444994796-9v4g4\" (UID: \"5eb34c1b-916a-4f45-a6b3-e5e8b17a0872\") " pod="openshift-ingress/router-default-5444994796-9v4g4" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.165324 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9pdz\" (UniqueName: \"kubernetes.io/projected/f1f65b0f-1ef9-445f-8e08-3a4b7bc82f93-kube-api-access-c9pdz\") pod \"package-server-manager-789f6589d5-5jjc7\" (UID: \"f1f65b0f-1ef9-445f-8e08-3a4b7bc82f93\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5jjc7" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.174708 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-nrgmz"] Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.178982 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqrxt\" (UniqueName: \"kubernetes.io/projected/65646690-9b87-47d8-a187-207924a2c486-kube-api-access-nqrxt\") pod \"console-f9d7485db-xc8gr\" (UID: \"65646690-9b87-47d8-a187-207924a2c486\") " pod="openshift-console/console-f9d7485db-xc8gr" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.197285 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-zfqpl" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.208609 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7mnw\" (UniqueName: \"kubernetes.io/projected/9c86283c-c972-4467-b374-5d638dbfd9b9-kube-api-access-h7mnw\") pod \"service-ca-9c57cc56f-v478l\" (UID: \"9c86283c-c972-4467-b374-5d638dbfd9b9\") " pod="openshift-service-ca/service-ca-9c57cc56f-v478l" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.217655 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-v478l" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.218276 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sq525\" (UniqueName: \"kubernetes.io/projected/8e691e4a-54da-49cd-acaf-e1b14cadde2e-kube-api-access-sq525\") pod \"machine-config-controller-84d6567774-ffb55\" (UID: \"8e691e4a-54da-49cd-acaf-e1b14cadde2e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ffb55" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.225189 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pxqjl" event={"ID":"64d1cf60-f9b7-4722-83aa-7967cd9827d6","Type":"ContainerStarted","Data":"ed3727be6b8ec310dddc715f41c68b3e18345078a1c947983613cf8626bf4d01"} Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.225249 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pxqjl" event={"ID":"64d1cf60-f9b7-4722-83aa-7967cd9827d6","Type":"ContainerStarted","Data":"4293b82ebba76d592c8c62da8ce990598fd1c789e48da4a42fbc5d1912b7f6f1"} Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.226651 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xs7zs" event={"ID":"e87e0a88-a606-4f22-be47-72cc718fce1b","Type":"ContainerStarted","Data":"b217a46cba54d6171eca86bd772872a3981b3f4fb0e3a3d022cd42281b734602"} Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.230603 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 02:44:39 crc kubenswrapper[4901]: E0309 02:44:39.231031 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 02:44:39.731017727 +0000 UTC m=+204.320681459 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.236112 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwn9p\" (UniqueName: \"kubernetes.io/projected/c9a5b87c-5fb9-44a9-9f71-84aec278ac58-kube-api-access-lwn9p\") pod \"ingress-canary-p949f\" (UID: \"c9a5b87c-5fb9-44a9-9f71-84aec278ac58\") " pod="openshift-ingress-canary/ingress-canary-p949f" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.236870 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-92tt5" event={"ID":"bffc6cea-5224-4f60-b2c2-3e93aa5cff97","Type":"ContainerStarted","Data":"c7f18b04b7e0c93faf23bd0142ea6235a9b161319ee6515edf926ea2d2d26793"} Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.236903 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-92tt5" event={"ID":"bffc6cea-5224-4f60-b2c2-3e93aa5cff97","Type":"ContainerStarted","Data":"cb49615d282ac961b22fd3d6edd46999d65d3613886d6016eadb4d610df13881"} Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.243521 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-jznjb" event={"ID":"054fc716-7800-43b0-af23-328b685f89f9","Type":"ContainerStarted","Data":"70bc3014dac74458e16191a61a818b5b4534e9ca4a4f7e3a7ac85ad00811c315"} Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.245698 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" event={"ID":"0bef1913-8737-48c8-bcf5-89daf1bd1c54","Type":"ContainerStarted","Data":"0ee6723aa85968887f12a733a8284e9d66a89e772c7329c665dd351a8b35a40e"} Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.254536 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-xc8gr" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.264514 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/463898fa-b0fc-411e-abce-b0c64f32e240-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-qnwn6\" (UID: \"463898fa-b0fc-411e-abce-b0c64f32e240\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qnwn6" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.275584 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-5jgbr" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.277881 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hm2vx\" (UniqueName: \"kubernetes.io/projected/12637446-5b6a-4b60-911a-335c765680b4-kube-api-access-hm2vx\") pod \"openshift-controller-manager-operator-756b6f6bc6-jvk7f\" (UID: \"12637446-5b6a-4b60-911a-335c765680b4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jvk7f" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.286355 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvfpj\" (UniqueName: \"kubernetes.io/projected/0860921b-92bb-498b-97b1-ee87b6e985cc-kube-api-access-xvfpj\") pod \"service-ca-operator-777779d784-m52jv\" (UID: \"0860921b-92bb-498b-97b1-ee87b6e985cc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m52jv" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.294790 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-p949f" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.304698 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-p7chk" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.324850 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-sclvh" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.346338 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwh8x\" (UniqueName: \"kubernetes.io/projected/7d39bde0-c78d-4e46-b388-78ce5fbadb9f-kube-api-access-fwh8x\") pod \"catalog-operator-68c6474976-fl2d7\" (UID: \"7d39bde0-c78d-4e46-b388-78ce5fbadb9f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fl2d7" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.348172 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.353492 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-np7wv" Mar 09 02:44:39 crc kubenswrapper[4901]: E0309 02:44:39.363060 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 02:44:39.863042597 +0000 UTC m=+204.452706329 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tpmfc" (UID: "3befa408-f48c-4244-81ca-6bf178967fbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.365614 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-9v4g4" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.369361 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bchg7\" (UniqueName: \"kubernetes.io/projected/d65d3de3-6f6f-42e4-9853-bc5b7bad5236-kube-api-access-bchg7\") pod \"packageserver-d55dfcdfc-9fxwq\" (UID: \"d65d3de3-6f6f-42e4-9853-bc5b7bad5236\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9fxwq" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.396028 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qnwn6" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.400795 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xtpw8"] Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.403093 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fl2d7" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.416588 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79vzb\" (UniqueName: \"kubernetes.io/projected/2009785b-23de-4c85-8dbc-285219ade858-kube-api-access-79vzb\") pod \"control-plane-machine-set-operator-78cbb6b69f-8m6x2\" (UID: \"2009785b-23de-4c85-8dbc-285219ade858\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8m6x2" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.453716 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ffb55" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.454158 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5jjc7" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.454186 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrfvl\" (UniqueName: \"kubernetes.io/projected/9e9eea51-12bc-40f5-94b0-3fb75a48b898-kube-api-access-nrfvl\") pod \"csi-hostpathplugin-lgxp2\" (UID: \"9e9eea51-12bc-40f5-94b0-3fb75a48b898\") " pod="hostpath-provisioner/csi-hostpathplugin-lgxp2" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.454407 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9fxwq" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.454489 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkn9d\" (UniqueName: \"kubernetes.io/projected/680e9e87-71a2-402c-84f2-e8eb2b7a4c44-kube-api-access-dkn9d\") pod \"auto-csr-approver-29550404-w8858\" (UID: \"680e9e87-71a2-402c-84f2-e8eb2b7a4c44\") " pod="openshift-infra/auto-csr-approver-29550404-w8858" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.455409 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 02:44:39 crc kubenswrapper[4901]: E0309 02:44:39.456900 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 02:44:39.956883168 +0000 UTC m=+204.546546900 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.457023 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c4a65957-3afe-4859-8ec3-4b3a2180e744-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zf94m\" (UID: \"c4a65957-3afe-4859-8ec3-4b3a2180e744\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zf94m" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.466132 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbzwb\" (UniqueName: \"kubernetes.io/projected/e3336f36-4389-49db-a669-fe0cbc0bfdfd-kube-api-access-qbzwb\") pod \"collect-profiles-29550390-hrjpt\" (UID: \"e3336f36-4389-49db-a669-fe0cbc0bfdfd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550390-hrjpt" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.474948 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-stq8d"] Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.475915 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-kxd66"] Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.484812 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mf9xq\" (UniqueName: \"kubernetes.io/projected/469c34ce-1d46-4d99-a1c4-ac180fd08322-kube-api-access-mf9xq\") pod \"migrator-59844c95c7-cwmk8\" (UID: \"469c34ce-1d46-4d99-a1c4-ac180fd08322\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-cwmk8" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.486376 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9n5l\" (UniqueName: \"kubernetes.io/projected/59743d02-8280-4571-b67f-fba4e4659d39-kube-api-access-d9n5l\") pod \"kube-storage-version-migrator-operator-b67b599dd-nlx8v\" (UID: \"59743d02-8280-4571-b67f-fba4e4659d39\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nlx8v" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.487329 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwnxp\" (UniqueName: \"kubernetes.io/projected/ae631b64-6f22-4112-8fb8-aa2c5140275b-kube-api-access-zwnxp\") pod \"marketplace-operator-79b997595-kwffm\" (UID: \"ae631b64-6f22-4112-8fb8-aa2c5140275b\") " pod="openshift-marketplace/marketplace-operator-79b997595-kwffm" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.493548 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-m52jv" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.520135 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550390-hrjpt" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.534979 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpr9w\" (UniqueName: \"kubernetes.io/projected/38e11ab1-ab41-4665-90ce-5c7aba1639f2-kube-api-access-lpr9w\") pod \"olm-operator-6b444d44fb-m299m\" (UID: \"38e11ab1-ab41-4665-90ce-5c7aba1639f2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m299m" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.535176 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bk27\" (UniqueName: \"kubernetes.io/projected/c80e8dad-f5d6-4a23-90f9-4de04483c9be-kube-api-access-6bk27\") pod \"machine-config-server-x28d9\" (UID: \"c80e8dad-f5d6-4a23-90f9-4de04483c9be\") " pod="openshift-machine-config-operator/machine-config-server-x28d9" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.538694 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ck6m\" (UniqueName: \"kubernetes.io/projected/18a59baf-2a09-41ff-94c9-1219cf47dfc2-kube-api-access-4ck6m\") pod \"multus-admission-controller-857f4d67dd-m8h87\" (UID: \"18a59baf-2a09-41ff-94c9-1219cf47dfc2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-m8h87" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.539506 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jffft"] Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.540421 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550404-w8858" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.548706 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-lgxp2" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.558210 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:39 crc kubenswrapper[4901]: E0309 02:44:39.558556 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 02:44:40.058544657 +0000 UTC m=+204.648208389 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tpmfc" (UID: "3befa408-f48c-4244-81ca-6bf178967fbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.577453 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlgdf\" (UniqueName: \"kubernetes.io/projected/ccdc6c7c-cc5b-4b4e-a1d3-663d9a7900ae-kube-api-access-mlgdf\") pod \"machine-config-operator-74547568cd-n9lvf\" (UID: \"ccdc6c7c-cc5b-4b4e-a1d3-663d9a7900ae\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n9lvf" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.595930 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jvk7f" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.604308 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-x28d9" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.659237 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 02:44:39 crc kubenswrapper[4901]: E0309 02:44:39.659769 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 02:44:40.159750451 +0000 UTC m=+204.749414183 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.669535 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-cwmk8" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.682358 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zf94m" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.701252 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nlx8v" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.708943 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8m6x2" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.719109 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m299m" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.763085 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-kwffm" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.763695 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-vwwl2"] Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.768649 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n9lvf" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.769507 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:39 crc kubenswrapper[4901]: E0309 02:44:39.769793 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 02:44:40.269782575 +0000 UTC m=+204.859446307 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tpmfc" (UID: "3befa408-f48c-4244-81ca-6bf178967fbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.805368 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-m8h87" Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.878701 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 02:44:39 crc kubenswrapper[4901]: E0309 02:44:39.878841 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 02:44:40.378817867 +0000 UTC m=+204.968481599 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.879341 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:39 crc kubenswrapper[4901]: E0309 02:44:39.879674 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 02:44:40.379661793 +0000 UTC m=+204.969325525 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tpmfc" (UID: "3befa408-f48c-4244-81ca-6bf178967fbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.893901 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-zfqpl"] Mar 09 02:44:39 crc kubenswrapper[4901]: W0309 02:44:39.939128 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57ac07da_c623_40dc_b43b_87d5b43c4468.slice/crio-9351837179f00512f468e8cc717a03e5e1f3cfdb9ceeff80ad972f49aa079f9e WatchSource:0}: Error finding container 9351837179f00512f468e8cc717a03e5e1f3cfdb9ceeff80ad972f49aa079f9e: Status 404 returned error can't find the container with id 9351837179f00512f468e8cc717a03e5e1f3cfdb9ceeff80ad972f49aa079f9e Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.964399 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-57xhf"] Mar 09 02:44:39 crc kubenswrapper[4901]: I0309 02:44:39.980230 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 02:44:39 crc kubenswrapper[4901]: E0309 02:44:39.981145 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 02:44:40.481128746 +0000 UTC m=+205.070792478 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:40 crc kubenswrapper[4901]: I0309 02:44:40.083894 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:40 crc kubenswrapper[4901]: E0309 02:44:40.084275 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 02:44:40.584259481 +0000 UTC m=+205.173923213 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tpmfc" (UID: "3befa408-f48c-4244-81ca-6bf178967fbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:40 crc kubenswrapper[4901]: I0309 02:44:40.108177 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-xc8gr"] Mar 09 02:44:40 crc kubenswrapper[4901]: I0309 02:44:40.185035 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 02:44:40 crc kubenswrapper[4901]: E0309 02:44:40.185623 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 02:44:40.68560925 +0000 UTC m=+205.275272982 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:40 crc kubenswrapper[4901]: I0309 02:44:40.191486 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-92tt5" podStartSLOduration=133.191463775 podStartE2EDuration="2m13.191463775s" podCreationTimestamp="2026-03-09 02:42:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:44:40.188572904 +0000 UTC m=+204.778236636" watchObservedRunningTime="2026-03-09 02:44:40.191463775 +0000 UTC m=+204.781127507" Mar 09 02:44:40 crc kubenswrapper[4901]: I0309 02:44:40.243606 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-f6c5c" podStartSLOduration=132.242302865 podStartE2EDuration="2m12.242302865s" podCreationTimestamp="2026-03-09 02:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:44:40.216845399 +0000 UTC m=+204.806509131" watchObservedRunningTime="2026-03-09 02:44:40.242302865 +0000 UTC m=+204.831966607" Mar 09 02:44:40 crc kubenswrapper[4901]: I0309 02:44:40.271618 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-9v4g4" event={"ID":"5eb34c1b-916a-4f45-a6b3-e5e8b17a0872","Type":"ContainerStarted","Data":"88bd1d678f4a021c90d2a4e774b433e4771e5e2274b4768977146630d2c938de"} Mar 09 02:44:40 crc kubenswrapper[4901]: I0309 02:44:40.283856 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kxd66" event={"ID":"c8f28c79-5540-44c0-acea-aa36ce8a47d9","Type":"ContainerStarted","Data":"5ec9cc976de571abea8817eb67c3ead59dc7029fb1e8fc83f7f5543e7d91abce"} Mar 09 02:44:40 crc kubenswrapper[4901]: I0309 02:44:40.286691 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:40 crc kubenswrapper[4901]: E0309 02:44:40.286985 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 02:44:40.786974759 +0000 UTC m=+205.376638491 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tpmfc" (UID: "3befa408-f48c-4244-81ca-6bf178967fbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:40 crc kubenswrapper[4901]: I0309 02:44:40.289155 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-jznjb" event={"ID":"054fc716-7800-43b0-af23-328b685f89f9","Type":"ContainerStarted","Data":"fcdb60d311c373fa67d38db74505a3a5f6009695874ca0c2dac1dd8e900c21e1"} Mar 09 02:44:40 crc kubenswrapper[4901]: I0309 02:44:40.289608 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-jznjb" Mar 09 02:44:40 crc kubenswrapper[4901]: I0309 02:44:40.312418 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-x28d9" event={"ID":"c80e8dad-f5d6-4a23-90f9-4de04483c9be","Type":"ContainerStarted","Data":"825bbe8efa6da001c9de32ba2ab72ddd3f44b31581b62e7a194dce1c596911a5"} Mar 09 02:44:40 crc kubenswrapper[4901]: I0309 02:44:40.326070 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-stq8d" event={"ID":"9800e019-e239-4af9-9059-0c29be7ca479","Type":"ContainerStarted","Data":"520ca90efd9ef9324f1d6b148a1a0ff585b5435bbfe9ce132c5b99149ca572ff"} Mar 09 02:44:40 crc kubenswrapper[4901]: I0309 02:44:40.338000 4901 patch_prober.go:28] interesting pod/downloads-7954f5f757-jznjb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 09 02:44:40 crc kubenswrapper[4901]: I0309 02:44:40.338050 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-jznjb" podUID="054fc716-7800-43b0-af23-328b685f89f9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 09 02:44:40 crc kubenswrapper[4901]: I0309 02:44:40.362668 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-5jgbr" podStartSLOduration=133.362650825 podStartE2EDuration="2m13.362650825s" podCreationTimestamp="2026-03-09 02:42:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:44:40.297940327 +0000 UTC m=+204.887604079" watchObservedRunningTime="2026-03-09 02:44:40.362650825 +0000 UTC m=+204.952314557" Mar 09 02:44:40 crc kubenswrapper[4901]: I0309 02:44:40.373699 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-57xhf" event={"ID":"890abe57-aa9b-4c46-8a26-c2c1fd724fab","Type":"ContainerStarted","Data":"dedb703f62caa2b0ed7495796287a437699a55bc93f4f7c90f6ffa90715ec485"} Mar 09 02:44:40 crc kubenswrapper[4901]: I0309 02:44:40.385062 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-vwwl2" event={"ID":"57ac07da-c623-40dc-b43b-87d5b43c4468","Type":"ContainerStarted","Data":"9351837179f00512f468e8cc717a03e5e1f3cfdb9ceeff80ad972f49aa079f9e"} Mar 09 02:44:40 crc kubenswrapper[4901]: I0309 02:44:40.390655 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 02:44:40 crc kubenswrapper[4901]: E0309 02:44:40.394720 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 02:44:40.89470483 +0000 UTC m=+205.484368562 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:40 crc kubenswrapper[4901]: I0309 02:44:40.396587 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xtpw8" event={"ID":"1336f974-5df8-4d81-91bb-2c2364c87479","Type":"ContainerStarted","Data":"b148db18bab4d4bd94826d7e0be94bbc5990e9b068bc9c3967fe4dd2a27c86cd"} Mar 09 02:44:40 crc kubenswrapper[4901]: I0309 02:44:40.404906 4901 generic.go:334] "Generic (PLEG): container finished" podID="e4c5ab6f-bc3e-48e7-a896-c8f69da5d9a0" containerID="8057953c2990a8c62aa97bce622f04e1cdf8af0a74022f32bfaea4666bb36cb1" exitCode=0 Mar 09 02:44:40 crc kubenswrapper[4901]: I0309 02:44:40.404977 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nrgmz" event={"ID":"e4c5ab6f-bc3e-48e7-a896-c8f69da5d9a0","Type":"ContainerDied","Data":"8057953c2990a8c62aa97bce622f04e1cdf8af0a74022f32bfaea4666bb36cb1"} Mar 09 02:44:40 crc kubenswrapper[4901]: I0309 02:44:40.405005 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nrgmz" event={"ID":"e4c5ab6f-bc3e-48e7-a896-c8f69da5d9a0","Type":"ContainerStarted","Data":"042a3336e1f70dcc1df2df2eb7b288fa2959be414be66f07517013aa220eec75"} Mar 09 02:44:40 crc kubenswrapper[4901]: I0309 02:44:40.420741 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" event={"ID":"0bef1913-8737-48c8-bcf5-89daf1bd1c54","Type":"ContainerStarted","Data":"ba483cf28e26977022a8c8ce366cc98cb64776353c9ee50895f3e6e7743f51af"} Mar 09 02:44:40 crc kubenswrapper[4901]: I0309 02:44:40.421564 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" Mar 09 02:44:40 crc kubenswrapper[4901]: I0309 02:44:40.422896 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pxqjl" event={"ID":"64d1cf60-f9b7-4722-83aa-7967cd9827d6","Type":"ContainerStarted","Data":"34a46254127a91e27bb422e204879830112b8388e1505bb032ad4b8ea0eebdb2"} Mar 09 02:44:40 crc kubenswrapper[4901]: I0309 02:44:40.429894 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xs7zs" podStartSLOduration=132.429860513 podStartE2EDuration="2m12.429860513s" podCreationTimestamp="2026-03-09 02:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:44:40.419820515 +0000 UTC m=+205.009484237" watchObservedRunningTime="2026-03-09 02:44:40.429860513 +0000 UTC m=+205.019524245" Mar 09 02:44:40 crc kubenswrapper[4901]: I0309 02:44:40.463588 4901 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-tzf54 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.10:6443/healthz\": dial tcp 10.217.0.10:6443: connect: connection refused" start-of-body= Mar 09 02:44:40 crc kubenswrapper[4901]: I0309 02:44:40.463651 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" podUID="0bef1913-8737-48c8-bcf5-89daf1bd1c54" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.10:6443/healthz\": dial tcp 10.217.0.10:6443: connect: connection refused" Mar 09 02:44:40 crc kubenswrapper[4901]: I0309 02:44:40.464862 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-zfqpl" event={"ID":"730c2ff6-fe2c-425c-88a8-686d2d7617f0","Type":"ContainerStarted","Data":"40defa5ed5712fc7e4329733d478466da940e50177d4a600dad4804812601a94"} Mar 09 02:44:40 crc kubenswrapper[4901]: I0309 02:44:40.496503 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:40 crc kubenswrapper[4901]: E0309 02:44:40.498765 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 02:44:40.998753014 +0000 UTC m=+205.588416746 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tpmfc" (UID: "3befa408-f48c-4244-81ca-6bf178967fbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:40 crc kubenswrapper[4901]: I0309 02:44:40.514866 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-v478l"] Mar 09 02:44:40 crc kubenswrapper[4901]: I0309 02:44:40.550109 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-p949f"] Mar 09 02:44:40 crc kubenswrapper[4901]: I0309 02:44:40.602919 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 02:44:40 crc kubenswrapper[4901]: E0309 02:44:40.603988 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 02:44:41.103973466 +0000 UTC m=+205.693637198 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:40 crc kubenswrapper[4901]: I0309 02:44:40.699267 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jvk7f"] Mar 09 02:44:40 crc kubenswrapper[4901]: I0309 02:44:40.701006 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-p7chk"] Mar 09 02:44:40 crc kubenswrapper[4901]: I0309 02:44:40.705799 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:40 crc kubenswrapper[4901]: E0309 02:44:40.706235 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 02:44:41.206209453 +0000 UTC m=+205.795873175 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tpmfc" (UID: "3befa408-f48c-4244-81ca-6bf178967fbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:40 crc kubenswrapper[4901]: I0309 02:44:40.809327 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 02:44:40 crc kubenswrapper[4901]: E0309 02:44:40.809487 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 02:44:41.309451582 +0000 UTC m=+205.899115314 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:40 crc kubenswrapper[4901]: I0309 02:44:40.809856 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:40 crc kubenswrapper[4901]: E0309 02:44:40.810321 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 02:44:41.310303598 +0000 UTC m=+205.899967330 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tpmfc" (UID: "3befa408-f48c-4244-81ca-6bf178967fbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:40 crc kubenswrapper[4901]: I0309 02:44:40.911650 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 02:44:40 crc kubenswrapper[4901]: E0309 02:44:40.911838 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 02:44:41.411811772 +0000 UTC m=+206.001475504 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:40 crc kubenswrapper[4901]: I0309 02:44:40.911957 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:40 crc kubenswrapper[4901]: E0309 02:44:40.912363 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 02:44:41.412352159 +0000 UTC m=+206.002015891 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tpmfc" (UID: "3befa408-f48c-4244-81ca-6bf178967fbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:40 crc kubenswrapper[4901]: I0309 02:44:40.932239 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6t2h5" podStartSLOduration=133.932202918 podStartE2EDuration="2m13.932202918s" podCreationTimestamp="2026-03-09 02:42:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:44:40.89847168 +0000 UTC m=+205.488135412" watchObservedRunningTime="2026-03-09 02:44:40.932202918 +0000 UTC m=+205.521866640" Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.013130 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 02:44:41 crc kubenswrapper[4901]: E0309 02:44:41.013289 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 02:44:41.513269954 +0000 UTC m=+206.102933706 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.020996 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:41 crc kubenswrapper[4901]: E0309 02:44:41.021472 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 02:44:41.521460103 +0000 UTC m=+206.111123835 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tpmfc" (UID: "3befa408-f48c-4244-81ca-6bf178967fbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.122570 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 02:44:41 crc kubenswrapper[4901]: E0309 02:44:41.122717 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 02:44:41.622691688 +0000 UTC m=+206.212355420 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.122830 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:41 crc kubenswrapper[4901]: E0309 02:44:41.123215 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 02:44:41.623205004 +0000 UTC m=+206.212868736 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tpmfc" (UID: "3befa408-f48c-4244-81ca-6bf178967fbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.223895 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 02:44:41 crc kubenswrapper[4901]: E0309 02:44:41.224263 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 02:44:41.724206972 +0000 UTC m=+206.313870704 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:41 crc kubenswrapper[4901]: W0309 02:44:41.256802 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12637446_5b6a_4b60_911a_335c765680b4.slice/crio-f5826337ae5b2cc95278cc39046d16b1bea9f32c041e2adc4690545491111a59 WatchSource:0}: Error finding container f5826337ae5b2cc95278cc39046d16b1bea9f32c041e2adc4690545491111a59: Status 404 returned error can't find the container with id f5826337ae5b2cc95278cc39046d16b1bea9f32c041e2adc4690545491111a59 Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.328138 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:41 crc kubenswrapper[4901]: E0309 02:44:41.328478 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 02:44:41.828466533 +0000 UTC m=+206.418130265 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tpmfc" (UID: "3befa408-f48c-4244-81ca-6bf178967fbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.414727 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pxqjl" podStartSLOduration=134.414707903 podStartE2EDuration="2m14.414707903s" podCreationTimestamp="2026-03-09 02:42:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:44:41.346974829 +0000 UTC m=+205.936638581" watchObservedRunningTime="2026-03-09 02:44:41.414707903 +0000 UTC m=+206.004371635" Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.426199 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-np7wv"] Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.431475 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 02:44:41 crc kubenswrapper[4901]: E0309 02:44:41.431834 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 02:44:41.931814685 +0000 UTC m=+206.521478417 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.442300 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-sclvh"] Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.468190 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-jznjb" podStartSLOduration=134.468175876 podStartE2EDuration="2m14.468175876s" podCreationTimestamp="2026-03-09 02:42:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:44:41.466562595 +0000 UTC m=+206.056226327" watchObservedRunningTime="2026-03-09 02:44:41.468175876 +0000 UTC m=+206.057839608" Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.468491 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" podStartSLOduration=134.468485526 podStartE2EDuration="2m14.468485526s" podCreationTimestamp="2026-03-09 02:42:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:44:41.444630251 +0000 UTC m=+206.034293983" watchObservedRunningTime="2026-03-09 02:44:41.468485526 +0000 UTC m=+206.058149258" Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.484736 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-9v4g4" event={"ID":"5eb34c1b-916a-4f45-a6b3-e5e8b17a0872","Type":"ContainerStarted","Data":"76d9e1ec2c85dda1fff5633c62ae9623f90195cbb590ef1beb834d9f39538ee0"} Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.486396 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xtpw8" event={"ID":"1336f974-5df8-4d81-91bb-2c2364c87479","Type":"ContainerStarted","Data":"6e20feb90a92ed841626d335d4cea80f62fca99c66cdb13614e2fb1a83896a82"} Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.498117 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-p949f" event={"ID":"c9a5b87c-5fb9-44a9-9f71-84aec278ac58","Type":"ContainerStarted","Data":"7aa0526df72abd5d117e68d1d164b9cb34bcdbef78929cb6fc939d4831aa979d"} Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.502687 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-57xhf" event={"ID":"890abe57-aa9b-4c46-8a26-c2c1fd724fab","Type":"ContainerStarted","Data":"bd20d2dfb804bd2a2008883f837fbe0ce04f0947561aa9272165ada954d14200"} Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.503186 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-57xhf" Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.508435 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-9v4g4" podStartSLOduration=134.50841832 podStartE2EDuration="2m14.50841832s" podCreationTimestamp="2026-03-09 02:42:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:44:41.50555084 +0000 UTC m=+206.095214582" watchObservedRunningTime="2026-03-09 02:44:41.50841832 +0000 UTC m=+206.098082052" Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.510392 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-v478l" event={"ID":"9c86283c-c972-4467-b374-5d638dbfd9b9","Type":"ContainerStarted","Data":"f88618c481c1c848f129341f68def19d32eec76ddc81ef12908a2d2f53c78b26"} Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.512560 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-xc8gr" event={"ID":"65646690-9b87-47d8-a187-207924a2c486","Type":"ContainerStarted","Data":"f8a83cdb6731c81c2438eb2c6959538df9187f75414e3ab0a69797b0115de3f4"} Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.516525 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-x28d9" event={"ID":"c80e8dad-f5d6-4a23-90f9-4de04483c9be","Type":"ContainerStarted","Data":"30f7f4a187d8bf2520e1d33d2964832c843d1c5ceb9b5a6d1feef34f873795e7"} Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.519146 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-stq8d" event={"ID":"9800e019-e239-4af9-9059-0c29be7ca479","Type":"ContainerStarted","Data":"b1f954f59c8381bd7a63743ab413d1584fc267aea3e036cefbb702e5233c5787"} Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.529337 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jffft" event={"ID":"99c51354-0ea1-4e7a-bd34-1bb57e6422c0","Type":"ContainerStarted","Data":"e48d53f1f4e9b80d4e36f16c67e8c2546d72e168e7045cd163badc0c57eae829"} Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.539831 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-vwwl2" event={"ID":"57ac07da-c623-40dc-b43b-87d5b43c4468","Type":"ContainerStarted","Data":"b7655a258d36510a374fecf62dde2281c1b686e6230d2acdcc79550c5065e2e7"} Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.540672 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:41 crc kubenswrapper[4901]: E0309 02:44:41.541053 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 02:44:42.041041823 +0000 UTC m=+206.630705545 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tpmfc" (UID: "3befa408-f48c-4244-81ca-6bf178967fbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.545047 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kxd66" event={"ID":"c8f28c79-5540-44c0-acea-aa36ce8a47d9","Type":"ContainerStarted","Data":"eec40bf16c6c513015d051185d64ab5d94ac01949c25ff516daa236a47e3409e"} Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.546967 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-p7chk" event={"ID":"a7e3f000-a2fc-4ebb-a6ee-00fda57097b7","Type":"ContainerStarted","Data":"a1f11388d96768ba0d2cae822c6bc950a73bcfb3ef30ee48d987595ae86ed143"} Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.548300 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-zfqpl" event={"ID":"730c2ff6-fe2c-425c-88a8-686d2d7617f0","Type":"ContainerStarted","Data":"2d47e049bd3ccbd4cf1bb9cf8be23fffe2fc79e4099c17bf4f98ae24907af7d6"} Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.550624 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-zfqpl" Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.550784 4901 patch_prober.go:28] interesting pod/console-operator-58897d9998-zfqpl container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/readyz\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.550834 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-zfqpl" podUID="730c2ff6-fe2c-425c-88a8-686d2d7617f0" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.23:8443/readyz\": dial tcp 10.217.0.23:8443: connect: connection refused" Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.556049 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jvk7f" event={"ID":"12637446-5b6a-4b60-911a-335c765680b4","Type":"ContainerStarted","Data":"f5826337ae5b2cc95278cc39046d16b1bea9f32c041e2adc4690545491111a59"} Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.565399 4901 patch_prober.go:28] interesting pod/downloads-7954f5f757-jznjb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.565433 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-jznjb" podUID="054fc716-7800-43b0-af23-328b685f89f9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.565696 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.579694 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-57xhf" podStartSLOduration=133.579679117 podStartE2EDuration="2m13.579679117s" podCreationTimestamp="2026-03-09 02:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:44:41.540569678 +0000 UTC m=+206.130233410" watchObservedRunningTime="2026-03-09 02:44:41.579679117 +0000 UTC m=+206.169342849" Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.581400 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xtpw8" podStartSLOduration=134.581393481 podStartE2EDuration="2m14.581393481s" podCreationTimestamp="2026-03-09 02:42:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:44:41.57914149 +0000 UTC m=+206.168805222" watchObservedRunningTime="2026-03-09 02:44:41.581393481 +0000 UTC m=+206.171057213" Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.620140 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-x28d9" podStartSLOduration=5.620119907 podStartE2EDuration="5.620119907s" podCreationTimestamp="2026-03-09 02:44:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:44:41.615740198 +0000 UTC m=+206.205403930" watchObservedRunningTime="2026-03-09 02:44:41.620119907 +0000 UTC m=+206.209783659" Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.632097 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550390-hrjpt"] Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.642272 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 02:44:41 crc kubenswrapper[4901]: E0309 02:44:41.656365 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 02:44:42.156328593 +0000 UTC m=+206.745992325 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.678093 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5jjc7"] Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.715848 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-stq8d" podStartSLOduration=134.715825747 podStartE2EDuration="2m14.715825747s" podCreationTimestamp="2026-03-09 02:42:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:44:41.709693093 +0000 UTC m=+206.299356835" watchObservedRunningTime="2026-03-09 02:44:41.715825747 +0000 UTC m=+206.305489479" Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.762684 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:41 crc kubenswrapper[4901]: W0309 02:44:41.768882 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0860921b_92bb_498b_97b1_ee87b6e985cc.slice/crio-394ebf3a6145edbe56b008d96dc3219c04a0598a4fb10d18fe62c51d470fa116 WatchSource:0}: Error finding container 394ebf3a6145edbe56b008d96dc3219c04a0598a4fb10d18fe62c51d470fa116: Status 404 returned error can't find the container with id 394ebf3a6145edbe56b008d96dc3219c04a0598a4fb10d18fe62c51d470fa116 Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.770827 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-m52jv"] Mar 09 02:44:41 crc kubenswrapper[4901]: E0309 02:44:41.771285 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 02:44:42.271268623 +0000 UTC m=+206.860932355 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tpmfc" (UID: "3befa408-f48c-4244-81ca-6bf178967fbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.794595 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nlx8v"] Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.794631 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9fxwq"] Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.812021 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-vwwl2" podStartSLOduration=134.812001342 podStartE2EDuration="2m14.812001342s" podCreationTimestamp="2026-03-09 02:42:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:44:41.778089688 +0000 UTC m=+206.367753410" watchObservedRunningTime="2026-03-09 02:44:41.812001342 +0000 UTC m=+206.401665074" Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.812831 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-ffb55"] Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.814590 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-zfqpl" podStartSLOduration=134.814577554 podStartE2EDuration="2m14.814577554s" podCreationTimestamp="2026-03-09 02:42:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:44:41.800590841 +0000 UTC m=+206.390254573" watchObservedRunningTime="2026-03-09 02:44:41.814577554 +0000 UTC m=+206.404241286" Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.862260 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8m6x2"] Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.864793 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 02:44:41 crc kubenswrapper[4901]: E0309 02:44:41.865205 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 02:44:42.365190746 +0000 UTC m=+206.954854478 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.886946 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fl2d7"] Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.887116 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qnwn6"] Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.898075 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-cwmk8"] Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.903518 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-m8h87"] Mar 09 02:44:41 crc kubenswrapper[4901]: W0309 02:44:41.919912 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2009785b_23de_4c85_8dbc_285219ade858.slice/crio-8d7dbb6dcb16f32d999d7a849299bae3e398ff5fd889e658fe6858a79db4a54d WatchSource:0}: Error finding container 8d7dbb6dcb16f32d999d7a849299bae3e398ff5fd889e658fe6858a79db4a54d: Status 404 returned error can't find the container with id 8d7dbb6dcb16f32d999d7a849299bae3e398ff5fd889e658fe6858a79db4a54d Mar 09 02:44:41 crc kubenswrapper[4901]: W0309 02:44:41.932570 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18a59baf_2a09_41ff_94c9_1219cf47dfc2.slice/crio-98c85fa5e4badeacc69b131e1f65ee15f170a6b6433f7b03d13a538afd297657 WatchSource:0}: Error finding container 98c85fa5e4badeacc69b131e1f65ee15f170a6b6433f7b03d13a538afd297657: Status 404 returned error can't find the container with id 98c85fa5e4badeacc69b131e1f65ee15f170a6b6433f7b03d13a538afd297657 Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.936821 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-57xhf" Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.951337 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-n9lvf"] Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.952546 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-92tt5" Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.953253 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-92tt5" Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.967359 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:41 crc kubenswrapper[4901]: E0309 02:44:41.967698 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 02:44:42.467685841 +0000 UTC m=+207.057349573 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tpmfc" (UID: "3befa408-f48c-4244-81ca-6bf178967fbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.974431 4901 patch_prober.go:28] interesting pod/apiserver-76f77b778f-92tt5 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 09 02:44:41 crc kubenswrapper[4901]: [+]log ok Mar 09 02:44:41 crc kubenswrapper[4901]: [+]etcd ok Mar 09 02:44:41 crc kubenswrapper[4901]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 09 02:44:41 crc kubenswrapper[4901]: [+]poststarthook/generic-apiserver-start-informers ok Mar 09 02:44:41 crc kubenswrapper[4901]: [+]poststarthook/max-in-flight-filter ok Mar 09 02:44:41 crc kubenswrapper[4901]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 09 02:44:41 crc kubenswrapper[4901]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 09 02:44:41 crc kubenswrapper[4901]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 09 02:44:41 crc kubenswrapper[4901]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Mar 09 02:44:41 crc kubenswrapper[4901]: [+]poststarthook/project.openshift.io-projectcache ok Mar 09 02:44:41 crc kubenswrapper[4901]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 09 02:44:41 crc kubenswrapper[4901]: [+]poststarthook/openshift.io-startinformers ok Mar 09 02:44:41 crc kubenswrapper[4901]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 09 02:44:41 crc kubenswrapper[4901]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 09 02:44:41 crc kubenswrapper[4901]: livez check failed Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.974480 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-92tt5" podUID="bffc6cea-5224-4f60-b2c2-3e93aa5cff97" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 02:44:41 crc kubenswrapper[4901]: W0309 02:44:41.982677 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podccdc6c7c_cc5b_4b4e_a1d3_663d9a7900ae.slice/crio-d8f153997b809c50ebe9a73103b87ab77677c23d2d83342e00de42478ddc9ab6 WatchSource:0}: Error finding container d8f153997b809c50ebe9a73103b87ab77677c23d2d83342e00de42478ddc9ab6: Status 404 returned error can't find the container with id d8f153997b809c50ebe9a73103b87ab77677c23d2d83342e00de42478ddc9ab6 Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.998588 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xs7zs" Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.998637 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xs7zs" Mar 09 02:44:41 crc kubenswrapper[4901]: I0309 02:44:41.998716 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m299m"] Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.001704 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zf94m"] Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.037813 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xs7zs" Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.052064 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-lgxp2"] Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.069032 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.070067 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.070160 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.070202 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.070266 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:44:42 crc kubenswrapper[4901]: E0309 02:44:42.073637 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 02:44:42.573621415 +0000 UTC m=+207.163285147 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.074688 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.080741 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.081435 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.081769 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:44:42 crc kubenswrapper[4901]: W0309 02:44:42.084332 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38e11ab1_ab41_4665_90ce_5c7aba1639f2.slice/crio-f8b99521a6a64922eec12dfbd7b93c9291f9873abc10a10cb675f76414a42b8f WatchSource:0}: Error finding container f8b99521a6a64922eec12dfbd7b93c9291f9873abc10a10cb675f76414a42b8f: Status 404 returned error can't find the container with id f8b99521a6a64922eec12dfbd7b93c9291f9873abc10a10cb675f76414a42b8f Mar 09 02:44:42 crc kubenswrapper[4901]: W0309 02:44:42.092845 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4a65957_3afe_4859_8ec3_4b3a2180e744.slice/crio-28f442847a6794bd27b95bc1174d8038c4a8b2f1e91d83ae844def1d17a5d258 WatchSource:0}: Error finding container 28f442847a6794bd27b95bc1174d8038c4a8b2f1e91d83ae844def1d17a5d258: Status 404 returned error can't find the container with id 28f442847a6794bd27b95bc1174d8038c4a8b2f1e91d83ae844def1d17a5d258 Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.148073 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550404-w8858"] Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.156770 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-kwffm"] Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.173707 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:42 crc kubenswrapper[4901]: E0309 02:44:42.174998 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 02:44:42.674981534 +0000 UTC m=+207.264645266 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tpmfc" (UID: "3befa408-f48c-4244-81ca-6bf178967fbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.233036 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 02:44:42 crc kubenswrapper[4901]: W0309 02:44:42.236583 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod680e9e87_71a2_402c_84f2_e8eb2b7a4c44.slice/crio-42b0097386974baecd436dfb54d11bfe48767366b4c548092d36b19648b087a0 WatchSource:0}: Error finding container 42b0097386974baecd436dfb54d11bfe48767366b4c548092d36b19648b087a0: Status 404 returned error can't find the container with id 42b0097386974baecd436dfb54d11bfe48767366b4c548092d36b19648b087a0 Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.242251 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.250478 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.274272 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 02:44:42 crc kubenswrapper[4901]: E0309 02:44:42.274986 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 02:44:42.77496385 +0000 UTC m=+207.364627582 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.282496 4901 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.383511 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:42 crc kubenswrapper[4901]: E0309 02:44:42.384175 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 02:44:42.884162577 +0000 UTC m=+207.473826309 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tpmfc" (UID: "3befa408-f48c-4244-81ca-6bf178967fbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.384191 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-9v4g4" Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.395624 4901 patch_prober.go:28] interesting pod/router-default-5444994796-9v4g4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 02:44:42 crc kubenswrapper[4901]: [-]has-synced failed: reason withheld Mar 09 02:44:42 crc kubenswrapper[4901]: [+]process-running ok Mar 09 02:44:42 crc kubenswrapper[4901]: healthz check failed Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.395685 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9v4g4" podUID="5eb34c1b-916a-4f45-a6b3-e5e8b17a0872" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.484686 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 02:44:42 crc kubenswrapper[4901]: E0309 02:44:42.485166 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 02:44:42.985149305 +0000 UTC m=+207.574813037 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.587292 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.587296 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-p949f" event={"ID":"c9a5b87c-5fb9-44a9-9f71-84aec278ac58","Type":"ContainerStarted","Data":"2f9e64adf957d42006187d943a81a77eee5343903affce6374ebde14fb9ead41"} Mar 09 02:44:42 crc kubenswrapper[4901]: E0309 02:44:42.587808 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 02:44:43.087793405 +0000 UTC m=+207.677457137 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tpmfc" (UID: "3befa408-f48c-4244-81ca-6bf178967fbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.592325 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5jjc7" event={"ID":"f1f65b0f-1ef9-445f-8e08-3a4b7bc82f93","Type":"ContainerStarted","Data":"be5f91ebb96981511c859d9a9da19637e7acf7d584d5e3be1278fd7eb914455f"} Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.592364 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5jjc7" event={"ID":"f1f65b0f-1ef9-445f-8e08-3a4b7bc82f93","Type":"ContainerStarted","Data":"aed6491d5f1a8aa9b8d50211f693c94130ce5c298c31ef5eadadeae24b622596"} Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.616727 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nlx8v" event={"ID":"59743d02-8280-4571-b67f-fba4e4659d39","Type":"ContainerStarted","Data":"33dfbf52e88b3a34cb7566dbff114ddc8953664841d49b179709a1ddc52b6093"} Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.617159 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nlx8v" event={"ID":"59743d02-8280-4571-b67f-fba4e4659d39","Type":"ContainerStarted","Data":"79fa4e2c044a0d2f141fa4ae7c1876412233c8cc1a400976adc932b1c06968c9"} Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.663391 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-p949f" podStartSLOduration=6.663373867 podStartE2EDuration="6.663373867s" podCreationTimestamp="2026-03-09 02:44:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:44:42.616422651 +0000 UTC m=+207.206086383" watchObservedRunningTime="2026-03-09 02:44:42.663373867 +0000 UTC m=+207.253037589" Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.664813 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nlx8v" podStartSLOduration=134.664807293 podStartE2EDuration="2m14.664807293s" podCreationTimestamp="2026-03-09 02:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:44:42.662551561 +0000 UTC m=+207.252215293" watchObservedRunningTime="2026-03-09 02:44:42.664807293 +0000 UTC m=+207.254471025" Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.689835 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 02:44:42 crc kubenswrapper[4901]: E0309 02:44:42.690930 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 02:44:43.190915779 +0000 UTC m=+207.780579511 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.704122 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jvk7f" event={"ID":"12637446-5b6a-4b60-911a-335c765680b4","Type":"ContainerStarted","Data":"da264a784d452f2664e4764a200abbc9309322e5109f2f7369a8a5519d0f2285"} Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.735591 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550404-w8858" event={"ID":"680e9e87-71a2-402c-84f2-e8eb2b7a4c44","Type":"ContainerStarted","Data":"42b0097386974baecd436dfb54d11bfe48767366b4c548092d36b19648b087a0"} Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.736209 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jvk7f" podStartSLOduration=135.736191343 podStartE2EDuration="2m15.736191343s" podCreationTimestamp="2026-03-09 02:42:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:44:42.735554633 +0000 UTC m=+207.325218365" watchObservedRunningTime="2026-03-09 02:44:42.736191343 +0000 UTC m=+207.325855075" Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.749895 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ffb55" event={"ID":"8e691e4a-54da-49cd-acaf-e1b14cadde2e","Type":"ContainerStarted","Data":"7b46e7f1a4cbe2613f739147426578e8711e9a6ea9cd50a74ae492bafdd0b04d"} Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.749938 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ffb55" event={"ID":"8e691e4a-54da-49cd-acaf-e1b14cadde2e","Type":"ContainerStarted","Data":"71a9220af436d9e3b293df60c5a6de11aa17e956ef50c33105c4c2e5abb6e5fd"} Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.759074 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-m52jv" event={"ID":"0860921b-92bb-498b-97b1-ee87b6e985cc","Type":"ContainerStarted","Data":"10cf70e93e3f8689610a32e511bf53c67fa08b5fa36bdb2c0c1527731d1c04e7"} Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.759121 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-m52jv" event={"ID":"0860921b-92bb-498b-97b1-ee87b6e985cc","Type":"ContainerStarted","Data":"394ebf3a6145edbe56b008d96dc3219c04a0598a4fb10d18fe62c51d470fa116"} Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.794913 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:42 crc kubenswrapper[4901]: E0309 02:44:42.796100 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 02:44:43.296088329 +0000 UTC m=+207.885752061 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tpmfc" (UID: "3befa408-f48c-4244-81ca-6bf178967fbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.802960 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-m52jv" podStartSLOduration=134.802947686 podStartE2EDuration="2m14.802947686s" podCreationTimestamp="2026-03-09 02:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:44:42.801861002 +0000 UTC m=+207.391524744" watchObservedRunningTime="2026-03-09 02:44:42.802947686 +0000 UTC m=+207.392611418" Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.824553 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qnwn6" event={"ID":"463898fa-b0fc-411e-abce-b0c64f32e240","Type":"ContainerStarted","Data":"323c3793b520f92329832e103418e0267fc6fe9c77c8a871c78284925f313338"} Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.833521 4901 ???:1] "http: TLS handshake error from 192.168.126.11:34404: no serving certificate available for the kubelet" Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.835440 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550390-hrjpt" event={"ID":"e3336f36-4389-49db-a669-fe0cbc0bfdfd","Type":"ContainerStarted","Data":"f09bcee651e918ddf27c2b14ec0f74a86cf3b972a8f39631c4b0972b19b67c2d"} Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.835489 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550390-hrjpt" event={"ID":"e3336f36-4389-49db-a669-fe0cbc0bfdfd","Type":"ContainerStarted","Data":"40da90ad82ded32d62374fd895952217b9b9dcb837fdf64357dec741811ba390"} Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.839311 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-m8h87" event={"ID":"18a59baf-2a09-41ff-94c9-1219cf47dfc2","Type":"ContainerStarted","Data":"98c85fa5e4badeacc69b131e1f65ee15f170a6b6433f7b03d13a538afd297657"} Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.841708 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fl2d7" event={"ID":"7d39bde0-c78d-4e46-b388-78ce5fbadb9f","Type":"ContainerStarted","Data":"c71c31d546744472305bf482b2ce921664ca5868e6973e4cf045bdb87155c7e9"} Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.842564 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fl2d7" Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.846000 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zf94m" event={"ID":"c4a65957-3afe-4859-8ec3-4b3a2180e744","Type":"ContainerStarted","Data":"28f442847a6794bd27b95bc1174d8038c4a8b2f1e91d83ae844def1d17a5d258"} Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.853759 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-v478l" event={"ID":"9c86283c-c972-4467-b374-5d638dbfd9b9","Type":"ContainerStarted","Data":"e968a511dd01c13f967b4fa089493ac35a848353ef1a9e87a1695b7669fd15ce"} Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.863591 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29550390-hrjpt" podStartSLOduration=135.863571516 podStartE2EDuration="2m15.863571516s" podCreationTimestamp="2026-03-09 02:42:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:44:42.860646163 +0000 UTC m=+207.450309895" watchObservedRunningTime="2026-03-09 02:44:42.863571516 +0000 UTC m=+207.453235248" Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.884083 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jffft" event={"ID":"99c51354-0ea1-4e7a-bd34-1bb57e6422c0","Type":"ContainerStarted","Data":"fe8401baac4a079f072578df248b1212441d395a5f4147e50b503b53b9c3361e"} Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.884448 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jffft" event={"ID":"99c51354-0ea1-4e7a-bd34-1bb57e6422c0","Type":"ContainerStarted","Data":"b1a3e2e4429ef200d4159db216055d96b7cfb870d1918713f26d536f822cf888"} Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.892545 4901 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-fl2d7 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" start-of-body= Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.892587 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fl2d7" podUID="7d39bde0-c78d-4e46-b388-78ce5fbadb9f" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.896242 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 02:44:42 crc kubenswrapper[4901]: E0309 02:44:42.898824 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 02:44:43.398809832 +0000 UTC m=+207.988473564 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.900600 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fl2d7" podStartSLOduration=134.900578428 podStartE2EDuration="2m14.900578428s" podCreationTimestamp="2026-03-09 02:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:44:42.899839254 +0000 UTC m=+207.489502986" watchObservedRunningTime="2026-03-09 02:44:42.900578428 +0000 UTC m=+207.490242160" Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.905532 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-cwmk8" event={"ID":"469c34ce-1d46-4d99-a1c4-ac180fd08322","Type":"ContainerStarted","Data":"6a5a5e911650ad847d4913568708230fdbf478ae4c6515e320e93cf5fb508023"} Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.905672 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-cwmk8" event={"ID":"469c34ce-1d46-4d99-a1c4-ac180fd08322","Type":"ContainerStarted","Data":"b561f66314f767f792d35dfb8f9291154c41d4cfd8afa07695eb8e6f37db5c0d"} Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.906742 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-v478l" podStartSLOduration=134.906728232 podStartE2EDuration="2m14.906728232s" podCreationTimestamp="2026-03-09 02:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:44:42.876278088 +0000 UTC m=+207.465941840" watchObservedRunningTime="2026-03-09 02:44:42.906728232 +0000 UTC m=+207.496391964" Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.908904 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8m6x2" event={"ID":"2009785b-23de-4c85-8dbc-285219ade858","Type":"ContainerStarted","Data":"8d7dbb6dcb16f32d999d7a849299bae3e398ff5fd889e658fe6858a79db4a54d"} Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.927074 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jffft" podStartSLOduration=135.927060876 podStartE2EDuration="2m15.927060876s" podCreationTimestamp="2026-03-09 02:42:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:44:42.925866148 +0000 UTC m=+207.515529880" watchObservedRunningTime="2026-03-09 02:44:42.927060876 +0000 UTC m=+207.516724598" Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.935567 4901 ???:1] "http: TLS handshake error from 192.168.126.11:34412: no serving certificate available for the kubelet" Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.942029 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n9lvf" event={"ID":"ccdc6c7c-cc5b-4b4e-a1d3-663d9a7900ae","Type":"ContainerStarted","Data":"d8f153997b809c50ebe9a73103b87ab77677c23d2d83342e00de42478ddc9ab6"} Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.944422 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-lgxp2" event={"ID":"9e9eea51-12bc-40f5-94b0-3fb75a48b898","Type":"ContainerStarted","Data":"d915ffb37c88c9bf0c20902a45c758dc7cead6a7e54bf7e9473912b76a058b1f"} Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.945920 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-np7wv" event={"ID":"1969ad35-8f20-4676-b7b6-375c5b93aa14","Type":"ContainerStarted","Data":"9425e44d861b829b634d81c457f73a8cc539c47456c802fb1c8923c9a37c6f2f"} Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.945950 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-np7wv" event={"ID":"1969ad35-8f20-4676-b7b6-375c5b93aa14","Type":"ContainerStarted","Data":"d26b67b18053b667381a7ad8d69804028cb7bdf893ba953a12764b09ce0f0c32"} Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.960940 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8m6x2" podStartSLOduration=134.960923068 podStartE2EDuration="2m14.960923068s" podCreationTimestamp="2026-03-09 02:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:44:42.960882997 +0000 UTC m=+207.550546739" watchObservedRunningTime="2026-03-09 02:44:42.960923068 +0000 UTC m=+207.550586810" Mar 09 02:44:42 crc kubenswrapper[4901]: I0309 02:44:42.975233 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-kwffm" event={"ID":"ae631b64-6f22-4112-8fb8-aa2c5140275b","Type":"ContainerStarted","Data":"db246342e47fd6acacbe8eabe928deb9ca559a03aad68d500c550d7c64a85815"} Mar 09 02:44:43 crc kubenswrapper[4901]: I0309 02:44:42.999335 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:43 crc kubenswrapper[4901]: I0309 02:44:43.001656 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9fxwq" event={"ID":"d65d3de3-6f6f-42e4-9853-bc5b7bad5236","Type":"ContainerStarted","Data":"3d40454c71e03ab8391cfcb90363c8ecc32c64c8e4f448f6ce09a53ba8e64e1c"} Mar 09 02:44:43 crc kubenswrapper[4901]: I0309 02:44:43.001697 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9fxwq" event={"ID":"d65d3de3-6f6f-42e4-9853-bc5b7bad5236","Type":"ContainerStarted","Data":"8cc4187e4f93c91470c1f0f8ff5f9397343e361b60515c4bf4abb352d307e2a9"} Mar 09 02:44:43 crc kubenswrapper[4901]: I0309 02:44:43.003108 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9fxwq" Mar 09 02:44:43 crc kubenswrapper[4901]: E0309 02:44:43.005314 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 02:44:43.505296723 +0000 UTC m=+208.094960455 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tpmfc" (UID: "3befa408-f48c-4244-81ca-6bf178967fbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:43 crc kubenswrapper[4901]: I0309 02:44:43.035620 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9fxwq" podStartSLOduration=135.035590832 podStartE2EDuration="2m15.035590832s" podCreationTimestamp="2026-03-09 02:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:44:43.032859496 +0000 UTC m=+207.622523228" watchObservedRunningTime="2026-03-09 02:44:43.035590832 +0000 UTC m=+207.625254574" Mar 09 02:44:43 crc kubenswrapper[4901]: I0309 02:44:43.055910 4901 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-9fxwq container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.24:5443/healthz\": dial tcp 10.217.0.24:5443: connect: connection refused" start-of-body= Mar 09 02:44:43 crc kubenswrapper[4901]: I0309 02:44:43.055973 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9fxwq" podUID="d65d3de3-6f6f-42e4-9853-bc5b7bad5236" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.24:5443/healthz\": dial tcp 10.217.0.24:5443: connect: connection refused" Mar 09 02:44:43 crc kubenswrapper[4901]: I0309 02:44:43.061763 4901 ???:1] "http: TLS handshake error from 192.168.126.11:34424: no serving certificate available for the kubelet" Mar 09 02:44:43 crc kubenswrapper[4901]: I0309 02:44:43.104925 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 02:44:43 crc kubenswrapper[4901]: E0309 02:44:43.105519 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 02:44:43.605477025 +0000 UTC m=+208.195140757 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:43 crc kubenswrapper[4901]: I0309 02:44:43.116100 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-p7chk" event={"ID":"a7e3f000-a2fc-4ebb-a6ee-00fda57097b7","Type":"ContainerStarted","Data":"cd160b5b6648747c6c26e0c8346a27c5f3ed5ff5e78f5f2542eccdda2b6e834b"} Mar 09 02:44:43 crc kubenswrapper[4901]: I0309 02:44:43.142005 4901 ???:1] "http: TLS handshake error from 192.168.126.11:34426: no serving certificate available for the kubelet" Mar 09 02:44:43 crc kubenswrapper[4901]: I0309 02:44:43.159582 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-p7chk" podStartSLOduration=136.159564017 podStartE2EDuration="2m16.159564017s" podCreationTimestamp="2026-03-09 02:42:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:44:43.158531395 +0000 UTC m=+207.748195117" watchObservedRunningTime="2026-03-09 02:44:43.159564017 +0000 UTC m=+207.749227749" Mar 09 02:44:43 crc kubenswrapper[4901]: I0309 02:44:43.163025 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nrgmz" event={"ID":"e4c5ab6f-bc3e-48e7-a896-c8f69da5d9a0","Type":"ContainerStarted","Data":"650cda18a741655987a8ccd5a13dc16cdf022015d3f30ffb1b8893163c191e75"} Mar 09 02:44:43 crc kubenswrapper[4901]: I0309 02:44:43.163297 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nrgmz" Mar 09 02:44:43 crc kubenswrapper[4901]: I0309 02:44:43.182596 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kxd66" event={"ID":"c8f28c79-5540-44c0-acea-aa36ce8a47d9","Type":"ContainerStarted","Data":"933d69b89064f287db275f118b1c5f4fac4c13701d263027a7a272ae368f5b37"} Mar 09 02:44:43 crc kubenswrapper[4901]: I0309 02:44:43.187278 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-sclvh" event={"ID":"4816493b-f6f6-425b-88f2-67aa948f3c67","Type":"ContainerStarted","Data":"53155ece6ca76ce22871799d20f8b83dd16faecde165031ab892d30301215ffe"} Mar 09 02:44:43 crc kubenswrapper[4901]: I0309 02:44:43.187788 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-sclvh" event={"ID":"4816493b-f6f6-425b-88f2-67aa948f3c67","Type":"ContainerStarted","Data":"18aaeff1ab3a2f58ba01606833c550cc7ad958f82fd97b87a8033fdac261096d"} Mar 09 02:44:43 crc kubenswrapper[4901]: I0309 02:44:43.208333 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:43 crc kubenswrapper[4901]: E0309 02:44:43.208619 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 02:44:43.70860484 +0000 UTC m=+208.298268562 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tpmfc" (UID: "3befa408-f48c-4244-81ca-6bf178967fbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:43 crc kubenswrapper[4901]: I0309 02:44:43.231174 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nrgmz" podStartSLOduration=136.231155694 podStartE2EDuration="2m16.231155694s" podCreationTimestamp="2026-03-09 02:42:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:44:43.198374096 +0000 UTC m=+207.788037818" watchObservedRunningTime="2026-03-09 02:44:43.231155694 +0000 UTC m=+207.820819426" Mar 09 02:44:43 crc kubenswrapper[4901]: I0309 02:44:43.231458 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kxd66" podStartSLOduration=136.231453183 podStartE2EDuration="2m16.231453183s" podCreationTimestamp="2026-03-09 02:42:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:44:43.230771262 +0000 UTC m=+207.820434994" watchObservedRunningTime="2026-03-09 02:44:43.231453183 +0000 UTC m=+207.821116915" Mar 09 02:44:43 crc kubenswrapper[4901]: I0309 02:44:43.234870 4901 ???:1] "http: TLS handshake error from 192.168.126.11:34432: no serving certificate available for the kubelet" Mar 09 02:44:43 crc kubenswrapper[4901]: I0309 02:44:43.245156 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-xc8gr" event={"ID":"65646690-9b87-47d8-a187-207924a2c486","Type":"ContainerStarted","Data":"5e047bcbc976639e944c1f21371f9d16ac25de2ac2d08b344fee0c7fcdea4045"} Mar 09 02:44:43 crc kubenswrapper[4901]: I0309 02:44:43.281980 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-sclvh" podStartSLOduration=136.281955882 podStartE2EDuration="2m16.281955882s" podCreationTimestamp="2026-03-09 02:42:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:44:43.270777628 +0000 UTC m=+207.860441360" watchObservedRunningTime="2026-03-09 02:44:43.281955882 +0000 UTC m=+207.871619614" Mar 09 02:44:43 crc kubenswrapper[4901]: I0309 02:44:43.295395 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m299m" event={"ID":"38e11ab1-ab41-4665-90ce-5c7aba1639f2","Type":"ContainerStarted","Data":"f8b99521a6a64922eec12dfbd7b93c9291f9873abc10a10cb675f76414a42b8f"} Mar 09 02:44:43 crc kubenswrapper[4901]: I0309 02:44:43.324447 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 02:44:43 crc kubenswrapper[4901]: E0309 02:44:43.325828 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 02:44:43.825813221 +0000 UTC m=+208.415476953 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:43 crc kubenswrapper[4901]: I0309 02:44:43.338055 4901 ???:1] "http: TLS handshake error from 192.168.126.11:34446: no serving certificate available for the kubelet" Mar 09 02:44:43 crc kubenswrapper[4901]: I0309 02:44:43.340049 4901 patch_prober.go:28] interesting pod/downloads-7954f5f757-jznjb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 09 02:44:43 crc kubenswrapper[4901]: I0309 02:44:43.340104 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-jznjb" podUID="054fc716-7800-43b0-af23-328b685f89f9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 09 02:44:43 crc kubenswrapper[4901]: I0309 02:44:43.347501 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-zfqpl" Mar 09 02:44:43 crc kubenswrapper[4901]: I0309 02:44:43.355578 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xs7zs" Mar 09 02:44:43 crc kubenswrapper[4901]: I0309 02:44:43.370306 4901 patch_prober.go:28] interesting pod/router-default-5444994796-9v4g4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 02:44:43 crc kubenswrapper[4901]: [-]has-synced failed: reason withheld Mar 09 02:44:43 crc kubenswrapper[4901]: [+]process-running ok Mar 09 02:44:43 crc kubenswrapper[4901]: healthz check failed Mar 09 02:44:43 crc kubenswrapper[4901]: I0309 02:44:43.370662 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9v4g4" podUID="5eb34c1b-916a-4f45-a6b3-e5e8b17a0872" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 02:44:43 crc kubenswrapper[4901]: I0309 02:44:43.383523 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m299m" podStartSLOduration=135.383502687 podStartE2EDuration="2m15.383502687s" podCreationTimestamp="2026-03-09 02:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:44:43.37916481 +0000 UTC m=+207.968828542" watchObservedRunningTime="2026-03-09 02:44:43.383502687 +0000 UTC m=+207.973166419" Mar 09 02:44:43 crc kubenswrapper[4901]: I0309 02:44:43.384048 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-xc8gr" podStartSLOduration=136.384042945 podStartE2EDuration="2m16.384042945s" podCreationTimestamp="2026-03-09 02:42:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:44:43.343604744 +0000 UTC m=+207.933268476" watchObservedRunningTime="2026-03-09 02:44:43.384042945 +0000 UTC m=+207.973706677" Mar 09 02:44:43 crc kubenswrapper[4901]: I0309 02:44:43.430415 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:43 crc kubenswrapper[4901]: I0309 02:44:43.448275 4901 ???:1] "http: TLS handshake error from 192.168.126.11:34456: no serving certificate available for the kubelet" Mar 09 02:44:43 crc kubenswrapper[4901]: E0309 02:44:43.451386 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 02:44:43.951368606 +0000 UTC m=+208.541032338 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tpmfc" (UID: "3befa408-f48c-4244-81ca-6bf178967fbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:43 crc kubenswrapper[4901]: I0309 02:44:43.542013 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 02:44:43 crc kubenswrapper[4901]: E0309 02:44:43.543208 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 02:44:44.043181923 +0000 UTC m=+208.632845655 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:43 crc kubenswrapper[4901]: I0309 02:44:43.565741 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:43 crc kubenswrapper[4901]: E0309 02:44:43.566068 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 02:44:44.066055027 +0000 UTC m=+208.655718749 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tpmfc" (UID: "3befa408-f48c-4244-81ca-6bf178967fbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:43 crc kubenswrapper[4901]: I0309 02:44:43.594580 4901 ???:1] "http: TLS handshake error from 192.168.126.11:34458: no serving certificate available for the kubelet" Mar 09 02:44:43 crc kubenswrapper[4901]: I0309 02:44:43.667590 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 02:44:43 crc kubenswrapper[4901]: E0309 02:44:43.667841 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 02:44:44.167825789 +0000 UTC m=+208.757489521 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:43 crc kubenswrapper[4901]: I0309 02:44:43.773070 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:43 crc kubenswrapper[4901]: E0309 02:44:43.774100 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 02:44:44.274081424 +0000 UTC m=+208.863745156 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tpmfc" (UID: "3befa408-f48c-4244-81ca-6bf178967fbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:43 crc kubenswrapper[4901]: I0309 02:44:43.876319 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 02:44:43 crc kubenswrapper[4901]: E0309 02:44:43.876743 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 02:44:44.376724923 +0000 UTC m=+208.966388655 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:43 crc kubenswrapper[4901]: I0309 02:44:43.977384 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:43 crc kubenswrapper[4901]: E0309 02:44:43.977903 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 02:44:44.477886376 +0000 UTC m=+209.067550108 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tpmfc" (UID: "3befa408-f48c-4244-81ca-6bf178967fbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.079046 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 02:44:44 crc kubenswrapper[4901]: E0309 02:44:44.079192 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 02:44:44.579156143 +0000 UTC m=+209.168819875 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.079473 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:44 crc kubenswrapper[4901]: E0309 02:44:44.079813 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 02:44:44.579805333 +0000 UTC m=+209.169469065 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tpmfc" (UID: "3befa408-f48c-4244-81ca-6bf178967fbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.180777 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 02:44:44 crc kubenswrapper[4901]: E0309 02:44:44.181070 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 02:44:44.681056099 +0000 UTC m=+209.270719831 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.279261 4901 ???:1] "http: TLS handshake error from 192.168.126.11:34460: no serving certificate available for the kubelet" Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.282655 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:44 crc kubenswrapper[4901]: E0309 02:44:44.283026 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 02:44:44.783008577 +0000 UTC m=+209.372672309 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tpmfc" (UID: "3befa408-f48c-4244-81ca-6bf178967fbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.315175 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fl2d7" event={"ID":"7d39bde0-c78d-4e46-b388-78ce5fbadb9f","Type":"ContainerStarted","Data":"035db68414abdae56b60df353f59dfcdd5e0a8749c50c0190355e42230bf7da0"} Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.323504 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fl2d7" Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.324232 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-p7chk" event={"ID":"a7e3f000-a2fc-4ebb-a6ee-00fda57097b7","Type":"ContainerStarted","Data":"631c455eb589676bd0faa5f51074eac2d12b5f8e00117d8f20291ad372b18343"} Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.336968 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-lgxp2" event={"ID":"9e9eea51-12bc-40f5-94b0-3fb75a48b898","Type":"ContainerStarted","Data":"7f5ff1e97355071cbe0b0365a5433aefe4ab67547e44107873bb42dadfaf654b"} Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.344331 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5jjc7" event={"ID":"f1f65b0f-1ef9-445f-8e08-3a4b7bc82f93","Type":"ContainerStarted","Data":"65c00e8c2eabd7b60a381216da3cb10f175d7a2ec1e30cd100cfd5dbc3c8c9db"} Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.345182 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5jjc7" Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.375208 4901 patch_prober.go:28] interesting pod/router-default-5444994796-9v4g4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 02:44:44 crc kubenswrapper[4901]: [-]has-synced failed: reason withheld Mar 09 02:44:44 crc kubenswrapper[4901]: [+]process-running ok Mar 09 02:44:44 crc kubenswrapper[4901]: healthz check failed Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.375298 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9v4g4" podUID="5eb34c1b-916a-4f45-a6b3-e5e8b17a0872" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.380338 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5jjc7" podStartSLOduration=136.380315888 podStartE2EDuration="2m16.380315888s" podCreationTimestamp="2026-03-09 02:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:44:44.378428308 +0000 UTC m=+208.968092040" watchObservedRunningTime="2026-03-09 02:44:44.380315888 +0000 UTC m=+208.969979630" Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.386523 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 02:44:44 crc kubenswrapper[4901]: E0309 02:44:44.387713 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 02:44:44.887698531 +0000 UTC m=+209.477362263 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.410208 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-cwmk8" event={"ID":"469c34ce-1d46-4d99-a1c4-ac180fd08322","Type":"ContainerStarted","Data":"afaceeade2210e96214a030b6806ccf9025fdab91d1ac17dc86fda28bf00be57"} Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.417298 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zf94m" event={"ID":"c4a65957-3afe-4859-8ec3-4b3a2180e744","Type":"ContainerStarted","Data":"e5088a82499cb185002eb93ff301f3a2e230a0597148678f2659d6f1247a1621"} Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.420578 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-np7wv" event={"ID":"1969ad35-8f20-4676-b7b6-375c5b93aa14","Type":"ContainerStarted","Data":"107a33af21e663154bd66c0facbdbbe3f5873ef09c5334cdebc5bfe6a5ceaad7"} Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.421227 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-np7wv" Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.424225 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8m6x2" event={"ID":"2009785b-23de-4c85-8dbc-285219ade858","Type":"ContainerStarted","Data":"daa84a0676dfa761ea6be39be1f9048efc4b3e9d8b353a1349139411db56f4e9"} Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.429455 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qnwn6" event={"ID":"463898fa-b0fc-411e-abce-b0c64f32e240","Type":"ContainerStarted","Data":"86df19f8ae90403670d22cd4c6bc6a632536b31abd3d4da9a959ce43a44d2c33"} Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.433211 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"23653f5c04b5b79885123a85efe9b7454695789e45f8ec140daa61b8c9d26358"} Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.433264 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"ea1f21717f9882add64d161a7dc4aa18c0a6e6cc671b05e3634ac4f664cdf629"} Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.438116 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ffb55" event={"ID":"8e691e4a-54da-49cd-acaf-e1b14cadde2e","Type":"ContainerStarted","Data":"1d72fa520c10f9b2c71a4f0e469ea553f5b900eb26c9e7ea4f7e51161424c1f5"} Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.441256 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-m8h87" event={"ID":"18a59baf-2a09-41ff-94c9-1219cf47dfc2","Type":"ContainerStarted","Data":"fc529172d17c3bf755e7463751514823378fc1f270d86c0fd35cd48a7fef1f45"} Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.441285 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-m8h87" event={"ID":"18a59baf-2a09-41ff-94c9-1219cf47dfc2","Type":"ContainerStarted","Data":"04b4f036a53250c8c327646ad2ef7ad67da55031cf13f8205cd697dec41af137"} Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.442857 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n9lvf" event={"ID":"ccdc6c7c-cc5b-4b4e-a1d3-663d9a7900ae","Type":"ContainerStarted","Data":"518598073b7d48094316d2ac8c1cbf7114074b917b50a6973649eaf4a81f460b"} Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.442887 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n9lvf" event={"ID":"ccdc6c7c-cc5b-4b4e-a1d3-663d9a7900ae","Type":"ContainerStarted","Data":"37d68f4b12fd90179dc1cf1742b05e100d43b726252d962c82c6cff09989cff5"} Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.445245 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-kwffm" event={"ID":"ae631b64-6f22-4112-8fb8-aa2c5140275b","Type":"ContainerStarted","Data":"c0fe91b31aeae00aa38b08b18868079f749e2b65d9d799ba9b4efc927cb5ab96"} Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.446061 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-kwffm" Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.446928 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"b678cd47f177b2ffceba2cf4fe879ed7e3e66a3652e27d1a8996f50ac1a1ed91"} Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.446955 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"9da89294aaefe2f4875bef41351351907ad1541ccceacef94b9b7a0e00d41422"} Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.447356 4901 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-kwffm container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.447398 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-kwffm" podUID="ae631b64-6f22-4112-8fb8-aa2c5140275b" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.449438 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"4a7d499afbf5329f3a2739a9c61f5cc7d9c0553f128f2b6af90143a8b6c783ea"} Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.449463 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"0f980cc1d41301131d1abe8abe83114cb41f38a7abc783bd0d7a3ac93c5d9fdf"} Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.449793 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.465461 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-cwmk8" podStartSLOduration=136.465441233 podStartE2EDuration="2m16.465441233s" podCreationTimestamp="2026-03-09 02:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:44:44.463626165 +0000 UTC m=+209.053289907" watchObservedRunningTime="2026-03-09 02:44:44.465441233 +0000 UTC m=+209.055104965" Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.470144 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m299m" event={"ID":"38e11ab1-ab41-4665-90ce-5c7aba1639f2","Type":"ContainerStarted","Data":"5625d95398d4664325a31a27d2aada664a291b1c23a00d8d91b0812b86d397f1"} Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.471091 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m299m" Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.482373 4901 generic.go:334] "Generic (PLEG): container finished" podID="e3336f36-4389-49db-a669-fe0cbc0bfdfd" containerID="f09bcee651e918ddf27c2b14ec0f74a86cf3b972a8f39631c4b0972b19b67c2d" exitCode=0 Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.482897 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550390-hrjpt" event={"ID":"e3336f36-4389-49db-a669-fe0cbc0bfdfd","Type":"ContainerDied","Data":"f09bcee651e918ddf27c2b14ec0f74a86cf3b972a8f39631c4b0972b19b67c2d"} Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.488621 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.488824 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m299m" Mar 09 02:44:44 crc kubenswrapper[4901]: E0309 02:44:44.490774 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 02:44:44.990758024 +0000 UTC m=+209.580421746 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tpmfc" (UID: "3befa408-f48c-4244-81ca-6bf178967fbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.497141 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nrgmz" Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.503950 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ffb55" podStartSLOduration=136.503933441 podStartE2EDuration="2m16.503933441s" podCreationTimestamp="2026-03-09 02:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:44:44.500038498 +0000 UTC m=+209.089702240" watchObservedRunningTime="2026-03-09 02:44:44.503933441 +0000 UTC m=+209.093597173" Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.508439 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5jgbr"] Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.508763 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-5jgbr" podUID="e1f8e22e-89a6-46b7-94f6-65aa27575c48" containerName="controller-manager" containerID="cri-o://763e802f5f830b75c23367fc42d4541bdde7131419f766f2f05ea4e153a0e6de" gracePeriod=30 Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.529806 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-m8h87" podStartSLOduration=136.529789089 podStartE2EDuration="2m16.529789089s" podCreationTimestamp="2026-03-09 02:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:44:44.528618492 +0000 UTC m=+209.118282224" watchObservedRunningTime="2026-03-09 02:44:44.529789089 +0000 UTC m=+209.119452821" Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.547519 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-57xhf"] Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.590642 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 02:44:44 crc kubenswrapper[4901]: E0309 02:44:44.591054 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 02:44:45.091025118 +0000 UTC m=+209.680688850 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.591891 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.605395 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n9lvf" podStartSLOduration=136.605377972 podStartE2EDuration="2m16.605377972s" podCreationTimestamp="2026-03-09 02:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:44:44.603641327 +0000 UTC m=+209.193305069" watchObservedRunningTime="2026-03-09 02:44:44.605377972 +0000 UTC m=+209.195041704" Mar 09 02:44:44 crc kubenswrapper[4901]: E0309 02:44:44.614936 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 02:44:45.114919814 +0000 UTC m=+209.704583546 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tpmfc" (UID: "3befa408-f48c-4244-81ca-6bf178967fbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.728986 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 02:44:44 crc kubenswrapper[4901]: E0309 02:44:44.729580 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 02:44:45.229562354 +0000 UTC m=+209.819226086 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.750320 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qnwn6" podStartSLOduration=137.750301861 podStartE2EDuration="2m17.750301861s" podCreationTimestamp="2026-03-09 02:42:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:44:44.707666501 +0000 UTC m=+209.297330233" watchObservedRunningTime="2026-03-09 02:44:44.750301861 +0000 UTC m=+209.339965593" Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.831324 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:44 crc kubenswrapper[4901]: E0309 02:44:44.831771 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 02:44:45.33175545 +0000 UTC m=+209.921419182 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tpmfc" (UID: "3befa408-f48c-4244-81ca-6bf178967fbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.837842 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-kwffm" podStartSLOduration=136.837828832 podStartE2EDuration="2m16.837828832s" podCreationTimestamp="2026-03-09 02:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:44:44.791544596 +0000 UTC m=+209.381208318" watchObservedRunningTime="2026-03-09 02:44:44.837828832 +0000 UTC m=+209.427492564" Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.839586 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-np7wv" podStartSLOduration=8.839580747 podStartE2EDuration="8.839580747s" podCreationTimestamp="2026-03-09 02:44:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:44:44.827511925 +0000 UTC m=+209.417175657" watchObservedRunningTime="2026-03-09 02:44:44.839580747 +0000 UTC m=+209.429244479" Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.890712 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zf94m" podStartSLOduration=136.890692826 podStartE2EDuration="2m16.890692826s" podCreationTimestamp="2026-03-09 02:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:44:44.860530101 +0000 UTC m=+209.450193833" watchObservedRunningTime="2026-03-09 02:44:44.890692826 +0000 UTC m=+209.480356558" Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.931753 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9fxwq" Mar 09 02:44:44 crc kubenswrapper[4901]: I0309 02:44:44.932491 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 02:44:44 crc kubenswrapper[4901]: E0309 02:44:44.933029 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 02:44:45.433005925 +0000 UTC m=+210.022669657 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.039176 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:45 crc kubenswrapper[4901]: E0309 02:44:45.039590 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 02:44:45.539576819 +0000 UTC m=+210.129240551 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tpmfc" (UID: "3befa408-f48c-4244-81ca-6bf178967fbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.043611 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-5jgbr" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.140272 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1f8e22e-89a6-46b7-94f6-65aa27575c48-config\") pod \"e1f8e22e-89a6-46b7-94f6-65aa27575c48\" (UID: \"e1f8e22e-89a6-46b7-94f6-65aa27575c48\") " Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.140352 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e1f8e22e-89a6-46b7-94f6-65aa27575c48-proxy-ca-bundles\") pod \"e1f8e22e-89a6-46b7-94f6-65aa27575c48\" (UID: \"e1f8e22e-89a6-46b7-94f6-65aa27575c48\") " Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.140446 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rz995\" (UniqueName: \"kubernetes.io/projected/e1f8e22e-89a6-46b7-94f6-65aa27575c48-kube-api-access-rz995\") pod \"e1f8e22e-89a6-46b7-94f6-65aa27575c48\" (UID: \"e1f8e22e-89a6-46b7-94f6-65aa27575c48\") " Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.140595 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.140646 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e1f8e22e-89a6-46b7-94f6-65aa27575c48-client-ca\") pod \"e1f8e22e-89a6-46b7-94f6-65aa27575c48\" (UID: \"e1f8e22e-89a6-46b7-94f6-65aa27575c48\") " Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.140680 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e1f8e22e-89a6-46b7-94f6-65aa27575c48-serving-cert\") pod \"e1f8e22e-89a6-46b7-94f6-65aa27575c48\" (UID: \"e1f8e22e-89a6-46b7-94f6-65aa27575c48\") " Mar 09 02:44:45 crc kubenswrapper[4901]: E0309 02:44:45.140796 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 02:44:45.640771313 +0000 UTC m=+210.230435035 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.140871 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.141195 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1f8e22e-89a6-46b7-94f6-65aa27575c48-client-ca" (OuterVolumeSpecName: "client-ca") pod "e1f8e22e-89a6-46b7-94f6-65aa27575c48" (UID: "e1f8e22e-89a6-46b7-94f6-65aa27575c48"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:44:45 crc kubenswrapper[4901]: E0309 02:44:45.141255 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 02:44:45.641243538 +0000 UTC m=+210.230907270 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tpmfc" (UID: "3befa408-f48c-4244-81ca-6bf178967fbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.141335 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1f8e22e-89a6-46b7-94f6-65aa27575c48-config" (OuterVolumeSpecName: "config") pod "e1f8e22e-89a6-46b7-94f6-65aa27575c48" (UID: "e1f8e22e-89a6-46b7-94f6-65aa27575c48"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.145715 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1f8e22e-89a6-46b7-94f6-65aa27575c48-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e1f8e22e-89a6-46b7-94f6-65aa27575c48" (UID: "e1f8e22e-89a6-46b7-94f6-65aa27575c48"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.149861 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1f8e22e-89a6-46b7-94f6-65aa27575c48-kube-api-access-rz995" (OuterVolumeSpecName: "kube-api-access-rz995") pod "e1f8e22e-89a6-46b7-94f6-65aa27575c48" (UID: "e1f8e22e-89a6-46b7-94f6-65aa27575c48"). InnerVolumeSpecName "kube-api-access-rz995". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.184661 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1f8e22e-89a6-46b7-94f6-65aa27575c48-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e1f8e22e-89a6-46b7-94f6-65aa27575c48" (UID: "e1f8e22e-89a6-46b7-94f6-65aa27575c48"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.242036 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 02:44:45 crc kubenswrapper[4901]: E0309 02:44:45.242261 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 02:44:45.742235376 +0000 UTC m=+210.331899108 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.242410 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.242682 4901 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e1f8e22e-89a6-46b7-94f6-65aa27575c48-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.242700 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1f8e22e-89a6-46b7-94f6-65aa27575c48-config\") on node \"crc\" DevicePath \"\"" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.242711 4901 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e1f8e22e-89a6-46b7-94f6-65aa27575c48-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.242721 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rz995\" (UniqueName: \"kubernetes.io/projected/e1f8e22e-89a6-46b7-94f6-65aa27575c48-kube-api-access-rz995\") on node \"crc\" DevicePath \"\"" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.242730 4901 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e1f8e22e-89a6-46b7-94f6-65aa27575c48-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 02:44:45 crc kubenswrapper[4901]: E0309 02:44:45.242833 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 02:44:45.742819414 +0000 UTC m=+210.332483146 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tpmfc" (UID: "3befa408-f48c-4244-81ca-6bf178967fbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.312988 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-54d6bfbf5c-w8487"] Mar 09 02:44:45 crc kubenswrapper[4901]: E0309 02:44:45.313904 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1f8e22e-89a6-46b7-94f6-65aa27575c48" containerName="controller-manager" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.313926 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1f8e22e-89a6-46b7-94f6-65aa27575c48" containerName="controller-manager" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.314035 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1f8e22e-89a6-46b7-94f6-65aa27575c48" containerName="controller-manager" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.314481 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54d6bfbf5c-w8487" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.324925 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-54d6bfbf5c-w8487"] Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.344726 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 02:44:45 crc kubenswrapper[4901]: E0309 02:44:45.345152 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 02:44:45.845136854 +0000 UTC m=+210.434800586 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.371000 4901 patch_prober.go:28] interesting pod/router-default-5444994796-9v4g4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 02:44:45 crc kubenswrapper[4901]: [-]has-synced failed: reason withheld Mar 09 02:44:45 crc kubenswrapper[4901]: [+]process-running ok Mar 09 02:44:45 crc kubenswrapper[4901]: healthz check failed Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.371071 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9v4g4" podUID="5eb34c1b-916a-4f45-a6b3-e5e8b17a0872" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.446427 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/20e06fd3-8c8f-452c-9df4-25911ea82ac1-proxy-ca-bundles\") pod \"controller-manager-54d6bfbf5c-w8487\" (UID: \"20e06fd3-8c8f-452c-9df4-25911ea82ac1\") " pod="openshift-controller-manager/controller-manager-54d6bfbf5c-w8487" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.446496 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.446560 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-464dl\" (UniqueName: \"kubernetes.io/projected/20e06fd3-8c8f-452c-9df4-25911ea82ac1-kube-api-access-464dl\") pod \"controller-manager-54d6bfbf5c-w8487\" (UID: \"20e06fd3-8c8f-452c-9df4-25911ea82ac1\") " pod="openshift-controller-manager/controller-manager-54d6bfbf5c-w8487" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.446589 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20e06fd3-8c8f-452c-9df4-25911ea82ac1-serving-cert\") pod \"controller-manager-54d6bfbf5c-w8487\" (UID: \"20e06fd3-8c8f-452c-9df4-25911ea82ac1\") " pod="openshift-controller-manager/controller-manager-54d6bfbf5c-w8487" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.446609 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20e06fd3-8c8f-452c-9df4-25911ea82ac1-config\") pod \"controller-manager-54d6bfbf5c-w8487\" (UID: \"20e06fd3-8c8f-452c-9df4-25911ea82ac1\") " pod="openshift-controller-manager/controller-manager-54d6bfbf5c-w8487" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.446639 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/20e06fd3-8c8f-452c-9df4-25911ea82ac1-client-ca\") pod \"controller-manager-54d6bfbf5c-w8487\" (UID: \"20e06fd3-8c8f-452c-9df4-25911ea82ac1\") " pod="openshift-controller-manager/controller-manager-54d6bfbf5c-w8487" Mar 09 02:44:45 crc kubenswrapper[4901]: E0309 02:44:45.446913 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 02:44:45.946897315 +0000 UTC m=+210.536561047 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tpmfc" (UID: "3befa408-f48c-4244-81ca-6bf178967fbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.495980 4901 generic.go:334] "Generic (PLEG): container finished" podID="e1f8e22e-89a6-46b7-94f6-65aa27575c48" containerID="763e802f5f830b75c23367fc42d4541bdde7131419f766f2f05ea4e153a0e6de" exitCode=0 Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.496202 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-5jgbr" event={"ID":"e1f8e22e-89a6-46b7-94f6-65aa27575c48","Type":"ContainerDied","Data":"763e802f5f830b75c23367fc42d4541bdde7131419f766f2f05ea4e153a0e6de"} Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.497395 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-5jgbr" event={"ID":"e1f8e22e-89a6-46b7-94f6-65aa27575c48","Type":"ContainerDied","Data":"906d980e15aacd868d5461777efa76c684fc94240e37f6c98e5dd557d2670978"} Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.497412 4901 scope.go:117] "RemoveContainer" containerID="763e802f5f830b75c23367fc42d4541bdde7131419f766f2f05ea4e153a0e6de" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.496295 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-5jgbr" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.499332 4901 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-kwffm container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.499384 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-kwffm" podUID="ae631b64-6f22-4112-8fb8-aa2c5140275b" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.500292 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-57xhf" podUID="890abe57-aa9b-4c46-8a26-c2c1fd724fab" containerName="route-controller-manager" containerID="cri-o://bd20d2dfb804bd2a2008883f837fbe0ce04f0947561aa9272165ada954d14200" gracePeriod=30 Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.548165 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.548496 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/20e06fd3-8c8f-452c-9df4-25911ea82ac1-client-ca\") pod \"controller-manager-54d6bfbf5c-w8487\" (UID: \"20e06fd3-8c8f-452c-9df4-25911ea82ac1\") " pod="openshift-controller-manager/controller-manager-54d6bfbf5c-w8487" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.548717 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/20e06fd3-8c8f-452c-9df4-25911ea82ac1-proxy-ca-bundles\") pod \"controller-manager-54d6bfbf5c-w8487\" (UID: \"20e06fd3-8c8f-452c-9df4-25911ea82ac1\") " pod="openshift-controller-manager/controller-manager-54d6bfbf5c-w8487" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.548899 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-464dl\" (UniqueName: \"kubernetes.io/projected/20e06fd3-8c8f-452c-9df4-25911ea82ac1-kube-api-access-464dl\") pod \"controller-manager-54d6bfbf5c-w8487\" (UID: \"20e06fd3-8c8f-452c-9df4-25911ea82ac1\") " pod="openshift-controller-manager/controller-manager-54d6bfbf5c-w8487" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.549047 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20e06fd3-8c8f-452c-9df4-25911ea82ac1-serving-cert\") pod \"controller-manager-54d6bfbf5c-w8487\" (UID: \"20e06fd3-8c8f-452c-9df4-25911ea82ac1\") " pod="openshift-controller-manager/controller-manager-54d6bfbf5c-w8487" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.549068 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20e06fd3-8c8f-452c-9df4-25911ea82ac1-config\") pod \"controller-manager-54d6bfbf5c-w8487\" (UID: \"20e06fd3-8c8f-452c-9df4-25911ea82ac1\") " pod="openshift-controller-manager/controller-manager-54d6bfbf5c-w8487" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.549205 4901 scope.go:117] "RemoveContainer" containerID="763e802f5f830b75c23367fc42d4541bdde7131419f766f2f05ea4e153a0e6de" Mar 09 02:44:45 crc kubenswrapper[4901]: E0309 02:44:45.550991 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 02:44:46.05096194 +0000 UTC m=+210.640625682 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:45 crc kubenswrapper[4901]: E0309 02:44:45.551056 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"763e802f5f830b75c23367fc42d4541bdde7131419f766f2f05ea4e153a0e6de\": container with ID starting with 763e802f5f830b75c23367fc42d4541bdde7131419f766f2f05ea4e153a0e6de not found: ID does not exist" containerID="763e802f5f830b75c23367fc42d4541bdde7131419f766f2f05ea4e153a0e6de" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.551097 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"763e802f5f830b75c23367fc42d4541bdde7131419f766f2f05ea4e153a0e6de"} err="failed to get container status \"763e802f5f830b75c23367fc42d4541bdde7131419f766f2f05ea4e153a0e6de\": rpc error: code = NotFound desc = could not find container \"763e802f5f830b75c23367fc42d4541bdde7131419f766f2f05ea4e153a0e6de\": container with ID starting with 763e802f5f830b75c23367fc42d4541bdde7131419f766f2f05ea4e153a0e6de not found: ID does not exist" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.552461 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/20e06fd3-8c8f-452c-9df4-25911ea82ac1-client-ca\") pod \"controller-manager-54d6bfbf5c-w8487\" (UID: \"20e06fd3-8c8f-452c-9df4-25911ea82ac1\") " pod="openshift-controller-manager/controller-manager-54d6bfbf5c-w8487" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.554483 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/20e06fd3-8c8f-452c-9df4-25911ea82ac1-proxy-ca-bundles\") pod \"controller-manager-54d6bfbf5c-w8487\" (UID: \"20e06fd3-8c8f-452c-9df4-25911ea82ac1\") " pod="openshift-controller-manager/controller-manager-54d6bfbf5c-w8487" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.554904 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20e06fd3-8c8f-452c-9df4-25911ea82ac1-config\") pod \"controller-manager-54d6bfbf5c-w8487\" (UID: \"20e06fd3-8c8f-452c-9df4-25911ea82ac1\") " pod="openshift-controller-manager/controller-manager-54d6bfbf5c-w8487" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.558323 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20e06fd3-8c8f-452c-9df4-25911ea82ac1-serving-cert\") pod \"controller-manager-54d6bfbf5c-w8487\" (UID: \"20e06fd3-8c8f-452c-9df4-25911ea82ac1\") " pod="openshift-controller-manager/controller-manager-54d6bfbf5c-w8487" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.570978 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-464dl\" (UniqueName: \"kubernetes.io/projected/20e06fd3-8c8f-452c-9df4-25911ea82ac1-kube-api-access-464dl\") pod \"controller-manager-54d6bfbf5c-w8487\" (UID: \"20e06fd3-8c8f-452c-9df4-25911ea82ac1\") " pod="openshift-controller-manager/controller-manager-54d6bfbf5c-w8487" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.596133 4901 ???:1] "http: TLS handshake error from 192.168.126.11:34476: no serving certificate available for the kubelet" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.625665 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8c7zs"] Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.627088 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8c7zs" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.630516 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.638977 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54d6bfbf5c-w8487" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.645618 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8c7zs"] Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.650260 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:45 crc kubenswrapper[4901]: E0309 02:44:45.650575 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 02:44:46.150562474 +0000 UTC m=+210.740226196 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tpmfc" (UID: "3befa408-f48c-4244-81ca-6bf178967fbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.651681 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5jgbr"] Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.653602 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5jgbr"] Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.751828 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.752580 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20-catalog-content\") pod \"community-operators-8c7zs\" (UID: \"b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20\") " pod="openshift-marketplace/community-operators-8c7zs" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.752650 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20-utilities\") pod \"community-operators-8c7zs\" (UID: \"b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20\") " pod="openshift-marketplace/community-operators-8c7zs" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.752722 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzcbv\" (UniqueName: \"kubernetes.io/projected/b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20-kube-api-access-fzcbv\") pod \"community-operators-8c7zs\" (UID: \"b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20\") " pod="openshift-marketplace/community-operators-8c7zs" Mar 09 02:44:45 crc kubenswrapper[4901]: E0309 02:44:45.752909 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 02:44:46.252891594 +0000 UTC m=+210.842555326 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.783163 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550390-hrjpt" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.834886 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-f7kz2"] Mar 09 02:44:45 crc kubenswrapper[4901]: E0309 02:44:45.835198 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3336f36-4389-49db-a669-fe0cbc0bfdfd" containerName="collect-profiles" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.835214 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3336f36-4389-49db-a669-fe0cbc0bfdfd" containerName="collect-profiles" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.835388 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3336f36-4389-49db-a669-fe0cbc0bfdfd" containerName="collect-profiles" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.837097 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f7kz2" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.846804 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.850209 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f7kz2"] Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.863761 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbzwb\" (UniqueName: \"kubernetes.io/projected/e3336f36-4389-49db-a669-fe0cbc0bfdfd-kube-api-access-qbzwb\") pod \"e3336f36-4389-49db-a669-fe0cbc0bfdfd\" (UID: \"e3336f36-4389-49db-a669-fe0cbc0bfdfd\") " Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.863944 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e3336f36-4389-49db-a669-fe0cbc0bfdfd-secret-volume\") pod \"e3336f36-4389-49db-a669-fe0cbc0bfdfd\" (UID: \"e3336f36-4389-49db-a669-fe0cbc0bfdfd\") " Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.864056 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e3336f36-4389-49db-a669-fe0cbc0bfdfd-config-volume\") pod \"e3336f36-4389-49db-a669-fe0cbc0bfdfd\" (UID: \"e3336f36-4389-49db-a669-fe0cbc0bfdfd\") " Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.864207 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.864259 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20-catalog-content\") pod \"community-operators-8c7zs\" (UID: \"b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20\") " pod="openshift-marketplace/community-operators-8c7zs" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.864284 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20-utilities\") pod \"community-operators-8c7zs\" (UID: \"b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20\") " pod="openshift-marketplace/community-operators-8c7zs" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.864320 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzcbv\" (UniqueName: \"kubernetes.io/projected/b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20-kube-api-access-fzcbv\") pod \"community-operators-8c7zs\" (UID: \"b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20\") " pod="openshift-marketplace/community-operators-8c7zs" Mar 09 02:44:45 crc kubenswrapper[4901]: E0309 02:44:45.867038 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 02:44:46.367021567 +0000 UTC m=+210.956685299 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tpmfc" (UID: "3befa408-f48c-4244-81ca-6bf178967fbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.867745 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3336f36-4389-49db-a669-fe0cbc0bfdfd-config-volume" (OuterVolumeSpecName: "config-volume") pod "e3336f36-4389-49db-a669-fe0cbc0bfdfd" (UID: "e3336f36-4389-49db-a669-fe0cbc0bfdfd"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.868178 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20-catalog-content\") pod \"community-operators-8c7zs\" (UID: \"b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20\") " pod="openshift-marketplace/community-operators-8c7zs" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.868407 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20-utilities\") pod \"community-operators-8c7zs\" (UID: \"b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20\") " pod="openshift-marketplace/community-operators-8c7zs" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.883536 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3336f36-4389-49db-a669-fe0cbc0bfdfd-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e3336f36-4389-49db-a669-fe0cbc0bfdfd" (UID: "e3336f36-4389-49db-a669-fe0cbc0bfdfd"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.886385 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3336f36-4389-49db-a669-fe0cbc0bfdfd-kube-api-access-qbzwb" (OuterVolumeSpecName: "kube-api-access-qbzwb") pod "e3336f36-4389-49db-a669-fe0cbc0bfdfd" (UID: "e3336f36-4389-49db-a669-fe0cbc0bfdfd"). InnerVolumeSpecName "kube-api-access-qbzwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.889169 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzcbv\" (UniqueName: \"kubernetes.io/projected/b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20-kube-api-access-fzcbv\") pod \"community-operators-8c7zs\" (UID: \"b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20\") " pod="openshift-marketplace/community-operators-8c7zs" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.929333 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-57xhf" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.942650 4901 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.948629 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8c7zs" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.965686 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.965912 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mw89\" (UniqueName: \"kubernetes.io/projected/04b4583f-8f26-47e0-8726-a0c2f2dca07e-kube-api-access-6mw89\") pod \"certified-operators-f7kz2\" (UID: \"04b4583f-8f26-47e0-8726-a0c2f2dca07e\") " pod="openshift-marketplace/certified-operators-f7kz2" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.966366 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04b4583f-8f26-47e0-8726-a0c2f2dca07e-utilities\") pod \"certified-operators-f7kz2\" (UID: \"04b4583f-8f26-47e0-8726-a0c2f2dca07e\") " pod="openshift-marketplace/certified-operators-f7kz2" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.966587 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04b4583f-8f26-47e0-8726-a0c2f2dca07e-catalog-content\") pod \"certified-operators-f7kz2\" (UID: \"04b4583f-8f26-47e0-8726-a0c2f2dca07e\") " pod="openshift-marketplace/certified-operators-f7kz2" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.966644 4901 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e3336f36-4389-49db-a669-fe0cbc0bfdfd-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.966657 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbzwb\" (UniqueName: \"kubernetes.io/projected/e3336f36-4389-49db-a669-fe0cbc0bfdfd-kube-api-access-qbzwb\") on node \"crc\" DevicePath \"\"" Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.966694 4901 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e3336f36-4389-49db-a669-fe0cbc0bfdfd-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 09 02:44:45 crc kubenswrapper[4901]: E0309 02:44:45.966853 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 02:44:46.466835277 +0000 UTC m=+211.056499009 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:45 crc kubenswrapper[4901]: I0309 02:44:45.976902 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-54d6bfbf5c-w8487"] Mar 09 02:44:46 crc kubenswrapper[4901]: W0309 02:44:46.001222 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20e06fd3_8c8f_452c_9df4_25911ea82ac1.slice/crio-09e748aa655e3e67d6d12350cda748d40cd03621cc2dc4c6fe825f571baa0e63 WatchSource:0}: Error finding container 09e748aa655e3e67d6d12350cda748d40cd03621cc2dc4c6fe825f571baa0e63: Status 404 returned error can't find the container with id 09e748aa655e3e67d6d12350cda748d40cd03621cc2dc4c6fe825f571baa0e63 Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.032973 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pgv99"] Mar 09 02:44:46 crc kubenswrapper[4901]: E0309 02:44:46.033198 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="890abe57-aa9b-4c46-8a26-c2c1fd724fab" containerName="route-controller-manager" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.033212 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="890abe57-aa9b-4c46-8a26-c2c1fd724fab" containerName="route-controller-manager" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.033859 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="890abe57-aa9b-4c46-8a26-c2c1fd724fab" containerName="route-controller-manager" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.034598 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pgv99" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.042613 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pgv99"] Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.067925 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/890abe57-aa9b-4c46-8a26-c2c1fd724fab-serving-cert\") pod \"890abe57-aa9b-4c46-8a26-c2c1fd724fab\" (UID: \"890abe57-aa9b-4c46-8a26-c2c1fd724fab\") " Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.067992 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwtv4\" (UniqueName: \"kubernetes.io/projected/890abe57-aa9b-4c46-8a26-c2c1fd724fab-kube-api-access-xwtv4\") pod \"890abe57-aa9b-4c46-8a26-c2c1fd724fab\" (UID: \"890abe57-aa9b-4c46-8a26-c2c1fd724fab\") " Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.068019 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/890abe57-aa9b-4c46-8a26-c2c1fd724fab-config\") pod \"890abe57-aa9b-4c46-8a26-c2c1fd724fab\" (UID: \"890abe57-aa9b-4c46-8a26-c2c1fd724fab\") " Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.068334 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/890abe57-aa9b-4c46-8a26-c2c1fd724fab-client-ca\") pod \"890abe57-aa9b-4c46-8a26-c2c1fd724fab\" (UID: \"890abe57-aa9b-4c46-8a26-c2c1fd724fab\") " Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.068472 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04b4583f-8f26-47e0-8726-a0c2f2dca07e-utilities\") pod \"certified-operators-f7kz2\" (UID: \"04b4583f-8f26-47e0-8726-a0c2f2dca07e\") " pod="openshift-marketplace/certified-operators-f7kz2" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.068517 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04b4583f-8f26-47e0-8726-a0c2f2dca07e-catalog-content\") pod \"certified-operators-f7kz2\" (UID: \"04b4583f-8f26-47e0-8726-a0c2f2dca07e\") " pod="openshift-marketplace/certified-operators-f7kz2" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.068546 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mw89\" (UniqueName: \"kubernetes.io/projected/04b4583f-8f26-47e0-8726-a0c2f2dca07e-kube-api-access-6mw89\") pod \"certified-operators-f7kz2\" (UID: \"04b4583f-8f26-47e0-8726-a0c2f2dca07e\") " pod="openshift-marketplace/certified-operators-f7kz2" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.068577 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:46 crc kubenswrapper[4901]: E0309 02:44:46.068879 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 02:44:46.568867258 +0000 UTC m=+211.158530990 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tpmfc" (UID: "3befa408-f48c-4244-81ca-6bf178967fbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.069475 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04b4583f-8f26-47e0-8726-a0c2f2dca07e-catalog-content\") pod \"certified-operators-f7kz2\" (UID: \"04b4583f-8f26-47e0-8726-a0c2f2dca07e\") " pod="openshift-marketplace/certified-operators-f7kz2" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.069547 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04b4583f-8f26-47e0-8726-a0c2f2dca07e-utilities\") pod \"certified-operators-f7kz2\" (UID: \"04b4583f-8f26-47e0-8726-a0c2f2dca07e\") " pod="openshift-marketplace/certified-operators-f7kz2" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.069725 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/890abe57-aa9b-4c46-8a26-c2c1fd724fab-client-ca" (OuterVolumeSpecName: "client-ca") pod "890abe57-aa9b-4c46-8a26-c2c1fd724fab" (UID: "890abe57-aa9b-4c46-8a26-c2c1fd724fab"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.069812 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/890abe57-aa9b-4c46-8a26-c2c1fd724fab-config" (OuterVolumeSpecName: "config") pod "890abe57-aa9b-4c46-8a26-c2c1fd724fab" (UID: "890abe57-aa9b-4c46-8a26-c2c1fd724fab"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.075975 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/890abe57-aa9b-4c46-8a26-c2c1fd724fab-kube-api-access-xwtv4" (OuterVolumeSpecName: "kube-api-access-xwtv4") pod "890abe57-aa9b-4c46-8a26-c2c1fd724fab" (UID: "890abe57-aa9b-4c46-8a26-c2c1fd724fab"). InnerVolumeSpecName "kube-api-access-xwtv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.075979 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/890abe57-aa9b-4c46-8a26-c2c1fd724fab-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "890abe57-aa9b-4c46-8a26-c2c1fd724fab" (UID: "890abe57-aa9b-4c46-8a26-c2c1fd724fab"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.095207 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mw89\" (UniqueName: \"kubernetes.io/projected/04b4583f-8f26-47e0-8726-a0c2f2dca07e-kube-api-access-6mw89\") pod \"certified-operators-f7kz2\" (UID: \"04b4583f-8f26-47e0-8726-a0c2f2dca07e\") " pod="openshift-marketplace/certified-operators-f7kz2" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.113747 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1f8e22e-89a6-46b7-94f6-65aa27575c48" path="/var/lib/kubelet/pods/e1f8e22e-89a6-46b7-94f6-65aa27575c48/volumes" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.172809 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.173100 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj7ml\" (UniqueName: \"kubernetes.io/projected/31ec7346-95de-49f9-ad63-7a7423ad1cc3-kube-api-access-mj7ml\") pod \"community-operators-pgv99\" (UID: \"31ec7346-95de-49f9-ad63-7a7423ad1cc3\") " pod="openshift-marketplace/community-operators-pgv99" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.173132 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31ec7346-95de-49f9-ad63-7a7423ad1cc3-utilities\") pod \"community-operators-pgv99\" (UID: \"31ec7346-95de-49f9-ad63-7a7423ad1cc3\") " pod="openshift-marketplace/community-operators-pgv99" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.173181 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31ec7346-95de-49f9-ad63-7a7423ad1cc3-catalog-content\") pod \"community-operators-pgv99\" (UID: \"31ec7346-95de-49f9-ad63-7a7423ad1cc3\") " pod="openshift-marketplace/community-operators-pgv99" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.173290 4901 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/890abe57-aa9b-4c46-8a26-c2c1fd724fab-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.173308 4901 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/890abe57-aa9b-4c46-8a26-c2c1fd724fab-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.173318 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwtv4\" (UniqueName: \"kubernetes.io/projected/890abe57-aa9b-4c46-8a26-c2c1fd724fab-kube-api-access-xwtv4\") on node \"crc\" DevicePath \"\"" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.173328 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/890abe57-aa9b-4c46-8a26-c2c1fd724fab-config\") on node \"crc\" DevicePath \"\"" Mar 09 02:44:46 crc kubenswrapper[4901]: E0309 02:44:46.173407 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 02:44:46.673391217 +0000 UTC m=+211.263054949 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.177592 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f7kz2" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.238716 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-llngg"] Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.239838 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-llngg" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.247278 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-llngg"] Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.283994 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.284078 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mj7ml\" (UniqueName: \"kubernetes.io/projected/31ec7346-95de-49f9-ad63-7a7423ad1cc3-kube-api-access-mj7ml\") pod \"community-operators-pgv99\" (UID: \"31ec7346-95de-49f9-ad63-7a7423ad1cc3\") " pod="openshift-marketplace/community-operators-pgv99" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.284102 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31ec7346-95de-49f9-ad63-7a7423ad1cc3-utilities\") pod \"community-operators-pgv99\" (UID: \"31ec7346-95de-49f9-ad63-7a7423ad1cc3\") " pod="openshift-marketplace/community-operators-pgv99" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.284121 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31ec7346-95de-49f9-ad63-7a7423ad1cc3-catalog-content\") pod \"community-operators-pgv99\" (UID: \"31ec7346-95de-49f9-ad63-7a7423ad1cc3\") " pod="openshift-marketplace/community-operators-pgv99" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.284616 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31ec7346-95de-49f9-ad63-7a7423ad1cc3-catalog-content\") pod \"community-operators-pgv99\" (UID: \"31ec7346-95de-49f9-ad63-7a7423ad1cc3\") " pod="openshift-marketplace/community-operators-pgv99" Mar 09 02:44:46 crc kubenswrapper[4901]: E0309 02:44:46.284740 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 02:44:46.784723942 +0000 UTC m=+211.374387674 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tpmfc" (UID: "3befa408-f48c-4244-81ca-6bf178967fbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.284766 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31ec7346-95de-49f9-ad63-7a7423ad1cc3-utilities\") pod \"community-operators-pgv99\" (UID: \"31ec7346-95de-49f9-ad63-7a7423ad1cc3\") " pod="openshift-marketplace/community-operators-pgv99" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.322760 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj7ml\" (UniqueName: \"kubernetes.io/projected/31ec7346-95de-49f9-ad63-7a7423ad1cc3-kube-api-access-mj7ml\") pod \"community-operators-pgv99\" (UID: \"31ec7346-95de-49f9-ad63-7a7423ad1cc3\") " pod="openshift-marketplace/community-operators-pgv99" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.369733 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pgv99" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.370902 4901 patch_prober.go:28] interesting pod/router-default-5444994796-9v4g4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 02:44:46 crc kubenswrapper[4901]: [-]has-synced failed: reason withheld Mar 09 02:44:46 crc kubenswrapper[4901]: [+]process-running ok Mar 09 02:44:46 crc kubenswrapper[4901]: healthz check failed Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.370952 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9v4g4" podUID="5eb34c1b-916a-4f45-a6b3-e5e8b17a0872" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.385467 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 02:44:46 crc kubenswrapper[4901]: E0309 02:44:46.385596 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 02:44:46.885569475 +0000 UTC m=+211.475233207 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.386061 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.386143 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9d5529e-413e-4c88-98d8-1df5a9e55721-catalog-content\") pod \"certified-operators-llngg\" (UID: \"f9d5529e-413e-4c88-98d8-1df5a9e55721\") " pod="openshift-marketplace/certified-operators-llngg" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.386168 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cctv\" (UniqueName: \"kubernetes.io/projected/f9d5529e-413e-4c88-98d8-1df5a9e55721-kube-api-access-2cctv\") pod \"certified-operators-llngg\" (UID: \"f9d5529e-413e-4c88-98d8-1df5a9e55721\") " pod="openshift-marketplace/certified-operators-llngg" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.386191 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9d5529e-413e-4c88-98d8-1df5a9e55721-utilities\") pod \"certified-operators-llngg\" (UID: \"f9d5529e-413e-4c88-98d8-1df5a9e55721\") " pod="openshift-marketplace/certified-operators-llngg" Mar 09 02:44:46 crc kubenswrapper[4901]: E0309 02:44:46.386505 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 02:44:46.886497024 +0000 UTC m=+211.476160756 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tpmfc" (UID: "3befa408-f48c-4244-81ca-6bf178967fbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.430544 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8c7zs"] Mar 09 02:44:46 crc kubenswrapper[4901]: W0309 02:44:46.442229 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8f8e088_32d5_4d89_87cc_5a3ecc3f3f20.slice/crio-ad77c7e73f6190c64b8f69b2d69a0e6be726b6532002de8c126c56ec67e77890 WatchSource:0}: Error finding container ad77c7e73f6190c64b8f69b2d69a0e6be726b6532002de8c126c56ec67e77890: Status 404 returned error can't find the container with id ad77c7e73f6190c64b8f69b2d69a0e6be726b6532002de8c126c56ec67e77890 Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.491692 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.492020 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9d5529e-413e-4c88-98d8-1df5a9e55721-catalog-content\") pod \"certified-operators-llngg\" (UID: \"f9d5529e-413e-4c88-98d8-1df5a9e55721\") " pod="openshift-marketplace/certified-operators-llngg" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.492053 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cctv\" (UniqueName: \"kubernetes.io/projected/f9d5529e-413e-4c88-98d8-1df5a9e55721-kube-api-access-2cctv\") pod \"certified-operators-llngg\" (UID: \"f9d5529e-413e-4c88-98d8-1df5a9e55721\") " pod="openshift-marketplace/certified-operators-llngg" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.492084 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9d5529e-413e-4c88-98d8-1df5a9e55721-utilities\") pod \"certified-operators-llngg\" (UID: \"f9d5529e-413e-4c88-98d8-1df5a9e55721\") " pod="openshift-marketplace/certified-operators-llngg" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.492519 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9d5529e-413e-4c88-98d8-1df5a9e55721-utilities\") pod \"certified-operators-llngg\" (UID: \"f9d5529e-413e-4c88-98d8-1df5a9e55721\") " pod="openshift-marketplace/certified-operators-llngg" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.492817 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9d5529e-413e-4c88-98d8-1df5a9e55721-catalog-content\") pod \"certified-operators-llngg\" (UID: \"f9d5529e-413e-4c88-98d8-1df5a9e55721\") " pod="openshift-marketplace/certified-operators-llngg" Mar 09 02:44:46 crc kubenswrapper[4901]: E0309 02:44:46.492826 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 02:44:46.99281268 +0000 UTC m=+211.582476412 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.513833 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cctv\" (UniqueName: \"kubernetes.io/projected/f9d5529e-413e-4c88-98d8-1df5a9e55721-kube-api-access-2cctv\") pod \"certified-operators-llngg\" (UID: \"f9d5529e-413e-4c88-98d8-1df5a9e55721\") " pod="openshift-marketplace/certified-operators-llngg" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.514588 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8c7zs" event={"ID":"b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20","Type":"ContainerStarted","Data":"ad77c7e73f6190c64b8f69b2d69a0e6be726b6532002de8c126c56ec67e77890"} Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.516462 4901 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-09T02:44:45.942668662Z","Handler":null,"Name":""} Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.519920 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550390-hrjpt" event={"ID":"e3336f36-4389-49db-a669-fe0cbc0bfdfd","Type":"ContainerDied","Data":"40da90ad82ded32d62374fd895952217b9b9dcb837fdf64357dec741811ba390"} Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.519960 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40da90ad82ded32d62374fd895952217b9b9dcb837fdf64357dec741811ba390" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.520028 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550390-hrjpt" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.526381 4901 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.526412 4901 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.533818 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5499684995-bmjk8"] Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.534465 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5499684995-bmjk8" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.544182 4901 generic.go:334] "Generic (PLEG): container finished" podID="890abe57-aa9b-4c46-8a26-c2c1fd724fab" containerID="bd20d2dfb804bd2a2008883f837fbe0ce04f0947561aa9272165ada954d14200" exitCode=0 Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.544291 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-57xhf" event={"ID":"890abe57-aa9b-4c46-8a26-c2c1fd724fab","Type":"ContainerDied","Data":"bd20d2dfb804bd2a2008883f837fbe0ce04f0947561aa9272165ada954d14200"} Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.544326 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-57xhf" event={"ID":"890abe57-aa9b-4c46-8a26-c2c1fd724fab","Type":"ContainerDied","Data":"dedb703f62caa2b0ed7495796287a437699a55bc93f4f7c90f6ffa90715ec485"} Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.544348 4901 scope.go:117] "RemoveContainer" containerID="bd20d2dfb804bd2a2008883f837fbe0ce04f0947561aa9272165ada954d14200" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.544511 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-57xhf" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.551921 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-lgxp2" event={"ID":"9e9eea51-12bc-40f5-94b0-3fb75a48b898","Type":"ContainerStarted","Data":"991c6ef56718e2b5524b86739a9a5d7949e370da21b60aefaa2e060e7a1a3c22"} Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.551967 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-lgxp2" event={"ID":"9e9eea51-12bc-40f5-94b0-3fb75a48b898","Type":"ContainerStarted","Data":"36c27803d6ed6994d660e8598fb959509a557e0392833c68f279257a0f898341"} Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.561913 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5499684995-bmjk8"] Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.565737 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-llngg" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.571751 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54d6bfbf5c-w8487" event={"ID":"20e06fd3-8c8f-452c-9df4-25911ea82ac1","Type":"ContainerStarted","Data":"0f1ff6dd01bff90753a073c26942f8438816f62dbf721cb67d41d0a23e08dfae"} Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.571806 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54d6bfbf5c-w8487" event={"ID":"20e06fd3-8c8f-452c-9df4-25911ea82ac1","Type":"ContainerStarted","Data":"09e748aa655e3e67d6d12350cda748d40cd03621cc2dc4c6fe825f571baa0e63"} Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.573319 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-54d6bfbf5c-w8487" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.592974 4901 scope.go:117] "RemoveContainer" containerID="bd20d2dfb804bd2a2008883f837fbe0ce04f0947561aa9272165ada954d14200" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.593123 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-kwffm" Mar 09 02:44:46 crc kubenswrapper[4901]: E0309 02:44:46.593773 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd20d2dfb804bd2a2008883f837fbe0ce04f0947561aa9272165ada954d14200\": container with ID starting with bd20d2dfb804bd2a2008883f837fbe0ce04f0947561aa9272165ada954d14200 not found: ID does not exist" containerID="bd20d2dfb804bd2a2008883f837fbe0ce04f0947561aa9272165ada954d14200" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.593823 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd20d2dfb804bd2a2008883f837fbe0ce04f0947561aa9272165ada954d14200"} err="failed to get container status \"bd20d2dfb804bd2a2008883f837fbe0ce04f0947561aa9272165ada954d14200\": rpc error: code = NotFound desc = could not find container \"bd20d2dfb804bd2a2008883f837fbe0ce04f0947561aa9272165ada954d14200\": container with ID starting with bd20d2dfb804bd2a2008883f837fbe0ce04f0947561aa9272165ada954d14200 not found: ID does not exist" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.595685 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f86d992b-5a2d-47ee-acf3-b73a05d43546-serving-cert\") pod \"route-controller-manager-5499684995-bmjk8\" (UID: \"f86d992b-5a2d-47ee-acf3-b73a05d43546\") " pod="openshift-route-controller-manager/route-controller-manager-5499684995-bmjk8" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.595724 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f86d992b-5a2d-47ee-acf3-b73a05d43546-client-ca\") pod \"route-controller-manager-5499684995-bmjk8\" (UID: \"f86d992b-5a2d-47ee-acf3-b73a05d43546\") " pod="openshift-route-controller-manager/route-controller-manager-5499684995-bmjk8" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.595755 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28jnw\" (UniqueName: \"kubernetes.io/projected/f86d992b-5a2d-47ee-acf3-b73a05d43546-kube-api-access-28jnw\") pod \"route-controller-manager-5499684995-bmjk8\" (UID: \"f86d992b-5a2d-47ee-acf3-b73a05d43546\") " pod="openshift-route-controller-manager/route-controller-manager-5499684995-bmjk8" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.595954 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.596079 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f86d992b-5a2d-47ee-acf3-b73a05d43546-config\") pod \"route-controller-manager-5499684995-bmjk8\" (UID: \"f86d992b-5a2d-47ee-acf3-b73a05d43546\") " pod="openshift-route-controller-manager/route-controller-manager-5499684995-bmjk8" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.609483 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-54d6bfbf5c-w8487" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.620960 4901 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.621013 4901 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.643915 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-54d6bfbf5c-w8487" podStartSLOduration=1.6438925229999999 podStartE2EDuration="1.643892523s" podCreationTimestamp="2026-03-09 02:44:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:44:46.620004937 +0000 UTC m=+211.209668689" watchObservedRunningTime="2026-03-09 02:44:46.643892523 +0000 UTC m=+211.233556255" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.651609 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f7kz2"] Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.680605 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tpmfc\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.684419 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pgv99"] Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.697505 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.706097 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f86d992b-5a2d-47ee-acf3-b73a05d43546-serving-cert\") pod \"route-controller-manager-5499684995-bmjk8\" (UID: \"f86d992b-5a2d-47ee-acf3-b73a05d43546\") " pod="openshift-route-controller-manager/route-controller-manager-5499684995-bmjk8" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.706530 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f86d992b-5a2d-47ee-acf3-b73a05d43546-client-ca\") pod \"route-controller-manager-5499684995-bmjk8\" (UID: \"f86d992b-5a2d-47ee-acf3-b73a05d43546\") " pod="openshift-route-controller-manager/route-controller-manager-5499684995-bmjk8" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.706643 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28jnw\" (UniqueName: \"kubernetes.io/projected/f86d992b-5a2d-47ee-acf3-b73a05d43546-kube-api-access-28jnw\") pod \"route-controller-manager-5499684995-bmjk8\" (UID: \"f86d992b-5a2d-47ee-acf3-b73a05d43546\") " pod="openshift-route-controller-manager/route-controller-manager-5499684995-bmjk8" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.706988 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f86d992b-5a2d-47ee-acf3-b73a05d43546-config\") pod \"route-controller-manager-5499684995-bmjk8\" (UID: \"f86d992b-5a2d-47ee-acf3-b73a05d43546\") " pod="openshift-route-controller-manager/route-controller-manager-5499684995-bmjk8" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.708956 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f86d992b-5a2d-47ee-acf3-b73a05d43546-config\") pod \"route-controller-manager-5499684995-bmjk8\" (UID: \"f86d992b-5a2d-47ee-acf3-b73a05d43546\") " pod="openshift-route-controller-manager/route-controller-manager-5499684995-bmjk8" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.711676 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f86d992b-5a2d-47ee-acf3-b73a05d43546-client-ca\") pod \"route-controller-manager-5499684995-bmjk8\" (UID: \"f86d992b-5a2d-47ee-acf3-b73a05d43546\") " pod="openshift-route-controller-manager/route-controller-manager-5499684995-bmjk8" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.712335 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.714512 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f86d992b-5a2d-47ee-acf3-b73a05d43546-serving-cert\") pod \"route-controller-manager-5499684995-bmjk8\" (UID: \"f86d992b-5a2d-47ee-acf3-b73a05d43546\") " pod="openshift-route-controller-manager/route-controller-manager-5499684995-bmjk8" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.725009 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-57xhf"] Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.730689 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-57xhf"] Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.732084 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28jnw\" (UniqueName: \"kubernetes.io/projected/f86d992b-5a2d-47ee-acf3-b73a05d43546-kube-api-access-28jnw\") pod \"route-controller-manager-5499684995-bmjk8\" (UID: \"f86d992b-5a2d-47ee-acf3-b73a05d43546\") " pod="openshift-route-controller-manager/route-controller-manager-5499684995-bmjk8" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.843060 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.897537 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-llngg"] Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.909998 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5499684995-bmjk8" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.960685 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-92tt5" Mar 09 02:44:46 crc kubenswrapper[4901]: I0309 02:44:46.965522 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-92tt5" Mar 09 02:44:46 crc kubenswrapper[4901]: W0309 02:44:46.977737 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9d5529e_413e_4c88_98d8_1df5a9e55721.slice/crio-1414ec18dd610b4e3f03f5b2694707803f86255512f28660430c6d684f0bbf0b WatchSource:0}: Error finding container 1414ec18dd610b4e3f03f5b2694707803f86255512f28660430c6d684f0bbf0b: Status 404 returned error can't find the container with id 1414ec18dd610b4e3f03f5b2694707803f86255512f28660430c6d684f0bbf0b Mar 09 02:44:47 crc kubenswrapper[4901]: I0309 02:44:47.158452 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tpmfc"] Mar 09 02:44:47 crc kubenswrapper[4901]: I0309 02:44:47.370782 4901 patch_prober.go:28] interesting pod/router-default-5444994796-9v4g4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 02:44:47 crc kubenswrapper[4901]: [-]has-synced failed: reason withheld Mar 09 02:44:47 crc kubenswrapper[4901]: [+]process-running ok Mar 09 02:44:47 crc kubenswrapper[4901]: healthz check failed Mar 09 02:44:47 crc kubenswrapper[4901]: I0309 02:44:47.371192 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9v4g4" podUID="5eb34c1b-916a-4f45-a6b3-e5e8b17a0872" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 02:44:47 crc kubenswrapper[4901]: I0309 02:44:47.376616 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 09 02:44:47 crc kubenswrapper[4901]: I0309 02:44:47.377391 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 02:44:47 crc kubenswrapper[4901]: I0309 02:44:47.383871 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 09 02:44:47 crc kubenswrapper[4901]: I0309 02:44:47.384361 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 09 02:44:47 crc kubenswrapper[4901]: E0309 02:44:47.403422 4901 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9d5529e_413e_4c88_98d8_1df5a9e55721.slice/crio-368e95487a8e31ef31a5fc871c10b02ebed4b56116c83e7486b7b9a266d1bfff.scope\": RecentStats: unable to find data in memory cache]" Mar 09 02:44:47 crc kubenswrapper[4901]: I0309 02:44:47.433708 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 09 02:44:47 crc kubenswrapper[4901]: I0309 02:44:47.494643 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5499684995-bmjk8"] Mar 09 02:44:47 crc kubenswrapper[4901]: I0309 02:44:47.529076 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/96f3139b-08f7-4019-b629-c0aae2286507-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"96f3139b-08f7-4019-b629-c0aae2286507\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 02:44:47 crc kubenswrapper[4901]: I0309 02:44:47.529117 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/96f3139b-08f7-4019-b629-c0aae2286507-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"96f3139b-08f7-4019-b629-c0aae2286507\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 02:44:47 crc kubenswrapper[4901]: I0309 02:44:47.584742 4901 generic.go:334] "Generic (PLEG): container finished" podID="f9d5529e-413e-4c88-98d8-1df5a9e55721" containerID="368e95487a8e31ef31a5fc871c10b02ebed4b56116c83e7486b7b9a266d1bfff" exitCode=0 Mar 09 02:44:47 crc kubenswrapper[4901]: I0309 02:44:47.584817 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-llngg" event={"ID":"f9d5529e-413e-4c88-98d8-1df5a9e55721","Type":"ContainerDied","Data":"368e95487a8e31ef31a5fc871c10b02ebed4b56116c83e7486b7b9a266d1bfff"} Mar 09 02:44:47 crc kubenswrapper[4901]: I0309 02:44:47.584921 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-llngg" event={"ID":"f9d5529e-413e-4c88-98d8-1df5a9e55721","Type":"ContainerStarted","Data":"1414ec18dd610b4e3f03f5b2694707803f86255512f28660430c6d684f0bbf0b"} Mar 09 02:44:47 crc kubenswrapper[4901]: I0309 02:44:47.590451 4901 generic.go:334] "Generic (PLEG): container finished" podID="31ec7346-95de-49f9-ad63-7a7423ad1cc3" containerID="8bdb7754b1766bab89302c80d1535d35a1431877626b1603537636c60f6cf1d9" exitCode=0 Mar 09 02:44:47 crc kubenswrapper[4901]: I0309 02:44:47.590610 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pgv99" event={"ID":"31ec7346-95de-49f9-ad63-7a7423ad1cc3","Type":"ContainerDied","Data":"8bdb7754b1766bab89302c80d1535d35a1431877626b1603537636c60f6cf1d9"} Mar 09 02:44:47 crc kubenswrapper[4901]: I0309 02:44:47.590671 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pgv99" event={"ID":"31ec7346-95de-49f9-ad63-7a7423ad1cc3","Type":"ContainerStarted","Data":"3e9c01d36d5ee1ced6522878dfb79b08d16bed6c0a4e805bdd30c0f9dc1f4148"} Mar 09 02:44:47 crc kubenswrapper[4901]: I0309 02:44:47.630823 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/96f3139b-08f7-4019-b629-c0aae2286507-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"96f3139b-08f7-4019-b629-c0aae2286507\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 02:44:47 crc kubenswrapper[4901]: I0309 02:44:47.630880 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/96f3139b-08f7-4019-b629-c0aae2286507-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"96f3139b-08f7-4019-b629-c0aae2286507\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 02:44:47 crc kubenswrapper[4901]: I0309 02:44:47.631303 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/96f3139b-08f7-4019-b629-c0aae2286507-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"96f3139b-08f7-4019-b629-c0aae2286507\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 02:44:47 crc kubenswrapper[4901]: I0309 02:44:47.640813 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-lgxp2" event={"ID":"9e9eea51-12bc-40f5-94b0-3fb75a48b898","Type":"ContainerStarted","Data":"c48e1919908937c2177e58e36c65286751bf798175fd2058e923c98dc61988aa"} Mar 09 02:44:47 crc kubenswrapper[4901]: I0309 02:44:47.646316 4901 generic.go:334] "Generic (PLEG): container finished" podID="04b4583f-8f26-47e0-8726-a0c2f2dca07e" containerID="2d265cfec6c4e06c96aa8bff7ddac181540e019a7827efc35c9a92f92e5afb66" exitCode=0 Mar 09 02:44:47 crc kubenswrapper[4901]: I0309 02:44:47.646400 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f7kz2" event={"ID":"04b4583f-8f26-47e0-8726-a0c2f2dca07e","Type":"ContainerDied","Data":"2d265cfec6c4e06c96aa8bff7ddac181540e019a7827efc35c9a92f92e5afb66"} Mar 09 02:44:47 crc kubenswrapper[4901]: I0309 02:44:47.646471 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f7kz2" event={"ID":"04b4583f-8f26-47e0-8726-a0c2f2dca07e","Type":"ContainerStarted","Data":"fa840e7517bc7d02b5da717dfbd5f74776795a5c63c17597b6707a0d89c11c3e"} Mar 09 02:44:47 crc kubenswrapper[4901]: I0309 02:44:47.652829 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/96f3139b-08f7-4019-b629-c0aae2286507-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"96f3139b-08f7-4019-b629-c0aae2286507\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 02:44:47 crc kubenswrapper[4901]: I0309 02:44:47.652858 4901 generic.go:334] "Generic (PLEG): container finished" podID="b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20" containerID="871f8665b64e519e97af09130385cb75718dc35dd882a5dbc5ec662ba988581e" exitCode=0 Mar 09 02:44:47 crc kubenswrapper[4901]: I0309 02:44:47.653020 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8c7zs" event={"ID":"b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20","Type":"ContainerDied","Data":"871f8665b64e519e97af09130385cb75718dc35dd882a5dbc5ec662ba988581e"} Mar 09 02:44:47 crc kubenswrapper[4901]: I0309 02:44:47.659186 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5499684995-bmjk8" event={"ID":"f86d992b-5a2d-47ee-acf3-b73a05d43546","Type":"ContainerStarted","Data":"2f9bd55901a29d4456ef5545596d85aa9e3f2c1979e133a61b8a9f30f3b5284f"} Mar 09 02:44:47 crc kubenswrapper[4901]: I0309 02:44:47.662624 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" event={"ID":"3befa408-f48c-4244-81ca-6bf178967fbe","Type":"ContainerStarted","Data":"4accb1a5953199fd48c9457d5a8e4d43ff0a6f94b349e73f169595c37979ee89"} Mar 09 02:44:47 crc kubenswrapper[4901]: I0309 02:44:47.662732 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" event={"ID":"3befa408-f48c-4244-81ca-6bf178967fbe","Type":"ContainerStarted","Data":"27911bfed1ce10bbb49a63a16d2402f2ee0007b275a29004aa1f7f43403610e7"} Mar 09 02:44:47 crc kubenswrapper[4901]: I0309 02:44:47.664003 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:44:47 crc kubenswrapper[4901]: I0309 02:44:47.668169 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-lgxp2" podStartSLOduration=11.668142442 podStartE2EDuration="11.668142442s" podCreationTimestamp="2026-03-09 02:44:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:44:47.666796929 +0000 UTC m=+212.256460661" watchObservedRunningTime="2026-03-09 02:44:47.668142442 +0000 UTC m=+212.257806184" Mar 09 02:44:47 crc kubenswrapper[4901]: I0309 02:44:47.702806 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 02:44:47 crc kubenswrapper[4901]: I0309 02:44:47.711495 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" podStartSLOduration=140.711470384 podStartE2EDuration="2m20.711470384s" podCreationTimestamp="2026-03-09 02:42:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:44:47.708371716 +0000 UTC m=+212.298035458" watchObservedRunningTime="2026-03-09 02:44:47.711470384 +0000 UTC m=+212.301134106" Mar 09 02:44:47 crc kubenswrapper[4901]: I0309 02:44:47.828584 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lvtrc"] Mar 09 02:44:47 crc kubenswrapper[4901]: I0309 02:44:47.829594 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lvtrc" Mar 09 02:44:47 crc kubenswrapper[4901]: I0309 02:44:47.831730 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 09 02:44:47 crc kubenswrapper[4901]: I0309 02:44:47.852552 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lvtrc"] Mar 09 02:44:47 crc kubenswrapper[4901]: I0309 02:44:47.937934 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1092e265-a1ed-40f3-9a91-c1996ea7479c-utilities\") pod \"redhat-marketplace-lvtrc\" (UID: \"1092e265-a1ed-40f3-9a91-c1996ea7479c\") " pod="openshift-marketplace/redhat-marketplace-lvtrc" Mar 09 02:44:47 crc kubenswrapper[4901]: I0309 02:44:47.938284 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1092e265-a1ed-40f3-9a91-c1996ea7479c-catalog-content\") pod \"redhat-marketplace-lvtrc\" (UID: \"1092e265-a1ed-40f3-9a91-c1996ea7479c\") " pod="openshift-marketplace/redhat-marketplace-lvtrc" Mar 09 02:44:47 crc kubenswrapper[4901]: I0309 02:44:47.938328 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q425q\" (UniqueName: \"kubernetes.io/projected/1092e265-a1ed-40f3-9a91-c1996ea7479c-kube-api-access-q425q\") pod \"redhat-marketplace-lvtrc\" (UID: \"1092e265-a1ed-40f3-9a91-c1996ea7479c\") " pod="openshift-marketplace/redhat-marketplace-lvtrc" Mar 09 02:44:47 crc kubenswrapper[4901]: I0309 02:44:47.965912 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 09 02:44:48 crc kubenswrapper[4901]: I0309 02:44:48.041163 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1092e265-a1ed-40f3-9a91-c1996ea7479c-utilities\") pod \"redhat-marketplace-lvtrc\" (UID: \"1092e265-a1ed-40f3-9a91-c1996ea7479c\") " pod="openshift-marketplace/redhat-marketplace-lvtrc" Mar 09 02:44:48 crc kubenswrapper[4901]: I0309 02:44:48.041302 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1092e265-a1ed-40f3-9a91-c1996ea7479c-catalog-content\") pod \"redhat-marketplace-lvtrc\" (UID: \"1092e265-a1ed-40f3-9a91-c1996ea7479c\") " pod="openshift-marketplace/redhat-marketplace-lvtrc" Mar 09 02:44:48 crc kubenswrapper[4901]: I0309 02:44:48.041349 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q425q\" (UniqueName: \"kubernetes.io/projected/1092e265-a1ed-40f3-9a91-c1996ea7479c-kube-api-access-q425q\") pod \"redhat-marketplace-lvtrc\" (UID: \"1092e265-a1ed-40f3-9a91-c1996ea7479c\") " pod="openshift-marketplace/redhat-marketplace-lvtrc" Mar 09 02:44:48 crc kubenswrapper[4901]: I0309 02:44:48.042150 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1092e265-a1ed-40f3-9a91-c1996ea7479c-utilities\") pod \"redhat-marketplace-lvtrc\" (UID: \"1092e265-a1ed-40f3-9a91-c1996ea7479c\") " pod="openshift-marketplace/redhat-marketplace-lvtrc" Mar 09 02:44:48 crc kubenswrapper[4901]: I0309 02:44:48.043860 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1092e265-a1ed-40f3-9a91-c1996ea7479c-catalog-content\") pod \"redhat-marketplace-lvtrc\" (UID: \"1092e265-a1ed-40f3-9a91-c1996ea7479c\") " pod="openshift-marketplace/redhat-marketplace-lvtrc" Mar 09 02:44:48 crc kubenswrapper[4901]: I0309 02:44:48.062221 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q425q\" (UniqueName: \"kubernetes.io/projected/1092e265-a1ed-40f3-9a91-c1996ea7479c-kube-api-access-q425q\") pod \"redhat-marketplace-lvtrc\" (UID: \"1092e265-a1ed-40f3-9a91-c1996ea7479c\") " pod="openshift-marketplace/redhat-marketplace-lvtrc" Mar 09 02:44:48 crc kubenswrapper[4901]: I0309 02:44:48.123468 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="890abe57-aa9b-4c46-8a26-c2c1fd724fab" path="/var/lib/kubelet/pods/890abe57-aa9b-4c46-8a26-c2c1fd724fab/volumes" Mar 09 02:44:48 crc kubenswrapper[4901]: I0309 02:44:48.124363 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 09 02:44:48 crc kubenswrapper[4901]: I0309 02:44:48.180649 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lvtrc" Mar 09 02:44:48 crc kubenswrapper[4901]: I0309 02:44:48.199853 4901 ???:1] "http: TLS handshake error from 192.168.126.11:34490: no serving certificate available for the kubelet" Mar 09 02:44:48 crc kubenswrapper[4901]: I0309 02:44:48.225972 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jp6kj"] Mar 09 02:44:48 crc kubenswrapper[4901]: I0309 02:44:48.227930 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jp6kj" Mar 09 02:44:48 crc kubenswrapper[4901]: I0309 02:44:48.236408 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jp6kj"] Mar 09 02:44:48 crc kubenswrapper[4901]: I0309 02:44:48.346151 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da9622a3-9e9b-4c8a-86d3-110f44883bbc-catalog-content\") pod \"redhat-marketplace-jp6kj\" (UID: \"da9622a3-9e9b-4c8a-86d3-110f44883bbc\") " pod="openshift-marketplace/redhat-marketplace-jp6kj" Mar 09 02:44:48 crc kubenswrapper[4901]: I0309 02:44:48.346204 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98gpc\" (UniqueName: \"kubernetes.io/projected/da9622a3-9e9b-4c8a-86d3-110f44883bbc-kube-api-access-98gpc\") pod \"redhat-marketplace-jp6kj\" (UID: \"da9622a3-9e9b-4c8a-86d3-110f44883bbc\") " pod="openshift-marketplace/redhat-marketplace-jp6kj" Mar 09 02:44:48 crc kubenswrapper[4901]: I0309 02:44:48.346273 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da9622a3-9e9b-4c8a-86d3-110f44883bbc-utilities\") pod \"redhat-marketplace-jp6kj\" (UID: \"da9622a3-9e9b-4c8a-86d3-110f44883bbc\") " pod="openshift-marketplace/redhat-marketplace-jp6kj" Mar 09 02:44:48 crc kubenswrapper[4901]: I0309 02:44:48.381708 4901 patch_prober.go:28] interesting pod/router-default-5444994796-9v4g4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 02:44:48 crc kubenswrapper[4901]: [-]has-synced failed: reason withheld Mar 09 02:44:48 crc kubenswrapper[4901]: [+]process-running ok Mar 09 02:44:48 crc kubenswrapper[4901]: healthz check failed Mar 09 02:44:48 crc kubenswrapper[4901]: I0309 02:44:48.382059 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9v4g4" podUID="5eb34c1b-916a-4f45-a6b3-e5e8b17a0872" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 02:44:48 crc kubenswrapper[4901]: I0309 02:44:48.447571 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da9622a3-9e9b-4c8a-86d3-110f44883bbc-catalog-content\") pod \"redhat-marketplace-jp6kj\" (UID: \"da9622a3-9e9b-4c8a-86d3-110f44883bbc\") " pod="openshift-marketplace/redhat-marketplace-jp6kj" Mar 09 02:44:48 crc kubenswrapper[4901]: I0309 02:44:48.447640 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98gpc\" (UniqueName: \"kubernetes.io/projected/da9622a3-9e9b-4c8a-86d3-110f44883bbc-kube-api-access-98gpc\") pod \"redhat-marketplace-jp6kj\" (UID: \"da9622a3-9e9b-4c8a-86d3-110f44883bbc\") " pod="openshift-marketplace/redhat-marketplace-jp6kj" Mar 09 02:44:48 crc kubenswrapper[4901]: I0309 02:44:48.447684 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da9622a3-9e9b-4c8a-86d3-110f44883bbc-utilities\") pod \"redhat-marketplace-jp6kj\" (UID: \"da9622a3-9e9b-4c8a-86d3-110f44883bbc\") " pod="openshift-marketplace/redhat-marketplace-jp6kj" Mar 09 02:44:48 crc kubenswrapper[4901]: I0309 02:44:48.448163 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lvtrc"] Mar 09 02:44:48 crc kubenswrapper[4901]: I0309 02:44:48.449044 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da9622a3-9e9b-4c8a-86d3-110f44883bbc-utilities\") pod \"redhat-marketplace-jp6kj\" (UID: \"da9622a3-9e9b-4c8a-86d3-110f44883bbc\") " pod="openshift-marketplace/redhat-marketplace-jp6kj" Mar 09 02:44:48 crc kubenswrapper[4901]: I0309 02:44:48.449123 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da9622a3-9e9b-4c8a-86d3-110f44883bbc-catalog-content\") pod \"redhat-marketplace-jp6kj\" (UID: \"da9622a3-9e9b-4c8a-86d3-110f44883bbc\") " pod="openshift-marketplace/redhat-marketplace-jp6kj" Mar 09 02:44:48 crc kubenswrapper[4901]: I0309 02:44:48.466686 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98gpc\" (UniqueName: \"kubernetes.io/projected/da9622a3-9e9b-4c8a-86d3-110f44883bbc-kube-api-access-98gpc\") pod \"redhat-marketplace-jp6kj\" (UID: \"da9622a3-9e9b-4c8a-86d3-110f44883bbc\") " pod="openshift-marketplace/redhat-marketplace-jp6kj" Mar 09 02:44:48 crc kubenswrapper[4901]: W0309 02:44:48.468873 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1092e265_a1ed_40f3_9a91_c1996ea7479c.slice/crio-4c82cd34d56ab02d1aa1166ad65a93b978d753395d57bf0a3096e373f891dc9a WatchSource:0}: Error finding container 4c82cd34d56ab02d1aa1166ad65a93b978d753395d57bf0a3096e373f891dc9a: Status 404 returned error can't find the container with id 4c82cd34d56ab02d1aa1166ad65a93b978d753395d57bf0a3096e373f891dc9a Mar 09 02:44:48 crc kubenswrapper[4901]: I0309 02:44:48.577679 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jp6kj" Mar 09 02:44:48 crc kubenswrapper[4901]: I0309 02:44:48.671194 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"96f3139b-08f7-4019-b629-c0aae2286507","Type":"ContainerStarted","Data":"6be170aeaffe427415a4ce77fb8a665854aec5deafef0adf40d2c9b4c856c24b"} Mar 09 02:44:48 crc kubenswrapper[4901]: I0309 02:44:48.671242 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"96f3139b-08f7-4019-b629-c0aae2286507","Type":"ContainerStarted","Data":"9564bfb5831d51d5ca8b7f665bc3676172b4c097a093e358aa327e2b56cb3bd5"} Mar 09 02:44:48 crc kubenswrapper[4901]: I0309 02:44:48.678785 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5499684995-bmjk8" event={"ID":"f86d992b-5a2d-47ee-acf3-b73a05d43546","Type":"ContainerStarted","Data":"4d26710a3af4946b502fa37af205edd2be2a09071299c2d8432854be8491f006"} Mar 09 02:44:48 crc kubenswrapper[4901]: I0309 02:44:48.680024 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5499684995-bmjk8" Mar 09 02:44:48 crc kubenswrapper[4901]: I0309 02:44:48.684003 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lvtrc" event={"ID":"1092e265-a1ed-40f3-9a91-c1996ea7479c","Type":"ContainerStarted","Data":"4c82cd34d56ab02d1aa1166ad65a93b978d753395d57bf0a3096e373f891dc9a"} Mar 09 02:44:48 crc kubenswrapper[4901]: I0309 02:44:48.688648 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5499684995-bmjk8" Mar 09 02:44:48 crc kubenswrapper[4901]: I0309 02:44:48.689299 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=1.689290341 podStartE2EDuration="1.689290341s" podCreationTimestamp="2026-03-09 02:44:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:44:48.688722493 +0000 UTC m=+213.278386225" watchObservedRunningTime="2026-03-09 02:44:48.689290341 +0000 UTC m=+213.278954073" Mar 09 02:44:48 crc kubenswrapper[4901]: I0309 02:44:48.700677 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 09 02:44:48 crc kubenswrapper[4901]: I0309 02:44:48.701697 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 02:44:48 crc kubenswrapper[4901]: I0309 02:44:48.704169 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 09 02:44:48 crc kubenswrapper[4901]: I0309 02:44:48.704259 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 09 02:44:48 crc kubenswrapper[4901]: I0309 02:44:48.714332 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 09 02:44:48 crc kubenswrapper[4901]: I0309 02:44:48.715484 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5499684995-bmjk8" podStartSLOduration=3.7154654799999998 podStartE2EDuration="3.71546548s" podCreationTimestamp="2026-03-09 02:44:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:44:48.710313847 +0000 UTC m=+213.299977579" watchObservedRunningTime="2026-03-09 02:44:48.71546548 +0000 UTC m=+213.305129202" Mar 09 02:44:48 crc kubenswrapper[4901]: I0309 02:44:48.718698 4901 patch_prober.go:28] interesting pod/downloads-7954f5f757-jznjb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 09 02:44:48 crc kubenswrapper[4901]: I0309 02:44:48.718739 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-jznjb" podUID="054fc716-7800-43b0-af23-328b685f89f9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 09 02:44:48 crc kubenswrapper[4901]: I0309 02:44:48.720239 4901 patch_prober.go:28] interesting pod/downloads-7954f5f757-jznjb container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 09 02:44:48 crc kubenswrapper[4901]: I0309 02:44:48.720340 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-jznjb" podUID="054fc716-7800-43b0-af23-328b685f89f9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 09 02:44:48 crc kubenswrapper[4901]: I0309 02:44:48.756953 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/57cf013d-2206-4a42-a7f9-78b7a8fa4c5d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"57cf013d-2206-4a42-a7f9-78b7a8fa4c5d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 02:44:48 crc kubenswrapper[4901]: I0309 02:44:48.757412 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/57cf013d-2206-4a42-a7f9-78b7a8fa4c5d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"57cf013d-2206-4a42-a7f9-78b7a8fa4c5d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 02:44:48 crc kubenswrapper[4901]: I0309 02:44:48.830682 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-m9pkg"] Mar 09 02:44:48 crc kubenswrapper[4901]: I0309 02:44:48.832702 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m9pkg" Mar 09 02:44:48 crc kubenswrapper[4901]: I0309 02:44:48.840724 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 09 02:44:48 crc kubenswrapper[4901]: I0309 02:44:48.853435 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m9pkg"] Mar 09 02:44:48 crc kubenswrapper[4901]: I0309 02:44:48.859470 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/57cf013d-2206-4a42-a7f9-78b7a8fa4c5d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"57cf013d-2206-4a42-a7f9-78b7a8fa4c5d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 02:44:48 crc kubenswrapper[4901]: I0309 02:44:48.859536 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/57cf013d-2206-4a42-a7f9-78b7a8fa4c5d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"57cf013d-2206-4a42-a7f9-78b7a8fa4c5d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 02:44:48 crc kubenswrapper[4901]: I0309 02:44:48.859586 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/57cf013d-2206-4a42-a7f9-78b7a8fa4c5d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"57cf013d-2206-4a42-a7f9-78b7a8fa4c5d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 02:44:48 crc kubenswrapper[4901]: I0309 02:44:48.877042 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/57cf013d-2206-4a42-a7f9-78b7a8fa4c5d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"57cf013d-2206-4a42-a7f9-78b7a8fa4c5d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 02:44:48 crc kubenswrapper[4901]: I0309 02:44:48.917031 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jp6kj"] Mar 09 02:44:48 crc kubenswrapper[4901]: I0309 02:44:48.961661 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f881266-e72e-4a74-a232-2dc3c6e95f08-utilities\") pod \"redhat-operators-m9pkg\" (UID: \"1f881266-e72e-4a74-a232-2dc3c6e95f08\") " pod="openshift-marketplace/redhat-operators-m9pkg" Mar 09 02:44:48 crc kubenswrapper[4901]: I0309 02:44:48.961756 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f881266-e72e-4a74-a232-2dc3c6e95f08-catalog-content\") pod \"redhat-operators-m9pkg\" (UID: \"1f881266-e72e-4a74-a232-2dc3c6e95f08\") " pod="openshift-marketplace/redhat-operators-m9pkg" Mar 09 02:44:48 crc kubenswrapper[4901]: I0309 02:44:48.961798 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvb89\" (UniqueName: \"kubernetes.io/projected/1f881266-e72e-4a74-a232-2dc3c6e95f08-kube-api-access-hvb89\") pod \"redhat-operators-m9pkg\" (UID: \"1f881266-e72e-4a74-a232-2dc3c6e95f08\") " pod="openshift-marketplace/redhat-operators-m9pkg" Mar 09 02:44:49 crc kubenswrapper[4901]: I0309 02:44:49.024442 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 02:44:49 crc kubenswrapper[4901]: I0309 02:44:49.062651 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f881266-e72e-4a74-a232-2dc3c6e95f08-catalog-content\") pod \"redhat-operators-m9pkg\" (UID: \"1f881266-e72e-4a74-a232-2dc3c6e95f08\") " pod="openshift-marketplace/redhat-operators-m9pkg" Mar 09 02:44:49 crc kubenswrapper[4901]: I0309 02:44:49.062711 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvb89\" (UniqueName: \"kubernetes.io/projected/1f881266-e72e-4a74-a232-2dc3c6e95f08-kube-api-access-hvb89\") pod \"redhat-operators-m9pkg\" (UID: \"1f881266-e72e-4a74-a232-2dc3c6e95f08\") " pod="openshift-marketplace/redhat-operators-m9pkg" Mar 09 02:44:49 crc kubenswrapper[4901]: I0309 02:44:49.062766 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f881266-e72e-4a74-a232-2dc3c6e95f08-utilities\") pod \"redhat-operators-m9pkg\" (UID: \"1f881266-e72e-4a74-a232-2dc3c6e95f08\") " pod="openshift-marketplace/redhat-operators-m9pkg" Mar 09 02:44:49 crc kubenswrapper[4901]: I0309 02:44:49.063137 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f881266-e72e-4a74-a232-2dc3c6e95f08-utilities\") pod \"redhat-operators-m9pkg\" (UID: \"1f881266-e72e-4a74-a232-2dc3c6e95f08\") " pod="openshift-marketplace/redhat-operators-m9pkg" Mar 09 02:44:49 crc kubenswrapper[4901]: I0309 02:44:49.063365 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f881266-e72e-4a74-a232-2dc3c6e95f08-catalog-content\") pod \"redhat-operators-m9pkg\" (UID: \"1f881266-e72e-4a74-a232-2dc3c6e95f08\") " pod="openshift-marketplace/redhat-operators-m9pkg" Mar 09 02:44:49 crc kubenswrapper[4901]: I0309 02:44:49.085041 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvb89\" (UniqueName: \"kubernetes.io/projected/1f881266-e72e-4a74-a232-2dc3c6e95f08-kube-api-access-hvb89\") pod \"redhat-operators-m9pkg\" (UID: \"1f881266-e72e-4a74-a232-2dc3c6e95f08\") " pod="openshift-marketplace/redhat-operators-m9pkg" Mar 09 02:44:49 crc kubenswrapper[4901]: I0309 02:44:49.156641 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m9pkg" Mar 09 02:44:49 crc kubenswrapper[4901]: I0309 02:44:49.237096 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rrzzj"] Mar 09 02:44:49 crc kubenswrapper[4901]: I0309 02:44:49.238827 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rrzzj" Mar 09 02:44:49 crc kubenswrapper[4901]: I0309 02:44:49.256934 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rrzzj"] Mar 09 02:44:49 crc kubenswrapper[4901]: I0309 02:44:49.258889 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-xc8gr" Mar 09 02:44:49 crc kubenswrapper[4901]: I0309 02:44:49.258920 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-xc8gr" Mar 09 02:44:49 crc kubenswrapper[4901]: I0309 02:44:49.275017 4901 patch_prober.go:28] interesting pod/console-f9d7485db-xc8gr container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.40:8443/health\": dial tcp 10.217.0.40:8443: connect: connection refused" start-of-body= Mar 09 02:44:49 crc kubenswrapper[4901]: I0309 02:44:49.275070 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-xc8gr" podUID="65646690-9b87-47d8-a187-207924a2c486" containerName="console" probeResult="failure" output="Get \"https://10.217.0.40:8443/health\": dial tcp 10.217.0.40:8443: connect: connection refused" Mar 09 02:44:49 crc kubenswrapper[4901]: I0309 02:44:49.367679 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-9v4g4" Mar 09 02:44:49 crc kubenswrapper[4901]: I0309 02:44:49.371124 4901 patch_prober.go:28] interesting pod/router-default-5444994796-9v4g4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 02:44:49 crc kubenswrapper[4901]: [-]has-synced failed: reason withheld Mar 09 02:44:49 crc kubenswrapper[4901]: [+]process-running ok Mar 09 02:44:49 crc kubenswrapper[4901]: healthz check failed Mar 09 02:44:49 crc kubenswrapper[4901]: I0309 02:44:49.371197 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9v4g4" podUID="5eb34c1b-916a-4f45-a6b3-e5e8b17a0872" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 02:44:49 crc kubenswrapper[4901]: I0309 02:44:49.376160 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ckrx\" (UniqueName: \"kubernetes.io/projected/980a688d-18b9-4f90-9255-a55568e7bbc0-kube-api-access-4ckrx\") pod \"redhat-operators-rrzzj\" (UID: \"980a688d-18b9-4f90-9255-a55568e7bbc0\") " pod="openshift-marketplace/redhat-operators-rrzzj" Mar 09 02:44:49 crc kubenswrapper[4901]: I0309 02:44:49.376187 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/980a688d-18b9-4f90-9255-a55568e7bbc0-utilities\") pod \"redhat-operators-rrzzj\" (UID: \"980a688d-18b9-4f90-9255-a55568e7bbc0\") " pod="openshift-marketplace/redhat-operators-rrzzj" Mar 09 02:44:49 crc kubenswrapper[4901]: I0309 02:44:49.376327 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/980a688d-18b9-4f90-9255-a55568e7bbc0-catalog-content\") pod \"redhat-operators-rrzzj\" (UID: \"980a688d-18b9-4f90-9255-a55568e7bbc0\") " pod="openshift-marketplace/redhat-operators-rrzzj" Mar 09 02:44:49 crc kubenswrapper[4901]: I0309 02:44:49.482827 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/980a688d-18b9-4f90-9255-a55568e7bbc0-utilities\") pod \"redhat-operators-rrzzj\" (UID: \"980a688d-18b9-4f90-9255-a55568e7bbc0\") " pod="openshift-marketplace/redhat-operators-rrzzj" Mar 09 02:44:49 crc kubenswrapper[4901]: I0309 02:44:49.482918 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/980a688d-18b9-4f90-9255-a55568e7bbc0-catalog-content\") pod \"redhat-operators-rrzzj\" (UID: \"980a688d-18b9-4f90-9255-a55568e7bbc0\") " pod="openshift-marketplace/redhat-operators-rrzzj" Mar 09 02:44:49 crc kubenswrapper[4901]: I0309 02:44:49.483014 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ckrx\" (UniqueName: \"kubernetes.io/projected/980a688d-18b9-4f90-9255-a55568e7bbc0-kube-api-access-4ckrx\") pod \"redhat-operators-rrzzj\" (UID: \"980a688d-18b9-4f90-9255-a55568e7bbc0\") " pod="openshift-marketplace/redhat-operators-rrzzj" Mar 09 02:44:49 crc kubenswrapper[4901]: I0309 02:44:49.485364 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/980a688d-18b9-4f90-9255-a55568e7bbc0-utilities\") pod \"redhat-operators-rrzzj\" (UID: \"980a688d-18b9-4f90-9255-a55568e7bbc0\") " pod="openshift-marketplace/redhat-operators-rrzzj" Mar 09 02:44:49 crc kubenswrapper[4901]: I0309 02:44:49.486140 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/980a688d-18b9-4f90-9255-a55568e7bbc0-catalog-content\") pod \"redhat-operators-rrzzj\" (UID: \"980a688d-18b9-4f90-9255-a55568e7bbc0\") " pod="openshift-marketplace/redhat-operators-rrzzj" Mar 09 02:44:49 crc kubenswrapper[4901]: I0309 02:44:49.511966 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ckrx\" (UniqueName: \"kubernetes.io/projected/980a688d-18b9-4f90-9255-a55568e7bbc0-kube-api-access-4ckrx\") pod \"redhat-operators-rrzzj\" (UID: \"980a688d-18b9-4f90-9255-a55568e7bbc0\") " pod="openshift-marketplace/redhat-operators-rrzzj" Mar 09 02:44:49 crc kubenswrapper[4901]: I0309 02:44:49.594371 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rrzzj" Mar 09 02:44:49 crc kubenswrapper[4901]: I0309 02:44:49.624932 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 09 02:44:49 crc kubenswrapper[4901]: W0309 02:44:49.675301 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod57cf013d_2206_4a42_a7f9_78b7a8fa4c5d.slice/crio-5e160cedbe5e9281a0747197729ce38d6f5dc3e49716e1e155bffe07f75e968d WatchSource:0}: Error finding container 5e160cedbe5e9281a0747197729ce38d6f5dc3e49716e1e155bffe07f75e968d: Status 404 returned error can't find the container with id 5e160cedbe5e9281a0747197729ce38d6f5dc3e49716e1e155bffe07f75e968d Mar 09 02:44:49 crc kubenswrapper[4901]: I0309 02:44:49.699302 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m9pkg"] Mar 09 02:44:49 crc kubenswrapper[4901]: I0309 02:44:49.728187 4901 generic.go:334] "Generic (PLEG): container finished" podID="1092e265-a1ed-40f3-9a91-c1996ea7479c" containerID="53230f46debba4713ebe4d6ad9b7aefb27b7fd4a4b57aebf05721b6999490970" exitCode=0 Mar 09 02:44:49 crc kubenswrapper[4901]: I0309 02:44:49.728275 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lvtrc" event={"ID":"1092e265-a1ed-40f3-9a91-c1996ea7479c","Type":"ContainerDied","Data":"53230f46debba4713ebe4d6ad9b7aefb27b7fd4a4b57aebf05721b6999490970"} Mar 09 02:44:49 crc kubenswrapper[4901]: I0309 02:44:49.734589 4901 generic.go:334] "Generic (PLEG): container finished" podID="da9622a3-9e9b-4c8a-86d3-110f44883bbc" containerID="bc517bbf89051779d2c9961b2dbb1cb2009e6191eb89d6f198ecf7896781c952" exitCode=0 Mar 09 02:44:49 crc kubenswrapper[4901]: I0309 02:44:49.734669 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jp6kj" event={"ID":"da9622a3-9e9b-4c8a-86d3-110f44883bbc","Type":"ContainerDied","Data":"bc517bbf89051779d2c9961b2dbb1cb2009e6191eb89d6f198ecf7896781c952"} Mar 09 02:44:49 crc kubenswrapper[4901]: I0309 02:44:49.734695 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jp6kj" event={"ID":"da9622a3-9e9b-4c8a-86d3-110f44883bbc","Type":"ContainerStarted","Data":"c0dbd2e364a9d9f9191768b79e02f943d3dfb0ed0e35922b763df5adbba2e901"} Mar 09 02:44:49 crc kubenswrapper[4901]: I0309 02:44:49.740528 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"57cf013d-2206-4a42-a7f9-78b7a8fa4c5d","Type":"ContainerStarted","Data":"5e160cedbe5e9281a0747197729ce38d6f5dc3e49716e1e155bffe07f75e968d"} Mar 09 02:44:49 crc kubenswrapper[4901]: I0309 02:44:49.749533 4901 generic.go:334] "Generic (PLEG): container finished" podID="96f3139b-08f7-4019-b629-c0aae2286507" containerID="6be170aeaffe427415a4ce77fb8a665854aec5deafef0adf40d2c9b4c856c24b" exitCode=0 Mar 09 02:44:49 crc kubenswrapper[4901]: I0309 02:44:49.749753 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"96f3139b-08f7-4019-b629-c0aae2286507","Type":"ContainerDied","Data":"6be170aeaffe427415a4ce77fb8a665854aec5deafef0adf40d2c9b4c856c24b"} Mar 09 02:44:49 crc kubenswrapper[4901]: I0309 02:44:49.803079 4901 ???:1] "http: TLS handshake error from 192.168.126.11:34496: no serving certificate available for the kubelet" Mar 09 02:44:49 crc kubenswrapper[4901]: I0309 02:44:49.968394 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rrzzj"] Mar 09 02:44:50 crc kubenswrapper[4901]: I0309 02:44:50.372041 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-9v4g4" Mar 09 02:44:50 crc kubenswrapper[4901]: I0309 02:44:50.380075 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-9v4g4" Mar 09 02:44:50 crc kubenswrapper[4901]: I0309 02:44:50.763908 4901 generic.go:334] "Generic (PLEG): container finished" podID="980a688d-18b9-4f90-9255-a55568e7bbc0" containerID="221aebb0e1b3bf7eb222157cf442fc7acd39bda43a80d52539eca60faf5a6ba9" exitCode=0 Mar 09 02:44:50 crc kubenswrapper[4901]: I0309 02:44:50.763983 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rrzzj" event={"ID":"980a688d-18b9-4f90-9255-a55568e7bbc0","Type":"ContainerDied","Data":"221aebb0e1b3bf7eb222157cf442fc7acd39bda43a80d52539eca60faf5a6ba9"} Mar 09 02:44:50 crc kubenswrapper[4901]: I0309 02:44:50.764011 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rrzzj" event={"ID":"980a688d-18b9-4f90-9255-a55568e7bbc0","Type":"ContainerStarted","Data":"eebca66ba4df17581a2b391b60c15d20e345580a7efefc3b3b72adca8c654761"} Mar 09 02:44:50 crc kubenswrapper[4901]: I0309 02:44:50.768470 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"57cf013d-2206-4a42-a7f9-78b7a8fa4c5d","Type":"ContainerStarted","Data":"7f1e6c3258ae9bce71cb0a280c27fad75d40e203f4b399404360c3c0ed1e05a7"} Mar 09 02:44:50 crc kubenswrapper[4901]: I0309 02:44:50.770671 4901 generic.go:334] "Generic (PLEG): container finished" podID="1f881266-e72e-4a74-a232-2dc3c6e95f08" containerID="1dbb22790e59c59af731447241266c244f449a0deeb5bf08a313ec3f2c0c8430" exitCode=0 Mar 09 02:44:50 crc kubenswrapper[4901]: I0309 02:44:50.770716 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m9pkg" event={"ID":"1f881266-e72e-4a74-a232-2dc3c6e95f08","Type":"ContainerDied","Data":"1dbb22790e59c59af731447241266c244f449a0deeb5bf08a313ec3f2c0c8430"} Mar 09 02:44:50 crc kubenswrapper[4901]: I0309 02:44:50.770734 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m9pkg" event={"ID":"1f881266-e72e-4a74-a232-2dc3c6e95f08","Type":"ContainerStarted","Data":"26f0c78f7c31f9c59be5d85ae92944375c6d9934b1dc573c9efa445e871ddb5c"} Mar 09 02:44:50 crc kubenswrapper[4901]: I0309 02:44:50.801157 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.801136824 podStartE2EDuration="2.801136824s" podCreationTimestamp="2026-03-09 02:44:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:44:50.79814459 +0000 UTC m=+215.387808322" watchObservedRunningTime="2026-03-09 02:44:50.801136824 +0000 UTC m=+215.390800556" Mar 09 02:44:51 crc kubenswrapper[4901]: I0309 02:44:51.115697 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 02:44:51 crc kubenswrapper[4901]: I0309 02:44:51.227619 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/96f3139b-08f7-4019-b629-c0aae2286507-kube-api-access\") pod \"96f3139b-08f7-4019-b629-c0aae2286507\" (UID: \"96f3139b-08f7-4019-b629-c0aae2286507\") " Mar 09 02:44:51 crc kubenswrapper[4901]: I0309 02:44:51.227715 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/96f3139b-08f7-4019-b629-c0aae2286507-kubelet-dir\") pod \"96f3139b-08f7-4019-b629-c0aae2286507\" (UID: \"96f3139b-08f7-4019-b629-c0aae2286507\") " Mar 09 02:44:51 crc kubenswrapper[4901]: I0309 02:44:51.227879 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/96f3139b-08f7-4019-b629-c0aae2286507-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "96f3139b-08f7-4019-b629-c0aae2286507" (UID: "96f3139b-08f7-4019-b629-c0aae2286507"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 02:44:51 crc kubenswrapper[4901]: I0309 02:44:51.233447 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96f3139b-08f7-4019-b629-c0aae2286507-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "96f3139b-08f7-4019-b629-c0aae2286507" (UID: "96f3139b-08f7-4019-b629-c0aae2286507"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:44:51 crc kubenswrapper[4901]: I0309 02:44:51.329490 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/96f3139b-08f7-4019-b629-c0aae2286507-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 02:44:51 crc kubenswrapper[4901]: I0309 02:44:51.329546 4901 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/96f3139b-08f7-4019-b629-c0aae2286507-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 09 02:44:51 crc kubenswrapper[4901]: I0309 02:44:51.791955 4901 generic.go:334] "Generic (PLEG): container finished" podID="57cf013d-2206-4a42-a7f9-78b7a8fa4c5d" containerID="7f1e6c3258ae9bce71cb0a280c27fad75d40e203f4b399404360c3c0ed1e05a7" exitCode=0 Mar 09 02:44:51 crc kubenswrapper[4901]: I0309 02:44:51.792042 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"57cf013d-2206-4a42-a7f9-78b7a8fa4c5d","Type":"ContainerDied","Data":"7f1e6c3258ae9bce71cb0a280c27fad75d40e203f4b399404360c3c0ed1e05a7"} Mar 09 02:44:51 crc kubenswrapper[4901]: I0309 02:44:51.794374 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 02:44:51 crc kubenswrapper[4901]: I0309 02:44:51.794344 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"96f3139b-08f7-4019-b629-c0aae2286507","Type":"ContainerDied","Data":"9564bfb5831d51d5ca8b7f665bc3676172b4c097a093e358aa327e2b56cb3bd5"} Mar 09 02:44:51 crc kubenswrapper[4901]: I0309 02:44:51.795050 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9564bfb5831d51d5ca8b7f665bc3676172b4c097a093e358aa327e2b56cb3bd5" Mar 09 02:44:53 crc kubenswrapper[4901]: I0309 02:44:53.337428 4901 ???:1] "http: TLS handshake error from 192.168.126.11:33742: no serving certificate available for the kubelet" Mar 09 02:44:53 crc kubenswrapper[4901]: I0309 02:44:53.821651 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-665b6dd947-jffft_99c51354-0ea1-4e7a-bd34-1bb57e6422c0/cluster-samples-operator/0.log" Mar 09 02:44:53 crc kubenswrapper[4901]: I0309 02:44:53.821687 4901 generic.go:334] "Generic (PLEG): container finished" podID="99c51354-0ea1-4e7a-bd34-1bb57e6422c0" containerID="b1a3e2e4429ef200d4159db216055d96b7cfb870d1918713f26d536f822cf888" exitCode=2 Mar 09 02:44:53 crc kubenswrapper[4901]: I0309 02:44:53.821716 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jffft" event={"ID":"99c51354-0ea1-4e7a-bd34-1bb57e6422c0","Type":"ContainerDied","Data":"b1a3e2e4429ef200d4159db216055d96b7cfb870d1918713f26d536f822cf888"} Mar 09 02:44:53 crc kubenswrapper[4901]: I0309 02:44:53.822130 4901 scope.go:117] "RemoveContainer" containerID="b1a3e2e4429ef200d4159db216055d96b7cfb870d1918713f26d536f822cf888" Mar 09 02:44:54 crc kubenswrapper[4901]: I0309 02:44:54.358023 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-np7wv" Mar 09 02:44:58 crc kubenswrapper[4901]: I0309 02:44:58.724770 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-jznjb" Mar 09 02:44:59 crc kubenswrapper[4901]: I0309 02:44:59.262792 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-xc8gr" Mar 09 02:44:59 crc kubenswrapper[4901]: I0309 02:44:59.268244 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-xc8gr" Mar 09 02:45:00 crc kubenswrapper[4901]: I0309 02:45:00.135383 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550405-nlxv5"] Mar 09 02:45:00 crc kubenswrapper[4901]: E0309 02:45:00.135593 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96f3139b-08f7-4019-b629-c0aae2286507" containerName="pruner" Mar 09 02:45:00 crc kubenswrapper[4901]: I0309 02:45:00.135604 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="96f3139b-08f7-4019-b629-c0aae2286507" containerName="pruner" Mar 09 02:45:00 crc kubenswrapper[4901]: I0309 02:45:00.135691 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="96f3139b-08f7-4019-b629-c0aae2286507" containerName="pruner" Mar 09 02:45:00 crc kubenswrapper[4901]: I0309 02:45:00.136054 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550405-nlxv5" Mar 09 02:45:00 crc kubenswrapper[4901]: I0309 02:45:00.137827 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 09 02:45:00 crc kubenswrapper[4901]: I0309 02:45:00.139447 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 09 02:45:00 crc kubenswrapper[4901]: I0309 02:45:00.153649 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550405-nlxv5"] Mar 09 02:45:00 crc kubenswrapper[4901]: I0309 02:45:00.200661 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e7e0edae-f21a-473e-93a7-c9b797f9a112-secret-volume\") pod \"collect-profiles-29550405-nlxv5\" (UID: \"e7e0edae-f21a-473e-93a7-c9b797f9a112\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550405-nlxv5" Mar 09 02:45:00 crc kubenswrapper[4901]: I0309 02:45:00.200742 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ppsc\" (UniqueName: \"kubernetes.io/projected/e7e0edae-f21a-473e-93a7-c9b797f9a112-kube-api-access-6ppsc\") pod \"collect-profiles-29550405-nlxv5\" (UID: \"e7e0edae-f21a-473e-93a7-c9b797f9a112\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550405-nlxv5" Mar 09 02:45:00 crc kubenswrapper[4901]: I0309 02:45:00.200820 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7e0edae-f21a-473e-93a7-c9b797f9a112-config-volume\") pod \"collect-profiles-29550405-nlxv5\" (UID: \"e7e0edae-f21a-473e-93a7-c9b797f9a112\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550405-nlxv5" Mar 09 02:45:00 crc kubenswrapper[4901]: I0309 02:45:00.302075 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7e0edae-f21a-473e-93a7-c9b797f9a112-config-volume\") pod \"collect-profiles-29550405-nlxv5\" (UID: \"e7e0edae-f21a-473e-93a7-c9b797f9a112\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550405-nlxv5" Mar 09 02:45:00 crc kubenswrapper[4901]: I0309 02:45:00.302138 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e7e0edae-f21a-473e-93a7-c9b797f9a112-secret-volume\") pod \"collect-profiles-29550405-nlxv5\" (UID: \"e7e0edae-f21a-473e-93a7-c9b797f9a112\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550405-nlxv5" Mar 09 02:45:00 crc kubenswrapper[4901]: I0309 02:45:00.302173 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ppsc\" (UniqueName: \"kubernetes.io/projected/e7e0edae-f21a-473e-93a7-c9b797f9a112-kube-api-access-6ppsc\") pod \"collect-profiles-29550405-nlxv5\" (UID: \"e7e0edae-f21a-473e-93a7-c9b797f9a112\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550405-nlxv5" Mar 09 02:45:00 crc kubenswrapper[4901]: I0309 02:45:00.303009 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7e0edae-f21a-473e-93a7-c9b797f9a112-config-volume\") pod \"collect-profiles-29550405-nlxv5\" (UID: \"e7e0edae-f21a-473e-93a7-c9b797f9a112\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550405-nlxv5" Mar 09 02:45:00 crc kubenswrapper[4901]: I0309 02:45:00.312508 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e7e0edae-f21a-473e-93a7-c9b797f9a112-secret-volume\") pod \"collect-profiles-29550405-nlxv5\" (UID: \"e7e0edae-f21a-473e-93a7-c9b797f9a112\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550405-nlxv5" Mar 09 02:45:00 crc kubenswrapper[4901]: I0309 02:45:00.323113 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ppsc\" (UniqueName: \"kubernetes.io/projected/e7e0edae-f21a-473e-93a7-c9b797f9a112-kube-api-access-6ppsc\") pod \"collect-profiles-29550405-nlxv5\" (UID: \"e7e0edae-f21a-473e-93a7-c9b797f9a112\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550405-nlxv5" Mar 09 02:45:00 crc kubenswrapper[4901]: I0309 02:45:00.465434 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550405-nlxv5" Mar 09 02:45:01 crc kubenswrapper[4901]: I0309 02:45:01.306361 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 02:45:01 crc kubenswrapper[4901]: I0309 02:45:01.416681 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/57cf013d-2206-4a42-a7f9-78b7a8fa4c5d-kubelet-dir\") pod \"57cf013d-2206-4a42-a7f9-78b7a8fa4c5d\" (UID: \"57cf013d-2206-4a42-a7f9-78b7a8fa4c5d\") " Mar 09 02:45:01 crc kubenswrapper[4901]: I0309 02:45:01.416749 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/57cf013d-2206-4a42-a7f9-78b7a8fa4c5d-kube-api-access\") pod \"57cf013d-2206-4a42-a7f9-78b7a8fa4c5d\" (UID: \"57cf013d-2206-4a42-a7f9-78b7a8fa4c5d\") " Mar 09 02:45:01 crc kubenswrapper[4901]: I0309 02:45:01.416897 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57cf013d-2206-4a42-a7f9-78b7a8fa4c5d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "57cf013d-2206-4a42-a7f9-78b7a8fa4c5d" (UID: "57cf013d-2206-4a42-a7f9-78b7a8fa4c5d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 02:45:01 crc kubenswrapper[4901]: I0309 02:45:01.417131 4901 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/57cf013d-2206-4a42-a7f9-78b7a8fa4c5d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 09 02:45:01 crc kubenswrapper[4901]: I0309 02:45:01.420156 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57cf013d-2206-4a42-a7f9-78b7a8fa4c5d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "57cf013d-2206-4a42-a7f9-78b7a8fa4c5d" (UID: "57cf013d-2206-4a42-a7f9-78b7a8fa4c5d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:45:01 crc kubenswrapper[4901]: I0309 02:45:01.517818 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/57cf013d-2206-4a42-a7f9-78b7a8fa4c5d-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 02:45:01 crc kubenswrapper[4901]: I0309 02:45:01.867325 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"57cf013d-2206-4a42-a7f9-78b7a8fa4c5d","Type":"ContainerDied","Data":"5e160cedbe5e9281a0747197729ce38d6f5dc3e49716e1e155bffe07f75e968d"} Mar 09 02:45:01 crc kubenswrapper[4901]: I0309 02:45:01.867363 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e160cedbe5e9281a0747197729ce38d6f5dc3e49716e1e155bffe07f75e968d" Mar 09 02:45:01 crc kubenswrapper[4901]: I0309 02:45:01.867421 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 02:45:04 crc kubenswrapper[4901]: I0309 02:45:04.210376 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-54d6bfbf5c-w8487"] Mar 09 02:45:04 crc kubenswrapper[4901]: I0309 02:45:04.210928 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-54d6bfbf5c-w8487" podUID="20e06fd3-8c8f-452c-9df4-25911ea82ac1" containerName="controller-manager" containerID="cri-o://0f1ff6dd01bff90753a073c26942f8438816f62dbf721cb67d41d0a23e08dfae" gracePeriod=30 Mar 09 02:45:04 crc kubenswrapper[4901]: I0309 02:45:04.213055 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5499684995-bmjk8"] Mar 09 02:45:04 crc kubenswrapper[4901]: I0309 02:45:04.213332 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5499684995-bmjk8" podUID="f86d992b-5a2d-47ee-acf3-b73a05d43546" containerName="route-controller-manager" containerID="cri-o://4d26710a3af4946b502fa37af205edd2be2a09071299c2d8432854be8491f006" gracePeriod=30 Mar 09 02:45:04 crc kubenswrapper[4901]: I0309 02:45:04.890870 4901 generic.go:334] "Generic (PLEG): container finished" podID="20e06fd3-8c8f-452c-9df4-25911ea82ac1" containerID="0f1ff6dd01bff90753a073c26942f8438816f62dbf721cb67d41d0a23e08dfae" exitCode=0 Mar 09 02:45:04 crc kubenswrapper[4901]: I0309 02:45:04.891055 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54d6bfbf5c-w8487" event={"ID":"20e06fd3-8c8f-452c-9df4-25911ea82ac1","Type":"ContainerDied","Data":"0f1ff6dd01bff90753a073c26942f8438816f62dbf721cb67d41d0a23e08dfae"} Mar 09 02:45:04 crc kubenswrapper[4901]: I0309 02:45:04.893268 4901 generic.go:334] "Generic (PLEG): container finished" podID="f86d992b-5a2d-47ee-acf3-b73a05d43546" containerID="4d26710a3af4946b502fa37af205edd2be2a09071299c2d8432854be8491f006" exitCode=0 Mar 09 02:45:04 crc kubenswrapper[4901]: I0309 02:45:04.893349 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5499684995-bmjk8" event={"ID":"f86d992b-5a2d-47ee-acf3-b73a05d43546","Type":"ContainerDied","Data":"4d26710a3af4946b502fa37af205edd2be2a09071299c2d8432854be8491f006"} Mar 09 02:45:05 crc kubenswrapper[4901]: I0309 02:45:05.640707 4901 patch_prober.go:28] interesting pod/controller-manager-54d6bfbf5c-w8487 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" start-of-body= Mar 09 02:45:05 crc kubenswrapper[4901]: I0309 02:45:05.641123 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-54d6bfbf5c-w8487" podUID="20e06fd3-8c8f-452c-9df4-25911ea82ac1" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" Mar 09 02:45:06 crc kubenswrapper[4901]: E0309 02:45:06.175506 4901 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 09 02:45:06 crc kubenswrapper[4901]: E0309 02:45:06.176121 4901 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 09 02:45:06 crc kubenswrapper[4901]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 09 02:45:06 crc kubenswrapper[4901]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dkn9d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29550404-w8858_openshift-infra(680e9e87-71a2-402c-84f2-e8eb2b7a4c44): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 09 02:45:06 crc kubenswrapper[4901]: > logger="UnhandledError" Mar 09 02:45:06 crc kubenswrapper[4901]: E0309 02:45:06.177391 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29550404-w8858" podUID="680e9e87-71a2-402c-84f2-e8eb2b7a4c44" Mar 09 02:45:06 crc kubenswrapper[4901]: I0309 02:45:06.847362 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:45:06 crc kubenswrapper[4901]: E0309 02:45:06.916611 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29550404-w8858" podUID="680e9e87-71a2-402c-84f2-e8eb2b7a4c44" Mar 09 02:45:07 crc kubenswrapper[4901]: I0309 02:45:07.914267 4901 patch_prober.go:28] interesting pod/route-controller-manager-5499684995-bmjk8 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.50:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 02:45:07 crc kubenswrapper[4901]: I0309 02:45:07.914375 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5499684995-bmjk8" podUID="f86d992b-5a2d-47ee-acf3-b73a05d43546" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.50:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 02:45:10 crc kubenswrapper[4901]: E0309 02:45:10.690917 4901 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 09 02:45:10 crc kubenswrapper[4901]: E0309 02:45:10.691882 4901 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2cctv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-llngg_openshift-marketplace(f9d5529e-413e-4c88-98d8-1df5a9e55721): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 09 02:45:10 crc kubenswrapper[4901]: E0309 02:45:10.693167 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-llngg" podUID="f9d5529e-413e-4c88-98d8-1df5a9e55721" Mar 09 02:45:11 crc kubenswrapper[4901]: E0309 02:45:11.842450 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-llngg" podUID="f9d5529e-413e-4c88-98d8-1df5a9e55721" Mar 09 02:45:11 crc kubenswrapper[4901]: E0309 02:45:11.901693 4901 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 09 02:45:11 crc kubenswrapper[4901]: E0309 02:45:11.901847 4901 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-98gpc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-jp6kj_openshift-marketplace(da9622a3-9e9b-4c8a-86d3-110f44883bbc): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 09 02:45:11 crc kubenswrapper[4901]: E0309 02:45:11.903230 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-jp6kj" podUID="da9622a3-9e9b-4c8a-86d3-110f44883bbc" Mar 09 02:45:13 crc kubenswrapper[4901]: E0309 02:45:13.149866 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-jp6kj" podUID="da9622a3-9e9b-4c8a-86d3-110f44883bbc" Mar 09 02:45:13 crc kubenswrapper[4901]: E0309 02:45:13.229551 4901 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 09 02:45:13 crc kubenswrapper[4901]: E0309 02:45:13.232072 4901 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mj7ml,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-pgv99_openshift-marketplace(31ec7346-95de-49f9-ad63-7a7423ad1cc3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 09 02:45:13 crc kubenswrapper[4901]: E0309 02:45:13.233147 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-pgv99" podUID="31ec7346-95de-49f9-ad63-7a7423ad1cc3" Mar 09 02:45:13 crc kubenswrapper[4901]: E0309 02:45:13.238932 4901 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 09 02:45:13 crc kubenswrapper[4901]: E0309 02:45:13.239051 4901 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q425q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-lvtrc_openshift-marketplace(1092e265-a1ed-40f3-9a91-c1996ea7479c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 09 02:45:13 crc kubenswrapper[4901]: E0309 02:45:13.240824 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-lvtrc" podUID="1092e265-a1ed-40f3-9a91-c1996ea7479c" Mar 09 02:45:13 crc kubenswrapper[4901]: I0309 02:45:13.836071 4901 ???:1] "http: TLS handshake error from 192.168.126.11:58478: no serving certificate available for the kubelet" Mar 09 02:45:16 crc kubenswrapper[4901]: E0309 02:45:16.516341 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-pgv99" podUID="31ec7346-95de-49f9-ad63-7a7423ad1cc3" Mar 09 02:45:16 crc kubenswrapper[4901]: E0309 02:45:16.516878 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-lvtrc" podUID="1092e265-a1ed-40f3-9a91-c1996ea7479c" Mar 09 02:45:16 crc kubenswrapper[4901]: E0309 02:45:16.552055 4901 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 09 02:45:16 crc kubenswrapper[4901]: E0309 02:45:16.552285 4901 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4ckrx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-rrzzj_openshift-marketplace(980a688d-18b9-4f90-9255-a55568e7bbc0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 09 02:45:16 crc kubenswrapper[4901]: E0309 02:45:16.553482 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-rrzzj" podUID="980a688d-18b9-4f90-9255-a55568e7bbc0" Mar 09 02:45:16 crc kubenswrapper[4901]: E0309 02:45:16.596069 4901 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 09 02:45:16 crc kubenswrapper[4901]: E0309 02:45:16.596346 4901 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hvb89,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-m9pkg_openshift-marketplace(1f881266-e72e-4a74-a232-2dc3c6e95f08): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 09 02:45:16 crc kubenswrapper[4901]: E0309 02:45:16.598262 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-m9pkg" podUID="1f881266-e72e-4a74-a232-2dc3c6e95f08" Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.602801 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54d6bfbf5c-w8487" Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.617830 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5499684995-bmjk8" Mar 09 02:45:16 crc kubenswrapper[4901]: E0309 02:45:16.639133 4901 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 09 02:45:16 crc kubenswrapper[4901]: E0309 02:45:16.639318 4901 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fzcbv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-8c7zs_openshift-marketplace(b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.640564 4901 patch_prober.go:28] interesting pod/controller-manager-54d6bfbf5c-w8487 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.45:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.640642 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-54d6bfbf5c-w8487" podUID="20e06fd3-8c8f-452c-9df4-25911ea82ac1" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.45:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 02:45:16 crc kubenswrapper[4901]: E0309 02:45:16.640827 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-8c7zs" podUID="b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20" Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.652446 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-76c4d7d756-q8rgc"] Mar 09 02:45:16 crc kubenswrapper[4901]: E0309 02:45:16.652914 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20e06fd3-8c8f-452c-9df4-25911ea82ac1" containerName="controller-manager" Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.652932 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="20e06fd3-8c8f-452c-9df4-25911ea82ac1" containerName="controller-manager" Mar 09 02:45:16 crc kubenswrapper[4901]: E0309 02:45:16.652954 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57cf013d-2206-4a42-a7f9-78b7a8fa4c5d" containerName="pruner" Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.652993 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="57cf013d-2206-4a42-a7f9-78b7a8fa4c5d" containerName="pruner" Mar 09 02:45:16 crc kubenswrapper[4901]: E0309 02:45:16.653015 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f86d992b-5a2d-47ee-acf3-b73a05d43546" containerName="route-controller-manager" Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.653024 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="f86d992b-5a2d-47ee-acf3-b73a05d43546" containerName="route-controller-manager" Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.653318 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="f86d992b-5a2d-47ee-acf3-b73a05d43546" containerName="route-controller-manager" Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.653335 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="20e06fd3-8c8f-452c-9df4-25911ea82ac1" containerName="controller-manager" Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.653352 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="57cf013d-2206-4a42-a7f9-78b7a8fa4c5d" containerName="pruner" Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.653989 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76c4d7d756-q8rgc" Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.659811 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-76c4d7d756-q8rgc"] Mar 09 02:45:16 crc kubenswrapper[4901]: E0309 02:45:16.680676 4901 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 09 02:45:16 crc kubenswrapper[4901]: E0309 02:45:16.680829 4901 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6mw89,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-f7kz2_openshift-marketplace(04b4583f-8f26-47e0-8726-a0c2f2dca07e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 09 02:45:16 crc kubenswrapper[4901]: E0309 02:45:16.681968 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-f7kz2" podUID="04b4583f-8f26-47e0-8726-a0c2f2dca07e" Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.756276 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/20e06fd3-8c8f-452c-9df4-25911ea82ac1-client-ca\") pod \"20e06fd3-8c8f-452c-9df4-25911ea82ac1\" (UID: \"20e06fd3-8c8f-452c-9df4-25911ea82ac1\") " Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.756339 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20e06fd3-8c8f-452c-9df4-25911ea82ac1-config\") pod \"20e06fd3-8c8f-452c-9df4-25911ea82ac1\" (UID: \"20e06fd3-8c8f-452c-9df4-25911ea82ac1\") " Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.756379 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-464dl\" (UniqueName: \"kubernetes.io/projected/20e06fd3-8c8f-452c-9df4-25911ea82ac1-kube-api-access-464dl\") pod \"20e06fd3-8c8f-452c-9df4-25911ea82ac1\" (UID: \"20e06fd3-8c8f-452c-9df4-25911ea82ac1\") " Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.756408 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f86d992b-5a2d-47ee-acf3-b73a05d43546-serving-cert\") pod \"f86d992b-5a2d-47ee-acf3-b73a05d43546\" (UID: \"f86d992b-5a2d-47ee-acf3-b73a05d43546\") " Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.756495 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f86d992b-5a2d-47ee-acf3-b73a05d43546-config\") pod \"f86d992b-5a2d-47ee-acf3-b73a05d43546\" (UID: \"f86d992b-5a2d-47ee-acf3-b73a05d43546\") " Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.756531 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20e06fd3-8c8f-452c-9df4-25911ea82ac1-serving-cert\") pod \"20e06fd3-8c8f-452c-9df4-25911ea82ac1\" (UID: \"20e06fd3-8c8f-452c-9df4-25911ea82ac1\") " Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.756556 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/20e06fd3-8c8f-452c-9df4-25911ea82ac1-proxy-ca-bundles\") pod \"20e06fd3-8c8f-452c-9df4-25911ea82ac1\" (UID: \"20e06fd3-8c8f-452c-9df4-25911ea82ac1\") " Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.756577 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f86d992b-5a2d-47ee-acf3-b73a05d43546-client-ca\") pod \"f86d992b-5a2d-47ee-acf3-b73a05d43546\" (UID: \"f86d992b-5a2d-47ee-acf3-b73a05d43546\") " Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.756620 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28jnw\" (UniqueName: \"kubernetes.io/projected/f86d992b-5a2d-47ee-acf3-b73a05d43546-kube-api-access-28jnw\") pod \"f86d992b-5a2d-47ee-acf3-b73a05d43546\" (UID: \"f86d992b-5a2d-47ee-acf3-b73a05d43546\") " Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.756771 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/26347922-100f-4f1b-a96c-b4b2d1d5275d-client-ca\") pod \"controller-manager-76c4d7d756-q8rgc\" (UID: \"26347922-100f-4f1b-a96c-b4b2d1d5275d\") " pod="openshift-controller-manager/controller-manager-76c4d7d756-q8rgc" Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.756831 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trgkt\" (UniqueName: \"kubernetes.io/projected/26347922-100f-4f1b-a96c-b4b2d1d5275d-kube-api-access-trgkt\") pod \"controller-manager-76c4d7d756-q8rgc\" (UID: \"26347922-100f-4f1b-a96c-b4b2d1d5275d\") " pod="openshift-controller-manager/controller-manager-76c4d7d756-q8rgc" Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.756857 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/26347922-100f-4f1b-a96c-b4b2d1d5275d-proxy-ca-bundles\") pod \"controller-manager-76c4d7d756-q8rgc\" (UID: \"26347922-100f-4f1b-a96c-b4b2d1d5275d\") " pod="openshift-controller-manager/controller-manager-76c4d7d756-q8rgc" Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.756913 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26347922-100f-4f1b-a96c-b4b2d1d5275d-config\") pod \"controller-manager-76c4d7d756-q8rgc\" (UID: \"26347922-100f-4f1b-a96c-b4b2d1d5275d\") " pod="openshift-controller-manager/controller-manager-76c4d7d756-q8rgc" Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.756932 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26347922-100f-4f1b-a96c-b4b2d1d5275d-serving-cert\") pod \"controller-manager-76c4d7d756-q8rgc\" (UID: \"26347922-100f-4f1b-a96c-b4b2d1d5275d\") " pod="openshift-controller-manager/controller-manager-76c4d7d756-q8rgc" Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.757880 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20e06fd3-8c8f-452c-9df4-25911ea82ac1-client-ca" (OuterVolumeSpecName: "client-ca") pod "20e06fd3-8c8f-452c-9df4-25911ea82ac1" (UID: "20e06fd3-8c8f-452c-9df4-25911ea82ac1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.758747 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20e06fd3-8c8f-452c-9df4-25911ea82ac1-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "20e06fd3-8c8f-452c-9df4-25911ea82ac1" (UID: "20e06fd3-8c8f-452c-9df4-25911ea82ac1"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.758891 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f86d992b-5a2d-47ee-acf3-b73a05d43546-config" (OuterVolumeSpecName: "config") pod "f86d992b-5a2d-47ee-acf3-b73a05d43546" (UID: "f86d992b-5a2d-47ee-acf3-b73a05d43546"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.759005 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20e06fd3-8c8f-452c-9df4-25911ea82ac1-config" (OuterVolumeSpecName: "config") pod "20e06fd3-8c8f-452c-9df4-25911ea82ac1" (UID: "20e06fd3-8c8f-452c-9df4-25911ea82ac1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.759258 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f86d992b-5a2d-47ee-acf3-b73a05d43546-client-ca" (OuterVolumeSpecName: "client-ca") pod "f86d992b-5a2d-47ee-acf3-b73a05d43546" (UID: "f86d992b-5a2d-47ee-acf3-b73a05d43546"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.764499 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f86d992b-5a2d-47ee-acf3-b73a05d43546-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f86d992b-5a2d-47ee-acf3-b73a05d43546" (UID: "f86d992b-5a2d-47ee-acf3-b73a05d43546"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.768070 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20e06fd3-8c8f-452c-9df4-25911ea82ac1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "20e06fd3-8c8f-452c-9df4-25911ea82ac1" (UID: "20e06fd3-8c8f-452c-9df4-25911ea82ac1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.768215 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f86d992b-5a2d-47ee-acf3-b73a05d43546-kube-api-access-28jnw" (OuterVolumeSpecName: "kube-api-access-28jnw") pod "f86d992b-5a2d-47ee-acf3-b73a05d43546" (UID: "f86d992b-5a2d-47ee-acf3-b73a05d43546"). InnerVolumeSpecName "kube-api-access-28jnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.770581 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20e06fd3-8c8f-452c-9df4-25911ea82ac1-kube-api-access-464dl" (OuterVolumeSpecName: "kube-api-access-464dl") pod "20e06fd3-8c8f-452c-9df4-25911ea82ac1" (UID: "20e06fd3-8c8f-452c-9df4-25911ea82ac1"). InnerVolumeSpecName "kube-api-access-464dl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.858133 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trgkt\" (UniqueName: \"kubernetes.io/projected/26347922-100f-4f1b-a96c-b4b2d1d5275d-kube-api-access-trgkt\") pod \"controller-manager-76c4d7d756-q8rgc\" (UID: \"26347922-100f-4f1b-a96c-b4b2d1d5275d\") " pod="openshift-controller-manager/controller-manager-76c4d7d756-q8rgc" Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.858178 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/26347922-100f-4f1b-a96c-b4b2d1d5275d-proxy-ca-bundles\") pod \"controller-manager-76c4d7d756-q8rgc\" (UID: \"26347922-100f-4f1b-a96c-b4b2d1d5275d\") " pod="openshift-controller-manager/controller-manager-76c4d7d756-q8rgc" Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.858248 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26347922-100f-4f1b-a96c-b4b2d1d5275d-config\") pod \"controller-manager-76c4d7d756-q8rgc\" (UID: \"26347922-100f-4f1b-a96c-b4b2d1d5275d\") " pod="openshift-controller-manager/controller-manager-76c4d7d756-q8rgc" Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.858267 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26347922-100f-4f1b-a96c-b4b2d1d5275d-serving-cert\") pod \"controller-manager-76c4d7d756-q8rgc\" (UID: \"26347922-100f-4f1b-a96c-b4b2d1d5275d\") " pod="openshift-controller-manager/controller-manager-76c4d7d756-q8rgc" Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.858310 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/26347922-100f-4f1b-a96c-b4b2d1d5275d-client-ca\") pod \"controller-manager-76c4d7d756-q8rgc\" (UID: \"26347922-100f-4f1b-a96c-b4b2d1d5275d\") " pod="openshift-controller-manager/controller-manager-76c4d7d756-q8rgc" Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.858369 4901 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/20e06fd3-8c8f-452c-9df4-25911ea82ac1-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.858380 4901 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f86d992b-5a2d-47ee-acf3-b73a05d43546-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.858389 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28jnw\" (UniqueName: \"kubernetes.io/projected/f86d992b-5a2d-47ee-acf3-b73a05d43546-kube-api-access-28jnw\") on node \"crc\" DevicePath \"\"" Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.858398 4901 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/20e06fd3-8c8f-452c-9df4-25911ea82ac1-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.858406 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20e06fd3-8c8f-452c-9df4-25911ea82ac1-config\") on node \"crc\" DevicePath \"\"" Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.858415 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-464dl\" (UniqueName: \"kubernetes.io/projected/20e06fd3-8c8f-452c-9df4-25911ea82ac1-kube-api-access-464dl\") on node \"crc\" DevicePath \"\"" Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.858424 4901 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f86d992b-5a2d-47ee-acf3-b73a05d43546-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.858433 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f86d992b-5a2d-47ee-acf3-b73a05d43546-config\") on node \"crc\" DevicePath \"\"" Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.858443 4901 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20e06fd3-8c8f-452c-9df4-25911ea82ac1-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.859347 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/26347922-100f-4f1b-a96c-b4b2d1d5275d-client-ca\") pod \"controller-manager-76c4d7d756-q8rgc\" (UID: \"26347922-100f-4f1b-a96c-b4b2d1d5275d\") " pod="openshift-controller-manager/controller-manager-76c4d7d756-q8rgc" Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.859502 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/26347922-100f-4f1b-a96c-b4b2d1d5275d-proxy-ca-bundles\") pod \"controller-manager-76c4d7d756-q8rgc\" (UID: \"26347922-100f-4f1b-a96c-b4b2d1d5275d\") " pod="openshift-controller-manager/controller-manager-76c4d7d756-q8rgc" Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.859989 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26347922-100f-4f1b-a96c-b4b2d1d5275d-config\") pod \"controller-manager-76c4d7d756-q8rgc\" (UID: \"26347922-100f-4f1b-a96c-b4b2d1d5275d\") " pod="openshift-controller-manager/controller-manager-76c4d7d756-q8rgc" Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.862735 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26347922-100f-4f1b-a96c-b4b2d1d5275d-serving-cert\") pod \"controller-manager-76c4d7d756-q8rgc\" (UID: \"26347922-100f-4f1b-a96c-b4b2d1d5275d\") " pod="openshift-controller-manager/controller-manager-76c4d7d756-q8rgc" Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.876554 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trgkt\" (UniqueName: \"kubernetes.io/projected/26347922-100f-4f1b-a96c-b4b2d1d5275d-kube-api-access-trgkt\") pod \"controller-manager-76c4d7d756-q8rgc\" (UID: \"26347922-100f-4f1b-a96c-b4b2d1d5275d\") " pod="openshift-controller-manager/controller-manager-76c4d7d756-q8rgc" Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.938352 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550405-nlxv5"] Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.979733 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54d6bfbf5c-w8487" event={"ID":"20e06fd3-8c8f-452c-9df4-25911ea82ac1","Type":"ContainerDied","Data":"09e748aa655e3e67d6d12350cda748d40cd03621cc2dc4c6fe825f571baa0e63"} Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.979811 4901 scope.go:117] "RemoveContainer" containerID="0f1ff6dd01bff90753a073c26942f8438816f62dbf721cb67d41d0a23e08dfae" Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.979873 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54d6bfbf5c-w8487" Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.983071 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-665b6dd947-jffft_99c51354-0ea1-4e7a-bd34-1bb57e6422c0/cluster-samples-operator/0.log" Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.983163 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jffft" event={"ID":"99c51354-0ea1-4e7a-bd34-1bb57e6422c0","Type":"ContainerStarted","Data":"1b013370c54320353c7949eddc8f12af7f14d81481f009d7813ffd0ba6b63316"} Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.985824 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5499684995-bmjk8" event={"ID":"f86d992b-5a2d-47ee-acf3-b73a05d43546","Type":"ContainerDied","Data":"2f9bd55901a29d4456ef5545596d85aa9e3f2c1979e133a61b8a9f30f3b5284f"} Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.985922 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5499684995-bmjk8" Mar 09 02:45:16 crc kubenswrapper[4901]: I0309 02:45:16.989067 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76c4d7d756-q8rgc" Mar 09 02:45:17 crc kubenswrapper[4901]: I0309 02:45:17.002658 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550405-nlxv5" event={"ID":"e7e0edae-f21a-473e-93a7-c9b797f9a112","Type":"ContainerStarted","Data":"3cb63ca142b0e2a13faa5421ff6dc1a77717a066f793d6f65134d09fb5261043"} Mar 09 02:45:17 crc kubenswrapper[4901]: E0309 02:45:17.029081 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-8c7zs" podUID="b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20" Mar 09 02:45:17 crc kubenswrapper[4901]: E0309 02:45:17.029096 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-m9pkg" podUID="1f881266-e72e-4a74-a232-2dc3c6e95f08" Mar 09 02:45:17 crc kubenswrapper[4901]: E0309 02:45:17.029172 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-f7kz2" podUID="04b4583f-8f26-47e0-8726-a0c2f2dca07e" Mar 09 02:45:17 crc kubenswrapper[4901]: E0309 02:45:17.029263 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-rrzzj" podUID="980a688d-18b9-4f90-9255-a55568e7bbc0" Mar 09 02:45:17 crc kubenswrapper[4901]: I0309 02:45:17.038699 4901 scope.go:117] "RemoveContainer" containerID="4d26710a3af4946b502fa37af205edd2be2a09071299c2d8432854be8491f006" Mar 09 02:45:17 crc kubenswrapper[4901]: I0309 02:45:17.115491 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-54d6bfbf5c-w8487"] Mar 09 02:45:17 crc kubenswrapper[4901]: I0309 02:45:17.118397 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-54d6bfbf5c-w8487"] Mar 09 02:45:17 crc kubenswrapper[4901]: I0309 02:45:17.130602 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5499684995-bmjk8"] Mar 09 02:45:17 crc kubenswrapper[4901]: I0309 02:45:17.134369 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5499684995-bmjk8"] Mar 09 02:45:17 crc kubenswrapper[4901]: I0309 02:45:17.224101 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-76c4d7d756-q8rgc"] Mar 09 02:45:17 crc kubenswrapper[4901]: W0309 02:45:17.232597 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26347922_100f_4f1b_a96c_b4b2d1d5275d.slice/crio-b4f015f0a469d9cb495271a0842f3aeadffcdac84cbb2b78a4a36023d561634f WatchSource:0}: Error finding container b4f015f0a469d9cb495271a0842f3aeadffcdac84cbb2b78a4a36023d561634f: Status 404 returned error can't find the container with id b4f015f0a469d9cb495271a0842f3aeadffcdac84cbb2b78a4a36023d561634f Mar 09 02:45:18 crc kubenswrapper[4901]: I0309 02:45:18.020831 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-76c4d7d756-q8rgc" event={"ID":"26347922-100f-4f1b-a96c-b4b2d1d5275d","Type":"ContainerStarted","Data":"86cdbab10b9f0d293bc5330cf6fcdd54e286ef48b943a8f67749249b38229d25"} Mar 09 02:45:18 crc kubenswrapper[4901]: I0309 02:45:18.022323 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-76c4d7d756-q8rgc" event={"ID":"26347922-100f-4f1b-a96c-b4b2d1d5275d","Type":"ContainerStarted","Data":"b4f015f0a469d9cb495271a0842f3aeadffcdac84cbb2b78a4a36023d561634f"} Mar 09 02:45:18 crc kubenswrapper[4901]: I0309 02:45:18.022360 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-76c4d7d756-q8rgc" Mar 09 02:45:18 crc kubenswrapper[4901]: I0309 02:45:18.026792 4901 generic.go:334] "Generic (PLEG): container finished" podID="e7e0edae-f21a-473e-93a7-c9b797f9a112" containerID="3ee70dec9345649446b76daa64fb85b6eb014cc1effe10f165caa1e54849e223" exitCode=0 Mar 09 02:45:18 crc kubenswrapper[4901]: I0309 02:45:18.026857 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550405-nlxv5" event={"ID":"e7e0edae-f21a-473e-93a7-c9b797f9a112","Type":"ContainerDied","Data":"3ee70dec9345649446b76daa64fb85b6eb014cc1effe10f165caa1e54849e223"} Mar 09 02:45:18 crc kubenswrapper[4901]: I0309 02:45:18.027310 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-76c4d7d756-q8rgc" Mar 09 02:45:18 crc kubenswrapper[4901]: I0309 02:45:18.042505 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-76c4d7d756-q8rgc" podStartSLOduration=14.042485745 podStartE2EDuration="14.042485745s" podCreationTimestamp="2026-03-09 02:45:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:45:18.039466549 +0000 UTC m=+242.629130291" watchObservedRunningTime="2026-03-09 02:45:18.042485745 +0000 UTC m=+242.632149537" Mar 09 02:45:18 crc kubenswrapper[4901]: I0309 02:45:18.115421 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20e06fd3-8c8f-452c-9df4-25911ea82ac1" path="/var/lib/kubelet/pods/20e06fd3-8c8f-452c-9df4-25911ea82ac1/volumes" Mar 09 02:45:18 crc kubenswrapper[4901]: I0309 02:45:18.116450 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f86d992b-5a2d-47ee-acf3-b73a05d43546" path="/var/lib/kubelet/pods/f86d992b-5a2d-47ee-acf3-b73a05d43546/volumes" Mar 09 02:45:19 crc kubenswrapper[4901]: I0309 02:45:19.280703 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550405-nlxv5" Mar 09 02:45:19 crc kubenswrapper[4901]: I0309 02:45:19.295054 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ppsc\" (UniqueName: \"kubernetes.io/projected/e7e0edae-f21a-473e-93a7-c9b797f9a112-kube-api-access-6ppsc\") pod \"e7e0edae-f21a-473e-93a7-c9b797f9a112\" (UID: \"e7e0edae-f21a-473e-93a7-c9b797f9a112\") " Mar 09 02:45:19 crc kubenswrapper[4901]: I0309 02:45:19.295131 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e7e0edae-f21a-473e-93a7-c9b797f9a112-secret-volume\") pod \"e7e0edae-f21a-473e-93a7-c9b797f9a112\" (UID: \"e7e0edae-f21a-473e-93a7-c9b797f9a112\") " Mar 09 02:45:19 crc kubenswrapper[4901]: I0309 02:45:19.295168 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7e0edae-f21a-473e-93a7-c9b797f9a112-config-volume\") pod \"e7e0edae-f21a-473e-93a7-c9b797f9a112\" (UID: \"e7e0edae-f21a-473e-93a7-c9b797f9a112\") " Mar 09 02:45:19 crc kubenswrapper[4901]: I0309 02:45:19.296559 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e0edae-f21a-473e-93a7-c9b797f9a112-config-volume" (OuterVolumeSpecName: "config-volume") pod "e7e0edae-f21a-473e-93a7-c9b797f9a112" (UID: "e7e0edae-f21a-473e-93a7-c9b797f9a112"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:45:19 crc kubenswrapper[4901]: I0309 02:45:19.305932 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e0edae-f21a-473e-93a7-c9b797f9a112-kube-api-access-6ppsc" (OuterVolumeSpecName: "kube-api-access-6ppsc") pod "e7e0edae-f21a-473e-93a7-c9b797f9a112" (UID: "e7e0edae-f21a-473e-93a7-c9b797f9a112"). InnerVolumeSpecName "kube-api-access-6ppsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:45:19 crc kubenswrapper[4901]: I0309 02:45:19.310108 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e0edae-f21a-473e-93a7-c9b797f9a112-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e7e0edae-f21a-473e-93a7-c9b797f9a112" (UID: "e7e0edae-f21a-473e-93a7-c9b797f9a112"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:45:19 crc kubenswrapper[4901]: I0309 02:45:19.370115 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 09 02:45:19 crc kubenswrapper[4901]: E0309 02:45:19.370471 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7e0edae-f21a-473e-93a7-c9b797f9a112" containerName="collect-profiles" Mar 09 02:45:19 crc kubenswrapper[4901]: I0309 02:45:19.370488 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7e0edae-f21a-473e-93a7-c9b797f9a112" containerName="collect-profiles" Mar 09 02:45:19 crc kubenswrapper[4901]: I0309 02:45:19.370622 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7e0edae-f21a-473e-93a7-c9b797f9a112" containerName="collect-profiles" Mar 09 02:45:19 crc kubenswrapper[4901]: I0309 02:45:19.371109 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 02:45:19 crc kubenswrapper[4901]: I0309 02:45:19.373426 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 09 02:45:19 crc kubenswrapper[4901]: I0309 02:45:19.373782 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 09 02:45:19 crc kubenswrapper[4901]: I0309 02:45:19.385159 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 09 02:45:19 crc kubenswrapper[4901]: I0309 02:45:19.397115 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0bda5c71-147c-4195-84a2-1bf36dc5b8d1-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0bda5c71-147c-4195-84a2-1bf36dc5b8d1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 02:45:19 crc kubenswrapper[4901]: I0309 02:45:19.397195 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0bda5c71-147c-4195-84a2-1bf36dc5b8d1-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0bda5c71-147c-4195-84a2-1bf36dc5b8d1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 02:45:19 crc kubenswrapper[4901]: I0309 02:45:19.397408 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ppsc\" (UniqueName: \"kubernetes.io/projected/e7e0edae-f21a-473e-93a7-c9b797f9a112-kube-api-access-6ppsc\") on node \"crc\" DevicePath \"\"" Mar 09 02:45:19 crc kubenswrapper[4901]: I0309 02:45:19.397441 4901 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e7e0edae-f21a-473e-93a7-c9b797f9a112-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 09 02:45:19 crc kubenswrapper[4901]: I0309 02:45:19.397456 4901 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7e0edae-f21a-473e-93a7-c9b797f9a112-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 02:45:19 crc kubenswrapper[4901]: I0309 02:45:19.462041 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5jjc7" Mar 09 02:45:19 crc kubenswrapper[4901]: I0309 02:45:19.499649 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0bda5c71-147c-4195-84a2-1bf36dc5b8d1-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0bda5c71-147c-4195-84a2-1bf36dc5b8d1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 02:45:19 crc kubenswrapper[4901]: I0309 02:45:19.499758 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0bda5c71-147c-4195-84a2-1bf36dc5b8d1-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0bda5c71-147c-4195-84a2-1bf36dc5b8d1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 02:45:19 crc kubenswrapper[4901]: I0309 02:45:19.500256 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0bda5c71-147c-4195-84a2-1bf36dc5b8d1-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0bda5c71-147c-4195-84a2-1bf36dc5b8d1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 02:45:19 crc kubenswrapper[4901]: I0309 02:45:19.533509 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0bda5c71-147c-4195-84a2-1bf36dc5b8d1-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0bda5c71-147c-4195-84a2-1bf36dc5b8d1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 02:45:19 crc kubenswrapper[4901]: I0309 02:45:19.558683 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56bb5fb5dc-v9v8k"] Mar 09 02:45:19 crc kubenswrapper[4901]: I0309 02:45:19.559763 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56bb5fb5dc-v9v8k" Mar 09 02:45:19 crc kubenswrapper[4901]: I0309 02:45:19.561598 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 09 02:45:19 crc kubenswrapper[4901]: I0309 02:45:19.561955 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 09 02:45:19 crc kubenswrapper[4901]: I0309 02:45:19.562144 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 09 02:45:19 crc kubenswrapper[4901]: I0309 02:45:19.562318 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 09 02:45:19 crc kubenswrapper[4901]: I0309 02:45:19.562518 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 09 02:45:19 crc kubenswrapper[4901]: I0309 02:45:19.563149 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 09 02:45:19 crc kubenswrapper[4901]: I0309 02:45:19.578390 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56bb5fb5dc-v9v8k"] Mar 09 02:45:19 crc kubenswrapper[4901]: I0309 02:45:19.601134 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gklt\" (UniqueName: \"kubernetes.io/projected/586fa848-a7a2-4543-8ff7-894ed18ba93f-kube-api-access-2gklt\") pod \"route-controller-manager-56bb5fb5dc-v9v8k\" (UID: \"586fa848-a7a2-4543-8ff7-894ed18ba93f\") " pod="openshift-route-controller-manager/route-controller-manager-56bb5fb5dc-v9v8k" Mar 09 02:45:19 crc kubenswrapper[4901]: I0309 02:45:19.601244 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/586fa848-a7a2-4543-8ff7-894ed18ba93f-config\") pod \"route-controller-manager-56bb5fb5dc-v9v8k\" (UID: \"586fa848-a7a2-4543-8ff7-894ed18ba93f\") " pod="openshift-route-controller-manager/route-controller-manager-56bb5fb5dc-v9v8k" Mar 09 02:45:19 crc kubenswrapper[4901]: I0309 02:45:19.601273 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/586fa848-a7a2-4543-8ff7-894ed18ba93f-serving-cert\") pod \"route-controller-manager-56bb5fb5dc-v9v8k\" (UID: \"586fa848-a7a2-4543-8ff7-894ed18ba93f\") " pod="openshift-route-controller-manager/route-controller-manager-56bb5fb5dc-v9v8k" Mar 09 02:45:19 crc kubenswrapper[4901]: I0309 02:45:19.601300 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/586fa848-a7a2-4543-8ff7-894ed18ba93f-client-ca\") pod \"route-controller-manager-56bb5fb5dc-v9v8k\" (UID: \"586fa848-a7a2-4543-8ff7-894ed18ba93f\") " pod="openshift-route-controller-manager/route-controller-manager-56bb5fb5dc-v9v8k" Mar 09 02:45:19 crc kubenswrapper[4901]: I0309 02:45:19.700874 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 02:45:19 crc kubenswrapper[4901]: I0309 02:45:19.711921 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gklt\" (UniqueName: \"kubernetes.io/projected/586fa848-a7a2-4543-8ff7-894ed18ba93f-kube-api-access-2gklt\") pod \"route-controller-manager-56bb5fb5dc-v9v8k\" (UID: \"586fa848-a7a2-4543-8ff7-894ed18ba93f\") " pod="openshift-route-controller-manager/route-controller-manager-56bb5fb5dc-v9v8k" Mar 09 02:45:19 crc kubenswrapper[4901]: I0309 02:45:19.711997 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/586fa848-a7a2-4543-8ff7-894ed18ba93f-config\") pod \"route-controller-manager-56bb5fb5dc-v9v8k\" (UID: \"586fa848-a7a2-4543-8ff7-894ed18ba93f\") " pod="openshift-route-controller-manager/route-controller-manager-56bb5fb5dc-v9v8k" Mar 09 02:45:19 crc kubenswrapper[4901]: I0309 02:45:19.712025 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/586fa848-a7a2-4543-8ff7-894ed18ba93f-serving-cert\") pod \"route-controller-manager-56bb5fb5dc-v9v8k\" (UID: \"586fa848-a7a2-4543-8ff7-894ed18ba93f\") " pod="openshift-route-controller-manager/route-controller-manager-56bb5fb5dc-v9v8k" Mar 09 02:45:19 crc kubenswrapper[4901]: I0309 02:45:19.712048 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/586fa848-a7a2-4543-8ff7-894ed18ba93f-client-ca\") pod \"route-controller-manager-56bb5fb5dc-v9v8k\" (UID: \"586fa848-a7a2-4543-8ff7-894ed18ba93f\") " pod="openshift-route-controller-manager/route-controller-manager-56bb5fb5dc-v9v8k" Mar 09 02:45:19 crc kubenswrapper[4901]: I0309 02:45:19.712817 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/586fa848-a7a2-4543-8ff7-894ed18ba93f-client-ca\") pod \"route-controller-manager-56bb5fb5dc-v9v8k\" (UID: \"586fa848-a7a2-4543-8ff7-894ed18ba93f\") " pod="openshift-route-controller-manager/route-controller-manager-56bb5fb5dc-v9v8k" Mar 09 02:45:19 crc kubenswrapper[4901]: I0309 02:45:19.714436 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/586fa848-a7a2-4543-8ff7-894ed18ba93f-config\") pod \"route-controller-manager-56bb5fb5dc-v9v8k\" (UID: \"586fa848-a7a2-4543-8ff7-894ed18ba93f\") " pod="openshift-route-controller-manager/route-controller-manager-56bb5fb5dc-v9v8k" Mar 09 02:45:19 crc kubenswrapper[4901]: I0309 02:45:19.718510 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/586fa848-a7a2-4543-8ff7-894ed18ba93f-serving-cert\") pod \"route-controller-manager-56bb5fb5dc-v9v8k\" (UID: \"586fa848-a7a2-4543-8ff7-894ed18ba93f\") " pod="openshift-route-controller-manager/route-controller-manager-56bb5fb5dc-v9v8k" Mar 09 02:45:19 crc kubenswrapper[4901]: I0309 02:45:19.729111 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gklt\" (UniqueName: \"kubernetes.io/projected/586fa848-a7a2-4543-8ff7-894ed18ba93f-kube-api-access-2gklt\") pod \"route-controller-manager-56bb5fb5dc-v9v8k\" (UID: \"586fa848-a7a2-4543-8ff7-894ed18ba93f\") " pod="openshift-route-controller-manager/route-controller-manager-56bb5fb5dc-v9v8k" Mar 09 02:45:19 crc kubenswrapper[4901]: I0309 02:45:19.881816 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56bb5fb5dc-v9v8k" Mar 09 02:45:20 crc kubenswrapper[4901]: I0309 02:45:20.039270 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550405-nlxv5" Mar 09 02:45:20 crc kubenswrapper[4901]: I0309 02:45:20.039397 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550405-nlxv5" event={"ID":"e7e0edae-f21a-473e-93a7-c9b797f9a112","Type":"ContainerDied","Data":"3cb63ca142b0e2a13faa5421ff6dc1a77717a066f793d6f65134d09fb5261043"} Mar 09 02:45:20 crc kubenswrapper[4901]: I0309 02:45:20.039442 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3cb63ca142b0e2a13faa5421ff6dc1a77717a066f793d6f65134d09fb5261043" Mar 09 02:45:20 crc kubenswrapper[4901]: I0309 02:45:20.085643 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 09 02:45:20 crc kubenswrapper[4901]: I0309 02:45:20.292695 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56bb5fb5dc-v9v8k"] Mar 09 02:45:20 crc kubenswrapper[4901]: W0309 02:45:20.304901 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod586fa848_a7a2_4543_8ff7_894ed18ba93f.slice/crio-f322a0e1171376be7d014d1cda548528195e1ecb30d6775ffe9a504729274ad4 WatchSource:0}: Error finding container f322a0e1171376be7d014d1cda548528195e1ecb30d6775ffe9a504729274ad4: Status 404 returned error can't find the container with id f322a0e1171376be7d014d1cda548528195e1ecb30d6775ffe9a504729274ad4 Mar 09 02:45:21 crc kubenswrapper[4901]: I0309 02:45:21.045000 4901 generic.go:334] "Generic (PLEG): container finished" podID="0bda5c71-147c-4195-84a2-1bf36dc5b8d1" containerID="fd6e01c7b37d1b63de481b81b1db3293f1292ec4ac35a0d0d533a1cb4a5af8b4" exitCode=0 Mar 09 02:45:21 crc kubenswrapper[4901]: I0309 02:45:21.045067 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"0bda5c71-147c-4195-84a2-1bf36dc5b8d1","Type":"ContainerDied","Data":"fd6e01c7b37d1b63de481b81b1db3293f1292ec4ac35a0d0d533a1cb4a5af8b4"} Mar 09 02:45:21 crc kubenswrapper[4901]: I0309 02:45:21.045286 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"0bda5c71-147c-4195-84a2-1bf36dc5b8d1","Type":"ContainerStarted","Data":"2cd86f3e81e6a893137ca9b8d4267b8ebc074d104de64266b3385993b910a08a"} Mar 09 02:45:21 crc kubenswrapper[4901]: I0309 02:45:21.047105 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56bb5fb5dc-v9v8k" event={"ID":"586fa848-a7a2-4543-8ff7-894ed18ba93f","Type":"ContainerStarted","Data":"d9a5be4e7713171756048b8053379c5d964f79d9410805f29bb56273714be81e"} Mar 09 02:45:21 crc kubenswrapper[4901]: I0309 02:45:21.047147 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56bb5fb5dc-v9v8k" event={"ID":"586fa848-a7a2-4543-8ff7-894ed18ba93f","Type":"ContainerStarted","Data":"f322a0e1171376be7d014d1cda548528195e1ecb30d6775ffe9a504729274ad4"} Mar 09 02:45:21 crc kubenswrapper[4901]: I0309 02:45:21.047337 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-56bb5fb5dc-v9v8k" Mar 09 02:45:21 crc kubenswrapper[4901]: I0309 02:45:21.053141 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-56bb5fb5dc-v9v8k" Mar 09 02:45:21 crc kubenswrapper[4901]: I0309 02:45:21.092688 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-56bb5fb5dc-v9v8k" podStartSLOduration=17.092669775 podStartE2EDuration="17.092669775s" podCreationTimestamp="2026-03-09 02:45:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:45:21.092014364 +0000 UTC m=+245.681678096" watchObservedRunningTime="2026-03-09 02:45:21.092669775 +0000 UTC m=+245.682333507" Mar 09 02:45:22 crc kubenswrapper[4901]: I0309 02:45:22.146972 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e883667-62d8-4920-a810-558a77f260ca-metrics-certs\") pod \"network-metrics-daemon-lg26b\" (UID: \"9e883667-62d8-4920-a810-558a77f260ca\") " pod="openshift-multus/network-metrics-daemon-lg26b" Mar 09 02:45:22 crc kubenswrapper[4901]: I0309 02:45:22.150604 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 09 02:45:22 crc kubenswrapper[4901]: I0309 02:45:22.171613 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e883667-62d8-4920-a810-558a77f260ca-metrics-certs\") pod \"network-metrics-daemon-lg26b\" (UID: \"9e883667-62d8-4920-a810-558a77f260ca\") " pod="openshift-multus/network-metrics-daemon-lg26b" Mar 09 02:45:22 crc kubenswrapper[4901]: I0309 02:45:22.254977 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 02:45:22 crc kubenswrapper[4901]: I0309 02:45:22.326405 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 09 02:45:22 crc kubenswrapper[4901]: I0309 02:45:22.335142 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lg26b" Mar 09 02:45:22 crc kubenswrapper[4901]: I0309 02:45:22.388801 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 02:45:22 crc kubenswrapper[4901]: I0309 02:45:22.451445 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0bda5c71-147c-4195-84a2-1bf36dc5b8d1-kube-api-access\") pod \"0bda5c71-147c-4195-84a2-1bf36dc5b8d1\" (UID: \"0bda5c71-147c-4195-84a2-1bf36dc5b8d1\") " Mar 09 02:45:22 crc kubenswrapper[4901]: I0309 02:45:22.451563 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0bda5c71-147c-4195-84a2-1bf36dc5b8d1-kubelet-dir\") pod \"0bda5c71-147c-4195-84a2-1bf36dc5b8d1\" (UID: \"0bda5c71-147c-4195-84a2-1bf36dc5b8d1\") " Mar 09 02:45:22 crc kubenswrapper[4901]: I0309 02:45:22.451844 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0bda5c71-147c-4195-84a2-1bf36dc5b8d1-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0bda5c71-147c-4195-84a2-1bf36dc5b8d1" (UID: "0bda5c71-147c-4195-84a2-1bf36dc5b8d1"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 02:45:22 crc kubenswrapper[4901]: I0309 02:45:22.456295 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bda5c71-147c-4195-84a2-1bf36dc5b8d1-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0bda5c71-147c-4195-84a2-1bf36dc5b8d1" (UID: "0bda5c71-147c-4195-84a2-1bf36dc5b8d1"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:45:22 crc kubenswrapper[4901]: I0309 02:45:22.552322 4901 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0bda5c71-147c-4195-84a2-1bf36dc5b8d1-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 09 02:45:22 crc kubenswrapper[4901]: I0309 02:45:22.552356 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0bda5c71-147c-4195-84a2-1bf36dc5b8d1-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 02:45:22 crc kubenswrapper[4901]: I0309 02:45:22.733545 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-lg26b"] Mar 09 02:45:22 crc kubenswrapper[4901]: W0309 02:45:22.738165 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e883667_62d8_4920_a810_558a77f260ca.slice/crio-9750abc21f8c14efa4da9aef9453e2bee0759ac5ede6e166d5d14c622abdd529 WatchSource:0}: Error finding container 9750abc21f8c14efa4da9aef9453e2bee0759ac5ede6e166d5d14c622abdd529: Status 404 returned error can't find the container with id 9750abc21f8c14efa4da9aef9453e2bee0759ac5ede6e166d5d14c622abdd529 Mar 09 02:45:22 crc kubenswrapper[4901]: I0309 02:45:22.876775 4901 csr.go:261] certificate signing request csr-j9xqj is approved, waiting to be issued Mar 09 02:45:22 crc kubenswrapper[4901]: I0309 02:45:22.889096 4901 csr.go:257] certificate signing request csr-j9xqj is issued Mar 09 02:45:23 crc kubenswrapper[4901]: I0309 02:45:23.071381 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lg26b" event={"ID":"9e883667-62d8-4920-a810-558a77f260ca","Type":"ContainerStarted","Data":"37c0cae66804fc7ea12ed6ec108d38c19137c2d522539c0cce22b0faa8af2704"} Mar 09 02:45:23 crc kubenswrapper[4901]: I0309 02:45:23.071812 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lg26b" event={"ID":"9e883667-62d8-4920-a810-558a77f260ca","Type":"ContainerStarted","Data":"9750abc21f8c14efa4da9aef9453e2bee0759ac5ede6e166d5d14c622abdd529"} Mar 09 02:45:23 crc kubenswrapper[4901]: I0309 02:45:23.075992 4901 generic.go:334] "Generic (PLEG): container finished" podID="680e9e87-71a2-402c-84f2-e8eb2b7a4c44" containerID="d71e6d807ace698720a2a4dd399cb57a364cb762660df830f7ad7c026fc22a73" exitCode=0 Mar 09 02:45:23 crc kubenswrapper[4901]: I0309 02:45:23.076082 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550404-w8858" event={"ID":"680e9e87-71a2-402c-84f2-e8eb2b7a4c44","Type":"ContainerDied","Data":"d71e6d807ace698720a2a4dd399cb57a364cb762660df830f7ad7c026fc22a73"} Mar 09 02:45:23 crc kubenswrapper[4901]: I0309 02:45:23.082166 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"0bda5c71-147c-4195-84a2-1bf36dc5b8d1","Type":"ContainerDied","Data":"2cd86f3e81e6a893137ca9b8d4267b8ebc074d104de64266b3385993b910a08a"} Mar 09 02:45:23 crc kubenswrapper[4901]: I0309 02:45:23.082250 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 02:45:23 crc kubenswrapper[4901]: I0309 02:45:23.082406 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2cd86f3e81e6a893137ca9b8d4267b8ebc074d104de64266b3385993b910a08a" Mar 09 02:45:23 crc kubenswrapper[4901]: I0309 02:45:23.890069 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-18 04:57:26.782853633 +0000 UTC Mar 09 02:45:23 crc kubenswrapper[4901]: I0309 02:45:23.890111 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6818h12m2.892744728s for next certificate rotation Mar 09 02:45:24 crc kubenswrapper[4901]: I0309 02:45:24.088756 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lg26b" event={"ID":"9e883667-62d8-4920-a810-558a77f260ca","Type":"ContainerStarted","Data":"e745c9564aa18e329dd229e93611b6f0eb8ae1c87eeeefc1e1bab3df60f02b21"} Mar 09 02:45:24 crc kubenswrapper[4901]: I0309 02:45:24.125267 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-lg26b" podStartSLOduration=177.125207097 podStartE2EDuration="2m57.125207097s" podCreationTimestamp="2026-03-09 02:42:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:45:24.111009488 +0000 UTC m=+248.700673300" watchObservedRunningTime="2026-03-09 02:45:24.125207097 +0000 UTC m=+248.714870869" Mar 09 02:45:24 crc kubenswrapper[4901]: I0309 02:45:24.165839 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 09 02:45:24 crc kubenswrapper[4901]: E0309 02:45:24.166073 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bda5c71-147c-4195-84a2-1bf36dc5b8d1" containerName="pruner" Mar 09 02:45:24 crc kubenswrapper[4901]: I0309 02:45:24.166090 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bda5c71-147c-4195-84a2-1bf36dc5b8d1" containerName="pruner" Mar 09 02:45:24 crc kubenswrapper[4901]: I0309 02:45:24.166240 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bda5c71-147c-4195-84a2-1bf36dc5b8d1" containerName="pruner" Mar 09 02:45:24 crc kubenswrapper[4901]: I0309 02:45:24.166677 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 09 02:45:24 crc kubenswrapper[4901]: I0309 02:45:24.169616 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 09 02:45:24 crc kubenswrapper[4901]: I0309 02:45:24.171635 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 09 02:45:24 crc kubenswrapper[4901]: I0309 02:45:24.196979 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 09 02:45:24 crc kubenswrapper[4901]: I0309 02:45:24.257438 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-76c4d7d756-q8rgc"] Mar 09 02:45:24 crc kubenswrapper[4901]: I0309 02:45:24.257636 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-76c4d7d756-q8rgc" podUID="26347922-100f-4f1b-a96c-b4b2d1d5275d" containerName="controller-manager" containerID="cri-o://86cdbab10b9f0d293bc5330cf6fcdd54e286ef48b943a8f67749249b38229d25" gracePeriod=30 Mar 09 02:45:24 crc kubenswrapper[4901]: I0309 02:45:24.279842 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/946ddaae-0092-45f2-b5af-3a5168fa64e8-var-lock\") pod \"installer-9-crc\" (UID: \"946ddaae-0092-45f2-b5af-3a5168fa64e8\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 02:45:24 crc kubenswrapper[4901]: I0309 02:45:24.279913 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/946ddaae-0092-45f2-b5af-3a5168fa64e8-kube-api-access\") pod \"installer-9-crc\" (UID: \"946ddaae-0092-45f2-b5af-3a5168fa64e8\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 02:45:24 crc kubenswrapper[4901]: I0309 02:45:24.279937 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/946ddaae-0092-45f2-b5af-3a5168fa64e8-kubelet-dir\") pod \"installer-9-crc\" (UID: \"946ddaae-0092-45f2-b5af-3a5168fa64e8\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 02:45:24 crc kubenswrapper[4901]: I0309 02:45:24.352883 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56bb5fb5dc-v9v8k"] Mar 09 02:45:24 crc kubenswrapper[4901]: I0309 02:45:24.353066 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-56bb5fb5dc-v9v8k" podUID="586fa848-a7a2-4543-8ff7-894ed18ba93f" containerName="route-controller-manager" containerID="cri-o://d9a5be4e7713171756048b8053379c5d964f79d9410805f29bb56273714be81e" gracePeriod=30 Mar 09 02:45:24 crc kubenswrapper[4901]: I0309 02:45:24.381781 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/946ddaae-0092-45f2-b5af-3a5168fa64e8-kube-api-access\") pod \"installer-9-crc\" (UID: \"946ddaae-0092-45f2-b5af-3a5168fa64e8\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 02:45:24 crc kubenswrapper[4901]: I0309 02:45:24.381821 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/946ddaae-0092-45f2-b5af-3a5168fa64e8-kubelet-dir\") pod \"installer-9-crc\" (UID: \"946ddaae-0092-45f2-b5af-3a5168fa64e8\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 02:45:24 crc kubenswrapper[4901]: I0309 02:45:24.381889 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/946ddaae-0092-45f2-b5af-3a5168fa64e8-var-lock\") pod \"installer-9-crc\" (UID: \"946ddaae-0092-45f2-b5af-3a5168fa64e8\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 02:45:24 crc kubenswrapper[4901]: I0309 02:45:24.381949 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/946ddaae-0092-45f2-b5af-3a5168fa64e8-var-lock\") pod \"installer-9-crc\" (UID: \"946ddaae-0092-45f2-b5af-3a5168fa64e8\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 02:45:24 crc kubenswrapper[4901]: I0309 02:45:24.382187 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/946ddaae-0092-45f2-b5af-3a5168fa64e8-kubelet-dir\") pod \"installer-9-crc\" (UID: \"946ddaae-0092-45f2-b5af-3a5168fa64e8\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 02:45:24 crc kubenswrapper[4901]: I0309 02:45:24.420100 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/946ddaae-0092-45f2-b5af-3a5168fa64e8-kube-api-access\") pod \"installer-9-crc\" (UID: \"946ddaae-0092-45f2-b5af-3a5168fa64e8\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 02:45:24 crc kubenswrapper[4901]: I0309 02:45:24.487774 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 09 02:45:24 crc kubenswrapper[4901]: I0309 02:45:24.571160 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550404-w8858" Mar 09 02:45:24 crc kubenswrapper[4901]: I0309 02:45:24.685686 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkn9d\" (UniqueName: \"kubernetes.io/projected/680e9e87-71a2-402c-84f2-e8eb2b7a4c44-kube-api-access-dkn9d\") pod \"680e9e87-71a2-402c-84f2-e8eb2b7a4c44\" (UID: \"680e9e87-71a2-402c-84f2-e8eb2b7a4c44\") " Mar 09 02:45:24 crc kubenswrapper[4901]: I0309 02:45:24.690894 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/680e9e87-71a2-402c-84f2-e8eb2b7a4c44-kube-api-access-dkn9d" (OuterVolumeSpecName: "kube-api-access-dkn9d") pod "680e9e87-71a2-402c-84f2-e8eb2b7a4c44" (UID: "680e9e87-71a2-402c-84f2-e8eb2b7a4c44"). InnerVolumeSpecName "kube-api-access-dkn9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:45:24 crc kubenswrapper[4901]: I0309 02:45:24.787308 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkn9d\" (UniqueName: \"kubernetes.io/projected/680e9e87-71a2-402c-84f2-e8eb2b7a4c44-kube-api-access-dkn9d\") on node \"crc\" DevicePath \"\"" Mar 09 02:45:24 crc kubenswrapper[4901]: I0309 02:45:24.819528 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56bb5fb5dc-v9v8k" Mar 09 02:45:24 crc kubenswrapper[4901]: I0309 02:45:24.866587 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76c4d7d756-q8rgc" Mar 09 02:45:24 crc kubenswrapper[4901]: I0309 02:45:24.888590 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/26347922-100f-4f1b-a96c-b4b2d1d5275d-proxy-ca-bundles\") pod \"26347922-100f-4f1b-a96c-b4b2d1d5275d\" (UID: \"26347922-100f-4f1b-a96c-b4b2d1d5275d\") " Mar 09 02:45:24 crc kubenswrapper[4901]: I0309 02:45:24.888677 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gklt\" (UniqueName: \"kubernetes.io/projected/586fa848-a7a2-4543-8ff7-894ed18ba93f-kube-api-access-2gklt\") pod \"586fa848-a7a2-4543-8ff7-894ed18ba93f\" (UID: \"586fa848-a7a2-4543-8ff7-894ed18ba93f\") " Mar 09 02:45:24 crc kubenswrapper[4901]: I0309 02:45:24.888720 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26347922-100f-4f1b-a96c-b4b2d1d5275d-serving-cert\") pod \"26347922-100f-4f1b-a96c-b4b2d1d5275d\" (UID: \"26347922-100f-4f1b-a96c-b4b2d1d5275d\") " Mar 09 02:45:24 crc kubenswrapper[4901]: I0309 02:45:24.888740 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26347922-100f-4f1b-a96c-b4b2d1d5275d-config\") pod \"26347922-100f-4f1b-a96c-b4b2d1d5275d\" (UID: \"26347922-100f-4f1b-a96c-b4b2d1d5275d\") " Mar 09 02:45:24 crc kubenswrapper[4901]: I0309 02:45:24.888791 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/586fa848-a7a2-4543-8ff7-894ed18ba93f-serving-cert\") pod \"586fa848-a7a2-4543-8ff7-894ed18ba93f\" (UID: \"586fa848-a7a2-4543-8ff7-894ed18ba93f\") " Mar 09 02:45:24 crc kubenswrapper[4901]: I0309 02:45:24.888840 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/586fa848-a7a2-4543-8ff7-894ed18ba93f-config\") pod \"586fa848-a7a2-4543-8ff7-894ed18ba93f\" (UID: \"586fa848-a7a2-4543-8ff7-894ed18ba93f\") " Mar 09 02:45:24 crc kubenswrapper[4901]: I0309 02:45:24.888875 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/586fa848-a7a2-4543-8ff7-894ed18ba93f-client-ca\") pod \"586fa848-a7a2-4543-8ff7-894ed18ba93f\" (UID: \"586fa848-a7a2-4543-8ff7-894ed18ba93f\") " Mar 09 02:45:24 crc kubenswrapper[4901]: I0309 02:45:24.888962 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/26347922-100f-4f1b-a96c-b4b2d1d5275d-client-ca\") pod \"26347922-100f-4f1b-a96c-b4b2d1d5275d\" (UID: \"26347922-100f-4f1b-a96c-b4b2d1d5275d\") " Mar 09 02:45:24 crc kubenswrapper[4901]: I0309 02:45:24.889018 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trgkt\" (UniqueName: \"kubernetes.io/projected/26347922-100f-4f1b-a96c-b4b2d1d5275d-kube-api-access-trgkt\") pod \"26347922-100f-4f1b-a96c-b4b2d1d5275d\" (UID: \"26347922-100f-4f1b-a96c-b4b2d1d5275d\") " Mar 09 02:45:24 crc kubenswrapper[4901]: I0309 02:45:24.889821 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26347922-100f-4f1b-a96c-b4b2d1d5275d-client-ca" (OuterVolumeSpecName: "client-ca") pod "26347922-100f-4f1b-a96c-b4b2d1d5275d" (UID: "26347922-100f-4f1b-a96c-b4b2d1d5275d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:45:24 crc kubenswrapper[4901]: I0309 02:45:24.889973 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26347922-100f-4f1b-a96c-b4b2d1d5275d-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "26347922-100f-4f1b-a96c-b4b2d1d5275d" (UID: "26347922-100f-4f1b-a96c-b4b2d1d5275d"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:45:24 crc kubenswrapper[4901]: I0309 02:45:24.890128 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/586fa848-a7a2-4543-8ff7-894ed18ba93f-config" (OuterVolumeSpecName: "config") pod "586fa848-a7a2-4543-8ff7-894ed18ba93f" (UID: "586fa848-a7a2-4543-8ff7-894ed18ba93f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:45:24 crc kubenswrapper[4901]: I0309 02:45:24.890119 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26347922-100f-4f1b-a96c-b4b2d1d5275d-config" (OuterVolumeSpecName: "config") pod "26347922-100f-4f1b-a96c-b4b2d1d5275d" (UID: "26347922-100f-4f1b-a96c-b4b2d1d5275d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:45:24 crc kubenswrapper[4901]: I0309 02:45:24.890171 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/586fa848-a7a2-4543-8ff7-894ed18ba93f-client-ca" (OuterVolumeSpecName: "client-ca") pod "586fa848-a7a2-4543-8ff7-894ed18ba93f" (UID: "586fa848-a7a2-4543-8ff7-894ed18ba93f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:45:24 crc kubenswrapper[4901]: I0309 02:45:24.890183 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-03 23:55:19.715894383 +0000 UTC Mar 09 02:45:24 crc kubenswrapper[4901]: I0309 02:45:24.890199 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7221h9m54.825696656s for next certificate rotation Mar 09 02:45:24 crc kubenswrapper[4901]: I0309 02:45:24.890657 4901 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/26347922-100f-4f1b-a96c-b4b2d1d5275d-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 02:45:24 crc kubenswrapper[4901]: I0309 02:45:24.890672 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26347922-100f-4f1b-a96c-b4b2d1d5275d-config\") on node \"crc\" DevicePath \"\"" Mar 09 02:45:24 crc kubenswrapper[4901]: I0309 02:45:24.890681 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/586fa848-a7a2-4543-8ff7-894ed18ba93f-config\") on node \"crc\" DevicePath \"\"" Mar 09 02:45:24 crc kubenswrapper[4901]: I0309 02:45:24.890689 4901 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/586fa848-a7a2-4543-8ff7-894ed18ba93f-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 02:45:24 crc kubenswrapper[4901]: I0309 02:45:24.890697 4901 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/26347922-100f-4f1b-a96c-b4b2d1d5275d-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 02:45:24 crc kubenswrapper[4901]: I0309 02:45:24.892297 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/586fa848-a7a2-4543-8ff7-894ed18ba93f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "586fa848-a7a2-4543-8ff7-894ed18ba93f" (UID: "586fa848-a7a2-4543-8ff7-894ed18ba93f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:45:24 crc kubenswrapper[4901]: I0309 02:45:24.893103 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26347922-100f-4f1b-a96c-b4b2d1d5275d-kube-api-access-trgkt" (OuterVolumeSpecName: "kube-api-access-trgkt") pod "26347922-100f-4f1b-a96c-b4b2d1d5275d" (UID: "26347922-100f-4f1b-a96c-b4b2d1d5275d"). InnerVolumeSpecName "kube-api-access-trgkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:45:24 crc kubenswrapper[4901]: I0309 02:45:24.893327 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26347922-100f-4f1b-a96c-b4b2d1d5275d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "26347922-100f-4f1b-a96c-b4b2d1d5275d" (UID: "26347922-100f-4f1b-a96c-b4b2d1d5275d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:45:24 crc kubenswrapper[4901]: I0309 02:45:24.895038 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/586fa848-a7a2-4543-8ff7-894ed18ba93f-kube-api-access-2gklt" (OuterVolumeSpecName: "kube-api-access-2gklt") pod "586fa848-a7a2-4543-8ff7-894ed18ba93f" (UID: "586fa848-a7a2-4543-8ff7-894ed18ba93f"). InnerVolumeSpecName "kube-api-access-2gklt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:45:24 crc kubenswrapper[4901]: I0309 02:45:24.971202 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 09 02:45:24 crc kubenswrapper[4901]: W0309 02:45:24.975902 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod946ddaae_0092_45f2_b5af_3a5168fa64e8.slice/crio-0fb69dfcb4ae21b653c77905eaf02d3b8352b0c01fccf190870874141a2381de WatchSource:0}: Error finding container 0fb69dfcb4ae21b653c77905eaf02d3b8352b0c01fccf190870874141a2381de: Status 404 returned error can't find the container with id 0fb69dfcb4ae21b653c77905eaf02d3b8352b0c01fccf190870874141a2381de Mar 09 02:45:24 crc kubenswrapper[4901]: I0309 02:45:24.991881 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gklt\" (UniqueName: \"kubernetes.io/projected/586fa848-a7a2-4543-8ff7-894ed18ba93f-kube-api-access-2gklt\") on node \"crc\" DevicePath \"\"" Mar 09 02:45:24 crc kubenswrapper[4901]: I0309 02:45:24.992094 4901 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26347922-100f-4f1b-a96c-b4b2d1d5275d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 02:45:24 crc kubenswrapper[4901]: I0309 02:45:24.992104 4901 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/586fa848-a7a2-4543-8ff7-894ed18ba93f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 02:45:24 crc kubenswrapper[4901]: I0309 02:45:24.992112 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trgkt\" (UniqueName: \"kubernetes.io/projected/26347922-100f-4f1b-a96c-b4b2d1d5275d-kube-api-access-trgkt\") on node \"crc\" DevicePath \"\"" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.098271 4901 generic.go:334] "Generic (PLEG): container finished" podID="26347922-100f-4f1b-a96c-b4b2d1d5275d" containerID="86cdbab10b9f0d293bc5330cf6fcdd54e286ef48b943a8f67749249b38229d25" exitCode=0 Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.098387 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76c4d7d756-q8rgc" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.098407 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-76c4d7d756-q8rgc" event={"ID":"26347922-100f-4f1b-a96c-b4b2d1d5275d","Type":"ContainerDied","Data":"86cdbab10b9f0d293bc5330cf6fcdd54e286ef48b943a8f67749249b38229d25"} Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.099605 4901 scope.go:117] "RemoveContainer" containerID="86cdbab10b9f0d293bc5330cf6fcdd54e286ef48b943a8f67749249b38229d25" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.100411 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-76c4d7d756-q8rgc" event={"ID":"26347922-100f-4f1b-a96c-b4b2d1d5275d","Type":"ContainerDied","Data":"b4f015f0a469d9cb495271a0842f3aeadffcdac84cbb2b78a4a36023d561634f"} Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.101815 4901 generic.go:334] "Generic (PLEG): container finished" podID="586fa848-a7a2-4543-8ff7-894ed18ba93f" containerID="d9a5be4e7713171756048b8053379c5d964f79d9410805f29bb56273714be81e" exitCode=0 Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.102004 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56bb5fb5dc-v9v8k" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.102154 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56bb5fb5dc-v9v8k" event={"ID":"586fa848-a7a2-4543-8ff7-894ed18ba93f","Type":"ContainerDied","Data":"d9a5be4e7713171756048b8053379c5d964f79d9410805f29bb56273714be81e"} Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.102254 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56bb5fb5dc-v9v8k" event={"ID":"586fa848-a7a2-4543-8ff7-894ed18ba93f","Type":"ContainerDied","Data":"f322a0e1171376be7d014d1cda548528195e1ecb30d6775ffe9a504729274ad4"} Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.107136 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"946ddaae-0092-45f2-b5af-3a5168fa64e8","Type":"ContainerStarted","Data":"0fb69dfcb4ae21b653c77905eaf02d3b8352b0c01fccf190870874141a2381de"} Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.108639 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550404-w8858" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.108635 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550404-w8858" event={"ID":"680e9e87-71a2-402c-84f2-e8eb2b7a4c44","Type":"ContainerDied","Data":"42b0097386974baecd436dfb54d11bfe48767366b4c548092d36b19648b087a0"} Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.108775 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42b0097386974baecd436dfb54d11bfe48767366b4c548092d36b19648b087a0" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.116617 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jp6kj" event={"ID":"da9622a3-9e9b-4c8a-86d3-110f44883bbc","Type":"ContainerStarted","Data":"1feee7a0be93d3f596c3f82ac98106bbdfad7a29435505a3976377cb4f90117f"} Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.128000 4901 scope.go:117] "RemoveContainer" containerID="86cdbab10b9f0d293bc5330cf6fcdd54e286ef48b943a8f67749249b38229d25" Mar 09 02:45:25 crc kubenswrapper[4901]: E0309 02:45:25.128732 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86cdbab10b9f0d293bc5330cf6fcdd54e286ef48b943a8f67749249b38229d25\": container with ID starting with 86cdbab10b9f0d293bc5330cf6fcdd54e286ef48b943a8f67749249b38229d25 not found: ID does not exist" containerID="86cdbab10b9f0d293bc5330cf6fcdd54e286ef48b943a8f67749249b38229d25" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.128782 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86cdbab10b9f0d293bc5330cf6fcdd54e286ef48b943a8f67749249b38229d25"} err="failed to get container status \"86cdbab10b9f0d293bc5330cf6fcdd54e286ef48b943a8f67749249b38229d25\": rpc error: code = NotFound desc = could not find container \"86cdbab10b9f0d293bc5330cf6fcdd54e286ef48b943a8f67749249b38229d25\": container with ID starting with 86cdbab10b9f0d293bc5330cf6fcdd54e286ef48b943a8f67749249b38229d25 not found: ID does not exist" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.128802 4901 scope.go:117] "RemoveContainer" containerID="d9a5be4e7713171756048b8053379c5d964f79d9410805f29bb56273714be81e" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.132687 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-76c4d7d756-q8rgc"] Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.137148 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-76c4d7d756-q8rgc"] Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.166414 4901 scope.go:117] "RemoveContainer" containerID="d9a5be4e7713171756048b8053379c5d964f79d9410805f29bb56273714be81e" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.167006 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56bb5fb5dc-v9v8k"] Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.169238 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56bb5fb5dc-v9v8k"] Mar 09 02:45:25 crc kubenswrapper[4901]: E0309 02:45:25.170579 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9a5be4e7713171756048b8053379c5d964f79d9410805f29bb56273714be81e\": container with ID starting with d9a5be4e7713171756048b8053379c5d964f79d9410805f29bb56273714be81e not found: ID does not exist" containerID="d9a5be4e7713171756048b8053379c5d964f79d9410805f29bb56273714be81e" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.170620 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9a5be4e7713171756048b8053379c5d964f79d9410805f29bb56273714be81e"} err="failed to get container status \"d9a5be4e7713171756048b8053379c5d964f79d9410805f29bb56273714be81e\": rpc error: code = NotFound desc = could not find container \"d9a5be4e7713171756048b8053379c5d964f79d9410805f29bb56273714be81e\": container with ID starting with d9a5be4e7713171756048b8053379c5d964f79d9410805f29bb56273714be81e not found: ID does not exist" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.564692 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-77db75d486-lj5ct"] Mar 09 02:45:25 crc kubenswrapper[4901]: E0309 02:45:25.565179 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="586fa848-a7a2-4543-8ff7-894ed18ba93f" containerName="route-controller-manager" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.565274 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="586fa848-a7a2-4543-8ff7-894ed18ba93f" containerName="route-controller-manager" Mar 09 02:45:25 crc kubenswrapper[4901]: E0309 02:45:25.565369 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="680e9e87-71a2-402c-84f2-e8eb2b7a4c44" containerName="oc" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.565448 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="680e9e87-71a2-402c-84f2-e8eb2b7a4c44" containerName="oc" Mar 09 02:45:25 crc kubenswrapper[4901]: E0309 02:45:25.565527 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26347922-100f-4f1b-a96c-b4b2d1d5275d" containerName="controller-manager" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.565588 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="26347922-100f-4f1b-a96c-b4b2d1d5275d" containerName="controller-manager" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.565740 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="680e9e87-71a2-402c-84f2-e8eb2b7a4c44" containerName="oc" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.565807 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="586fa848-a7a2-4543-8ff7-894ed18ba93f" containerName="route-controller-manager" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.565868 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="26347922-100f-4f1b-a96c-b4b2d1d5275d" containerName="controller-manager" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.566313 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77db75d486-lj5ct" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.568017 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.568378 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.568534 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.568897 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.569107 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.572401 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.575043 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.599310 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa-serving-cert\") pod \"controller-manager-77db75d486-lj5ct\" (UID: \"5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa\") " pod="openshift-controller-manager/controller-manager-77db75d486-lj5ct" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.599449 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56sgf\" (UniqueName: \"kubernetes.io/projected/5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa-kube-api-access-56sgf\") pod \"controller-manager-77db75d486-lj5ct\" (UID: \"5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa\") " pod="openshift-controller-manager/controller-manager-77db75d486-lj5ct" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.599555 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa-config\") pod \"controller-manager-77db75d486-lj5ct\" (UID: \"5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa\") " pod="openshift-controller-manager/controller-manager-77db75d486-lj5ct" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.599706 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa-proxy-ca-bundles\") pod \"controller-manager-77db75d486-lj5ct\" (UID: \"5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa\") " pod="openshift-controller-manager/controller-manager-77db75d486-lj5ct" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.599797 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa-client-ca\") pod \"controller-manager-77db75d486-lj5ct\" (UID: \"5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa\") " pod="openshift-controller-manager/controller-manager-77db75d486-lj5ct" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.602388 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74cfc7b869-mhczd"] Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.603056 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-74cfc7b869-mhczd" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.605369 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74cfc7b869-mhczd"] Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.606666 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.606945 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.607066 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.607079 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.607149 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.607271 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.609163 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-77db75d486-lj5ct"] Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.700884 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/569f3f72-275f-43c8-b9ea-258fff4d7af5-client-ca\") pod \"route-controller-manager-74cfc7b869-mhczd\" (UID: \"569f3f72-275f-43c8-b9ea-258fff4d7af5\") " pod="openshift-route-controller-manager/route-controller-manager-74cfc7b869-mhczd" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.700929 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa-config\") pod \"controller-manager-77db75d486-lj5ct\" (UID: \"5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa\") " pod="openshift-controller-manager/controller-manager-77db75d486-lj5ct" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.700966 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwtdd\" (UniqueName: \"kubernetes.io/projected/569f3f72-275f-43c8-b9ea-258fff4d7af5-kube-api-access-hwtdd\") pod \"route-controller-manager-74cfc7b869-mhczd\" (UID: \"569f3f72-275f-43c8-b9ea-258fff4d7af5\") " pod="openshift-route-controller-manager/route-controller-manager-74cfc7b869-mhczd" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.701113 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa-proxy-ca-bundles\") pod \"controller-manager-77db75d486-lj5ct\" (UID: \"5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa\") " pod="openshift-controller-manager/controller-manager-77db75d486-lj5ct" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.701201 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/569f3f72-275f-43c8-b9ea-258fff4d7af5-config\") pod \"route-controller-manager-74cfc7b869-mhczd\" (UID: \"569f3f72-275f-43c8-b9ea-258fff4d7af5\") " pod="openshift-route-controller-manager/route-controller-manager-74cfc7b869-mhczd" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.701298 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa-client-ca\") pod \"controller-manager-77db75d486-lj5ct\" (UID: \"5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa\") " pod="openshift-controller-manager/controller-manager-77db75d486-lj5ct" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.701444 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa-serving-cert\") pod \"controller-manager-77db75d486-lj5ct\" (UID: \"5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa\") " pod="openshift-controller-manager/controller-manager-77db75d486-lj5ct" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.701495 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56sgf\" (UniqueName: \"kubernetes.io/projected/5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa-kube-api-access-56sgf\") pod \"controller-manager-77db75d486-lj5ct\" (UID: \"5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa\") " pod="openshift-controller-manager/controller-manager-77db75d486-lj5ct" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.701535 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/569f3f72-275f-43c8-b9ea-258fff4d7af5-serving-cert\") pod \"route-controller-manager-74cfc7b869-mhczd\" (UID: \"569f3f72-275f-43c8-b9ea-258fff4d7af5\") " pod="openshift-route-controller-manager/route-controller-manager-74cfc7b869-mhczd" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.701989 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa-client-ca\") pod \"controller-manager-77db75d486-lj5ct\" (UID: \"5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa\") " pod="openshift-controller-manager/controller-manager-77db75d486-lj5ct" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.702070 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa-proxy-ca-bundles\") pod \"controller-manager-77db75d486-lj5ct\" (UID: \"5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa\") " pod="openshift-controller-manager/controller-manager-77db75d486-lj5ct" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.702159 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa-config\") pod \"controller-manager-77db75d486-lj5ct\" (UID: \"5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa\") " pod="openshift-controller-manager/controller-manager-77db75d486-lj5ct" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.707298 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa-serving-cert\") pod \"controller-manager-77db75d486-lj5ct\" (UID: \"5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa\") " pod="openshift-controller-manager/controller-manager-77db75d486-lj5ct" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.719424 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56sgf\" (UniqueName: \"kubernetes.io/projected/5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa-kube-api-access-56sgf\") pod \"controller-manager-77db75d486-lj5ct\" (UID: \"5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa\") " pod="openshift-controller-manager/controller-manager-77db75d486-lj5ct" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.802437 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwtdd\" (UniqueName: \"kubernetes.io/projected/569f3f72-275f-43c8-b9ea-258fff4d7af5-kube-api-access-hwtdd\") pod \"route-controller-manager-74cfc7b869-mhczd\" (UID: \"569f3f72-275f-43c8-b9ea-258fff4d7af5\") " pod="openshift-route-controller-manager/route-controller-manager-74cfc7b869-mhczd" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.802727 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/569f3f72-275f-43c8-b9ea-258fff4d7af5-config\") pod \"route-controller-manager-74cfc7b869-mhczd\" (UID: \"569f3f72-275f-43c8-b9ea-258fff4d7af5\") " pod="openshift-route-controller-manager/route-controller-manager-74cfc7b869-mhczd" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.802788 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/569f3f72-275f-43c8-b9ea-258fff4d7af5-serving-cert\") pod \"route-controller-manager-74cfc7b869-mhczd\" (UID: \"569f3f72-275f-43c8-b9ea-258fff4d7af5\") " pod="openshift-route-controller-manager/route-controller-manager-74cfc7b869-mhczd" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.802805 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/569f3f72-275f-43c8-b9ea-258fff4d7af5-client-ca\") pod \"route-controller-manager-74cfc7b869-mhczd\" (UID: \"569f3f72-275f-43c8-b9ea-258fff4d7af5\") " pod="openshift-route-controller-manager/route-controller-manager-74cfc7b869-mhczd" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.803970 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/569f3f72-275f-43c8-b9ea-258fff4d7af5-client-ca\") pod \"route-controller-manager-74cfc7b869-mhczd\" (UID: \"569f3f72-275f-43c8-b9ea-258fff4d7af5\") " pod="openshift-route-controller-manager/route-controller-manager-74cfc7b869-mhczd" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.806617 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/569f3f72-275f-43c8-b9ea-258fff4d7af5-serving-cert\") pod \"route-controller-manager-74cfc7b869-mhczd\" (UID: \"569f3f72-275f-43c8-b9ea-258fff4d7af5\") " pod="openshift-route-controller-manager/route-controller-manager-74cfc7b869-mhczd" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.806698 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/569f3f72-275f-43c8-b9ea-258fff4d7af5-config\") pod \"route-controller-manager-74cfc7b869-mhczd\" (UID: \"569f3f72-275f-43c8-b9ea-258fff4d7af5\") " pod="openshift-route-controller-manager/route-controller-manager-74cfc7b869-mhczd" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.816435 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwtdd\" (UniqueName: \"kubernetes.io/projected/569f3f72-275f-43c8-b9ea-258fff4d7af5-kube-api-access-hwtdd\") pod \"route-controller-manager-74cfc7b869-mhczd\" (UID: \"569f3f72-275f-43c8-b9ea-258fff4d7af5\") " pod="openshift-route-controller-manager/route-controller-manager-74cfc7b869-mhczd" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.926206 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77db75d486-lj5ct" Mar 09 02:45:25 crc kubenswrapper[4901]: I0309 02:45:25.933436 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-74cfc7b869-mhczd" Mar 09 02:45:26 crc kubenswrapper[4901]: I0309 02:45:26.123301 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26347922-100f-4f1b-a96c-b4b2d1d5275d" path="/var/lib/kubelet/pods/26347922-100f-4f1b-a96c-b4b2d1d5275d/volumes" Mar 09 02:45:26 crc kubenswrapper[4901]: I0309 02:45:26.124766 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="586fa848-a7a2-4543-8ff7-894ed18ba93f" path="/var/lib/kubelet/pods/586fa848-a7a2-4543-8ff7-894ed18ba93f/volumes" Mar 09 02:45:26 crc kubenswrapper[4901]: I0309 02:45:26.133512 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"946ddaae-0092-45f2-b5af-3a5168fa64e8","Type":"ContainerStarted","Data":"14d8eb66d461e65f942f8415e30e0dd40a4d9ece6ecd33c7d045290d53091edc"} Mar 09 02:45:26 crc kubenswrapper[4901]: I0309 02:45:26.144084 4901 generic.go:334] "Generic (PLEG): container finished" podID="da9622a3-9e9b-4c8a-86d3-110f44883bbc" containerID="1feee7a0be93d3f596c3f82ac98106bbdfad7a29435505a3976377cb4f90117f" exitCode=0 Mar 09 02:45:26 crc kubenswrapper[4901]: I0309 02:45:26.144134 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jp6kj" event={"ID":"da9622a3-9e9b-4c8a-86d3-110f44883bbc","Type":"ContainerDied","Data":"1feee7a0be93d3f596c3f82ac98106bbdfad7a29435505a3976377cb4f90117f"} Mar 09 02:45:26 crc kubenswrapper[4901]: I0309 02:45:26.194848 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.194830463 podStartE2EDuration="2.194830463s" podCreationTimestamp="2026-03-09 02:45:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:45:26.193193541 +0000 UTC m=+250.782857283" watchObservedRunningTime="2026-03-09 02:45:26.194830463 +0000 UTC m=+250.784494195" Mar 09 02:45:26 crc kubenswrapper[4901]: I0309 02:45:26.356879 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-77db75d486-lj5ct"] Mar 09 02:45:26 crc kubenswrapper[4901]: W0309 02:45:26.368424 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5dc62bb1_5c7f_404c_ba83_f4a7fde3f1aa.slice/crio-efa6a441234d44a6ac4a828f6a7d9ccb6e6b8dce2c0364017657c382d20805d7 WatchSource:0}: Error finding container efa6a441234d44a6ac4a828f6a7d9ccb6e6b8dce2c0364017657c382d20805d7: Status 404 returned error can't find the container with id efa6a441234d44a6ac4a828f6a7d9ccb6e6b8dce2c0364017657c382d20805d7 Mar 09 02:45:26 crc kubenswrapper[4901]: I0309 02:45:26.404824 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74cfc7b869-mhczd"] Mar 09 02:45:26 crc kubenswrapper[4901]: W0309 02:45:26.411417 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod569f3f72_275f_43c8_b9ea_258fff4d7af5.slice/crio-c98f78d91585d4f08a96777d3290763dc46f6a4c47710662e3bf6e6cb11a8636 WatchSource:0}: Error finding container c98f78d91585d4f08a96777d3290763dc46f6a4c47710662e3bf6e6cb11a8636: Status 404 returned error can't find the container with id c98f78d91585d4f08a96777d3290763dc46f6a4c47710662e3bf6e6cb11a8636 Mar 09 02:45:27 crc kubenswrapper[4901]: I0309 02:45:27.155994 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-74cfc7b869-mhczd" event={"ID":"569f3f72-275f-43c8-b9ea-258fff4d7af5","Type":"ContainerStarted","Data":"75ebb6ca60b7f9e2d4c5c249ccf9078a9a5e795489bf859ba9fd29a18119694a"} Mar 09 02:45:27 crc kubenswrapper[4901]: I0309 02:45:27.157107 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-74cfc7b869-mhczd" Mar 09 02:45:27 crc kubenswrapper[4901]: I0309 02:45:27.157193 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-74cfc7b869-mhczd" event={"ID":"569f3f72-275f-43c8-b9ea-258fff4d7af5","Type":"ContainerStarted","Data":"c98f78d91585d4f08a96777d3290763dc46f6a4c47710662e3bf6e6cb11a8636"} Mar 09 02:45:27 crc kubenswrapper[4901]: I0309 02:45:27.158417 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jp6kj" event={"ID":"da9622a3-9e9b-4c8a-86d3-110f44883bbc","Type":"ContainerStarted","Data":"3375ff6f15a7dd0ad28fa714ada39815827c3e7b828fd77c76a4ade566c07e27"} Mar 09 02:45:27 crc kubenswrapper[4901]: I0309 02:45:27.159680 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77db75d486-lj5ct" event={"ID":"5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa","Type":"ContainerStarted","Data":"d3f162d9f6c72c5683dde1fcbf44588a64246f3a1fe1c6526fcf7a86ed4bb97f"} Mar 09 02:45:27 crc kubenswrapper[4901]: I0309 02:45:27.159718 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77db75d486-lj5ct" event={"ID":"5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa","Type":"ContainerStarted","Data":"efa6a441234d44a6ac4a828f6a7d9ccb6e6b8dce2c0364017657c382d20805d7"} Mar 09 02:45:27 crc kubenswrapper[4901]: I0309 02:45:27.159869 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-77db75d486-lj5ct" Mar 09 02:45:27 crc kubenswrapper[4901]: I0309 02:45:27.163179 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-llngg" event={"ID":"f9d5529e-413e-4c88-98d8-1df5a9e55721","Type":"ContainerStarted","Data":"c43c00179f1b30f4609b2c2dd9b5045ee6f1cab9bd2b4e5a78066fd47e238a64"} Mar 09 02:45:27 crc kubenswrapper[4901]: I0309 02:45:27.163577 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-74cfc7b869-mhczd" Mar 09 02:45:27 crc kubenswrapper[4901]: I0309 02:45:27.164456 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-77db75d486-lj5ct" Mar 09 02:45:27 crc kubenswrapper[4901]: I0309 02:45:27.176879 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-74cfc7b869-mhczd" podStartSLOduration=3.176865445 podStartE2EDuration="3.176865445s" podCreationTimestamp="2026-03-09 02:45:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:45:27.173951503 +0000 UTC m=+251.763615235" watchObservedRunningTime="2026-03-09 02:45:27.176865445 +0000 UTC m=+251.766529177" Mar 09 02:45:27 crc kubenswrapper[4901]: I0309 02:45:27.220156 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jp6kj" podStartSLOduration=2.354656308 podStartE2EDuration="39.220142515s" podCreationTimestamp="2026-03-09 02:44:48 +0000 UTC" firstStartedPulling="2026-03-09 02:44:49.785315683 +0000 UTC m=+214.374979415" lastFinishedPulling="2026-03-09 02:45:26.65080186 +0000 UTC m=+251.240465622" observedRunningTime="2026-03-09 02:45:27.218533995 +0000 UTC m=+251.808197727" watchObservedRunningTime="2026-03-09 02:45:27.220142515 +0000 UTC m=+251.809806247" Mar 09 02:45:27 crc kubenswrapper[4901]: I0309 02:45:27.237901 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-77db75d486-lj5ct" podStartSLOduration=3.237880947 podStartE2EDuration="3.237880947s" podCreationTimestamp="2026-03-09 02:45:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:45:27.234373256 +0000 UTC m=+251.824036988" watchObservedRunningTime="2026-03-09 02:45:27.237880947 +0000 UTC m=+251.827544679" Mar 09 02:45:28 crc kubenswrapper[4901]: I0309 02:45:28.169707 4901 generic.go:334] "Generic (PLEG): container finished" podID="f9d5529e-413e-4c88-98d8-1df5a9e55721" containerID="c43c00179f1b30f4609b2c2dd9b5045ee6f1cab9bd2b4e5a78066fd47e238a64" exitCode=0 Mar 09 02:45:28 crc kubenswrapper[4901]: I0309 02:45:28.169930 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-llngg" event={"ID":"f9d5529e-413e-4c88-98d8-1df5a9e55721","Type":"ContainerDied","Data":"c43c00179f1b30f4609b2c2dd9b5045ee6f1cab9bd2b4e5a78066fd47e238a64"} Mar 09 02:45:28 crc kubenswrapper[4901]: I0309 02:45:28.575318 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jp6kj" Mar 09 02:45:28 crc kubenswrapper[4901]: I0309 02:45:28.577507 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jp6kj" Mar 09 02:45:29 crc kubenswrapper[4901]: I0309 02:45:29.178455 4901 generic.go:334] "Generic (PLEG): container finished" podID="1092e265-a1ed-40f3-9a91-c1996ea7479c" containerID="3209d20df68398b5c74788dabea1a04c821a3282bc7f7575499b8b0b0a83bfd3" exitCode=0 Mar 09 02:45:29 crc kubenswrapper[4901]: I0309 02:45:29.178527 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lvtrc" event={"ID":"1092e265-a1ed-40f3-9a91-c1996ea7479c","Type":"ContainerDied","Data":"3209d20df68398b5c74788dabea1a04c821a3282bc7f7575499b8b0b0a83bfd3"} Mar 09 02:45:29 crc kubenswrapper[4901]: I0309 02:45:29.182320 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-llngg" event={"ID":"f9d5529e-413e-4c88-98d8-1df5a9e55721","Type":"ContainerStarted","Data":"cefbfc61f3b1f2cb4e7068df7a3a2f26eba4651a63f3a3a9c965132c8d0f984c"} Mar 09 02:45:29 crc kubenswrapper[4901]: I0309 02:45:29.213374 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-llngg" podStartSLOduration=2.134002506 podStartE2EDuration="43.213359219s" podCreationTimestamp="2026-03-09 02:44:46 +0000 UTC" firstStartedPulling="2026-03-09 02:44:47.586350402 +0000 UTC m=+212.176014134" lastFinishedPulling="2026-03-09 02:45:28.665707105 +0000 UTC m=+253.255370847" observedRunningTime="2026-03-09 02:45:29.209673656 +0000 UTC m=+253.799337388" watchObservedRunningTime="2026-03-09 02:45:29.213359219 +0000 UTC m=+253.803022951" Mar 09 02:45:29 crc kubenswrapper[4901]: I0309 02:45:29.719055 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-jp6kj" podUID="da9622a3-9e9b-4c8a-86d3-110f44883bbc" containerName="registry-server" probeResult="failure" output=< Mar 09 02:45:29 crc kubenswrapper[4901]: timeout: failed to connect service ":50051" within 1s Mar 09 02:45:29 crc kubenswrapper[4901]: > Mar 09 02:45:30 crc kubenswrapper[4901]: I0309 02:45:30.188829 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lvtrc" event={"ID":"1092e265-a1ed-40f3-9a91-c1996ea7479c","Type":"ContainerStarted","Data":"39efb4f2ab60f7232608db324b7dfe0d78f50cfa4a13054ef9e97b8132ac8389"} Mar 09 02:45:30 crc kubenswrapper[4901]: I0309 02:45:30.863213 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 02:45:30 crc kubenswrapper[4901]: I0309 02:45:30.863305 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 02:45:31 crc kubenswrapper[4901]: I0309 02:45:31.125044 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lvtrc" podStartSLOduration=4.350718331 podStartE2EDuration="44.125022826s" podCreationTimestamp="2026-03-09 02:44:47 +0000 UTC" firstStartedPulling="2026-03-09 02:44:49.790983212 +0000 UTC m=+214.380646944" lastFinishedPulling="2026-03-09 02:45:29.565287707 +0000 UTC m=+254.154951439" observedRunningTime="2026-03-09 02:45:30.2152693 +0000 UTC m=+254.804933032" watchObservedRunningTime="2026-03-09 02:45:31.125022826 +0000 UTC m=+255.714686558" Mar 09 02:45:31 crc kubenswrapper[4901]: I0309 02:45:31.195195 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rrzzj" event={"ID":"980a688d-18b9-4f90-9255-a55568e7bbc0","Type":"ContainerStarted","Data":"d8249cf85e541b0ff2e4327887c74d43f290628701991b302986bf860d7a9b9a"} Mar 09 02:45:31 crc kubenswrapper[4901]: I0309 02:45:31.196836 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m9pkg" event={"ID":"1f881266-e72e-4a74-a232-2dc3c6e95f08","Type":"ContainerStarted","Data":"c4a8c0c135f0c22c3c3d701e20c21309460b6bc3206ee2cbe413b767f0b661fe"} Mar 09 02:45:32 crc kubenswrapper[4901]: I0309 02:45:32.210984 4901 generic.go:334] "Generic (PLEG): container finished" podID="980a688d-18b9-4f90-9255-a55568e7bbc0" containerID="d8249cf85e541b0ff2e4327887c74d43f290628701991b302986bf860d7a9b9a" exitCode=0 Mar 09 02:45:32 crc kubenswrapper[4901]: I0309 02:45:32.211136 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rrzzj" event={"ID":"980a688d-18b9-4f90-9255-a55568e7bbc0","Type":"ContainerDied","Data":"d8249cf85e541b0ff2e4327887c74d43f290628701991b302986bf860d7a9b9a"} Mar 09 02:45:32 crc kubenswrapper[4901]: I0309 02:45:32.213504 4901 generic.go:334] "Generic (PLEG): container finished" podID="1f881266-e72e-4a74-a232-2dc3c6e95f08" containerID="c4a8c0c135f0c22c3c3d701e20c21309460b6bc3206ee2cbe413b767f0b661fe" exitCode=0 Mar 09 02:45:32 crc kubenswrapper[4901]: I0309 02:45:32.213553 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m9pkg" event={"ID":"1f881266-e72e-4a74-a232-2dc3c6e95f08","Type":"ContainerDied","Data":"c4a8c0c135f0c22c3c3d701e20c21309460b6bc3206ee2cbe413b767f0b661fe"} Mar 09 02:45:36 crc kubenswrapper[4901]: I0309 02:45:36.566883 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-llngg" Mar 09 02:45:36 crc kubenswrapper[4901]: I0309 02:45:36.567157 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-llngg" Mar 09 02:45:36 crc kubenswrapper[4901]: I0309 02:45:36.625767 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-llngg" Mar 09 02:45:37 crc kubenswrapper[4901]: I0309 02:45:37.245296 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rrzzj" event={"ID":"980a688d-18b9-4f90-9255-a55568e7bbc0","Type":"ContainerStarted","Data":"bc51639b1fdaab2b9298abfddc4c154b2c35b9379b248bccf51c2f0c715732c4"} Mar 09 02:45:37 crc kubenswrapper[4901]: I0309 02:45:37.248599 4901 generic.go:334] "Generic (PLEG): container finished" podID="31ec7346-95de-49f9-ad63-7a7423ad1cc3" containerID="1f6b15ab85e20e499dc0c57a75d0cb1adc0cdb16206f32ca3baf6c0a3a279e75" exitCode=0 Mar 09 02:45:37 crc kubenswrapper[4901]: I0309 02:45:37.248646 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pgv99" event={"ID":"31ec7346-95de-49f9-ad63-7a7423ad1cc3","Type":"ContainerDied","Data":"1f6b15ab85e20e499dc0c57a75d0cb1adc0cdb16206f32ca3baf6c0a3a279e75"} Mar 09 02:45:37 crc kubenswrapper[4901]: I0309 02:45:37.254112 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m9pkg" event={"ID":"1f881266-e72e-4a74-a232-2dc3c6e95f08","Type":"ContainerStarted","Data":"eff38df816f5351f23333eb558e294ced397c67c3c364cc487c31e0d930ab1a7"} Mar 09 02:45:37 crc kubenswrapper[4901]: I0309 02:45:37.255917 4901 generic.go:334] "Generic (PLEG): container finished" podID="04b4583f-8f26-47e0-8726-a0c2f2dca07e" containerID="41b1ba29b78fb5b576a7bcccaae5ff9b4841b961b09d6dd144dffde6de0b4211" exitCode=0 Mar 09 02:45:37 crc kubenswrapper[4901]: I0309 02:45:37.255970 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f7kz2" event={"ID":"04b4583f-8f26-47e0-8726-a0c2f2dca07e","Type":"ContainerDied","Data":"41b1ba29b78fb5b576a7bcccaae5ff9b4841b961b09d6dd144dffde6de0b4211"} Mar 09 02:45:37 crc kubenswrapper[4901]: I0309 02:45:37.259353 4901 generic.go:334] "Generic (PLEG): container finished" podID="b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20" containerID="ca2c6c23aed288577f4d7e1e61eda7f536c85a75458cbd1c4d8dcaf8737d7d1f" exitCode=0 Mar 09 02:45:37 crc kubenswrapper[4901]: I0309 02:45:37.259432 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8c7zs" event={"ID":"b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20","Type":"ContainerDied","Data":"ca2c6c23aed288577f4d7e1e61eda7f536c85a75458cbd1c4d8dcaf8737d7d1f"} Mar 09 02:45:37 crc kubenswrapper[4901]: I0309 02:45:37.283307 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rrzzj" podStartSLOduration=3.22628665 podStartE2EDuration="48.283289852s" podCreationTimestamp="2026-03-09 02:44:49 +0000 UTC" firstStartedPulling="2026-03-09 02:44:50.783087703 +0000 UTC m=+215.372751425" lastFinishedPulling="2026-03-09 02:45:35.840090855 +0000 UTC m=+260.429754627" observedRunningTime="2026-03-09 02:45:37.280440737 +0000 UTC m=+261.870104509" watchObservedRunningTime="2026-03-09 02:45:37.283289852 +0000 UTC m=+261.872953604" Mar 09 02:45:37 crc kubenswrapper[4901]: I0309 02:45:37.307114 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-m9pkg" podStartSLOduration=3.995448904 podStartE2EDuration="49.307088703s" podCreationTimestamp="2026-03-09 02:44:48 +0000 UTC" firstStartedPulling="2026-03-09 02:44:50.772173197 +0000 UTC m=+215.361836929" lastFinishedPulling="2026-03-09 02:45:36.083812956 +0000 UTC m=+260.673476728" observedRunningTime="2026-03-09 02:45:37.303037075 +0000 UTC m=+261.892700837" watchObservedRunningTime="2026-03-09 02:45:37.307088703 +0000 UTC m=+261.896752465" Mar 09 02:45:37 crc kubenswrapper[4901]: I0309 02:45:37.347274 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-llngg" Mar 09 02:45:38 crc kubenswrapper[4901]: I0309 02:45:38.181617 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lvtrc" Mar 09 02:45:38 crc kubenswrapper[4901]: I0309 02:45:38.181945 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lvtrc" Mar 09 02:45:38 crc kubenswrapper[4901]: I0309 02:45:38.225886 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lvtrc" Mar 09 02:45:38 crc kubenswrapper[4901]: I0309 02:45:38.267995 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pgv99" event={"ID":"31ec7346-95de-49f9-ad63-7a7423ad1cc3","Type":"ContainerStarted","Data":"ef0861ada52f9a028b4c821f8d51edb074de5475b60241dad3577de3c23590a9"} Mar 09 02:45:38 crc kubenswrapper[4901]: I0309 02:45:38.270581 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f7kz2" event={"ID":"04b4583f-8f26-47e0-8726-a0c2f2dca07e","Type":"ContainerStarted","Data":"3845026fe186c660c9b108474a279825364f66a95539fe61f869eff4ba913cbc"} Mar 09 02:45:38 crc kubenswrapper[4901]: I0309 02:45:38.273620 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8c7zs" event={"ID":"b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20","Type":"ContainerStarted","Data":"76f23be939376cddc439ea4d4d9116a4e5a11887f55232a53bfca8ed00f21d90"} Mar 09 02:45:38 crc kubenswrapper[4901]: I0309 02:45:38.292740 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pgv99" podStartSLOduration=2.191526249 podStartE2EDuration="52.292724458s" podCreationTimestamp="2026-03-09 02:44:46 +0000 UTC" firstStartedPulling="2026-03-09 02:44:47.60049585 +0000 UTC m=+212.190159582" lastFinishedPulling="2026-03-09 02:45:37.701694059 +0000 UTC m=+262.291357791" observedRunningTime="2026-03-09 02:45:38.288663521 +0000 UTC m=+262.878327263" watchObservedRunningTime="2026-03-09 02:45:38.292724458 +0000 UTC m=+262.882388200" Mar 09 02:45:38 crc kubenswrapper[4901]: I0309 02:45:38.333194 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8c7zs" podStartSLOduration=3.295393441 podStartE2EDuration="53.33317297s" podCreationTimestamp="2026-03-09 02:44:45 +0000 UTC" firstStartedPulling="2026-03-09 02:44:47.65448225 +0000 UTC m=+212.244145972" lastFinishedPulling="2026-03-09 02:45:37.692261769 +0000 UTC m=+262.281925501" observedRunningTime="2026-03-09 02:45:38.308899387 +0000 UTC m=+262.898563129" watchObservedRunningTime="2026-03-09 02:45:38.33317297 +0000 UTC m=+262.922836712" Mar 09 02:45:38 crc kubenswrapper[4901]: I0309 02:45:38.342916 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lvtrc" Mar 09 02:45:38 crc kubenswrapper[4901]: I0309 02:45:38.362827 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-f7kz2" podStartSLOduration=3.219043819 podStartE2EDuration="53.362797645s" podCreationTimestamp="2026-03-09 02:44:45 +0000 UTC" firstStartedPulling="2026-03-09 02:44:47.648963505 +0000 UTC m=+212.238627237" lastFinishedPulling="2026-03-09 02:45:37.792717321 +0000 UTC m=+262.382381063" observedRunningTime="2026-03-09 02:45:38.336733325 +0000 UTC m=+262.926397067" watchObservedRunningTime="2026-03-09 02:45:38.362797645 +0000 UTC m=+262.952461387" Mar 09 02:45:38 crc kubenswrapper[4901]: I0309 02:45:38.618499 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jp6kj" Mar 09 02:45:38 crc kubenswrapper[4901]: I0309 02:45:38.666623 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jp6kj" Mar 09 02:45:39 crc kubenswrapper[4901]: I0309 02:45:39.157371 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-m9pkg" Mar 09 02:45:39 crc kubenswrapper[4901]: I0309 02:45:39.157421 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-m9pkg" Mar 09 02:45:39 crc kubenswrapper[4901]: I0309 02:45:39.561794 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-llngg"] Mar 09 02:45:39 crc kubenswrapper[4901]: I0309 02:45:39.562056 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-llngg" podUID="f9d5529e-413e-4c88-98d8-1df5a9e55721" containerName="registry-server" containerID="cri-o://cefbfc61f3b1f2cb4e7068df7a3a2f26eba4651a63f3a3a9c965132c8d0f984c" gracePeriod=2 Mar 09 02:45:39 crc kubenswrapper[4901]: I0309 02:45:39.594470 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rrzzj" Mar 09 02:45:39 crc kubenswrapper[4901]: I0309 02:45:39.594726 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rrzzj" Mar 09 02:45:40 crc kubenswrapper[4901]: I0309 02:45:40.198668 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-m9pkg" podUID="1f881266-e72e-4a74-a232-2dc3c6e95f08" containerName="registry-server" probeResult="failure" output=< Mar 09 02:45:40 crc kubenswrapper[4901]: timeout: failed to connect service ":50051" within 1s Mar 09 02:45:40 crc kubenswrapper[4901]: > Mar 09 02:45:40 crc kubenswrapper[4901]: I0309 02:45:40.286432 4901 generic.go:334] "Generic (PLEG): container finished" podID="f9d5529e-413e-4c88-98d8-1df5a9e55721" containerID="cefbfc61f3b1f2cb4e7068df7a3a2f26eba4651a63f3a3a9c965132c8d0f984c" exitCode=0 Mar 09 02:45:40 crc kubenswrapper[4901]: I0309 02:45:40.286525 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-llngg" event={"ID":"f9d5529e-413e-4c88-98d8-1df5a9e55721","Type":"ContainerDied","Data":"cefbfc61f3b1f2cb4e7068df7a3a2f26eba4651a63f3a3a9c965132c8d0f984c"} Mar 09 02:45:40 crc kubenswrapper[4901]: I0309 02:45:40.649902 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rrzzj" podUID="980a688d-18b9-4f90-9255-a55568e7bbc0" containerName="registry-server" probeResult="failure" output=< Mar 09 02:45:40 crc kubenswrapper[4901]: timeout: failed to connect service ":50051" within 1s Mar 09 02:45:40 crc kubenswrapper[4901]: > Mar 09 02:45:40 crc kubenswrapper[4901]: I0309 02:45:40.742269 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-llngg" Mar 09 02:45:40 crc kubenswrapper[4901]: I0309 02:45:40.797654 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9d5529e-413e-4c88-98d8-1df5a9e55721-catalog-content\") pod \"f9d5529e-413e-4c88-98d8-1df5a9e55721\" (UID: \"f9d5529e-413e-4c88-98d8-1df5a9e55721\") " Mar 09 02:45:40 crc kubenswrapper[4901]: I0309 02:45:40.797802 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cctv\" (UniqueName: \"kubernetes.io/projected/f9d5529e-413e-4c88-98d8-1df5a9e55721-kube-api-access-2cctv\") pod \"f9d5529e-413e-4c88-98d8-1df5a9e55721\" (UID: \"f9d5529e-413e-4c88-98d8-1df5a9e55721\") " Mar 09 02:45:40 crc kubenswrapper[4901]: I0309 02:45:40.797843 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9d5529e-413e-4c88-98d8-1df5a9e55721-utilities\") pod \"f9d5529e-413e-4c88-98d8-1df5a9e55721\" (UID: \"f9d5529e-413e-4c88-98d8-1df5a9e55721\") " Mar 09 02:45:40 crc kubenswrapper[4901]: I0309 02:45:40.798848 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9d5529e-413e-4c88-98d8-1df5a9e55721-utilities" (OuterVolumeSpecName: "utilities") pod "f9d5529e-413e-4c88-98d8-1df5a9e55721" (UID: "f9d5529e-413e-4c88-98d8-1df5a9e55721"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 02:45:40 crc kubenswrapper[4901]: I0309 02:45:40.804350 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9d5529e-413e-4c88-98d8-1df5a9e55721-kube-api-access-2cctv" (OuterVolumeSpecName: "kube-api-access-2cctv") pod "f9d5529e-413e-4c88-98d8-1df5a9e55721" (UID: "f9d5529e-413e-4c88-98d8-1df5a9e55721"). InnerVolumeSpecName "kube-api-access-2cctv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:45:40 crc kubenswrapper[4901]: I0309 02:45:40.850694 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9d5529e-413e-4c88-98d8-1df5a9e55721-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f9d5529e-413e-4c88-98d8-1df5a9e55721" (UID: "f9d5529e-413e-4c88-98d8-1df5a9e55721"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 02:45:40 crc kubenswrapper[4901]: I0309 02:45:40.900073 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2cctv\" (UniqueName: \"kubernetes.io/projected/f9d5529e-413e-4c88-98d8-1df5a9e55721-kube-api-access-2cctv\") on node \"crc\" DevicePath \"\"" Mar 09 02:45:40 crc kubenswrapper[4901]: I0309 02:45:40.900152 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9d5529e-413e-4c88-98d8-1df5a9e55721-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 02:45:40 crc kubenswrapper[4901]: I0309 02:45:40.900179 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9d5529e-413e-4c88-98d8-1df5a9e55721-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 02:45:41 crc kubenswrapper[4901]: I0309 02:45:41.296061 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-llngg" event={"ID":"f9d5529e-413e-4c88-98d8-1df5a9e55721","Type":"ContainerDied","Data":"1414ec18dd610b4e3f03f5b2694707803f86255512f28660430c6d684f0bbf0b"} Mar 09 02:45:41 crc kubenswrapper[4901]: I0309 02:45:41.296614 4901 scope.go:117] "RemoveContainer" containerID="cefbfc61f3b1f2cb4e7068df7a3a2f26eba4651a63f3a3a9c965132c8d0f984c" Mar 09 02:45:41 crc kubenswrapper[4901]: I0309 02:45:41.296129 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-llngg" Mar 09 02:45:41 crc kubenswrapper[4901]: I0309 02:45:41.318648 4901 scope.go:117] "RemoveContainer" containerID="c43c00179f1b30f4609b2c2dd9b5045ee6f1cab9bd2b4e5a78066fd47e238a64" Mar 09 02:45:41 crc kubenswrapper[4901]: I0309 02:45:41.337025 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-llngg"] Mar 09 02:45:41 crc kubenswrapper[4901]: I0309 02:45:41.349473 4901 scope.go:117] "RemoveContainer" containerID="368e95487a8e31ef31a5fc871c10b02ebed4b56116c83e7486b7b9a266d1bfff" Mar 09 02:45:41 crc kubenswrapper[4901]: I0309 02:45:41.357640 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-llngg"] Mar 09 02:45:41 crc kubenswrapper[4901]: I0309 02:45:41.968099 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jp6kj"] Mar 09 02:45:41 crc kubenswrapper[4901]: I0309 02:45:41.968570 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jp6kj" podUID="da9622a3-9e9b-4c8a-86d3-110f44883bbc" containerName="registry-server" containerID="cri-o://3375ff6f15a7dd0ad28fa714ada39815827c3e7b828fd77c76a4ade566c07e27" gracePeriod=2 Mar 09 02:45:42 crc kubenswrapper[4901]: I0309 02:45:42.118845 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9d5529e-413e-4c88-98d8-1df5a9e55721" path="/var/lib/kubelet/pods/f9d5529e-413e-4c88-98d8-1df5a9e55721/volumes" Mar 09 02:45:42 crc kubenswrapper[4901]: I0309 02:45:42.304334 4901 generic.go:334] "Generic (PLEG): container finished" podID="da9622a3-9e9b-4c8a-86d3-110f44883bbc" containerID="3375ff6f15a7dd0ad28fa714ada39815827c3e7b828fd77c76a4ade566c07e27" exitCode=0 Mar 09 02:45:42 crc kubenswrapper[4901]: I0309 02:45:42.304413 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jp6kj" event={"ID":"da9622a3-9e9b-4c8a-86d3-110f44883bbc","Type":"ContainerDied","Data":"3375ff6f15a7dd0ad28fa714ada39815827c3e7b828fd77c76a4ade566c07e27"} Mar 09 02:45:43 crc kubenswrapper[4901]: I0309 02:45:43.120841 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jp6kj" Mar 09 02:45:43 crc kubenswrapper[4901]: I0309 02:45:43.232037 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da9622a3-9e9b-4c8a-86d3-110f44883bbc-catalog-content\") pod \"da9622a3-9e9b-4c8a-86d3-110f44883bbc\" (UID: \"da9622a3-9e9b-4c8a-86d3-110f44883bbc\") " Mar 09 02:45:43 crc kubenswrapper[4901]: I0309 02:45:43.232134 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98gpc\" (UniqueName: \"kubernetes.io/projected/da9622a3-9e9b-4c8a-86d3-110f44883bbc-kube-api-access-98gpc\") pod \"da9622a3-9e9b-4c8a-86d3-110f44883bbc\" (UID: \"da9622a3-9e9b-4c8a-86d3-110f44883bbc\") " Mar 09 02:45:43 crc kubenswrapper[4901]: I0309 02:45:43.232272 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da9622a3-9e9b-4c8a-86d3-110f44883bbc-utilities\") pod \"da9622a3-9e9b-4c8a-86d3-110f44883bbc\" (UID: \"da9622a3-9e9b-4c8a-86d3-110f44883bbc\") " Mar 09 02:45:43 crc kubenswrapper[4901]: I0309 02:45:43.232923 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da9622a3-9e9b-4c8a-86d3-110f44883bbc-utilities" (OuterVolumeSpecName: "utilities") pod "da9622a3-9e9b-4c8a-86d3-110f44883bbc" (UID: "da9622a3-9e9b-4c8a-86d3-110f44883bbc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 02:45:43 crc kubenswrapper[4901]: I0309 02:45:43.244506 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da9622a3-9e9b-4c8a-86d3-110f44883bbc-kube-api-access-98gpc" (OuterVolumeSpecName: "kube-api-access-98gpc") pod "da9622a3-9e9b-4c8a-86d3-110f44883bbc" (UID: "da9622a3-9e9b-4c8a-86d3-110f44883bbc"). InnerVolumeSpecName "kube-api-access-98gpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:45:43 crc kubenswrapper[4901]: I0309 02:45:43.278605 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da9622a3-9e9b-4c8a-86d3-110f44883bbc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "da9622a3-9e9b-4c8a-86d3-110f44883bbc" (UID: "da9622a3-9e9b-4c8a-86d3-110f44883bbc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 02:45:43 crc kubenswrapper[4901]: I0309 02:45:43.326893 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jp6kj" event={"ID":"da9622a3-9e9b-4c8a-86d3-110f44883bbc","Type":"ContainerDied","Data":"c0dbd2e364a9d9f9191768b79e02f943d3dfb0ed0e35922b763df5adbba2e901"} Mar 09 02:45:43 crc kubenswrapper[4901]: I0309 02:45:43.326970 4901 scope.go:117] "RemoveContainer" containerID="3375ff6f15a7dd0ad28fa714ada39815827c3e7b828fd77c76a4ade566c07e27" Mar 09 02:45:43 crc kubenswrapper[4901]: I0309 02:45:43.327162 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jp6kj" Mar 09 02:45:43 crc kubenswrapper[4901]: I0309 02:45:43.333486 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da9622a3-9e9b-4c8a-86d3-110f44883bbc-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 02:45:43 crc kubenswrapper[4901]: I0309 02:45:43.333531 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da9622a3-9e9b-4c8a-86d3-110f44883bbc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 02:45:43 crc kubenswrapper[4901]: I0309 02:45:43.333554 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98gpc\" (UniqueName: \"kubernetes.io/projected/da9622a3-9e9b-4c8a-86d3-110f44883bbc-kube-api-access-98gpc\") on node \"crc\" DevicePath \"\"" Mar 09 02:45:43 crc kubenswrapper[4901]: I0309 02:45:43.355442 4901 scope.go:117] "RemoveContainer" containerID="1feee7a0be93d3f596c3f82ac98106bbdfad7a29435505a3976377cb4f90117f" Mar 09 02:45:43 crc kubenswrapper[4901]: I0309 02:45:43.373319 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jp6kj"] Mar 09 02:45:43 crc kubenswrapper[4901]: I0309 02:45:43.380622 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jp6kj"] Mar 09 02:45:43 crc kubenswrapper[4901]: I0309 02:45:43.387311 4901 scope.go:117] "RemoveContainer" containerID="bc517bbf89051779d2c9961b2dbb1cb2009e6191eb89d6f198ecf7896781c952" Mar 09 02:45:44 crc kubenswrapper[4901]: I0309 02:45:44.119091 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da9622a3-9e9b-4c8a-86d3-110f44883bbc" path="/var/lib/kubelet/pods/da9622a3-9e9b-4c8a-86d3-110f44883bbc/volumes" Mar 09 02:45:44 crc kubenswrapper[4901]: I0309 02:45:44.242021 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-77db75d486-lj5ct"] Mar 09 02:45:44 crc kubenswrapper[4901]: I0309 02:45:44.242381 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-77db75d486-lj5ct" podUID="5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa" containerName="controller-manager" containerID="cri-o://d3f162d9f6c72c5683dde1fcbf44588a64246f3a1fe1c6526fcf7a86ed4bb97f" gracePeriod=30 Mar 09 02:45:44 crc kubenswrapper[4901]: I0309 02:45:44.261015 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74cfc7b869-mhczd"] Mar 09 02:45:44 crc kubenswrapper[4901]: I0309 02:45:44.261994 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-74cfc7b869-mhczd" podUID="569f3f72-275f-43c8-b9ea-258fff4d7af5" containerName="route-controller-manager" containerID="cri-o://75ebb6ca60b7f9e2d4c5c249ccf9078a9a5e795489bf859ba9fd29a18119694a" gracePeriod=30 Mar 09 02:45:44 crc kubenswrapper[4901]: I0309 02:45:44.838817 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-74cfc7b869-mhczd" Mar 09 02:45:44 crc kubenswrapper[4901]: I0309 02:45:44.911095 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77db75d486-lj5ct" Mar 09 02:45:44 crc kubenswrapper[4901]: I0309 02:45:44.974467 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/569f3f72-275f-43c8-b9ea-258fff4d7af5-serving-cert\") pod \"569f3f72-275f-43c8-b9ea-258fff4d7af5\" (UID: \"569f3f72-275f-43c8-b9ea-258fff4d7af5\") " Mar 09 02:45:44 crc kubenswrapper[4901]: I0309 02:45:44.975084 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwtdd\" (UniqueName: \"kubernetes.io/projected/569f3f72-275f-43c8-b9ea-258fff4d7af5-kube-api-access-hwtdd\") pod \"569f3f72-275f-43c8-b9ea-258fff4d7af5\" (UID: \"569f3f72-275f-43c8-b9ea-258fff4d7af5\") " Mar 09 02:45:44 crc kubenswrapper[4901]: I0309 02:45:44.975272 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56sgf\" (UniqueName: \"kubernetes.io/projected/5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa-kube-api-access-56sgf\") pod \"5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa\" (UID: \"5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa\") " Mar 09 02:45:44 crc kubenswrapper[4901]: I0309 02:45:44.975592 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/569f3f72-275f-43c8-b9ea-258fff4d7af5-client-ca\") pod \"569f3f72-275f-43c8-b9ea-258fff4d7af5\" (UID: \"569f3f72-275f-43c8-b9ea-258fff4d7af5\") " Mar 09 02:45:44 crc kubenswrapper[4901]: I0309 02:45:44.975624 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa-config\") pod \"5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa\" (UID: \"5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa\") " Mar 09 02:45:44 crc kubenswrapper[4901]: I0309 02:45:44.975946 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa-serving-cert\") pod \"5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa\" (UID: \"5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa\") " Mar 09 02:45:44 crc kubenswrapper[4901]: I0309 02:45:44.975976 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa-proxy-ca-bundles\") pod \"5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa\" (UID: \"5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa\") " Mar 09 02:45:44 crc kubenswrapper[4901]: I0309 02:45:44.976212 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/569f3f72-275f-43c8-b9ea-258fff4d7af5-config\") pod \"569f3f72-275f-43c8-b9ea-258fff4d7af5\" (UID: \"569f3f72-275f-43c8-b9ea-258fff4d7af5\") " Mar 09 02:45:44 crc kubenswrapper[4901]: I0309 02:45:44.976288 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa-client-ca\") pod \"5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa\" (UID: \"5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa\") " Mar 09 02:45:44 crc kubenswrapper[4901]: I0309 02:45:44.976348 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/569f3f72-275f-43c8-b9ea-258fff4d7af5-client-ca" (OuterVolumeSpecName: "client-ca") pod "569f3f72-275f-43c8-b9ea-258fff4d7af5" (UID: "569f3f72-275f-43c8-b9ea-258fff4d7af5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:45:44 crc kubenswrapper[4901]: I0309 02:45:44.976519 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa-config" (OuterVolumeSpecName: "config") pod "5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa" (UID: "5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:45:44 crc kubenswrapper[4901]: I0309 02:45:44.976711 4901 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/569f3f72-275f-43c8-b9ea-258fff4d7af5-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 02:45:44 crc kubenswrapper[4901]: I0309 02:45:44.976730 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa-config\") on node \"crc\" DevicePath \"\"" Mar 09 02:45:44 crc kubenswrapper[4901]: I0309 02:45:44.976894 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa" (UID: "5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:45:44 crc kubenswrapper[4901]: I0309 02:45:44.977036 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/569f3f72-275f-43c8-b9ea-258fff4d7af5-config" (OuterVolumeSpecName: "config") pod "569f3f72-275f-43c8-b9ea-258fff4d7af5" (UID: "569f3f72-275f-43c8-b9ea-258fff4d7af5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:45:44 crc kubenswrapper[4901]: I0309 02:45:44.977443 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa-client-ca" (OuterVolumeSpecName: "client-ca") pod "5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa" (UID: "5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:45:44 crc kubenswrapper[4901]: I0309 02:45:44.980167 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa" (UID: "5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:45:44 crc kubenswrapper[4901]: I0309 02:45:44.980767 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/569f3f72-275f-43c8-b9ea-258fff4d7af5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "569f3f72-275f-43c8-b9ea-258fff4d7af5" (UID: "569f3f72-275f-43c8-b9ea-258fff4d7af5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:45:44 crc kubenswrapper[4901]: I0309 02:45:44.980885 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa-kube-api-access-56sgf" (OuterVolumeSpecName: "kube-api-access-56sgf") pod "5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa" (UID: "5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa"). InnerVolumeSpecName "kube-api-access-56sgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:45:44 crc kubenswrapper[4901]: I0309 02:45:44.980944 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/569f3f72-275f-43c8-b9ea-258fff4d7af5-kube-api-access-hwtdd" (OuterVolumeSpecName: "kube-api-access-hwtdd") pod "569f3f72-275f-43c8-b9ea-258fff4d7af5" (UID: "569f3f72-275f-43c8-b9ea-258fff4d7af5"). InnerVolumeSpecName "kube-api-access-hwtdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.078259 4901 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.078316 4901 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.078336 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/569f3f72-275f-43c8-b9ea-258fff4d7af5-config\") on node \"crc\" DevicePath \"\"" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.078353 4901 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.078371 4901 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/569f3f72-275f-43c8-b9ea-258fff4d7af5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.078389 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwtdd\" (UniqueName: \"kubernetes.io/projected/569f3f72-275f-43c8-b9ea-258fff4d7af5-kube-api-access-hwtdd\") on node \"crc\" DevicePath \"\"" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.078407 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56sgf\" (UniqueName: \"kubernetes.io/projected/5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa-kube-api-access-56sgf\") on node \"crc\" DevicePath \"\"" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.345868 4901 generic.go:334] "Generic (PLEG): container finished" podID="569f3f72-275f-43c8-b9ea-258fff4d7af5" containerID="75ebb6ca60b7f9e2d4c5c249ccf9078a9a5e795489bf859ba9fd29a18119694a" exitCode=0 Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.345942 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-74cfc7b869-mhczd" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.346056 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-74cfc7b869-mhczd" event={"ID":"569f3f72-275f-43c8-b9ea-258fff4d7af5","Type":"ContainerDied","Data":"75ebb6ca60b7f9e2d4c5c249ccf9078a9a5e795489bf859ba9fd29a18119694a"} Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.346127 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-74cfc7b869-mhczd" event={"ID":"569f3f72-275f-43c8-b9ea-258fff4d7af5","Type":"ContainerDied","Data":"c98f78d91585d4f08a96777d3290763dc46f6a4c47710662e3bf6e6cb11a8636"} Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.346167 4901 scope.go:117] "RemoveContainer" containerID="75ebb6ca60b7f9e2d4c5c249ccf9078a9a5e795489bf859ba9fd29a18119694a" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.355287 4901 generic.go:334] "Generic (PLEG): container finished" podID="5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa" containerID="d3f162d9f6c72c5683dde1fcbf44588a64246f3a1fe1c6526fcf7a86ed4bb97f" exitCode=0 Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.355356 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77db75d486-lj5ct" event={"ID":"5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa","Type":"ContainerDied","Data":"d3f162d9f6c72c5683dde1fcbf44588a64246f3a1fe1c6526fcf7a86ed4bb97f"} Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.355405 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77db75d486-lj5ct" event={"ID":"5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa","Type":"ContainerDied","Data":"efa6a441234d44a6ac4a828f6a7d9ccb6e6b8dce2c0364017657c382d20805d7"} Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.355494 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77db75d486-lj5ct" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.375299 4901 scope.go:117] "RemoveContainer" containerID="75ebb6ca60b7f9e2d4c5c249ccf9078a9a5e795489bf859ba9fd29a18119694a" Mar 09 02:45:45 crc kubenswrapper[4901]: E0309 02:45:45.379888 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75ebb6ca60b7f9e2d4c5c249ccf9078a9a5e795489bf859ba9fd29a18119694a\": container with ID starting with 75ebb6ca60b7f9e2d4c5c249ccf9078a9a5e795489bf859ba9fd29a18119694a not found: ID does not exist" containerID="75ebb6ca60b7f9e2d4c5c249ccf9078a9a5e795489bf859ba9fd29a18119694a" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.379938 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75ebb6ca60b7f9e2d4c5c249ccf9078a9a5e795489bf859ba9fd29a18119694a"} err="failed to get container status \"75ebb6ca60b7f9e2d4c5c249ccf9078a9a5e795489bf859ba9fd29a18119694a\": rpc error: code = NotFound desc = could not find container \"75ebb6ca60b7f9e2d4c5c249ccf9078a9a5e795489bf859ba9fd29a18119694a\": container with ID starting with 75ebb6ca60b7f9e2d4c5c249ccf9078a9a5e795489bf859ba9fd29a18119694a not found: ID does not exist" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.379968 4901 scope.go:117] "RemoveContainer" containerID="d3f162d9f6c72c5683dde1fcbf44588a64246f3a1fe1c6526fcf7a86ed4bb97f" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.406819 4901 scope.go:117] "RemoveContainer" containerID="d3f162d9f6c72c5683dde1fcbf44588a64246f3a1fe1c6526fcf7a86ed4bb97f" Mar 09 02:45:45 crc kubenswrapper[4901]: E0309 02:45:45.407440 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3f162d9f6c72c5683dde1fcbf44588a64246f3a1fe1c6526fcf7a86ed4bb97f\": container with ID starting with d3f162d9f6c72c5683dde1fcbf44588a64246f3a1fe1c6526fcf7a86ed4bb97f not found: ID does not exist" containerID="d3f162d9f6c72c5683dde1fcbf44588a64246f3a1fe1c6526fcf7a86ed4bb97f" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.407480 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3f162d9f6c72c5683dde1fcbf44588a64246f3a1fe1c6526fcf7a86ed4bb97f"} err="failed to get container status \"d3f162d9f6c72c5683dde1fcbf44588a64246f3a1fe1c6526fcf7a86ed4bb97f\": rpc error: code = NotFound desc = could not find container \"d3f162d9f6c72c5683dde1fcbf44588a64246f3a1fe1c6526fcf7a86ed4bb97f\": container with ID starting with d3f162d9f6c72c5683dde1fcbf44588a64246f3a1fe1c6526fcf7a86ed4bb97f not found: ID does not exist" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.420178 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74cfc7b869-mhczd"] Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.427097 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74cfc7b869-mhczd"] Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.432302 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-77db75d486-lj5ct"] Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.436499 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-77db75d486-lj5ct"] Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.578133 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-ccd8f799d-cmm5g"] Mar 09 02:45:45 crc kubenswrapper[4901]: E0309 02:45:45.578611 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa" containerName="controller-manager" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.578640 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa" containerName="controller-manager" Mar 09 02:45:45 crc kubenswrapper[4901]: E0309 02:45:45.578658 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da9622a3-9e9b-4c8a-86d3-110f44883bbc" containerName="extract-utilities" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.578670 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="da9622a3-9e9b-4c8a-86d3-110f44883bbc" containerName="extract-utilities" Mar 09 02:45:45 crc kubenswrapper[4901]: E0309 02:45:45.578695 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9d5529e-413e-4c88-98d8-1df5a9e55721" containerName="extract-utilities" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.578710 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9d5529e-413e-4c88-98d8-1df5a9e55721" containerName="extract-utilities" Mar 09 02:45:45 crc kubenswrapper[4901]: E0309 02:45:45.578724 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9d5529e-413e-4c88-98d8-1df5a9e55721" containerName="extract-content" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.578737 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9d5529e-413e-4c88-98d8-1df5a9e55721" containerName="extract-content" Mar 09 02:45:45 crc kubenswrapper[4901]: E0309 02:45:45.578750 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9d5529e-413e-4c88-98d8-1df5a9e55721" containerName="registry-server" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.578762 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9d5529e-413e-4c88-98d8-1df5a9e55721" containerName="registry-server" Mar 09 02:45:45 crc kubenswrapper[4901]: E0309 02:45:45.578786 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="569f3f72-275f-43c8-b9ea-258fff4d7af5" containerName="route-controller-manager" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.578798 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="569f3f72-275f-43c8-b9ea-258fff4d7af5" containerName="route-controller-manager" Mar 09 02:45:45 crc kubenswrapper[4901]: E0309 02:45:45.578817 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da9622a3-9e9b-4c8a-86d3-110f44883bbc" containerName="registry-server" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.578829 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="da9622a3-9e9b-4c8a-86d3-110f44883bbc" containerName="registry-server" Mar 09 02:45:45 crc kubenswrapper[4901]: E0309 02:45:45.578865 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da9622a3-9e9b-4c8a-86d3-110f44883bbc" containerName="extract-content" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.578877 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="da9622a3-9e9b-4c8a-86d3-110f44883bbc" containerName="extract-content" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.579047 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9d5529e-413e-4c88-98d8-1df5a9e55721" containerName="registry-server" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.579063 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa" containerName="controller-manager" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.579080 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="da9622a3-9e9b-4c8a-86d3-110f44883bbc" containerName="registry-server" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.579105 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="569f3f72-275f-43c8-b9ea-258fff4d7af5" containerName="route-controller-manager" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.579637 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7b84c445f9-767xj"] Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.581633 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-ccd8f799d-cmm5g" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.582004 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b84c445f9-767xj" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.584535 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.584691 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.584890 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.584983 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.585716 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.587560 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.588973 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.589043 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.589219 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.589465 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.589873 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.592323 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.592398 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-ccd8f799d-cmm5g"] Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.595189 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7b84c445f9-767xj"] Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.602140 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.685258 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b849405e-49b5-468b-bfd1-1b305aca9529-serving-cert\") pod \"route-controller-manager-ccd8f799d-cmm5g\" (UID: \"b849405e-49b5-468b-bfd1-1b305aca9529\") " pod="openshift-route-controller-manager/route-controller-manager-ccd8f799d-cmm5g" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.685310 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91541d62-9618-47c1-8e73-935764f24375-config\") pod \"controller-manager-7b84c445f9-767xj\" (UID: \"91541d62-9618-47c1-8e73-935764f24375\") " pod="openshift-controller-manager/controller-manager-7b84c445f9-767xj" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.685349 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5g6p\" (UniqueName: \"kubernetes.io/projected/b849405e-49b5-468b-bfd1-1b305aca9529-kube-api-access-h5g6p\") pod \"route-controller-manager-ccd8f799d-cmm5g\" (UID: \"b849405e-49b5-468b-bfd1-1b305aca9529\") " pod="openshift-route-controller-manager/route-controller-manager-ccd8f799d-cmm5g" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.685394 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/91541d62-9618-47c1-8e73-935764f24375-serving-cert\") pod \"controller-manager-7b84c445f9-767xj\" (UID: \"91541d62-9618-47c1-8e73-935764f24375\") " pod="openshift-controller-manager/controller-manager-7b84c445f9-767xj" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.685493 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxfkr\" (UniqueName: \"kubernetes.io/projected/91541d62-9618-47c1-8e73-935764f24375-kube-api-access-xxfkr\") pod \"controller-manager-7b84c445f9-767xj\" (UID: \"91541d62-9618-47c1-8e73-935764f24375\") " pod="openshift-controller-manager/controller-manager-7b84c445f9-767xj" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.685567 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b849405e-49b5-468b-bfd1-1b305aca9529-config\") pod \"route-controller-manager-ccd8f799d-cmm5g\" (UID: \"b849405e-49b5-468b-bfd1-1b305aca9529\") " pod="openshift-route-controller-manager/route-controller-manager-ccd8f799d-cmm5g" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.685617 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b849405e-49b5-468b-bfd1-1b305aca9529-client-ca\") pod \"route-controller-manager-ccd8f799d-cmm5g\" (UID: \"b849405e-49b5-468b-bfd1-1b305aca9529\") " pod="openshift-route-controller-manager/route-controller-manager-ccd8f799d-cmm5g" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.685640 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/91541d62-9618-47c1-8e73-935764f24375-client-ca\") pod \"controller-manager-7b84c445f9-767xj\" (UID: \"91541d62-9618-47c1-8e73-935764f24375\") " pod="openshift-controller-manager/controller-manager-7b84c445f9-767xj" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.685663 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/91541d62-9618-47c1-8e73-935764f24375-proxy-ca-bundles\") pod \"controller-manager-7b84c445f9-767xj\" (UID: \"91541d62-9618-47c1-8e73-935764f24375\") " pod="openshift-controller-manager/controller-manager-7b84c445f9-767xj" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.786943 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxfkr\" (UniqueName: \"kubernetes.io/projected/91541d62-9618-47c1-8e73-935764f24375-kube-api-access-xxfkr\") pod \"controller-manager-7b84c445f9-767xj\" (UID: \"91541d62-9618-47c1-8e73-935764f24375\") " pod="openshift-controller-manager/controller-manager-7b84c445f9-767xj" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.787465 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b849405e-49b5-468b-bfd1-1b305aca9529-config\") pod \"route-controller-manager-ccd8f799d-cmm5g\" (UID: \"b849405e-49b5-468b-bfd1-1b305aca9529\") " pod="openshift-route-controller-manager/route-controller-manager-ccd8f799d-cmm5g" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.789420 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b849405e-49b5-468b-bfd1-1b305aca9529-config\") pod \"route-controller-manager-ccd8f799d-cmm5g\" (UID: \"b849405e-49b5-468b-bfd1-1b305aca9529\") " pod="openshift-route-controller-manager/route-controller-manager-ccd8f799d-cmm5g" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.789527 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b849405e-49b5-468b-bfd1-1b305aca9529-client-ca\") pod \"route-controller-manager-ccd8f799d-cmm5g\" (UID: \"b849405e-49b5-468b-bfd1-1b305aca9529\") " pod="openshift-route-controller-manager/route-controller-manager-ccd8f799d-cmm5g" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.790543 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b849405e-49b5-468b-bfd1-1b305aca9529-client-ca\") pod \"route-controller-manager-ccd8f799d-cmm5g\" (UID: \"b849405e-49b5-468b-bfd1-1b305aca9529\") " pod="openshift-route-controller-manager/route-controller-manager-ccd8f799d-cmm5g" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.790639 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/91541d62-9618-47c1-8e73-935764f24375-client-ca\") pod \"controller-manager-7b84c445f9-767xj\" (UID: \"91541d62-9618-47c1-8e73-935764f24375\") " pod="openshift-controller-manager/controller-manager-7b84c445f9-767xj" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.792426 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/91541d62-9618-47c1-8e73-935764f24375-client-ca\") pod \"controller-manager-7b84c445f9-767xj\" (UID: \"91541d62-9618-47c1-8e73-935764f24375\") " pod="openshift-controller-manager/controller-manager-7b84c445f9-767xj" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.792501 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/91541d62-9618-47c1-8e73-935764f24375-proxy-ca-bundles\") pod \"controller-manager-7b84c445f9-767xj\" (UID: \"91541d62-9618-47c1-8e73-935764f24375\") " pod="openshift-controller-manager/controller-manager-7b84c445f9-767xj" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.792669 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b849405e-49b5-468b-bfd1-1b305aca9529-serving-cert\") pod \"route-controller-manager-ccd8f799d-cmm5g\" (UID: \"b849405e-49b5-468b-bfd1-1b305aca9529\") " pod="openshift-route-controller-manager/route-controller-manager-ccd8f799d-cmm5g" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.792757 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/91541d62-9618-47c1-8e73-935764f24375-proxy-ca-bundles\") pod \"controller-manager-7b84c445f9-767xj\" (UID: \"91541d62-9618-47c1-8e73-935764f24375\") " pod="openshift-controller-manager/controller-manager-7b84c445f9-767xj" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.793365 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91541d62-9618-47c1-8e73-935764f24375-config\") pod \"controller-manager-7b84c445f9-767xj\" (UID: \"91541d62-9618-47c1-8e73-935764f24375\") " pod="openshift-controller-manager/controller-manager-7b84c445f9-767xj" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.793468 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5g6p\" (UniqueName: \"kubernetes.io/projected/b849405e-49b5-468b-bfd1-1b305aca9529-kube-api-access-h5g6p\") pod \"route-controller-manager-ccd8f799d-cmm5g\" (UID: \"b849405e-49b5-468b-bfd1-1b305aca9529\") " pod="openshift-route-controller-manager/route-controller-manager-ccd8f799d-cmm5g" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.793582 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/91541d62-9618-47c1-8e73-935764f24375-serving-cert\") pod \"controller-manager-7b84c445f9-767xj\" (UID: \"91541d62-9618-47c1-8e73-935764f24375\") " pod="openshift-controller-manager/controller-manager-7b84c445f9-767xj" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.795464 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91541d62-9618-47c1-8e73-935764f24375-config\") pod \"controller-manager-7b84c445f9-767xj\" (UID: \"91541d62-9618-47c1-8e73-935764f24375\") " pod="openshift-controller-manager/controller-manager-7b84c445f9-767xj" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.797302 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b849405e-49b5-468b-bfd1-1b305aca9529-serving-cert\") pod \"route-controller-manager-ccd8f799d-cmm5g\" (UID: \"b849405e-49b5-468b-bfd1-1b305aca9529\") " pod="openshift-route-controller-manager/route-controller-manager-ccd8f799d-cmm5g" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.799045 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/91541d62-9618-47c1-8e73-935764f24375-serving-cert\") pod \"controller-manager-7b84c445f9-767xj\" (UID: \"91541d62-9618-47c1-8e73-935764f24375\") " pod="openshift-controller-manager/controller-manager-7b84c445f9-767xj" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.813050 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxfkr\" (UniqueName: \"kubernetes.io/projected/91541d62-9618-47c1-8e73-935764f24375-kube-api-access-xxfkr\") pod \"controller-manager-7b84c445f9-767xj\" (UID: \"91541d62-9618-47c1-8e73-935764f24375\") " pod="openshift-controller-manager/controller-manager-7b84c445f9-767xj" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.827631 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5g6p\" (UniqueName: \"kubernetes.io/projected/b849405e-49b5-468b-bfd1-1b305aca9529-kube-api-access-h5g6p\") pod \"route-controller-manager-ccd8f799d-cmm5g\" (UID: \"b849405e-49b5-468b-bfd1-1b305aca9529\") " pod="openshift-route-controller-manager/route-controller-manager-ccd8f799d-cmm5g" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.921539 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-ccd8f799d-cmm5g" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.928031 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b84c445f9-767xj" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.950504 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8c7zs" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.950537 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8c7zs" Mar 09 02:45:45 crc kubenswrapper[4901]: I0309 02:45:45.999463 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8c7zs" Mar 09 02:45:46 crc kubenswrapper[4901]: I0309 02:45:46.114463 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="569f3f72-275f-43c8-b9ea-258fff4d7af5" path="/var/lib/kubelet/pods/569f3f72-275f-43c8-b9ea-258fff4d7af5/volumes" Mar 09 02:45:46 crc kubenswrapper[4901]: I0309 02:45:46.115595 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa" path="/var/lib/kubelet/pods/5dc62bb1-5c7f-404c-ba83-f4a7fde3f1aa/volumes" Mar 09 02:45:46 crc kubenswrapper[4901]: I0309 02:45:46.162894 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7b84c445f9-767xj"] Mar 09 02:45:46 crc kubenswrapper[4901]: I0309 02:45:46.178378 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-f7kz2" Mar 09 02:45:46 crc kubenswrapper[4901]: I0309 02:45:46.178420 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-f7kz2" Mar 09 02:45:46 crc kubenswrapper[4901]: I0309 02:45:46.220653 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-f7kz2" Mar 09 02:45:46 crc kubenswrapper[4901]: I0309 02:45:46.365656 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b84c445f9-767xj" event={"ID":"91541d62-9618-47c1-8e73-935764f24375","Type":"ContainerStarted","Data":"183ccd3bb50f49307668bdc7a563bdfd50e71b0e126c2e31a957fa694aed581f"} Mar 09 02:45:46 crc kubenswrapper[4901]: I0309 02:45:46.365974 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b84c445f9-767xj" event={"ID":"91541d62-9618-47c1-8e73-935764f24375","Type":"ContainerStarted","Data":"ab4db9c251c1979b5815efe33de3175a63afed4d9e2c007250bbe0305692a7cf"} Mar 09 02:45:46 crc kubenswrapper[4901]: I0309 02:45:46.365994 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7b84c445f9-767xj" Mar 09 02:45:46 crc kubenswrapper[4901]: I0309 02:45:46.370845 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pgv99" Mar 09 02:45:46 crc kubenswrapper[4901]: I0309 02:45:46.370875 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pgv99" Mar 09 02:45:46 crc kubenswrapper[4901]: I0309 02:45:46.397918 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7b84c445f9-767xj" Mar 09 02:45:46 crc kubenswrapper[4901]: I0309 02:45:46.411197 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7b84c445f9-767xj" podStartSLOduration=2.411180874 podStartE2EDuration="2.411180874s" podCreationTimestamp="2026-03-09 02:45:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:45:46.393258929 +0000 UTC m=+270.982922661" watchObservedRunningTime="2026-03-09 02:45:46.411180874 +0000 UTC m=+271.000844616" Mar 09 02:45:46 crc kubenswrapper[4901]: I0309 02:45:46.416079 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-ccd8f799d-cmm5g"] Mar 09 02:45:46 crc kubenswrapper[4901]: I0309 02:45:46.425082 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pgv99" Mar 09 02:45:46 crc kubenswrapper[4901]: I0309 02:45:46.438710 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8c7zs" Mar 09 02:45:46 crc kubenswrapper[4901]: I0309 02:45:46.439753 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-f7kz2" Mar 09 02:45:47 crc kubenswrapper[4901]: I0309 02:45:47.378151 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-ccd8f799d-cmm5g" event={"ID":"b849405e-49b5-468b-bfd1-1b305aca9529","Type":"ContainerStarted","Data":"b7670c3760e6467bb7b242b7754cd587962dc08f1725257c8b6af1392a5bf803"} Mar 09 02:45:47 crc kubenswrapper[4901]: I0309 02:45:47.378293 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-ccd8f799d-cmm5g" event={"ID":"b849405e-49b5-468b-bfd1-1b305aca9529","Type":"ContainerStarted","Data":"51716f54fc67d16127a9475e2bfcaab93d23ba104cae3671f5a0a5a4a0ca4cd5"} Mar 09 02:45:47 crc kubenswrapper[4901]: I0309 02:45:47.405520 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-ccd8f799d-cmm5g" podStartSLOduration=3.405487131 podStartE2EDuration="3.405487131s" podCreationTimestamp="2026-03-09 02:45:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:45:47.404672739 +0000 UTC m=+271.994336471" watchObservedRunningTime="2026-03-09 02:45:47.405487131 +0000 UTC m=+271.995150863" Mar 09 02:45:47 crc kubenswrapper[4901]: I0309 02:45:47.428919 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pgv99" Mar 09 02:45:48 crc kubenswrapper[4901]: I0309 02:45:48.223647 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-tzf54"] Mar 09 02:45:48 crc kubenswrapper[4901]: I0309 02:45:48.382899 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-ccd8f799d-cmm5g" Mar 09 02:45:48 crc kubenswrapper[4901]: I0309 02:45:48.388685 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-ccd8f799d-cmm5g" Mar 09 02:45:49 crc kubenswrapper[4901]: I0309 02:45:49.164759 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pgv99"] Mar 09 02:45:49 crc kubenswrapper[4901]: I0309 02:45:49.226719 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-m9pkg" Mar 09 02:45:49 crc kubenswrapper[4901]: I0309 02:45:49.284374 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-m9pkg" Mar 09 02:45:49 crc kubenswrapper[4901]: I0309 02:45:49.388734 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pgv99" podUID="31ec7346-95de-49f9-ad63-7a7423ad1cc3" containerName="registry-server" containerID="cri-o://ef0861ada52f9a028b4c821f8d51edb074de5475b60241dad3577de3c23590a9" gracePeriod=2 Mar 09 02:45:49 crc kubenswrapper[4901]: I0309 02:45:49.653300 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rrzzj" Mar 09 02:45:49 crc kubenswrapper[4901]: I0309 02:45:49.699448 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rrzzj" Mar 09 02:45:49 crc kubenswrapper[4901]: I0309 02:45:49.867796 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pgv99" Mar 09 02:45:49 crc kubenswrapper[4901]: I0309 02:45:49.977760 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mj7ml\" (UniqueName: \"kubernetes.io/projected/31ec7346-95de-49f9-ad63-7a7423ad1cc3-kube-api-access-mj7ml\") pod \"31ec7346-95de-49f9-ad63-7a7423ad1cc3\" (UID: \"31ec7346-95de-49f9-ad63-7a7423ad1cc3\") " Mar 09 02:45:49 crc kubenswrapper[4901]: I0309 02:45:49.977873 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31ec7346-95de-49f9-ad63-7a7423ad1cc3-catalog-content\") pod \"31ec7346-95de-49f9-ad63-7a7423ad1cc3\" (UID: \"31ec7346-95de-49f9-ad63-7a7423ad1cc3\") " Mar 09 02:45:49 crc kubenswrapper[4901]: I0309 02:45:49.977984 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31ec7346-95de-49f9-ad63-7a7423ad1cc3-utilities\") pod \"31ec7346-95de-49f9-ad63-7a7423ad1cc3\" (UID: \"31ec7346-95de-49f9-ad63-7a7423ad1cc3\") " Mar 09 02:45:49 crc kubenswrapper[4901]: I0309 02:45:49.979505 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31ec7346-95de-49f9-ad63-7a7423ad1cc3-utilities" (OuterVolumeSpecName: "utilities") pod "31ec7346-95de-49f9-ad63-7a7423ad1cc3" (UID: "31ec7346-95de-49f9-ad63-7a7423ad1cc3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 02:45:49 crc kubenswrapper[4901]: I0309 02:45:49.985039 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31ec7346-95de-49f9-ad63-7a7423ad1cc3-kube-api-access-mj7ml" (OuterVolumeSpecName: "kube-api-access-mj7ml") pod "31ec7346-95de-49f9-ad63-7a7423ad1cc3" (UID: "31ec7346-95de-49f9-ad63-7a7423ad1cc3"). InnerVolumeSpecName "kube-api-access-mj7ml". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:45:50 crc kubenswrapper[4901]: I0309 02:45:50.028437 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31ec7346-95de-49f9-ad63-7a7423ad1cc3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "31ec7346-95de-49f9-ad63-7a7423ad1cc3" (UID: "31ec7346-95de-49f9-ad63-7a7423ad1cc3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 02:45:50 crc kubenswrapper[4901]: I0309 02:45:50.079256 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mj7ml\" (UniqueName: \"kubernetes.io/projected/31ec7346-95de-49f9-ad63-7a7423ad1cc3-kube-api-access-mj7ml\") on node \"crc\" DevicePath \"\"" Mar 09 02:45:50 crc kubenswrapper[4901]: I0309 02:45:50.079488 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31ec7346-95de-49f9-ad63-7a7423ad1cc3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 02:45:50 crc kubenswrapper[4901]: I0309 02:45:50.079499 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31ec7346-95de-49f9-ad63-7a7423ad1cc3-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 02:45:50 crc kubenswrapper[4901]: I0309 02:45:50.408783 4901 generic.go:334] "Generic (PLEG): container finished" podID="31ec7346-95de-49f9-ad63-7a7423ad1cc3" containerID="ef0861ada52f9a028b4c821f8d51edb074de5475b60241dad3577de3c23590a9" exitCode=0 Mar 09 02:45:50 crc kubenswrapper[4901]: I0309 02:45:50.408836 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pgv99" Mar 09 02:45:50 crc kubenswrapper[4901]: I0309 02:45:50.408884 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pgv99" event={"ID":"31ec7346-95de-49f9-ad63-7a7423ad1cc3","Type":"ContainerDied","Data":"ef0861ada52f9a028b4c821f8d51edb074de5475b60241dad3577de3c23590a9"} Mar 09 02:45:50 crc kubenswrapper[4901]: I0309 02:45:50.408971 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pgv99" event={"ID":"31ec7346-95de-49f9-ad63-7a7423ad1cc3","Type":"ContainerDied","Data":"3e9c01d36d5ee1ced6522878dfb79b08d16bed6c0a4e805bdd30c0f9dc1f4148"} Mar 09 02:45:50 crc kubenswrapper[4901]: I0309 02:45:50.409012 4901 scope.go:117] "RemoveContainer" containerID="ef0861ada52f9a028b4c821f8d51edb074de5475b60241dad3577de3c23590a9" Mar 09 02:45:50 crc kubenswrapper[4901]: I0309 02:45:50.435743 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pgv99"] Mar 09 02:45:50 crc kubenswrapper[4901]: I0309 02:45:50.439323 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pgv99"] Mar 09 02:45:50 crc kubenswrapper[4901]: I0309 02:45:50.453164 4901 scope.go:117] "RemoveContainer" containerID="1f6b15ab85e20e499dc0c57a75d0cb1adc0cdb16206f32ca3baf6c0a3a279e75" Mar 09 02:45:50 crc kubenswrapper[4901]: I0309 02:45:50.477643 4901 scope.go:117] "RemoveContainer" containerID="8bdb7754b1766bab89302c80d1535d35a1431877626b1603537636c60f6cf1d9" Mar 09 02:45:50 crc kubenswrapper[4901]: I0309 02:45:50.515545 4901 scope.go:117] "RemoveContainer" containerID="ef0861ada52f9a028b4c821f8d51edb074de5475b60241dad3577de3c23590a9" Mar 09 02:45:50 crc kubenswrapper[4901]: E0309 02:45:50.516310 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef0861ada52f9a028b4c821f8d51edb074de5475b60241dad3577de3c23590a9\": container with ID starting with ef0861ada52f9a028b4c821f8d51edb074de5475b60241dad3577de3c23590a9 not found: ID does not exist" containerID="ef0861ada52f9a028b4c821f8d51edb074de5475b60241dad3577de3c23590a9" Mar 09 02:45:50 crc kubenswrapper[4901]: I0309 02:45:50.516381 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef0861ada52f9a028b4c821f8d51edb074de5475b60241dad3577de3c23590a9"} err="failed to get container status \"ef0861ada52f9a028b4c821f8d51edb074de5475b60241dad3577de3c23590a9\": rpc error: code = NotFound desc = could not find container \"ef0861ada52f9a028b4c821f8d51edb074de5475b60241dad3577de3c23590a9\": container with ID starting with ef0861ada52f9a028b4c821f8d51edb074de5475b60241dad3577de3c23590a9 not found: ID does not exist" Mar 09 02:45:50 crc kubenswrapper[4901]: I0309 02:45:50.516421 4901 scope.go:117] "RemoveContainer" containerID="1f6b15ab85e20e499dc0c57a75d0cb1adc0cdb16206f32ca3baf6c0a3a279e75" Mar 09 02:45:50 crc kubenswrapper[4901]: E0309 02:45:50.516873 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f6b15ab85e20e499dc0c57a75d0cb1adc0cdb16206f32ca3baf6c0a3a279e75\": container with ID starting with 1f6b15ab85e20e499dc0c57a75d0cb1adc0cdb16206f32ca3baf6c0a3a279e75 not found: ID does not exist" containerID="1f6b15ab85e20e499dc0c57a75d0cb1adc0cdb16206f32ca3baf6c0a3a279e75" Mar 09 02:45:50 crc kubenswrapper[4901]: I0309 02:45:50.516912 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f6b15ab85e20e499dc0c57a75d0cb1adc0cdb16206f32ca3baf6c0a3a279e75"} err="failed to get container status \"1f6b15ab85e20e499dc0c57a75d0cb1adc0cdb16206f32ca3baf6c0a3a279e75\": rpc error: code = NotFound desc = could not find container \"1f6b15ab85e20e499dc0c57a75d0cb1adc0cdb16206f32ca3baf6c0a3a279e75\": container with ID starting with 1f6b15ab85e20e499dc0c57a75d0cb1adc0cdb16206f32ca3baf6c0a3a279e75 not found: ID does not exist" Mar 09 02:45:50 crc kubenswrapper[4901]: I0309 02:45:50.516935 4901 scope.go:117] "RemoveContainer" containerID="8bdb7754b1766bab89302c80d1535d35a1431877626b1603537636c60f6cf1d9" Mar 09 02:45:50 crc kubenswrapper[4901]: E0309 02:45:50.517543 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bdb7754b1766bab89302c80d1535d35a1431877626b1603537636c60f6cf1d9\": container with ID starting with 8bdb7754b1766bab89302c80d1535d35a1431877626b1603537636c60f6cf1d9 not found: ID does not exist" containerID="8bdb7754b1766bab89302c80d1535d35a1431877626b1603537636c60f6cf1d9" Mar 09 02:45:50 crc kubenswrapper[4901]: I0309 02:45:50.517584 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bdb7754b1766bab89302c80d1535d35a1431877626b1603537636c60f6cf1d9"} err="failed to get container status \"8bdb7754b1766bab89302c80d1535d35a1431877626b1603537636c60f6cf1d9\": rpc error: code = NotFound desc = could not find container \"8bdb7754b1766bab89302c80d1535d35a1431877626b1603537636c60f6cf1d9\": container with ID starting with 8bdb7754b1766bab89302c80d1535d35a1431877626b1603537636c60f6cf1d9 not found: ID does not exist" Mar 09 02:45:52 crc kubenswrapper[4901]: I0309 02:45:52.118330 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31ec7346-95de-49f9-ad63-7a7423ad1cc3" path="/var/lib/kubelet/pods/31ec7346-95de-49f9-ad63-7a7423ad1cc3/volumes" Mar 09 02:45:52 crc kubenswrapper[4901]: I0309 02:45:52.562637 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rrzzj"] Mar 09 02:45:52 crc kubenswrapper[4901]: I0309 02:45:52.563068 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rrzzj" podUID="980a688d-18b9-4f90-9255-a55568e7bbc0" containerName="registry-server" containerID="cri-o://bc51639b1fdaab2b9298abfddc4c154b2c35b9379b248bccf51c2f0c715732c4" gracePeriod=2 Mar 09 02:45:53 crc kubenswrapper[4901]: I0309 02:45:53.040166 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rrzzj" Mar 09 02:45:53 crc kubenswrapper[4901]: I0309 02:45:53.131844 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/980a688d-18b9-4f90-9255-a55568e7bbc0-utilities\") pod \"980a688d-18b9-4f90-9255-a55568e7bbc0\" (UID: \"980a688d-18b9-4f90-9255-a55568e7bbc0\") " Mar 09 02:45:53 crc kubenswrapper[4901]: I0309 02:45:53.131885 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/980a688d-18b9-4f90-9255-a55568e7bbc0-catalog-content\") pod \"980a688d-18b9-4f90-9255-a55568e7bbc0\" (UID: \"980a688d-18b9-4f90-9255-a55568e7bbc0\") " Mar 09 02:45:53 crc kubenswrapper[4901]: I0309 02:45:53.131946 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ckrx\" (UniqueName: \"kubernetes.io/projected/980a688d-18b9-4f90-9255-a55568e7bbc0-kube-api-access-4ckrx\") pod \"980a688d-18b9-4f90-9255-a55568e7bbc0\" (UID: \"980a688d-18b9-4f90-9255-a55568e7bbc0\") " Mar 09 02:45:53 crc kubenswrapper[4901]: I0309 02:45:53.133471 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/980a688d-18b9-4f90-9255-a55568e7bbc0-utilities" (OuterVolumeSpecName: "utilities") pod "980a688d-18b9-4f90-9255-a55568e7bbc0" (UID: "980a688d-18b9-4f90-9255-a55568e7bbc0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 02:45:53 crc kubenswrapper[4901]: I0309 02:45:53.147526 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/980a688d-18b9-4f90-9255-a55568e7bbc0-kube-api-access-4ckrx" (OuterVolumeSpecName: "kube-api-access-4ckrx") pod "980a688d-18b9-4f90-9255-a55568e7bbc0" (UID: "980a688d-18b9-4f90-9255-a55568e7bbc0"). InnerVolumeSpecName "kube-api-access-4ckrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:45:53 crc kubenswrapper[4901]: I0309 02:45:53.233819 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/980a688d-18b9-4f90-9255-a55568e7bbc0-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 02:45:53 crc kubenswrapper[4901]: I0309 02:45:53.234072 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ckrx\" (UniqueName: \"kubernetes.io/projected/980a688d-18b9-4f90-9255-a55568e7bbc0-kube-api-access-4ckrx\") on node \"crc\" DevicePath \"\"" Mar 09 02:45:53 crc kubenswrapper[4901]: I0309 02:45:53.248282 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/980a688d-18b9-4f90-9255-a55568e7bbc0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "980a688d-18b9-4f90-9255-a55568e7bbc0" (UID: "980a688d-18b9-4f90-9255-a55568e7bbc0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 02:45:53 crc kubenswrapper[4901]: I0309 02:45:53.335398 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/980a688d-18b9-4f90-9255-a55568e7bbc0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 02:45:53 crc kubenswrapper[4901]: I0309 02:45:53.433581 4901 generic.go:334] "Generic (PLEG): container finished" podID="980a688d-18b9-4f90-9255-a55568e7bbc0" containerID="bc51639b1fdaab2b9298abfddc4c154b2c35b9379b248bccf51c2f0c715732c4" exitCode=0 Mar 09 02:45:53 crc kubenswrapper[4901]: I0309 02:45:53.433657 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rrzzj" event={"ID":"980a688d-18b9-4f90-9255-a55568e7bbc0","Type":"ContainerDied","Data":"bc51639b1fdaab2b9298abfddc4c154b2c35b9379b248bccf51c2f0c715732c4"} Mar 09 02:45:53 crc kubenswrapper[4901]: I0309 02:45:53.433719 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rrzzj" event={"ID":"980a688d-18b9-4f90-9255-a55568e7bbc0","Type":"ContainerDied","Data":"eebca66ba4df17581a2b391b60c15d20e345580a7efefc3b3b72adca8c654761"} Mar 09 02:45:53 crc kubenswrapper[4901]: I0309 02:45:53.433763 4901 scope.go:117] "RemoveContainer" containerID="bc51639b1fdaab2b9298abfddc4c154b2c35b9379b248bccf51c2f0c715732c4" Mar 09 02:45:53 crc kubenswrapper[4901]: I0309 02:45:53.434345 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rrzzj" Mar 09 02:45:53 crc kubenswrapper[4901]: I0309 02:45:53.450163 4901 scope.go:117] "RemoveContainer" containerID="d8249cf85e541b0ff2e4327887c74d43f290628701991b302986bf860d7a9b9a" Mar 09 02:45:53 crc kubenswrapper[4901]: I0309 02:45:53.468167 4901 scope.go:117] "RemoveContainer" containerID="221aebb0e1b3bf7eb222157cf442fc7acd39bda43a80d52539eca60faf5a6ba9" Mar 09 02:45:53 crc kubenswrapper[4901]: I0309 02:45:53.507437 4901 scope.go:117] "RemoveContainer" containerID="bc51639b1fdaab2b9298abfddc4c154b2c35b9379b248bccf51c2f0c715732c4" Mar 09 02:45:53 crc kubenswrapper[4901]: E0309 02:45:53.512677 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc51639b1fdaab2b9298abfddc4c154b2c35b9379b248bccf51c2f0c715732c4\": container with ID starting with bc51639b1fdaab2b9298abfddc4c154b2c35b9379b248bccf51c2f0c715732c4 not found: ID does not exist" containerID="bc51639b1fdaab2b9298abfddc4c154b2c35b9379b248bccf51c2f0c715732c4" Mar 09 02:45:53 crc kubenswrapper[4901]: I0309 02:45:53.512733 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc51639b1fdaab2b9298abfddc4c154b2c35b9379b248bccf51c2f0c715732c4"} err="failed to get container status \"bc51639b1fdaab2b9298abfddc4c154b2c35b9379b248bccf51c2f0c715732c4\": rpc error: code = NotFound desc = could not find container \"bc51639b1fdaab2b9298abfddc4c154b2c35b9379b248bccf51c2f0c715732c4\": container with ID starting with bc51639b1fdaab2b9298abfddc4c154b2c35b9379b248bccf51c2f0c715732c4 not found: ID does not exist" Mar 09 02:45:53 crc kubenswrapper[4901]: I0309 02:45:53.512763 4901 scope.go:117] "RemoveContainer" containerID="d8249cf85e541b0ff2e4327887c74d43f290628701991b302986bf860d7a9b9a" Mar 09 02:45:53 crc kubenswrapper[4901]: I0309 02:45:53.514343 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rrzzj"] Mar 09 02:45:53 crc kubenswrapper[4901]: E0309 02:45:53.514524 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8249cf85e541b0ff2e4327887c74d43f290628701991b302986bf860d7a9b9a\": container with ID starting with d8249cf85e541b0ff2e4327887c74d43f290628701991b302986bf860d7a9b9a not found: ID does not exist" containerID="d8249cf85e541b0ff2e4327887c74d43f290628701991b302986bf860d7a9b9a" Mar 09 02:45:53 crc kubenswrapper[4901]: I0309 02:45:53.514551 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8249cf85e541b0ff2e4327887c74d43f290628701991b302986bf860d7a9b9a"} err="failed to get container status \"d8249cf85e541b0ff2e4327887c74d43f290628701991b302986bf860d7a9b9a\": rpc error: code = NotFound desc = could not find container \"d8249cf85e541b0ff2e4327887c74d43f290628701991b302986bf860d7a9b9a\": container with ID starting with d8249cf85e541b0ff2e4327887c74d43f290628701991b302986bf860d7a9b9a not found: ID does not exist" Mar 09 02:45:53 crc kubenswrapper[4901]: I0309 02:45:53.514574 4901 scope.go:117] "RemoveContainer" containerID="221aebb0e1b3bf7eb222157cf442fc7acd39bda43a80d52539eca60faf5a6ba9" Mar 09 02:45:53 crc kubenswrapper[4901]: E0309 02:45:53.520953 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"221aebb0e1b3bf7eb222157cf442fc7acd39bda43a80d52539eca60faf5a6ba9\": container with ID starting with 221aebb0e1b3bf7eb222157cf442fc7acd39bda43a80d52539eca60faf5a6ba9 not found: ID does not exist" containerID="221aebb0e1b3bf7eb222157cf442fc7acd39bda43a80d52539eca60faf5a6ba9" Mar 09 02:45:53 crc kubenswrapper[4901]: I0309 02:45:53.520992 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"221aebb0e1b3bf7eb222157cf442fc7acd39bda43a80d52539eca60faf5a6ba9"} err="failed to get container status \"221aebb0e1b3bf7eb222157cf442fc7acd39bda43a80d52539eca60faf5a6ba9\": rpc error: code = NotFound desc = could not find container \"221aebb0e1b3bf7eb222157cf442fc7acd39bda43a80d52539eca60faf5a6ba9\": container with ID starting with 221aebb0e1b3bf7eb222157cf442fc7acd39bda43a80d52539eca60faf5a6ba9 not found: ID does not exist" Mar 09 02:45:53 crc kubenswrapper[4901]: I0309 02:45:53.521743 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rrzzj"] Mar 09 02:45:54 crc kubenswrapper[4901]: I0309 02:45:54.111787 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="980a688d-18b9-4f90-9255-a55568e7bbc0" path="/var/lib/kubelet/pods/980a688d-18b9-4f90-9255-a55568e7bbc0/volumes" Mar 09 02:46:00 crc kubenswrapper[4901]: I0309 02:46:00.137735 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550406-smrn9"] Mar 09 02:46:00 crc kubenswrapper[4901]: E0309 02:46:00.138250 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="980a688d-18b9-4f90-9255-a55568e7bbc0" containerName="extract-utilities" Mar 09 02:46:00 crc kubenswrapper[4901]: I0309 02:46:00.138265 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="980a688d-18b9-4f90-9255-a55568e7bbc0" containerName="extract-utilities" Mar 09 02:46:00 crc kubenswrapper[4901]: E0309 02:46:00.138283 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="980a688d-18b9-4f90-9255-a55568e7bbc0" containerName="registry-server" Mar 09 02:46:00 crc kubenswrapper[4901]: I0309 02:46:00.138292 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="980a688d-18b9-4f90-9255-a55568e7bbc0" containerName="registry-server" Mar 09 02:46:00 crc kubenswrapper[4901]: E0309 02:46:00.138307 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31ec7346-95de-49f9-ad63-7a7423ad1cc3" containerName="extract-utilities" Mar 09 02:46:00 crc kubenswrapper[4901]: I0309 02:46:00.138315 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="31ec7346-95de-49f9-ad63-7a7423ad1cc3" containerName="extract-utilities" Mar 09 02:46:00 crc kubenswrapper[4901]: E0309 02:46:00.138327 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31ec7346-95de-49f9-ad63-7a7423ad1cc3" containerName="extract-content" Mar 09 02:46:00 crc kubenswrapper[4901]: I0309 02:46:00.138336 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="31ec7346-95de-49f9-ad63-7a7423ad1cc3" containerName="extract-content" Mar 09 02:46:00 crc kubenswrapper[4901]: E0309 02:46:00.138350 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="980a688d-18b9-4f90-9255-a55568e7bbc0" containerName="extract-content" Mar 09 02:46:00 crc kubenswrapper[4901]: I0309 02:46:00.138358 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="980a688d-18b9-4f90-9255-a55568e7bbc0" containerName="extract-content" Mar 09 02:46:00 crc kubenswrapper[4901]: E0309 02:46:00.138372 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31ec7346-95de-49f9-ad63-7a7423ad1cc3" containerName="registry-server" Mar 09 02:46:00 crc kubenswrapper[4901]: I0309 02:46:00.138380 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="31ec7346-95de-49f9-ad63-7a7423ad1cc3" containerName="registry-server" Mar 09 02:46:00 crc kubenswrapper[4901]: I0309 02:46:00.138493 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="31ec7346-95de-49f9-ad63-7a7423ad1cc3" containerName="registry-server" Mar 09 02:46:00 crc kubenswrapper[4901]: I0309 02:46:00.138506 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="980a688d-18b9-4f90-9255-a55568e7bbc0" containerName="registry-server" Mar 09 02:46:00 crc kubenswrapper[4901]: I0309 02:46:00.138936 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550406-smrn9" Mar 09 02:46:00 crc kubenswrapper[4901]: I0309 02:46:00.142051 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lq6vs" Mar 09 02:46:00 crc kubenswrapper[4901]: I0309 02:46:00.142858 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 02:46:00 crc kubenswrapper[4901]: I0309 02:46:00.143532 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 02:46:00 crc kubenswrapper[4901]: I0309 02:46:00.149616 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550406-smrn9"] Mar 09 02:46:00 crc kubenswrapper[4901]: I0309 02:46:00.242153 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6ss2\" (UniqueName: \"kubernetes.io/projected/9a751411-2ccc-4bae-bdc6-c34fe2385db3-kube-api-access-x6ss2\") pod \"auto-csr-approver-29550406-smrn9\" (UID: \"9a751411-2ccc-4bae-bdc6-c34fe2385db3\") " pod="openshift-infra/auto-csr-approver-29550406-smrn9" Mar 09 02:46:00 crc kubenswrapper[4901]: I0309 02:46:00.344006 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6ss2\" (UniqueName: \"kubernetes.io/projected/9a751411-2ccc-4bae-bdc6-c34fe2385db3-kube-api-access-x6ss2\") pod \"auto-csr-approver-29550406-smrn9\" (UID: \"9a751411-2ccc-4bae-bdc6-c34fe2385db3\") " pod="openshift-infra/auto-csr-approver-29550406-smrn9" Mar 09 02:46:00 crc kubenswrapper[4901]: I0309 02:46:00.369985 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6ss2\" (UniqueName: \"kubernetes.io/projected/9a751411-2ccc-4bae-bdc6-c34fe2385db3-kube-api-access-x6ss2\") pod \"auto-csr-approver-29550406-smrn9\" (UID: \"9a751411-2ccc-4bae-bdc6-c34fe2385db3\") " pod="openshift-infra/auto-csr-approver-29550406-smrn9" Mar 09 02:46:00 crc kubenswrapper[4901]: I0309 02:46:00.461344 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550406-smrn9" Mar 09 02:46:00 crc kubenswrapper[4901]: I0309 02:46:00.862859 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 02:46:00 crc kubenswrapper[4901]: I0309 02:46:00.862937 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 02:46:00 crc kubenswrapper[4901]: I0309 02:46:00.896852 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550406-smrn9"] Mar 09 02:46:00 crc kubenswrapper[4901]: W0309 02:46:00.907435 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a751411_2ccc_4bae_bdc6_c34fe2385db3.slice/crio-1306e491bd1aad0c746f0a7541a3b2e1f34df503de637899fc5ce2eacd048db8 WatchSource:0}: Error finding container 1306e491bd1aad0c746f0a7541a3b2e1f34df503de637899fc5ce2eacd048db8: Status 404 returned error can't find the container with id 1306e491bd1aad0c746f0a7541a3b2e1f34df503de637899fc5ce2eacd048db8 Mar 09 02:46:01 crc kubenswrapper[4901]: I0309 02:46:01.485809 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550406-smrn9" event={"ID":"9a751411-2ccc-4bae-bdc6-c34fe2385db3","Type":"ContainerStarted","Data":"1306e491bd1aad0c746f0a7541a3b2e1f34df503de637899fc5ce2eacd048db8"} Mar 09 02:46:02 crc kubenswrapper[4901]: I0309 02:46:02.493690 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550406-smrn9" event={"ID":"9a751411-2ccc-4bae-bdc6-c34fe2385db3","Type":"ContainerStarted","Data":"5172438ccca7ce5fed0dddfef9f317c9139e2cda2f00e28127f72de1def21d6f"} Mar 09 02:46:02 crc kubenswrapper[4901]: I0309 02:46:02.510823 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550406-smrn9" podStartSLOduration=1.3719845720000001 podStartE2EDuration="2.510772157s" podCreationTimestamp="2026-03-09 02:46:00 +0000 UTC" firstStartedPulling="2026-03-09 02:46:00.911147271 +0000 UTC m=+285.500811013" lastFinishedPulling="2026-03-09 02:46:02.049934826 +0000 UTC m=+286.639598598" observedRunningTime="2026-03-09 02:46:02.508681472 +0000 UTC m=+287.098345214" watchObservedRunningTime="2026-03-09 02:46:02.510772157 +0000 UTC m=+287.100435879" Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.039561 4901 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.040201 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://a290d0cc95a867bb7d924bfb2e63a518bb753765d4cdb890ccdf96cd36450b38" gracePeriod=15 Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.040342 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://f99df08aa182691c1bdb9ab9fb7b5352b276c236655b1076e2993639d63a94bf" gracePeriod=15 Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.040341 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://9f1f96e1c6d66ab0dc08021bc7fe05a276034895cd0bb6784bd0ce5a18b7d406" gracePeriod=15 Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.040362 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://36d0fe7f5e959b5c71b03b4ad017949229c3a161ea79f7869e610d069248c552" gracePeriod=15 Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.040579 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://72901ff5efbd52bce10e4651744925f7a5758a7a3c2572b581ab2cd222f58bdd" gracePeriod=15 Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.041916 4901 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 09 02:46:03 crc kubenswrapper[4901]: E0309 02:46:03.042414 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.042461 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 09 02:46:03 crc kubenswrapper[4901]: E0309 02:46:03.042494 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.042511 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 09 02:46:03 crc kubenswrapper[4901]: E0309 02:46:03.042534 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.042550 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 02:46:03 crc kubenswrapper[4901]: E0309 02:46:03.042574 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.042589 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 09 02:46:03 crc kubenswrapper[4901]: E0309 02:46:03.042604 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.042620 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 02:46:03 crc kubenswrapper[4901]: E0309 02:46:03.042645 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.042662 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 02:46:03 crc kubenswrapper[4901]: E0309 02:46:03.042682 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.042698 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 02:46:03 crc kubenswrapper[4901]: E0309 02:46:03.042718 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.042734 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 09 02:46:03 crc kubenswrapper[4901]: E0309 02:46:03.042760 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.042776 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.043008 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.043035 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.043051 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.043077 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.043097 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.043121 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.043140 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 02:46:03 crc kubenswrapper[4901]: E0309 02:46:03.043401 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.043423 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.043693 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.044062 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.045762 4901 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.046937 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.057858 4901 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.195441 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.195504 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.195531 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.195559 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.195826 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.195965 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.196021 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.196076 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.297292 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.297357 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.297387 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.297415 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.297484 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.297511 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.297532 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.297551 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.297643 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.297688 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.297715 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.297740 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.297766 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.297792 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.297820 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.297844 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.499848 4901 generic.go:334] "Generic (PLEG): container finished" podID="9a751411-2ccc-4bae-bdc6-c34fe2385db3" containerID="5172438ccca7ce5fed0dddfef9f317c9139e2cda2f00e28127f72de1def21d6f" exitCode=0 Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.499976 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550406-smrn9" event={"ID":"9a751411-2ccc-4bae-bdc6-c34fe2385db3","Type":"ContainerDied","Data":"5172438ccca7ce5fed0dddfef9f317c9139e2cda2f00e28127f72de1def21d6f"} Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.500601 4901 status_manager.go:851] "Failed to get status for pod" podUID="9a751411-2ccc-4bae-bdc6-c34fe2385db3" pod="openshift-infra/auto-csr-approver-29550406-smrn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29550406-smrn9\": dial tcp 38.102.83.20:6443: connect: connection refused" Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.501635 4901 generic.go:334] "Generic (PLEG): container finished" podID="946ddaae-0092-45f2-b5af-3a5168fa64e8" containerID="14d8eb66d461e65f942f8415e30e0dd40a4d9ece6ecd33c7d045290d53091edc" exitCode=0 Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.501714 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"946ddaae-0092-45f2-b5af-3a5168fa64e8","Type":"ContainerDied","Data":"14d8eb66d461e65f942f8415e30e0dd40a4d9ece6ecd33c7d045290d53091edc"} Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.502180 4901 status_manager.go:851] "Failed to get status for pod" podUID="9a751411-2ccc-4bae-bdc6-c34fe2385db3" pod="openshift-infra/auto-csr-approver-29550406-smrn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29550406-smrn9\": dial tcp 38.102.83.20:6443: connect: connection refused" Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.502626 4901 status_manager.go:851] "Failed to get status for pod" podUID="946ddaae-0092-45f2-b5af-3a5168fa64e8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.20:6443: connect: connection refused" Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.504197 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.505367 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.505928 4901 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9f1f96e1c6d66ab0dc08021bc7fe05a276034895cd0bb6784bd0ce5a18b7d406" exitCode=0 Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.505952 4901 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f99df08aa182691c1bdb9ab9fb7b5352b276c236655b1076e2993639d63a94bf" exitCode=0 Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.505962 4901 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="72901ff5efbd52bce10e4651744925f7a5758a7a3c2572b581ab2cd222f58bdd" exitCode=0 Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.505972 4901 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="36d0fe7f5e959b5c71b03b4ad017949229c3a161ea79f7869e610d069248c552" exitCode=2 Mar 09 02:46:03 crc kubenswrapper[4901]: I0309 02:46:03.506016 4901 scope.go:117] "RemoveContainer" containerID="d4536e1ee725ee8ff6d9cd0fb06e1bbf9184ccf667480e5727a1ce065197f2cc" Mar 09 02:46:04 crc kubenswrapper[4901]: I0309 02:46:04.517734 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.165870 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.167831 4901 status_manager.go:851] "Failed to get status for pod" podUID="9a751411-2ccc-4bae-bdc6-c34fe2385db3" pod="openshift-infra/auto-csr-approver-29550406-smrn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29550406-smrn9\": dial tcp 38.102.83.20:6443: connect: connection refused" Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.168056 4901 status_manager.go:851] "Failed to get status for pod" podUID="946ddaae-0092-45f2-b5af-3a5168fa64e8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.20:6443: connect: connection refused" Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.168377 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550406-smrn9" Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.168793 4901 status_manager.go:851] "Failed to get status for pod" podUID="9a751411-2ccc-4bae-bdc6-c34fe2385db3" pod="openshift-infra/auto-csr-approver-29550406-smrn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29550406-smrn9\": dial tcp 38.102.83.20:6443: connect: connection refused" Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.170024 4901 status_manager.go:851] "Failed to get status for pod" podUID="946ddaae-0092-45f2-b5af-3a5168fa64e8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.20:6443: connect: connection refused" Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.340619 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/946ddaae-0092-45f2-b5af-3a5168fa64e8-var-lock\") pod \"946ddaae-0092-45f2-b5af-3a5168fa64e8\" (UID: \"946ddaae-0092-45f2-b5af-3a5168fa64e8\") " Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.340668 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/946ddaae-0092-45f2-b5af-3a5168fa64e8-kubelet-dir\") pod \"946ddaae-0092-45f2-b5af-3a5168fa64e8\" (UID: \"946ddaae-0092-45f2-b5af-3a5168fa64e8\") " Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.340740 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/946ddaae-0092-45f2-b5af-3a5168fa64e8-var-lock" (OuterVolumeSpecName: "var-lock") pod "946ddaae-0092-45f2-b5af-3a5168fa64e8" (UID: "946ddaae-0092-45f2-b5af-3a5168fa64e8"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.340802 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6ss2\" (UniqueName: \"kubernetes.io/projected/9a751411-2ccc-4bae-bdc6-c34fe2385db3-kube-api-access-x6ss2\") pod \"9a751411-2ccc-4bae-bdc6-c34fe2385db3\" (UID: \"9a751411-2ccc-4bae-bdc6-c34fe2385db3\") " Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.340849 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/946ddaae-0092-45f2-b5af-3a5168fa64e8-kube-api-access\") pod \"946ddaae-0092-45f2-b5af-3a5168fa64e8\" (UID: \"946ddaae-0092-45f2-b5af-3a5168fa64e8\") " Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.340843 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/946ddaae-0092-45f2-b5af-3a5168fa64e8-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "946ddaae-0092-45f2-b5af-3a5168fa64e8" (UID: "946ddaae-0092-45f2-b5af-3a5168fa64e8"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.341116 4901 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/946ddaae-0092-45f2-b5af-3a5168fa64e8-var-lock\") on node \"crc\" DevicePath \"\"" Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.341134 4901 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/946ddaae-0092-45f2-b5af-3a5168fa64e8-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.345746 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a751411-2ccc-4bae-bdc6-c34fe2385db3-kube-api-access-x6ss2" (OuterVolumeSpecName: "kube-api-access-x6ss2") pod "9a751411-2ccc-4bae-bdc6-c34fe2385db3" (UID: "9a751411-2ccc-4bae-bdc6-c34fe2385db3"). InnerVolumeSpecName "kube-api-access-x6ss2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.346073 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/946ddaae-0092-45f2-b5af-3a5168fa64e8-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "946ddaae-0092-45f2-b5af-3a5168fa64e8" (UID: "946ddaae-0092-45f2-b5af-3a5168fa64e8"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.417390 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.418270 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.419092 4901 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.20:6443: connect: connection refused" Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.419771 4901 status_manager.go:851] "Failed to get status for pod" podUID="9a751411-2ccc-4bae-bdc6-c34fe2385db3" pod="openshift-infra/auto-csr-approver-29550406-smrn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29550406-smrn9\": dial tcp 38.102.83.20:6443: connect: connection refused" Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.420264 4901 status_manager.go:851] "Failed to get status for pod" podUID="946ddaae-0092-45f2-b5af-3a5168fa64e8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.20:6443: connect: connection refused" Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.442334 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6ss2\" (UniqueName: \"kubernetes.io/projected/9a751411-2ccc-4bae-bdc6-c34fe2385db3-kube-api-access-x6ss2\") on node \"crc\" DevicePath \"\"" Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.442397 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/946ddaae-0092-45f2-b5af-3a5168fa64e8-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.526300 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550406-smrn9" event={"ID":"9a751411-2ccc-4bae-bdc6-c34fe2385db3","Type":"ContainerDied","Data":"1306e491bd1aad0c746f0a7541a3b2e1f34df503de637899fc5ce2eacd048db8"} Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.526705 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1306e491bd1aad0c746f0a7541a3b2e1f34df503de637899fc5ce2eacd048db8" Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.526375 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550406-smrn9" Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.527914 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"946ddaae-0092-45f2-b5af-3a5168fa64e8","Type":"ContainerDied","Data":"0fb69dfcb4ae21b653c77905eaf02d3b8352b0c01fccf190870874141a2381de"} Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.527939 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0fb69dfcb4ae21b653c77905eaf02d3b8352b0c01fccf190870874141a2381de" Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.527981 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.530899 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.531768 4901 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a290d0cc95a867bb7d924bfb2e63a518bb753765d4cdb890ccdf96cd36450b38" exitCode=0 Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.531833 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.531852 4901 scope.go:117] "RemoveContainer" containerID="9f1f96e1c6d66ab0dc08021bc7fe05a276034895cd0bb6784bd0ce5a18b7d406" Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.543458 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.543568 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.543598 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.543687 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.543736 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.543787 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.544030 4901 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.544046 4901 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.544056 4901 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.548666 4901 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.20:6443: connect: connection refused" Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.549257 4901 status_manager.go:851] "Failed to get status for pod" podUID="9a751411-2ccc-4bae-bdc6-c34fe2385db3" pod="openshift-infra/auto-csr-approver-29550406-smrn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29550406-smrn9\": dial tcp 38.102.83.20:6443: connect: connection refused" Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.557499 4901 status_manager.go:851] "Failed to get status for pod" podUID="946ddaae-0092-45f2-b5af-3a5168fa64e8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.20:6443: connect: connection refused" Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.558203 4901 status_manager.go:851] "Failed to get status for pod" podUID="9a751411-2ccc-4bae-bdc6-c34fe2385db3" pod="openshift-infra/auto-csr-approver-29550406-smrn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29550406-smrn9\": dial tcp 38.102.83.20:6443: connect: connection refused" Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.559939 4901 status_manager.go:851] "Failed to get status for pod" podUID="946ddaae-0092-45f2-b5af-3a5168fa64e8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.20:6443: connect: connection refused" Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.561070 4901 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.20:6443: connect: connection refused" Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.562079 4901 scope.go:117] "RemoveContainer" containerID="f99df08aa182691c1bdb9ab9fb7b5352b276c236655b1076e2993639d63a94bf" Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.579525 4901 scope.go:117] "RemoveContainer" containerID="72901ff5efbd52bce10e4651744925f7a5758a7a3c2572b581ab2cd222f58bdd" Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.596854 4901 scope.go:117] "RemoveContainer" containerID="36d0fe7f5e959b5c71b03b4ad017949229c3a161ea79f7869e610d069248c552" Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.611493 4901 scope.go:117] "RemoveContainer" containerID="a290d0cc95a867bb7d924bfb2e63a518bb753765d4cdb890ccdf96cd36450b38" Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.633478 4901 scope.go:117] "RemoveContainer" containerID="3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf" Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.657535 4901 scope.go:117] "RemoveContainer" containerID="9f1f96e1c6d66ab0dc08021bc7fe05a276034895cd0bb6784bd0ce5a18b7d406" Mar 09 02:46:05 crc kubenswrapper[4901]: E0309 02:46:05.659137 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f1f96e1c6d66ab0dc08021bc7fe05a276034895cd0bb6784bd0ce5a18b7d406\": container with ID starting with 9f1f96e1c6d66ab0dc08021bc7fe05a276034895cd0bb6784bd0ce5a18b7d406 not found: ID does not exist" containerID="9f1f96e1c6d66ab0dc08021bc7fe05a276034895cd0bb6784bd0ce5a18b7d406" Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.659188 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f1f96e1c6d66ab0dc08021bc7fe05a276034895cd0bb6784bd0ce5a18b7d406"} err="failed to get container status \"9f1f96e1c6d66ab0dc08021bc7fe05a276034895cd0bb6784bd0ce5a18b7d406\": rpc error: code = NotFound desc = could not find container \"9f1f96e1c6d66ab0dc08021bc7fe05a276034895cd0bb6784bd0ce5a18b7d406\": container with ID starting with 9f1f96e1c6d66ab0dc08021bc7fe05a276034895cd0bb6784bd0ce5a18b7d406 not found: ID does not exist" Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.659308 4901 scope.go:117] "RemoveContainer" containerID="f99df08aa182691c1bdb9ab9fb7b5352b276c236655b1076e2993639d63a94bf" Mar 09 02:46:05 crc kubenswrapper[4901]: E0309 02:46:05.659968 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f99df08aa182691c1bdb9ab9fb7b5352b276c236655b1076e2993639d63a94bf\": container with ID starting with f99df08aa182691c1bdb9ab9fb7b5352b276c236655b1076e2993639d63a94bf not found: ID does not exist" containerID="f99df08aa182691c1bdb9ab9fb7b5352b276c236655b1076e2993639d63a94bf" Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.659996 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f99df08aa182691c1bdb9ab9fb7b5352b276c236655b1076e2993639d63a94bf"} err="failed to get container status \"f99df08aa182691c1bdb9ab9fb7b5352b276c236655b1076e2993639d63a94bf\": rpc error: code = NotFound desc = could not find container \"f99df08aa182691c1bdb9ab9fb7b5352b276c236655b1076e2993639d63a94bf\": container with ID starting with f99df08aa182691c1bdb9ab9fb7b5352b276c236655b1076e2993639d63a94bf not found: ID does not exist" Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.660013 4901 scope.go:117] "RemoveContainer" containerID="72901ff5efbd52bce10e4651744925f7a5758a7a3c2572b581ab2cd222f58bdd" Mar 09 02:46:05 crc kubenswrapper[4901]: E0309 02:46:05.660628 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72901ff5efbd52bce10e4651744925f7a5758a7a3c2572b581ab2cd222f58bdd\": container with ID starting with 72901ff5efbd52bce10e4651744925f7a5758a7a3c2572b581ab2cd222f58bdd not found: ID does not exist" containerID="72901ff5efbd52bce10e4651744925f7a5758a7a3c2572b581ab2cd222f58bdd" Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.660667 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72901ff5efbd52bce10e4651744925f7a5758a7a3c2572b581ab2cd222f58bdd"} err="failed to get container status \"72901ff5efbd52bce10e4651744925f7a5758a7a3c2572b581ab2cd222f58bdd\": rpc error: code = NotFound desc = could not find container \"72901ff5efbd52bce10e4651744925f7a5758a7a3c2572b581ab2cd222f58bdd\": container with ID starting with 72901ff5efbd52bce10e4651744925f7a5758a7a3c2572b581ab2cd222f58bdd not found: ID does not exist" Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.660695 4901 scope.go:117] "RemoveContainer" containerID="36d0fe7f5e959b5c71b03b4ad017949229c3a161ea79f7869e610d069248c552" Mar 09 02:46:05 crc kubenswrapper[4901]: E0309 02:46:05.661004 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36d0fe7f5e959b5c71b03b4ad017949229c3a161ea79f7869e610d069248c552\": container with ID starting with 36d0fe7f5e959b5c71b03b4ad017949229c3a161ea79f7869e610d069248c552 not found: ID does not exist" containerID="36d0fe7f5e959b5c71b03b4ad017949229c3a161ea79f7869e610d069248c552" Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.661055 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36d0fe7f5e959b5c71b03b4ad017949229c3a161ea79f7869e610d069248c552"} err="failed to get container status \"36d0fe7f5e959b5c71b03b4ad017949229c3a161ea79f7869e610d069248c552\": rpc error: code = NotFound desc = could not find container \"36d0fe7f5e959b5c71b03b4ad017949229c3a161ea79f7869e610d069248c552\": container with ID starting with 36d0fe7f5e959b5c71b03b4ad017949229c3a161ea79f7869e610d069248c552 not found: ID does not exist" Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.661086 4901 scope.go:117] "RemoveContainer" containerID="a290d0cc95a867bb7d924bfb2e63a518bb753765d4cdb890ccdf96cd36450b38" Mar 09 02:46:05 crc kubenswrapper[4901]: E0309 02:46:05.661571 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a290d0cc95a867bb7d924bfb2e63a518bb753765d4cdb890ccdf96cd36450b38\": container with ID starting with a290d0cc95a867bb7d924bfb2e63a518bb753765d4cdb890ccdf96cd36450b38 not found: ID does not exist" containerID="a290d0cc95a867bb7d924bfb2e63a518bb753765d4cdb890ccdf96cd36450b38" Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.661596 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a290d0cc95a867bb7d924bfb2e63a518bb753765d4cdb890ccdf96cd36450b38"} err="failed to get container status \"a290d0cc95a867bb7d924bfb2e63a518bb753765d4cdb890ccdf96cd36450b38\": rpc error: code = NotFound desc = could not find container \"a290d0cc95a867bb7d924bfb2e63a518bb753765d4cdb890ccdf96cd36450b38\": container with ID starting with a290d0cc95a867bb7d924bfb2e63a518bb753765d4cdb890ccdf96cd36450b38 not found: ID does not exist" Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.661612 4901 scope.go:117] "RemoveContainer" containerID="3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf" Mar 09 02:46:05 crc kubenswrapper[4901]: E0309 02:46:05.661999 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\": container with ID starting with 3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf not found: ID does not exist" containerID="3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf" Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.662044 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf"} err="failed to get container status \"3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\": rpc error: code = NotFound desc = could not find container \"3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf\": container with ID starting with 3406de49f1c67b014df6844f0f1b5e7960a4d6c3cf62c63b04e0ffe60a7ac3bf not found: ID does not exist" Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.860308 4901 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.20:6443: connect: connection refused" Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.860884 4901 status_manager.go:851] "Failed to get status for pod" podUID="9a751411-2ccc-4bae-bdc6-c34fe2385db3" pod="openshift-infra/auto-csr-approver-29550406-smrn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29550406-smrn9\": dial tcp 38.102.83.20:6443: connect: connection refused" Mar 09 02:46:05 crc kubenswrapper[4901]: I0309 02:46:05.861497 4901 status_manager.go:851] "Failed to get status for pod" podUID="946ddaae-0092-45f2-b5af-3a5168fa64e8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.20:6443: connect: connection refused" Mar 09 02:46:06 crc kubenswrapper[4901]: I0309 02:46:06.110522 4901 status_manager.go:851] "Failed to get status for pod" podUID="9a751411-2ccc-4bae-bdc6-c34fe2385db3" pod="openshift-infra/auto-csr-approver-29550406-smrn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29550406-smrn9\": dial tcp 38.102.83.20:6443: connect: connection refused" Mar 09 02:46:06 crc kubenswrapper[4901]: I0309 02:46:06.110811 4901 status_manager.go:851] "Failed to get status for pod" podUID="946ddaae-0092-45f2-b5af-3a5168fa64e8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.20:6443: connect: connection refused" Mar 09 02:46:06 crc kubenswrapper[4901]: I0309 02:46:06.111276 4901 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.20:6443: connect: connection refused" Mar 09 02:46:06 crc kubenswrapper[4901]: I0309 02:46:06.124415 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 09 02:46:07 crc kubenswrapper[4901]: E0309 02:46:07.274391 4901 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.20:6443: connect: connection refused" Mar 09 02:46:07 crc kubenswrapper[4901]: E0309 02:46:07.275451 4901 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.20:6443: connect: connection refused" Mar 09 02:46:07 crc kubenswrapper[4901]: E0309 02:46:07.275977 4901 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.20:6443: connect: connection refused" Mar 09 02:46:07 crc kubenswrapper[4901]: E0309 02:46:07.276476 4901 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.20:6443: connect: connection refused" Mar 09 02:46:07 crc kubenswrapper[4901]: E0309 02:46:07.276968 4901 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.20:6443: connect: connection refused" Mar 09 02:46:07 crc kubenswrapper[4901]: I0309 02:46:07.277018 4901 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 09 02:46:07 crc kubenswrapper[4901]: E0309 02:46:07.277422 4901 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.20:6443: connect: connection refused" interval="200ms" Mar 09 02:46:07 crc kubenswrapper[4901]: E0309 02:46:07.479031 4901 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.20:6443: connect: connection refused" interval="400ms" Mar 09 02:46:07 crc kubenswrapper[4901]: E0309 02:46:07.880177 4901 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.20:6443: connect: connection refused" interval="800ms" Mar 09 02:46:08 crc kubenswrapper[4901]: E0309 02:46:08.102454 4901 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.20:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 02:46:08 crc kubenswrapper[4901]: I0309 02:46:08.103741 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 02:46:08 crc kubenswrapper[4901]: W0309 02:46:08.144617 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-06df7266be7eb803ec8788f46c913b6d5eb9496631d6fb8260e71b4e6e0cfbb9 WatchSource:0}: Error finding container 06df7266be7eb803ec8788f46c913b6d5eb9496631d6fb8260e71b4e6e0cfbb9: Status 404 returned error can't find the container with id 06df7266be7eb803ec8788f46c913b6d5eb9496631d6fb8260e71b4e6e0cfbb9 Mar 09 02:46:08 crc kubenswrapper[4901]: E0309 02:46:08.148140 4901 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.20:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189b0c44d9891ae3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:46:08.147372771 +0000 UTC m=+292.737036513,LastTimestamp:2026-03-09 02:46:08.147372771 +0000 UTC m=+292.737036513,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:46:08 crc kubenswrapper[4901]: I0309 02:46:08.553424 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"519dffd60b7d2a9e0f899da0e30ddb6ae5167331b7e470ef4c05af561c03eeff"} Mar 09 02:46:08 crc kubenswrapper[4901]: I0309 02:46:08.553866 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"06df7266be7eb803ec8788f46c913b6d5eb9496631d6fb8260e71b4e6e0cfbb9"} Mar 09 02:46:08 crc kubenswrapper[4901]: I0309 02:46:08.554703 4901 status_manager.go:851] "Failed to get status for pod" podUID="9a751411-2ccc-4bae-bdc6-c34fe2385db3" pod="openshift-infra/auto-csr-approver-29550406-smrn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29550406-smrn9\": dial tcp 38.102.83.20:6443: connect: connection refused" Mar 09 02:46:08 crc kubenswrapper[4901]: E0309 02:46:08.554887 4901 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.20:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 02:46:08 crc kubenswrapper[4901]: I0309 02:46:08.554897 4901 status_manager.go:851] "Failed to get status for pod" podUID="946ddaae-0092-45f2-b5af-3a5168fa64e8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.20:6443: connect: connection refused" Mar 09 02:46:08 crc kubenswrapper[4901]: E0309 02:46:08.680855 4901 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.20:6443: connect: connection refused" interval="1.6s" Mar 09 02:46:08 crc kubenswrapper[4901]: E0309 02:46:08.856196 4901 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.20:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189b0c44d9891ae3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 02:46:08.147372771 +0000 UTC m=+292.737036513,LastTimestamp:2026-03-09 02:46:08.147372771 +0000 UTC m=+292.737036513,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 02:46:10 crc kubenswrapper[4901]: E0309 02:46:10.282574 4901 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.20:6443: connect: connection refused" interval="3.2s" Mar 09 02:46:11 crc kubenswrapper[4901]: E0309 02:46:11.127447 4901 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.20:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" volumeName="registry-storage" Mar 09 02:46:13 crc kubenswrapper[4901]: I0309 02:46:13.258652 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" podUID="0bef1913-8737-48c8-bcf5-89daf1bd1c54" containerName="oauth-openshift" containerID="cri-o://ba483cf28e26977022a8c8ce366cc98cb64776353c9ee50895f3e6e7743f51af" gracePeriod=15 Mar 09 02:46:13 crc kubenswrapper[4901]: E0309 02:46:13.484328 4901 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.20:6443: connect: connection refused" interval="6.4s" Mar 09 02:46:13 crc kubenswrapper[4901]: I0309 02:46:13.594167 4901 generic.go:334] "Generic (PLEG): container finished" podID="0bef1913-8737-48c8-bcf5-89daf1bd1c54" containerID="ba483cf28e26977022a8c8ce366cc98cb64776353c9ee50895f3e6e7743f51af" exitCode=0 Mar 09 02:46:13 crc kubenswrapper[4901]: I0309 02:46:13.594216 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" event={"ID":"0bef1913-8737-48c8-bcf5-89daf1bd1c54","Type":"ContainerDied","Data":"ba483cf28e26977022a8c8ce366cc98cb64776353c9ee50895f3e6e7743f51af"} Mar 09 02:46:13 crc kubenswrapper[4901]: I0309 02:46:13.839731 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" Mar 09 02:46:13 crc kubenswrapper[4901]: I0309 02:46:13.840849 4901 status_manager.go:851] "Failed to get status for pod" podUID="0bef1913-8737-48c8-bcf5-89daf1bd1c54" pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-tzf54\": dial tcp 38.102.83.20:6443: connect: connection refused" Mar 09 02:46:13 crc kubenswrapper[4901]: I0309 02:46:13.841511 4901 status_manager.go:851] "Failed to get status for pod" podUID="9a751411-2ccc-4bae-bdc6-c34fe2385db3" pod="openshift-infra/auto-csr-approver-29550406-smrn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29550406-smrn9\": dial tcp 38.102.83.20:6443: connect: connection refused" Mar 09 02:46:13 crc kubenswrapper[4901]: I0309 02:46:13.842605 4901 status_manager.go:851] "Failed to get status for pod" podUID="946ddaae-0092-45f2-b5af-3a5168fa64e8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.20:6443: connect: connection refused" Mar 09 02:46:13 crc kubenswrapper[4901]: I0309 02:46:13.969106 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-user-idp-0-file-data\") pod \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\" (UID: \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\") " Mar 09 02:46:13 crc kubenswrapper[4901]: I0309 02:46:13.969190 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-system-cliconfig\") pod \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\" (UID: \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\") " Mar 09 02:46:13 crc kubenswrapper[4901]: I0309 02:46:13.969209 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0bef1913-8737-48c8-bcf5-89daf1bd1c54-audit-dir\") pod \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\" (UID: \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\") " Mar 09 02:46:13 crc kubenswrapper[4901]: I0309 02:46:13.969238 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-system-ocp-branding-template\") pod \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\" (UID: \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\") " Mar 09 02:46:13 crc kubenswrapper[4901]: I0309 02:46:13.969271 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-user-template-login\") pod \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\" (UID: \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\") " Mar 09 02:46:13 crc kubenswrapper[4901]: I0309 02:46:13.969338 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-system-service-ca\") pod \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\" (UID: \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\") " Mar 09 02:46:13 crc kubenswrapper[4901]: I0309 02:46:13.969360 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-system-router-certs\") pod \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\" (UID: \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\") " Mar 09 02:46:13 crc kubenswrapper[4901]: I0309 02:46:13.969381 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-system-serving-cert\") pod \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\" (UID: \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\") " Mar 09 02:46:13 crc kubenswrapper[4901]: I0309 02:46:13.969424 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-system-trusted-ca-bundle\") pod \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\" (UID: \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\") " Mar 09 02:46:13 crc kubenswrapper[4901]: I0309 02:46:13.969443 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-system-session\") pod \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\" (UID: \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\") " Mar 09 02:46:13 crc kubenswrapper[4901]: I0309 02:46:13.969467 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0bef1913-8737-48c8-bcf5-89daf1bd1c54-audit-policies\") pod \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\" (UID: \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\") " Mar 09 02:46:13 crc kubenswrapper[4901]: I0309 02:46:13.969492 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-user-template-error\") pod \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\" (UID: \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\") " Mar 09 02:46:13 crc kubenswrapper[4901]: I0309 02:46:13.969510 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgfr4\" (UniqueName: \"kubernetes.io/projected/0bef1913-8737-48c8-bcf5-89daf1bd1c54-kube-api-access-hgfr4\") pod \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\" (UID: \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\") " Mar 09 02:46:13 crc kubenswrapper[4901]: I0309 02:46:13.969534 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-user-template-provider-selection\") pod \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\" (UID: \"0bef1913-8737-48c8-bcf5-89daf1bd1c54\") " Mar 09 02:46:13 crc kubenswrapper[4901]: I0309 02:46:13.971163 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0bef1913-8737-48c8-bcf5-89daf1bd1c54-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "0bef1913-8737-48c8-bcf5-89daf1bd1c54" (UID: "0bef1913-8737-48c8-bcf5-89daf1bd1c54"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 02:46:13 crc kubenswrapper[4901]: I0309 02:46:13.972919 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bef1913-8737-48c8-bcf5-89daf1bd1c54-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "0bef1913-8737-48c8-bcf5-89daf1bd1c54" (UID: "0bef1913-8737-48c8-bcf5-89daf1bd1c54"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:46:13 crc kubenswrapper[4901]: I0309 02:46:13.973212 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "0bef1913-8737-48c8-bcf5-89daf1bd1c54" (UID: "0bef1913-8737-48c8-bcf5-89daf1bd1c54"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:46:13 crc kubenswrapper[4901]: I0309 02:46:13.973428 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "0bef1913-8737-48c8-bcf5-89daf1bd1c54" (UID: "0bef1913-8737-48c8-bcf5-89daf1bd1c54"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:46:13 crc kubenswrapper[4901]: I0309 02:46:13.973635 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "0bef1913-8737-48c8-bcf5-89daf1bd1c54" (UID: "0bef1913-8737-48c8-bcf5-89daf1bd1c54"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:46:13 crc kubenswrapper[4901]: I0309 02:46:13.976111 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bef1913-8737-48c8-bcf5-89daf1bd1c54-kube-api-access-hgfr4" (OuterVolumeSpecName: "kube-api-access-hgfr4") pod "0bef1913-8737-48c8-bcf5-89daf1bd1c54" (UID: "0bef1913-8737-48c8-bcf5-89daf1bd1c54"). InnerVolumeSpecName "kube-api-access-hgfr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:46:13 crc kubenswrapper[4901]: I0309 02:46:13.977683 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "0bef1913-8737-48c8-bcf5-89daf1bd1c54" (UID: "0bef1913-8737-48c8-bcf5-89daf1bd1c54"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:46:13 crc kubenswrapper[4901]: I0309 02:46:13.977939 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "0bef1913-8737-48c8-bcf5-89daf1bd1c54" (UID: "0bef1913-8737-48c8-bcf5-89daf1bd1c54"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:46:13 crc kubenswrapper[4901]: I0309 02:46:13.978172 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "0bef1913-8737-48c8-bcf5-89daf1bd1c54" (UID: "0bef1913-8737-48c8-bcf5-89daf1bd1c54"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:46:13 crc kubenswrapper[4901]: I0309 02:46:13.978532 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "0bef1913-8737-48c8-bcf5-89daf1bd1c54" (UID: "0bef1913-8737-48c8-bcf5-89daf1bd1c54"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:46:13 crc kubenswrapper[4901]: I0309 02:46:13.979112 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "0bef1913-8737-48c8-bcf5-89daf1bd1c54" (UID: "0bef1913-8737-48c8-bcf5-89daf1bd1c54"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:46:13 crc kubenswrapper[4901]: I0309 02:46:13.979351 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "0bef1913-8737-48c8-bcf5-89daf1bd1c54" (UID: "0bef1913-8737-48c8-bcf5-89daf1bd1c54"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:46:13 crc kubenswrapper[4901]: I0309 02:46:13.979738 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "0bef1913-8737-48c8-bcf5-89daf1bd1c54" (UID: "0bef1913-8737-48c8-bcf5-89daf1bd1c54"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:46:13 crc kubenswrapper[4901]: I0309 02:46:13.979955 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "0bef1913-8737-48c8-bcf5-89daf1bd1c54" (UID: "0bef1913-8737-48c8-bcf5-89daf1bd1c54"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:46:14 crc kubenswrapper[4901]: I0309 02:46:14.072719 4901 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 09 02:46:14 crc kubenswrapper[4901]: I0309 02:46:14.073117 4901 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 02:46:14 crc kubenswrapper[4901]: I0309 02:46:14.073287 4901 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 09 02:46:14 crc kubenswrapper[4901]: I0309 02:46:14.073423 4901 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 02:46:14 crc kubenswrapper[4901]: I0309 02:46:14.073573 4901 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 02:46:14 crc kubenswrapper[4901]: I0309 02:46:14.073712 4901 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 09 02:46:14 crc kubenswrapper[4901]: I0309 02:46:14.073878 4901 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0bef1913-8737-48c8-bcf5-89daf1bd1c54-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 09 02:46:14 crc kubenswrapper[4901]: I0309 02:46:14.074003 4901 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 09 02:46:14 crc kubenswrapper[4901]: I0309 02:46:14.074131 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgfr4\" (UniqueName: \"kubernetes.io/projected/0bef1913-8737-48c8-bcf5-89daf1bd1c54-kube-api-access-hgfr4\") on node \"crc\" DevicePath \"\"" Mar 09 02:46:14 crc kubenswrapper[4901]: I0309 02:46:14.074281 4901 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 09 02:46:14 crc kubenswrapper[4901]: I0309 02:46:14.074417 4901 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 09 02:46:14 crc kubenswrapper[4901]: I0309 02:46:14.074535 4901 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 09 02:46:14 crc kubenswrapper[4901]: I0309 02:46:14.074657 4901 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0bef1913-8737-48c8-bcf5-89daf1bd1c54-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 09 02:46:14 crc kubenswrapper[4901]: I0309 02:46:14.074786 4901 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0bef1913-8737-48c8-bcf5-89daf1bd1c54-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 09 02:46:14 crc kubenswrapper[4901]: I0309 02:46:14.613421 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" Mar 09 02:46:14 crc kubenswrapper[4901]: I0309 02:46:14.613350 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" event={"ID":"0bef1913-8737-48c8-bcf5-89daf1bd1c54","Type":"ContainerDied","Data":"0ee6723aa85968887f12a733a8284e9d66a89e772c7329c665dd351a8b35a40e"} Mar 09 02:46:14 crc kubenswrapper[4901]: I0309 02:46:14.614885 4901 scope.go:117] "RemoveContainer" containerID="ba483cf28e26977022a8c8ce366cc98cb64776353c9ee50895f3e6e7743f51af" Mar 09 02:46:14 crc kubenswrapper[4901]: I0309 02:46:14.614706 4901 status_manager.go:851] "Failed to get status for pod" podUID="946ddaae-0092-45f2-b5af-3a5168fa64e8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.20:6443: connect: connection refused" Mar 09 02:46:14 crc kubenswrapper[4901]: I0309 02:46:14.615606 4901 status_manager.go:851] "Failed to get status for pod" podUID="0bef1913-8737-48c8-bcf5-89daf1bd1c54" pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-tzf54\": dial tcp 38.102.83.20:6443: connect: connection refused" Mar 09 02:46:14 crc kubenswrapper[4901]: I0309 02:46:14.617333 4901 status_manager.go:851] "Failed to get status for pod" podUID="9a751411-2ccc-4bae-bdc6-c34fe2385db3" pod="openshift-infra/auto-csr-approver-29550406-smrn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29550406-smrn9\": dial tcp 38.102.83.20:6443: connect: connection refused" Mar 09 02:46:14 crc kubenswrapper[4901]: I0309 02:46:14.620374 4901 status_manager.go:851] "Failed to get status for pod" podUID="0bef1913-8737-48c8-bcf5-89daf1bd1c54" pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-tzf54\": dial tcp 38.102.83.20:6443: connect: connection refused" Mar 09 02:46:14 crc kubenswrapper[4901]: I0309 02:46:14.621021 4901 status_manager.go:851] "Failed to get status for pod" podUID="9a751411-2ccc-4bae-bdc6-c34fe2385db3" pod="openshift-infra/auto-csr-approver-29550406-smrn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29550406-smrn9\": dial tcp 38.102.83.20:6443: connect: connection refused" Mar 09 02:46:14 crc kubenswrapper[4901]: I0309 02:46:14.621359 4901 status_manager.go:851] "Failed to get status for pod" podUID="946ddaae-0092-45f2-b5af-3a5168fa64e8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.20:6443: connect: connection refused" Mar 09 02:46:15 crc kubenswrapper[4901]: I0309 02:46:15.614294 4901 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Liveness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 09 02:46:15 crc kubenswrapper[4901]: I0309 02:46:15.615168 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 09 02:46:15 crc kubenswrapper[4901]: I0309 02:46:15.620529 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 09 02:46:15 crc kubenswrapper[4901]: I0309 02:46:15.621998 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 09 02:46:15 crc kubenswrapper[4901]: I0309 02:46:15.622064 4901 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="e0cd220ceed026b8b0448743b09537807f1d4c59e290c74e4afa63b3331aaf97" exitCode=1 Mar 09 02:46:15 crc kubenswrapper[4901]: I0309 02:46:15.622137 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"e0cd220ceed026b8b0448743b09537807f1d4c59e290c74e4afa63b3331aaf97"} Mar 09 02:46:15 crc kubenswrapper[4901]: I0309 02:46:15.622651 4901 scope.go:117] "RemoveContainer" containerID="e0cd220ceed026b8b0448743b09537807f1d4c59e290c74e4afa63b3331aaf97" Mar 09 02:46:15 crc kubenswrapper[4901]: I0309 02:46:15.623012 4901 status_manager.go:851] "Failed to get status for pod" podUID="0bef1913-8737-48c8-bcf5-89daf1bd1c54" pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-tzf54\": dial tcp 38.102.83.20:6443: connect: connection refused" Mar 09 02:46:15 crc kubenswrapper[4901]: I0309 02:46:15.624717 4901 status_manager.go:851] "Failed to get status for pod" podUID="9a751411-2ccc-4bae-bdc6-c34fe2385db3" pod="openshift-infra/auto-csr-approver-29550406-smrn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29550406-smrn9\": dial tcp 38.102.83.20:6443: connect: connection refused" Mar 09 02:46:15 crc kubenswrapper[4901]: I0309 02:46:15.625263 4901 status_manager.go:851] "Failed to get status for pod" podUID="946ddaae-0092-45f2-b5af-3a5168fa64e8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.20:6443: connect: connection refused" Mar 09 02:46:15 crc kubenswrapper[4901]: I0309 02:46:15.626516 4901 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.20:6443: connect: connection refused" Mar 09 02:46:15 crc kubenswrapper[4901]: I0309 02:46:15.784022 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 02:46:16 crc kubenswrapper[4901]: I0309 02:46:16.120276 4901 status_manager.go:851] "Failed to get status for pod" podUID="0bef1913-8737-48c8-bcf5-89daf1bd1c54" pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-tzf54\": dial tcp 38.102.83.20:6443: connect: connection refused" Mar 09 02:46:16 crc kubenswrapper[4901]: I0309 02:46:16.120487 4901 status_manager.go:851] "Failed to get status for pod" podUID="9a751411-2ccc-4bae-bdc6-c34fe2385db3" pod="openshift-infra/auto-csr-approver-29550406-smrn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29550406-smrn9\": dial tcp 38.102.83.20:6443: connect: connection refused" Mar 09 02:46:16 crc kubenswrapper[4901]: I0309 02:46:16.120654 4901 status_manager.go:851] "Failed to get status for pod" podUID="946ddaae-0092-45f2-b5af-3a5168fa64e8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.20:6443: connect: connection refused" Mar 09 02:46:16 crc kubenswrapper[4901]: I0309 02:46:16.120860 4901 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.20:6443: connect: connection refused" Mar 09 02:46:16 crc kubenswrapper[4901]: I0309 02:46:16.633749 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 09 02:46:16 crc kubenswrapper[4901]: I0309 02:46:16.635361 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 09 02:46:16 crc kubenswrapper[4901]: I0309 02:46:16.635435 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f12ce4ade3c93451c2e88c4eacd7f0cc03fe1d48c5a66eef4dbf47b67c0776de"} Mar 09 02:46:16 crc kubenswrapper[4901]: I0309 02:46:16.637000 4901 status_manager.go:851] "Failed to get status for pod" podUID="0bef1913-8737-48c8-bcf5-89daf1bd1c54" pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-tzf54\": dial tcp 38.102.83.20:6443: connect: connection refused" Mar 09 02:46:16 crc kubenswrapper[4901]: I0309 02:46:16.637661 4901 status_manager.go:851] "Failed to get status for pod" podUID="9a751411-2ccc-4bae-bdc6-c34fe2385db3" pod="openshift-infra/auto-csr-approver-29550406-smrn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29550406-smrn9\": dial tcp 38.102.83.20:6443: connect: connection refused" Mar 09 02:46:16 crc kubenswrapper[4901]: I0309 02:46:16.638075 4901 status_manager.go:851] "Failed to get status for pod" podUID="946ddaae-0092-45f2-b5af-3a5168fa64e8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.20:6443: connect: connection refused" Mar 09 02:46:16 crc kubenswrapper[4901]: I0309 02:46:16.638621 4901 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.20:6443: connect: connection refused" Mar 09 02:46:17 crc kubenswrapper[4901]: I0309 02:46:17.105638 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 02:46:17 crc kubenswrapper[4901]: I0309 02:46:17.107138 4901 status_manager.go:851] "Failed to get status for pod" podUID="0bef1913-8737-48c8-bcf5-89daf1bd1c54" pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-tzf54\": dial tcp 38.102.83.20:6443: connect: connection refused" Mar 09 02:46:17 crc kubenswrapper[4901]: I0309 02:46:17.107874 4901 status_manager.go:851] "Failed to get status for pod" podUID="9a751411-2ccc-4bae-bdc6-c34fe2385db3" pod="openshift-infra/auto-csr-approver-29550406-smrn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29550406-smrn9\": dial tcp 38.102.83.20:6443: connect: connection refused" Mar 09 02:46:17 crc kubenswrapper[4901]: I0309 02:46:17.109453 4901 status_manager.go:851] "Failed to get status for pod" podUID="946ddaae-0092-45f2-b5af-3a5168fa64e8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.20:6443: connect: connection refused" Mar 09 02:46:17 crc kubenswrapper[4901]: I0309 02:46:17.109961 4901 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.20:6443: connect: connection refused" Mar 09 02:46:17 crc kubenswrapper[4901]: I0309 02:46:17.130304 4901 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="84ad2682-53b5-4e9d-acd1-0f0d210b322c" Mar 09 02:46:17 crc kubenswrapper[4901]: I0309 02:46:17.130678 4901 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="84ad2682-53b5-4e9d-acd1-0f0d210b322c" Mar 09 02:46:17 crc kubenswrapper[4901]: E0309 02:46:17.131383 4901 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.20:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 02:46:17 crc kubenswrapper[4901]: I0309 02:46:17.132040 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 02:46:17 crc kubenswrapper[4901]: W0309 02:46:17.162187 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-ad339985c41ad0167eb373a2817ba6467a7b52c12206c87cb9e0d89f447f0479 WatchSource:0}: Error finding container ad339985c41ad0167eb373a2817ba6467a7b52c12206c87cb9e0d89f447f0479: Status 404 returned error can't find the container with id ad339985c41ad0167eb373a2817ba6467a7b52c12206c87cb9e0d89f447f0479 Mar 09 02:46:17 crc kubenswrapper[4901]: I0309 02:46:17.644474 4901 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="ff07f91f57e88a41e22997f4f27d19d01d6ac04686845fb9e34c70b39921e1c5" exitCode=0 Mar 09 02:46:17 crc kubenswrapper[4901]: I0309 02:46:17.644613 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"ff07f91f57e88a41e22997f4f27d19d01d6ac04686845fb9e34c70b39921e1c5"} Mar 09 02:46:17 crc kubenswrapper[4901]: I0309 02:46:17.644700 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ad339985c41ad0167eb373a2817ba6467a7b52c12206c87cb9e0d89f447f0479"} Mar 09 02:46:17 crc kubenswrapper[4901]: I0309 02:46:17.645414 4901 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="84ad2682-53b5-4e9d-acd1-0f0d210b322c" Mar 09 02:46:17 crc kubenswrapper[4901]: I0309 02:46:17.645444 4901 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="84ad2682-53b5-4e9d-acd1-0f0d210b322c" Mar 09 02:46:17 crc kubenswrapper[4901]: E0309 02:46:17.646145 4901 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.20:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 02:46:17 crc kubenswrapper[4901]: I0309 02:46:17.646163 4901 status_manager.go:851] "Failed to get status for pod" podUID="0bef1913-8737-48c8-bcf5-89daf1bd1c54" pod="openshift-authentication/oauth-openshift-558db77b4-tzf54" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-tzf54\": dial tcp 38.102.83.20:6443: connect: connection refused" Mar 09 02:46:17 crc kubenswrapper[4901]: I0309 02:46:17.646890 4901 status_manager.go:851] "Failed to get status for pod" podUID="9a751411-2ccc-4bae-bdc6-c34fe2385db3" pod="openshift-infra/auto-csr-approver-29550406-smrn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29550406-smrn9\": dial tcp 38.102.83.20:6443: connect: connection refused" Mar 09 02:46:17 crc kubenswrapper[4901]: I0309 02:46:17.647747 4901 status_manager.go:851] "Failed to get status for pod" podUID="946ddaae-0092-45f2-b5af-3a5168fa64e8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.20:6443: connect: connection refused" Mar 09 02:46:17 crc kubenswrapper[4901]: I0309 02:46:17.648326 4901 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.20:6443: connect: connection refused" Mar 09 02:46:18 crc kubenswrapper[4901]: I0309 02:46:18.659247 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"318309d949151248019c41e5b8eecbca35cc154bf2dbfc8c030c7095ff10df4d"} Mar 09 02:46:18 crc kubenswrapper[4901]: I0309 02:46:18.659651 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"945a031c5e207de93d95cffdd4c8f55dc2e86e2287380b7b9470b00a90911577"} Mar 09 02:46:18 crc kubenswrapper[4901]: I0309 02:46:18.659667 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"33b62b3153d366cbdd1f0783f379622d61dcb4dc992d5ffb4cd7df3d0db1b14b"} Mar 09 02:46:19 crc kubenswrapper[4901]: I0309 02:46:19.645900 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 02:46:19 crc kubenswrapper[4901]: I0309 02:46:19.668067 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"bd419b2a244b5ecabfacf4ab492c892133814cef34d6004dda3641b0d79284fb"} Mar 09 02:46:19 crc kubenswrapper[4901]: I0309 02:46:19.668124 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"5674cfdcdaaa41fe1390de28597032acafad35849c9be6ea0e790ceac78e55b1"} Mar 09 02:46:19 crc kubenswrapper[4901]: I0309 02:46:19.668270 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 02:46:19 crc kubenswrapper[4901]: I0309 02:46:19.668384 4901 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="84ad2682-53b5-4e9d-acd1-0f0d210b322c" Mar 09 02:46:19 crc kubenswrapper[4901]: I0309 02:46:19.668410 4901 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="84ad2682-53b5-4e9d-acd1-0f0d210b322c" Mar 09 02:46:22 crc kubenswrapper[4901]: I0309 02:46:22.132843 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 02:46:22 crc kubenswrapper[4901]: I0309 02:46:22.133916 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 02:46:22 crc kubenswrapper[4901]: I0309 02:46:22.140536 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 02:46:24 crc kubenswrapper[4901]: I0309 02:46:24.694418 4901 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 02:46:25 crc kubenswrapper[4901]: I0309 02:46:25.716186 4901 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="84ad2682-53b5-4e9d-acd1-0f0d210b322c" Mar 09 02:46:25 crc kubenswrapper[4901]: I0309 02:46:25.716228 4901 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="84ad2682-53b5-4e9d-acd1-0f0d210b322c" Mar 09 02:46:25 crc kubenswrapper[4901]: I0309 02:46:25.723123 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 02:46:25 crc kubenswrapper[4901]: I0309 02:46:25.783913 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 02:46:25 crc kubenswrapper[4901]: I0309 02:46:25.791571 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 02:46:26 crc kubenswrapper[4901]: I0309 02:46:26.121007 4901 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="453aca8f-091c-47b7-b772-9f5ed8faf464" Mar 09 02:46:26 crc kubenswrapper[4901]: I0309 02:46:26.725647 4901 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="84ad2682-53b5-4e9d-acd1-0f0d210b322c" Mar 09 02:46:26 crc kubenswrapper[4901]: I0309 02:46:26.726064 4901 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="84ad2682-53b5-4e9d-acd1-0f0d210b322c" Mar 09 02:46:26 crc kubenswrapper[4901]: I0309 02:46:26.730022 4901 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="453aca8f-091c-47b7-b772-9f5ed8faf464" Mar 09 02:46:26 crc kubenswrapper[4901]: I0309 02:46:26.731285 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 02:46:30 crc kubenswrapper[4901]: I0309 02:46:30.862958 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 02:46:30 crc kubenswrapper[4901]: I0309 02:46:30.863343 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 02:46:30 crc kubenswrapper[4901]: I0309 02:46:30.863413 4901 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5c998" Mar 09 02:46:30 crc kubenswrapper[4901]: I0309 02:46:30.864790 4901 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f7db76f28a31cfe2c8e33f8d3addf0b75358f97a36710315a069de1716506eae"} pod="openshift-machine-config-operator/machine-config-daemon-5c998" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 02:46:30 crc kubenswrapper[4901]: I0309 02:46:30.864887 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" containerID="cri-o://f7db76f28a31cfe2c8e33f8d3addf0b75358f97a36710315a069de1716506eae" gracePeriod=600 Mar 09 02:46:31 crc kubenswrapper[4901]: I0309 02:46:31.764104 4901 generic.go:334] "Generic (PLEG): container finished" podID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerID="f7db76f28a31cfe2c8e33f8d3addf0b75358f97a36710315a069de1716506eae" exitCode=0 Mar 09 02:46:31 crc kubenswrapper[4901]: I0309 02:46:31.764691 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" event={"ID":"65e722e8-52c4-4bb6-9927-f378b2f7296a","Type":"ContainerDied","Data":"f7db76f28a31cfe2c8e33f8d3addf0b75358f97a36710315a069de1716506eae"} Mar 09 02:46:31 crc kubenswrapper[4901]: I0309 02:46:31.764732 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" event={"ID":"65e722e8-52c4-4bb6-9927-f378b2f7296a","Type":"ContainerStarted","Data":"4e216bbf999577ca8f01e583ee820521f2479f711576cf371b955a56a58308e3"} Mar 09 02:46:35 crc kubenswrapper[4901]: I0309 02:46:35.068999 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 09 02:46:35 crc kubenswrapper[4901]: I0309 02:46:35.287207 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 09 02:46:35 crc kubenswrapper[4901]: I0309 02:46:35.296087 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 09 02:46:35 crc kubenswrapper[4901]: I0309 02:46:35.888765 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 09 02:46:35 crc kubenswrapper[4901]: I0309 02:46:35.939787 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 09 02:46:36 crc kubenswrapper[4901]: I0309 02:46:36.280832 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 09 02:46:36 crc kubenswrapper[4901]: I0309 02:46:36.307204 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 09 02:46:36 crc kubenswrapper[4901]: I0309 02:46:36.346552 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 09 02:46:36 crc kubenswrapper[4901]: I0309 02:46:36.585758 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 09 02:46:36 crc kubenswrapper[4901]: I0309 02:46:36.607035 4901 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 09 02:46:36 crc kubenswrapper[4901]: I0309 02:46:36.718637 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 09 02:46:36 crc kubenswrapper[4901]: I0309 02:46:36.762683 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 09 02:46:36 crc kubenswrapper[4901]: I0309 02:46:36.793476 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 09 02:46:36 crc kubenswrapper[4901]: I0309 02:46:36.909677 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 09 02:46:36 crc kubenswrapper[4901]: I0309 02:46:36.931080 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 09 02:46:36 crc kubenswrapper[4901]: I0309 02:46:36.972527 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 09 02:46:36 crc kubenswrapper[4901]: I0309 02:46:36.985838 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 09 02:46:37 crc kubenswrapper[4901]: I0309 02:46:37.137710 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 09 02:46:37 crc kubenswrapper[4901]: I0309 02:46:37.346940 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 09 02:46:37 crc kubenswrapper[4901]: I0309 02:46:37.582353 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 09 02:46:37 crc kubenswrapper[4901]: I0309 02:46:37.845010 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 09 02:46:37 crc kubenswrapper[4901]: I0309 02:46:37.865210 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 09 02:46:37 crc kubenswrapper[4901]: I0309 02:46:37.985817 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 09 02:46:38 crc kubenswrapper[4901]: I0309 02:46:38.021291 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 09 02:46:38 crc kubenswrapper[4901]: I0309 02:46:38.164763 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 09 02:46:38 crc kubenswrapper[4901]: I0309 02:46:38.392115 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 09 02:46:38 crc kubenswrapper[4901]: I0309 02:46:38.459162 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 09 02:46:38 crc kubenswrapper[4901]: I0309 02:46:38.621819 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 09 02:46:38 crc kubenswrapper[4901]: I0309 02:46:38.657488 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 09 02:46:38 crc kubenswrapper[4901]: I0309 02:46:38.729886 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 02:46:38 crc kubenswrapper[4901]: I0309 02:46:38.744080 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 09 02:46:38 crc kubenswrapper[4901]: I0309 02:46:38.791063 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 09 02:46:38 crc kubenswrapper[4901]: I0309 02:46:38.845825 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 09 02:46:38 crc kubenswrapper[4901]: I0309 02:46:38.857882 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 09 02:46:38 crc kubenswrapper[4901]: I0309 02:46:38.857976 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 09 02:46:38 crc kubenswrapper[4901]: I0309 02:46:38.978520 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 09 02:46:39 crc kubenswrapper[4901]: I0309 02:46:39.138201 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 09 02:46:39 crc kubenswrapper[4901]: I0309 02:46:39.159135 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 09 02:46:39 crc kubenswrapper[4901]: I0309 02:46:39.212034 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 09 02:46:39 crc kubenswrapper[4901]: I0309 02:46:39.223299 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 09 02:46:39 crc kubenswrapper[4901]: I0309 02:46:39.244812 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 09 02:46:39 crc kubenswrapper[4901]: I0309 02:46:39.342601 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 09 02:46:39 crc kubenswrapper[4901]: I0309 02:46:39.364992 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 09 02:46:39 crc kubenswrapper[4901]: I0309 02:46:39.432662 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 09 02:46:39 crc kubenswrapper[4901]: I0309 02:46:39.488686 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 09 02:46:39 crc kubenswrapper[4901]: I0309 02:46:39.654613 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 09 02:46:39 crc kubenswrapper[4901]: I0309 02:46:39.675401 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 09 02:46:39 crc kubenswrapper[4901]: I0309 02:46:39.846542 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 09 02:46:39 crc kubenswrapper[4901]: I0309 02:46:39.972931 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 09 02:46:40 crc kubenswrapper[4901]: I0309 02:46:40.000418 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 09 02:46:40 crc kubenswrapper[4901]: I0309 02:46:40.020816 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 09 02:46:40 crc kubenswrapper[4901]: I0309 02:46:40.180511 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 09 02:46:40 crc kubenswrapper[4901]: I0309 02:46:40.236215 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 09 02:46:40 crc kubenswrapper[4901]: I0309 02:46:40.253603 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 09 02:46:40 crc kubenswrapper[4901]: I0309 02:46:40.339412 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 09 02:46:40 crc kubenswrapper[4901]: I0309 02:46:40.400904 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 09 02:46:40 crc kubenswrapper[4901]: I0309 02:46:40.613050 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 09 02:46:40 crc kubenswrapper[4901]: I0309 02:46:40.641064 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 09 02:46:40 crc kubenswrapper[4901]: I0309 02:46:40.658966 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 09 02:46:40 crc kubenswrapper[4901]: I0309 02:46:40.785716 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 09 02:46:40 crc kubenswrapper[4901]: I0309 02:46:40.827594 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 09 02:46:40 crc kubenswrapper[4901]: I0309 02:46:40.827898 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 09 02:46:40 crc kubenswrapper[4901]: I0309 02:46:40.840731 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 09 02:46:40 crc kubenswrapper[4901]: I0309 02:46:40.949171 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 09 02:46:41 crc kubenswrapper[4901]: I0309 02:46:41.010361 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 09 02:46:41 crc kubenswrapper[4901]: I0309 02:46:41.047287 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 09 02:46:41 crc kubenswrapper[4901]: I0309 02:46:41.072108 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 09 02:46:41 crc kubenswrapper[4901]: I0309 02:46:41.181604 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 09 02:46:41 crc kubenswrapper[4901]: I0309 02:46:41.202911 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 09 02:46:41 crc kubenswrapper[4901]: I0309 02:46:41.237360 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 09 02:46:41 crc kubenswrapper[4901]: I0309 02:46:41.316539 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 09 02:46:41 crc kubenswrapper[4901]: I0309 02:46:41.358149 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 09 02:46:41 crc kubenswrapper[4901]: I0309 02:46:41.363130 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 09 02:46:41 crc kubenswrapper[4901]: I0309 02:46:41.436903 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 09 02:46:41 crc kubenswrapper[4901]: I0309 02:46:41.489403 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 09 02:46:41 crc kubenswrapper[4901]: I0309 02:46:41.520353 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 09 02:46:41 crc kubenswrapper[4901]: I0309 02:46:41.609428 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 09 02:46:41 crc kubenswrapper[4901]: I0309 02:46:41.740049 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 09 02:46:41 crc kubenswrapper[4901]: I0309 02:46:41.751757 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 09 02:46:41 crc kubenswrapper[4901]: I0309 02:46:41.778735 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 09 02:46:41 crc kubenswrapper[4901]: I0309 02:46:41.853841 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 09 02:46:41 crc kubenswrapper[4901]: I0309 02:46:41.941133 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 09 02:46:41 crc kubenswrapper[4901]: I0309 02:46:41.972100 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 09 02:46:42 crc kubenswrapper[4901]: I0309 02:46:42.068508 4901 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 09 02:46:42 crc kubenswrapper[4901]: I0309 02:46:42.093974 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 09 02:46:42 crc kubenswrapper[4901]: I0309 02:46:42.127373 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 09 02:46:42 crc kubenswrapper[4901]: I0309 02:46:42.138142 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 09 02:46:42 crc kubenswrapper[4901]: I0309 02:46:42.149914 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 09 02:46:42 crc kubenswrapper[4901]: I0309 02:46:42.162550 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 09 02:46:42 crc kubenswrapper[4901]: I0309 02:46:42.208999 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 09 02:46:42 crc kubenswrapper[4901]: I0309 02:46:42.239048 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 09 02:46:42 crc kubenswrapper[4901]: I0309 02:46:42.289080 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 09 02:46:42 crc kubenswrapper[4901]: I0309 02:46:42.398196 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 09 02:46:42 crc kubenswrapper[4901]: I0309 02:46:42.405669 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 09 02:46:42 crc kubenswrapper[4901]: I0309 02:46:42.425138 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 09 02:46:42 crc kubenswrapper[4901]: I0309 02:46:42.466963 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 09 02:46:42 crc kubenswrapper[4901]: I0309 02:46:42.559207 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 09 02:46:42 crc kubenswrapper[4901]: I0309 02:46:42.603128 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 09 02:46:42 crc kubenswrapper[4901]: I0309 02:46:42.652331 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 09 02:46:42 crc kubenswrapper[4901]: I0309 02:46:42.773930 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 09 02:46:42 crc kubenswrapper[4901]: I0309 02:46:42.843200 4901 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 09 02:46:42 crc kubenswrapper[4901]: I0309 02:46:42.862583 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 09 02:46:42 crc kubenswrapper[4901]: I0309 02:46:42.914746 4901 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 09 02:46:42 crc kubenswrapper[4901]: I0309 02:46:42.916931 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 09 02:46:43 crc kubenswrapper[4901]: I0309 02:46:43.037402 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 09 02:46:43 crc kubenswrapper[4901]: I0309 02:46:43.066106 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 09 02:46:43 crc kubenswrapper[4901]: I0309 02:46:43.073755 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 09 02:46:43 crc kubenswrapper[4901]: I0309 02:46:43.187092 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 09 02:46:43 crc kubenswrapper[4901]: I0309 02:46:43.240556 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 09 02:46:43 crc kubenswrapper[4901]: I0309 02:46:43.287117 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 09 02:46:43 crc kubenswrapper[4901]: I0309 02:46:43.297689 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 09 02:46:43 crc kubenswrapper[4901]: I0309 02:46:43.387960 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 09 02:46:43 crc kubenswrapper[4901]: I0309 02:46:43.469288 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 02:46:43 crc kubenswrapper[4901]: I0309 02:46:43.474035 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 09 02:46:43 crc kubenswrapper[4901]: I0309 02:46:43.519063 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 09 02:46:43 crc kubenswrapper[4901]: I0309 02:46:43.676962 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 09 02:46:43 crc kubenswrapper[4901]: I0309 02:46:43.744600 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 09 02:46:43 crc kubenswrapper[4901]: I0309 02:46:43.854113 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 09 02:46:43 crc kubenswrapper[4901]: I0309 02:46:43.892010 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 09 02:46:43 crc kubenswrapper[4901]: I0309 02:46:43.941556 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 09 02:46:44 crc kubenswrapper[4901]: I0309 02:46:44.027887 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 09 02:46:44 crc kubenswrapper[4901]: I0309 02:46:44.068964 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 09 02:46:44 crc kubenswrapper[4901]: I0309 02:46:44.199922 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 09 02:46:44 crc kubenswrapper[4901]: I0309 02:46:44.217782 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 09 02:46:44 crc kubenswrapper[4901]: I0309 02:46:44.230546 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 09 02:46:44 crc kubenswrapper[4901]: I0309 02:46:44.267959 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 09 02:46:44 crc kubenswrapper[4901]: I0309 02:46:44.322028 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 09 02:46:44 crc kubenswrapper[4901]: I0309 02:46:44.356604 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 09 02:46:44 crc kubenswrapper[4901]: I0309 02:46:44.374650 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 09 02:46:44 crc kubenswrapper[4901]: I0309 02:46:44.412746 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 09 02:46:44 crc kubenswrapper[4901]: I0309 02:46:44.443032 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 09 02:46:44 crc kubenswrapper[4901]: I0309 02:46:44.522006 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 09 02:46:44 crc kubenswrapper[4901]: I0309 02:46:44.681689 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 09 02:46:44 crc kubenswrapper[4901]: I0309 02:46:44.701306 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 09 02:46:44 crc kubenswrapper[4901]: I0309 02:46:44.719683 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 09 02:46:44 crc kubenswrapper[4901]: I0309 02:46:44.838213 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 09 02:46:44 crc kubenswrapper[4901]: I0309 02:46:44.844627 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 09 02:46:44 crc kubenswrapper[4901]: I0309 02:46:44.864090 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 09 02:46:44 crc kubenswrapper[4901]: I0309 02:46:44.903614 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 09 02:46:45 crc kubenswrapper[4901]: I0309 02:46:45.025974 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 09 02:46:45 crc kubenswrapper[4901]: I0309 02:46:45.057652 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 09 02:46:45 crc kubenswrapper[4901]: I0309 02:46:45.122779 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 09 02:46:45 crc kubenswrapper[4901]: I0309 02:46:45.128262 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 09 02:46:45 crc kubenswrapper[4901]: I0309 02:46:45.152160 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 09 02:46:45 crc kubenswrapper[4901]: I0309 02:46:45.176938 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 09 02:46:45 crc kubenswrapper[4901]: I0309 02:46:45.181721 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 09 02:46:45 crc kubenswrapper[4901]: I0309 02:46:45.245695 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 09 02:46:45 crc kubenswrapper[4901]: I0309 02:46:45.271564 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 09 02:46:45 crc kubenswrapper[4901]: I0309 02:46:45.445120 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 09 02:46:45 crc kubenswrapper[4901]: I0309 02:46:45.447874 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 09 02:46:45 crc kubenswrapper[4901]: I0309 02:46:45.508677 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 09 02:46:45 crc kubenswrapper[4901]: I0309 02:46:45.509257 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 09 02:46:45 crc kubenswrapper[4901]: I0309 02:46:45.515059 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 09 02:46:45 crc kubenswrapper[4901]: I0309 02:46:45.647803 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 09 02:46:45 crc kubenswrapper[4901]: I0309 02:46:45.678410 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 09 02:46:45 crc kubenswrapper[4901]: I0309 02:46:45.693912 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 09 02:46:45 crc kubenswrapper[4901]: I0309 02:46:45.849485 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 09 02:46:45 crc kubenswrapper[4901]: I0309 02:46:45.993523 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.013202 4901 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.019372 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-tzf54"] Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.020000 4901 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="84ad2682-53b5-4e9d-acd1-0f0d210b322c" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.020048 4901 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="84ad2682-53b5-4e9d-acd1-0f0d210b322c" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.020846 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-7fdcdd74b7-fd7dj","openshift-kube-apiserver/kube-apiserver-crc"] Mar 09 02:46:46 crc kubenswrapper[4901]: E0309 02:46:46.021209 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="946ddaae-0092-45f2-b5af-3a5168fa64e8" containerName="installer" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.021256 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="946ddaae-0092-45f2-b5af-3a5168fa64e8" containerName="installer" Mar 09 02:46:46 crc kubenswrapper[4901]: E0309 02:46:46.021275 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a751411-2ccc-4bae-bdc6-c34fe2385db3" containerName="oc" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.021287 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a751411-2ccc-4bae-bdc6-c34fe2385db3" containerName="oc" Mar 09 02:46:46 crc kubenswrapper[4901]: E0309 02:46:46.021312 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bef1913-8737-48c8-bcf5-89daf1bd1c54" containerName="oauth-openshift" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.021323 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bef1913-8737-48c8-bcf5-89daf1bd1c54" containerName="oauth-openshift" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.021489 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bef1913-8737-48c8-bcf5-89daf1bd1c54" containerName="oauth-openshift" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.021506 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="946ddaae-0092-45f2-b5af-3a5168fa64e8" containerName="installer" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.021524 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a751411-2ccc-4bae-bdc6-c34fe2385db3" containerName="oc" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.022088 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-ccd8f799d-cmm5g","openshift-controller-manager/controller-manager-7b84c445f9-767xj"] Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.022263 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7fdcdd74b7-fd7dj" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.023087 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-ccd8f799d-cmm5g" podUID="b849405e-49b5-468b-bfd1-1b305aca9529" containerName="route-controller-manager" containerID="cri-o://b7670c3760e6467bb7b242b7754cd587962dc08f1725257c8b6af1392a5bf803" gracePeriod=30 Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.023161 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7b84c445f9-767xj" podUID="91541d62-9618-47c1-8e73-935764f24375" containerName="controller-manager" containerID="cri-o://183ccd3bb50f49307668bdc7a563bdfd50e71b0e126c2e31a957fa694aed581f" gracePeriod=30 Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.028827 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.029204 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.031465 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.033092 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.034040 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.034185 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.034585 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.034964 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.035036 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.034964 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.035271 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.035306 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.035157 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.070782 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.081369 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.081411 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.086495 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.089481 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.100522 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.103445 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=22.1034164 podStartE2EDuration="22.1034164s" podCreationTimestamp="2026-03-09 02:46:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:46:46.094423632 +0000 UTC m=+330.684087394" watchObservedRunningTime="2026-03-09 02:46:46.1034164 +0000 UTC m=+330.693080162" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.120191 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bef1913-8737-48c8-bcf5-89daf1bd1c54" path="/var/lib/kubelet/pods/0bef1913-8737-48c8-bcf5-89daf1bd1c54/volumes" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.121566 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0eb6a6fc-6667-4169-883b-e159b625af06-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7fdcdd74b7-fd7dj\" (UID: \"0eb6a6fc-6667-4169-883b-e159b625af06\") " pod="openshift-authentication/oauth-openshift-7fdcdd74b7-fd7dj" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.121719 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0eb6a6fc-6667-4169-883b-e159b625af06-v4-0-config-system-session\") pod \"oauth-openshift-7fdcdd74b7-fd7dj\" (UID: \"0eb6a6fc-6667-4169-883b-e159b625af06\") " pod="openshift-authentication/oauth-openshift-7fdcdd74b7-fd7dj" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.121844 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0eb6a6fc-6667-4169-883b-e159b625af06-v4-0-config-system-service-ca\") pod \"oauth-openshift-7fdcdd74b7-fd7dj\" (UID: \"0eb6a6fc-6667-4169-883b-e159b625af06\") " pod="openshift-authentication/oauth-openshift-7fdcdd74b7-fd7dj" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.122003 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0eb6a6fc-6667-4169-883b-e159b625af06-audit-dir\") pod \"oauth-openshift-7fdcdd74b7-fd7dj\" (UID: \"0eb6a6fc-6667-4169-883b-e159b625af06\") " pod="openshift-authentication/oauth-openshift-7fdcdd74b7-fd7dj" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.122120 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0eb6a6fc-6667-4169-883b-e159b625af06-v4-0-config-user-template-error\") pod \"oauth-openshift-7fdcdd74b7-fd7dj\" (UID: \"0eb6a6fc-6667-4169-883b-e159b625af06\") " pod="openshift-authentication/oauth-openshift-7fdcdd74b7-fd7dj" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.122249 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0eb6a6fc-6667-4169-883b-e159b625af06-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7fdcdd74b7-fd7dj\" (UID: \"0eb6a6fc-6667-4169-883b-e159b625af06\") " pod="openshift-authentication/oauth-openshift-7fdcdd74b7-fd7dj" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.122387 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0eb6a6fc-6667-4169-883b-e159b625af06-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7fdcdd74b7-fd7dj\" (UID: \"0eb6a6fc-6667-4169-883b-e159b625af06\") " pod="openshift-authentication/oauth-openshift-7fdcdd74b7-fd7dj" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.122503 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0eb6a6fc-6667-4169-883b-e159b625af06-v4-0-config-system-router-certs\") pod \"oauth-openshift-7fdcdd74b7-fd7dj\" (UID: \"0eb6a6fc-6667-4169-883b-e159b625af06\") " pod="openshift-authentication/oauth-openshift-7fdcdd74b7-fd7dj" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.122621 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0eb6a6fc-6667-4169-883b-e159b625af06-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7fdcdd74b7-fd7dj\" (UID: \"0eb6a6fc-6667-4169-883b-e159b625af06\") " pod="openshift-authentication/oauth-openshift-7fdcdd74b7-fd7dj" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.122781 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0eb6a6fc-6667-4169-883b-e159b625af06-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7fdcdd74b7-fd7dj\" (UID: \"0eb6a6fc-6667-4169-883b-e159b625af06\") " pod="openshift-authentication/oauth-openshift-7fdcdd74b7-fd7dj" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.122918 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jln9f\" (UniqueName: \"kubernetes.io/projected/0eb6a6fc-6667-4169-883b-e159b625af06-kube-api-access-jln9f\") pod \"oauth-openshift-7fdcdd74b7-fd7dj\" (UID: \"0eb6a6fc-6667-4169-883b-e159b625af06\") " pod="openshift-authentication/oauth-openshift-7fdcdd74b7-fd7dj" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.123032 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0eb6a6fc-6667-4169-883b-e159b625af06-v4-0-config-user-template-login\") pod \"oauth-openshift-7fdcdd74b7-fd7dj\" (UID: \"0eb6a6fc-6667-4169-883b-e159b625af06\") " pod="openshift-authentication/oauth-openshift-7fdcdd74b7-fd7dj" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.123145 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0eb6a6fc-6667-4169-883b-e159b625af06-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7fdcdd74b7-fd7dj\" (UID: \"0eb6a6fc-6667-4169-883b-e159b625af06\") " pod="openshift-authentication/oauth-openshift-7fdcdd74b7-fd7dj" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.123590 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0eb6a6fc-6667-4169-883b-e159b625af06-audit-policies\") pod \"oauth-openshift-7fdcdd74b7-fd7dj\" (UID: \"0eb6a6fc-6667-4169-883b-e159b625af06\") " pod="openshift-authentication/oauth-openshift-7fdcdd74b7-fd7dj" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.175850 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.224991 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0eb6a6fc-6667-4169-883b-e159b625af06-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7fdcdd74b7-fd7dj\" (UID: \"0eb6a6fc-6667-4169-883b-e159b625af06\") " pod="openshift-authentication/oauth-openshift-7fdcdd74b7-fd7dj" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.225034 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0eb6a6fc-6667-4169-883b-e159b625af06-v4-0-config-system-session\") pod \"oauth-openshift-7fdcdd74b7-fd7dj\" (UID: \"0eb6a6fc-6667-4169-883b-e159b625af06\") " pod="openshift-authentication/oauth-openshift-7fdcdd74b7-fd7dj" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.225059 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0eb6a6fc-6667-4169-883b-e159b625af06-v4-0-config-system-service-ca\") pod \"oauth-openshift-7fdcdd74b7-fd7dj\" (UID: \"0eb6a6fc-6667-4169-883b-e159b625af06\") " pod="openshift-authentication/oauth-openshift-7fdcdd74b7-fd7dj" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.225086 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0eb6a6fc-6667-4169-883b-e159b625af06-audit-dir\") pod \"oauth-openshift-7fdcdd74b7-fd7dj\" (UID: \"0eb6a6fc-6667-4169-883b-e159b625af06\") " pod="openshift-authentication/oauth-openshift-7fdcdd74b7-fd7dj" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.225106 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0eb6a6fc-6667-4169-883b-e159b625af06-v4-0-config-user-template-error\") pod \"oauth-openshift-7fdcdd74b7-fd7dj\" (UID: \"0eb6a6fc-6667-4169-883b-e159b625af06\") " pod="openshift-authentication/oauth-openshift-7fdcdd74b7-fd7dj" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.225129 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0eb6a6fc-6667-4169-883b-e159b625af06-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7fdcdd74b7-fd7dj\" (UID: \"0eb6a6fc-6667-4169-883b-e159b625af06\") " pod="openshift-authentication/oauth-openshift-7fdcdd74b7-fd7dj" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.225154 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0eb6a6fc-6667-4169-883b-e159b625af06-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7fdcdd74b7-fd7dj\" (UID: \"0eb6a6fc-6667-4169-883b-e159b625af06\") " pod="openshift-authentication/oauth-openshift-7fdcdd74b7-fd7dj" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.225173 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0eb6a6fc-6667-4169-883b-e159b625af06-v4-0-config-system-router-certs\") pod \"oauth-openshift-7fdcdd74b7-fd7dj\" (UID: \"0eb6a6fc-6667-4169-883b-e159b625af06\") " pod="openshift-authentication/oauth-openshift-7fdcdd74b7-fd7dj" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.225193 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0eb6a6fc-6667-4169-883b-e159b625af06-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7fdcdd74b7-fd7dj\" (UID: \"0eb6a6fc-6667-4169-883b-e159b625af06\") " pod="openshift-authentication/oauth-openshift-7fdcdd74b7-fd7dj" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.225218 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0eb6a6fc-6667-4169-883b-e159b625af06-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7fdcdd74b7-fd7dj\" (UID: \"0eb6a6fc-6667-4169-883b-e159b625af06\") " pod="openshift-authentication/oauth-openshift-7fdcdd74b7-fd7dj" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.225303 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jln9f\" (UniqueName: \"kubernetes.io/projected/0eb6a6fc-6667-4169-883b-e159b625af06-kube-api-access-jln9f\") pod \"oauth-openshift-7fdcdd74b7-fd7dj\" (UID: \"0eb6a6fc-6667-4169-883b-e159b625af06\") " pod="openshift-authentication/oauth-openshift-7fdcdd74b7-fd7dj" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.225326 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0eb6a6fc-6667-4169-883b-e159b625af06-v4-0-config-user-template-login\") pod \"oauth-openshift-7fdcdd74b7-fd7dj\" (UID: \"0eb6a6fc-6667-4169-883b-e159b625af06\") " pod="openshift-authentication/oauth-openshift-7fdcdd74b7-fd7dj" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.225353 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0eb6a6fc-6667-4169-883b-e159b625af06-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7fdcdd74b7-fd7dj\" (UID: \"0eb6a6fc-6667-4169-883b-e159b625af06\") " pod="openshift-authentication/oauth-openshift-7fdcdd74b7-fd7dj" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.225379 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0eb6a6fc-6667-4169-883b-e159b625af06-audit-policies\") pod \"oauth-openshift-7fdcdd74b7-fd7dj\" (UID: \"0eb6a6fc-6667-4169-883b-e159b625af06\") " pod="openshift-authentication/oauth-openshift-7fdcdd74b7-fd7dj" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.225446 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0eb6a6fc-6667-4169-883b-e159b625af06-audit-dir\") pod \"oauth-openshift-7fdcdd74b7-fd7dj\" (UID: \"0eb6a6fc-6667-4169-883b-e159b625af06\") " pod="openshift-authentication/oauth-openshift-7fdcdd74b7-fd7dj" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.226363 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0eb6a6fc-6667-4169-883b-e159b625af06-v4-0-config-system-service-ca\") pod \"oauth-openshift-7fdcdd74b7-fd7dj\" (UID: \"0eb6a6fc-6667-4169-883b-e159b625af06\") " pod="openshift-authentication/oauth-openshift-7fdcdd74b7-fd7dj" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.226485 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0eb6a6fc-6667-4169-883b-e159b625af06-audit-policies\") pod \"oauth-openshift-7fdcdd74b7-fd7dj\" (UID: \"0eb6a6fc-6667-4169-883b-e159b625af06\") " pod="openshift-authentication/oauth-openshift-7fdcdd74b7-fd7dj" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.226487 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0eb6a6fc-6667-4169-883b-e159b625af06-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7fdcdd74b7-fd7dj\" (UID: \"0eb6a6fc-6667-4169-883b-e159b625af06\") " pod="openshift-authentication/oauth-openshift-7fdcdd74b7-fd7dj" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.226993 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0eb6a6fc-6667-4169-883b-e159b625af06-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7fdcdd74b7-fd7dj\" (UID: \"0eb6a6fc-6667-4169-883b-e159b625af06\") " pod="openshift-authentication/oauth-openshift-7fdcdd74b7-fd7dj" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.232592 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0eb6a6fc-6667-4169-883b-e159b625af06-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7fdcdd74b7-fd7dj\" (UID: \"0eb6a6fc-6667-4169-883b-e159b625af06\") " pod="openshift-authentication/oauth-openshift-7fdcdd74b7-fd7dj" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.232697 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0eb6a6fc-6667-4169-883b-e159b625af06-v4-0-config-user-template-login\") pod \"oauth-openshift-7fdcdd74b7-fd7dj\" (UID: \"0eb6a6fc-6667-4169-883b-e159b625af06\") " pod="openshift-authentication/oauth-openshift-7fdcdd74b7-fd7dj" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.232890 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0eb6a6fc-6667-4169-883b-e159b625af06-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7fdcdd74b7-fd7dj\" (UID: \"0eb6a6fc-6667-4169-883b-e159b625af06\") " pod="openshift-authentication/oauth-openshift-7fdcdd74b7-fd7dj" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.232735 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0eb6a6fc-6667-4169-883b-e159b625af06-v4-0-config-user-template-error\") pod \"oauth-openshift-7fdcdd74b7-fd7dj\" (UID: \"0eb6a6fc-6667-4169-883b-e159b625af06\") " pod="openshift-authentication/oauth-openshift-7fdcdd74b7-fd7dj" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.233135 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0eb6a6fc-6667-4169-883b-e159b625af06-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7fdcdd74b7-fd7dj\" (UID: \"0eb6a6fc-6667-4169-883b-e159b625af06\") " pod="openshift-authentication/oauth-openshift-7fdcdd74b7-fd7dj" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.233585 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0eb6a6fc-6667-4169-883b-e159b625af06-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7fdcdd74b7-fd7dj\" (UID: \"0eb6a6fc-6667-4169-883b-e159b625af06\") " pod="openshift-authentication/oauth-openshift-7fdcdd74b7-fd7dj" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.240612 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0eb6a6fc-6667-4169-883b-e159b625af06-v4-0-config-system-router-certs\") pod \"oauth-openshift-7fdcdd74b7-fd7dj\" (UID: \"0eb6a6fc-6667-4169-883b-e159b625af06\") " pod="openshift-authentication/oauth-openshift-7fdcdd74b7-fd7dj" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.240836 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0eb6a6fc-6667-4169-883b-e159b625af06-v4-0-config-system-session\") pod \"oauth-openshift-7fdcdd74b7-fd7dj\" (UID: \"0eb6a6fc-6667-4169-883b-e159b625af06\") " pod="openshift-authentication/oauth-openshift-7fdcdd74b7-fd7dj" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.243780 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jln9f\" (UniqueName: \"kubernetes.io/projected/0eb6a6fc-6667-4169-883b-e159b625af06-kube-api-access-jln9f\") pod \"oauth-openshift-7fdcdd74b7-fd7dj\" (UID: \"0eb6a6fc-6667-4169-883b-e159b625af06\") " pod="openshift-authentication/oauth-openshift-7fdcdd74b7-fd7dj" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.246314 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.303738 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.316784 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.321608 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.374729 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7fdcdd74b7-fd7dj" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.499949 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-ccd8f799d-cmm5g" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.509019 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.522776 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.533479 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f5fcdfd45-6r89l"] Mar 09 02:46:46 crc kubenswrapper[4901]: E0309 02:46:46.533756 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b849405e-49b5-468b-bfd1-1b305aca9529" containerName="route-controller-manager" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.533774 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="b849405e-49b5-468b-bfd1-1b305aca9529" containerName="route-controller-manager" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.533903 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="b849405e-49b5-468b-bfd1-1b305aca9529" containerName="route-controller-manager" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.535731 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f5fcdfd45-6r89l" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.562697 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b84c445f9-767xj" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.575752 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.598775 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.601920 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.630510 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b849405e-49b5-468b-bfd1-1b305aca9529-config\") pod \"b849405e-49b5-468b-bfd1-1b305aca9529\" (UID: \"b849405e-49b5-468b-bfd1-1b305aca9529\") " Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.630557 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b849405e-49b5-468b-bfd1-1b305aca9529-serving-cert\") pod \"b849405e-49b5-468b-bfd1-1b305aca9529\" (UID: \"b849405e-49b5-468b-bfd1-1b305aca9529\") " Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.630604 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5g6p\" (UniqueName: \"kubernetes.io/projected/b849405e-49b5-468b-bfd1-1b305aca9529-kube-api-access-h5g6p\") pod \"b849405e-49b5-468b-bfd1-1b305aca9529\" (UID: \"b849405e-49b5-468b-bfd1-1b305aca9529\") " Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.630679 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b849405e-49b5-468b-bfd1-1b305aca9529-client-ca\") pod \"b849405e-49b5-468b-bfd1-1b305aca9529\" (UID: \"b849405e-49b5-468b-bfd1-1b305aca9529\") " Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.630709 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.631412 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b849405e-49b5-468b-bfd1-1b305aca9529-client-ca" (OuterVolumeSpecName: "client-ca") pod "b849405e-49b5-468b-bfd1-1b305aca9529" (UID: "b849405e-49b5-468b-bfd1-1b305aca9529"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.631605 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b849405e-49b5-468b-bfd1-1b305aca9529-config" (OuterVolumeSpecName: "config") pod "b849405e-49b5-468b-bfd1-1b305aca9529" (UID: "b849405e-49b5-468b-bfd1-1b305aca9529"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.634538 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b849405e-49b5-468b-bfd1-1b305aca9529-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b849405e-49b5-468b-bfd1-1b305aca9529" (UID: "b849405e-49b5-468b-bfd1-1b305aca9529"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.635450 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b849405e-49b5-468b-bfd1-1b305aca9529-kube-api-access-h5g6p" (OuterVolumeSpecName: "kube-api-access-h5g6p") pod "b849405e-49b5-468b-bfd1-1b305aca9529" (UID: "b849405e-49b5-468b-bfd1-1b305aca9529"). InnerVolumeSpecName "kube-api-access-h5g6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.678815 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.731503 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91541d62-9618-47c1-8e73-935764f24375-config\") pod \"91541d62-9618-47c1-8e73-935764f24375\" (UID: \"91541d62-9618-47c1-8e73-935764f24375\") " Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.731573 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxfkr\" (UniqueName: \"kubernetes.io/projected/91541d62-9618-47c1-8e73-935764f24375-kube-api-access-xxfkr\") pod \"91541d62-9618-47c1-8e73-935764f24375\" (UID: \"91541d62-9618-47c1-8e73-935764f24375\") " Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.731624 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/91541d62-9618-47c1-8e73-935764f24375-proxy-ca-bundles\") pod \"91541d62-9618-47c1-8e73-935764f24375\" (UID: \"91541d62-9618-47c1-8e73-935764f24375\") " Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.731654 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/91541d62-9618-47c1-8e73-935764f24375-client-ca\") pod \"91541d62-9618-47c1-8e73-935764f24375\" (UID: \"91541d62-9618-47c1-8e73-935764f24375\") " Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.731672 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/91541d62-9618-47c1-8e73-935764f24375-serving-cert\") pod \"91541d62-9618-47c1-8e73-935764f24375\" (UID: \"91541d62-9618-47c1-8e73-935764f24375\") " Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.731850 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e65cf217-f87a-45fd-a6d8-e3d3b2b11cab-serving-cert\") pod \"route-controller-manager-7f5fcdfd45-6r89l\" (UID: \"e65cf217-f87a-45fd-a6d8-e3d3b2b11cab\") " pod="openshift-route-controller-manager/route-controller-manager-7f5fcdfd45-6r89l" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.731875 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb4lg\" (UniqueName: \"kubernetes.io/projected/e65cf217-f87a-45fd-a6d8-e3d3b2b11cab-kube-api-access-cb4lg\") pod \"route-controller-manager-7f5fcdfd45-6r89l\" (UID: \"e65cf217-f87a-45fd-a6d8-e3d3b2b11cab\") " pod="openshift-route-controller-manager/route-controller-manager-7f5fcdfd45-6r89l" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.731924 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e65cf217-f87a-45fd-a6d8-e3d3b2b11cab-client-ca\") pod \"route-controller-manager-7f5fcdfd45-6r89l\" (UID: \"e65cf217-f87a-45fd-a6d8-e3d3b2b11cab\") " pod="openshift-route-controller-manager/route-controller-manager-7f5fcdfd45-6r89l" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.731946 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e65cf217-f87a-45fd-a6d8-e3d3b2b11cab-config\") pod \"route-controller-manager-7f5fcdfd45-6r89l\" (UID: \"e65cf217-f87a-45fd-a6d8-e3d3b2b11cab\") " pod="openshift-route-controller-manager/route-controller-manager-7f5fcdfd45-6r89l" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.732480 4901 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b849405e-49b5-468b-bfd1-1b305aca9529-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.732546 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b849405e-49b5-468b-bfd1-1b305aca9529-config\") on node \"crc\" DevicePath \"\"" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.732562 4901 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b849405e-49b5-468b-bfd1-1b305aca9529-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.732577 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5g6p\" (UniqueName: \"kubernetes.io/projected/b849405e-49b5-468b-bfd1-1b305aca9529-kube-api-access-h5g6p\") on node \"crc\" DevicePath \"\"" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.732919 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91541d62-9618-47c1-8e73-935764f24375-client-ca" (OuterVolumeSpecName: "client-ca") pod "91541d62-9618-47c1-8e73-935764f24375" (UID: "91541d62-9618-47c1-8e73-935764f24375"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.732938 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91541d62-9618-47c1-8e73-935764f24375-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "91541d62-9618-47c1-8e73-935764f24375" (UID: "91541d62-9618-47c1-8e73-935764f24375"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.733003 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91541d62-9618-47c1-8e73-935764f24375-config" (OuterVolumeSpecName: "config") pod "91541d62-9618-47c1-8e73-935764f24375" (UID: "91541d62-9618-47c1-8e73-935764f24375"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.734479 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91541d62-9618-47c1-8e73-935764f24375-kube-api-access-xxfkr" (OuterVolumeSpecName: "kube-api-access-xxfkr") pod "91541d62-9618-47c1-8e73-935764f24375" (UID: "91541d62-9618-47c1-8e73-935764f24375"). InnerVolumeSpecName "kube-api-access-xxfkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.735107 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91541d62-9618-47c1-8e73-935764f24375-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "91541d62-9618-47c1-8e73-935764f24375" (UID: "91541d62-9618-47c1-8e73-935764f24375"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.757293 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.802615 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.823556 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.833755 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e65cf217-f87a-45fd-a6d8-e3d3b2b11cab-client-ca\") pod \"route-controller-manager-7f5fcdfd45-6r89l\" (UID: \"e65cf217-f87a-45fd-a6d8-e3d3b2b11cab\") " pod="openshift-route-controller-manager/route-controller-manager-7f5fcdfd45-6r89l" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.833806 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e65cf217-f87a-45fd-a6d8-e3d3b2b11cab-config\") pod \"route-controller-manager-7f5fcdfd45-6r89l\" (UID: \"e65cf217-f87a-45fd-a6d8-e3d3b2b11cab\") " pod="openshift-route-controller-manager/route-controller-manager-7f5fcdfd45-6r89l" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.833890 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e65cf217-f87a-45fd-a6d8-e3d3b2b11cab-serving-cert\") pod \"route-controller-manager-7f5fcdfd45-6r89l\" (UID: \"e65cf217-f87a-45fd-a6d8-e3d3b2b11cab\") " pod="openshift-route-controller-manager/route-controller-manager-7f5fcdfd45-6r89l" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.833918 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cb4lg\" (UniqueName: \"kubernetes.io/projected/e65cf217-f87a-45fd-a6d8-e3d3b2b11cab-kube-api-access-cb4lg\") pod \"route-controller-manager-7f5fcdfd45-6r89l\" (UID: \"e65cf217-f87a-45fd-a6d8-e3d3b2b11cab\") " pod="openshift-route-controller-manager/route-controller-manager-7f5fcdfd45-6r89l" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.833971 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxfkr\" (UniqueName: \"kubernetes.io/projected/91541d62-9618-47c1-8e73-935764f24375-kube-api-access-xxfkr\") on node \"crc\" DevicePath \"\"" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.833985 4901 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/91541d62-9618-47c1-8e73-935764f24375-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.833997 4901 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/91541d62-9618-47c1-8e73-935764f24375-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.834009 4901 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/91541d62-9618-47c1-8e73-935764f24375-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.834024 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91541d62-9618-47c1-8e73-935764f24375-config\") on node \"crc\" DevicePath \"\"" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.834652 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e65cf217-f87a-45fd-a6d8-e3d3b2b11cab-client-ca\") pod \"route-controller-manager-7f5fcdfd45-6r89l\" (UID: \"e65cf217-f87a-45fd-a6d8-e3d3b2b11cab\") " pod="openshift-route-controller-manager/route-controller-manager-7f5fcdfd45-6r89l" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.835785 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e65cf217-f87a-45fd-a6d8-e3d3b2b11cab-config\") pod \"route-controller-manager-7f5fcdfd45-6r89l\" (UID: \"e65cf217-f87a-45fd-a6d8-e3d3b2b11cab\") " pod="openshift-route-controller-manager/route-controller-manager-7f5fcdfd45-6r89l" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.838984 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e65cf217-f87a-45fd-a6d8-e3d3b2b11cab-serving-cert\") pod \"route-controller-manager-7f5fcdfd45-6r89l\" (UID: \"e65cf217-f87a-45fd-a6d8-e3d3b2b11cab\") " pod="openshift-route-controller-manager/route-controller-manager-7f5fcdfd45-6r89l" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.853906 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb4lg\" (UniqueName: \"kubernetes.io/projected/e65cf217-f87a-45fd-a6d8-e3d3b2b11cab-kube-api-access-cb4lg\") pod \"route-controller-manager-7f5fcdfd45-6r89l\" (UID: \"e65cf217-f87a-45fd-a6d8-e3d3b2b11cab\") " pod="openshift-route-controller-manager/route-controller-manager-7f5fcdfd45-6r89l" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.859814 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f5fcdfd45-6r89l" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.871335 4901 generic.go:334] "Generic (PLEG): container finished" podID="b849405e-49b5-468b-bfd1-1b305aca9529" containerID="b7670c3760e6467bb7b242b7754cd587962dc08f1725257c8b6af1392a5bf803" exitCode=0 Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.871403 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-ccd8f799d-cmm5g" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.871444 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-ccd8f799d-cmm5g" event={"ID":"b849405e-49b5-468b-bfd1-1b305aca9529","Type":"ContainerDied","Data":"b7670c3760e6467bb7b242b7754cd587962dc08f1725257c8b6af1392a5bf803"} Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.871514 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-ccd8f799d-cmm5g" event={"ID":"b849405e-49b5-468b-bfd1-1b305aca9529","Type":"ContainerDied","Data":"51716f54fc67d16127a9475e2bfcaab93d23ba104cae3671f5a0a5a4a0ca4cd5"} Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.871540 4901 scope.go:117] "RemoveContainer" containerID="b7670c3760e6467bb7b242b7754cd587962dc08f1725257c8b6af1392a5bf803" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.873437 4901 generic.go:334] "Generic (PLEG): container finished" podID="91541d62-9618-47c1-8e73-935764f24375" containerID="183ccd3bb50f49307668bdc7a563bdfd50e71b0e126c2e31a957fa694aed581f" exitCode=0 Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.873497 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b84c445f9-767xj" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.873488 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b84c445f9-767xj" event={"ID":"91541d62-9618-47c1-8e73-935764f24375","Type":"ContainerDied","Data":"183ccd3bb50f49307668bdc7a563bdfd50e71b0e126c2e31a957fa694aed581f"} Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.873634 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b84c445f9-767xj" event={"ID":"91541d62-9618-47c1-8e73-935764f24375","Type":"ContainerDied","Data":"ab4db9c251c1979b5815efe33de3175a63afed4d9e2c007250bbe0305692a7cf"} Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.892393 4901 scope.go:117] "RemoveContainer" containerID="b7670c3760e6467bb7b242b7754cd587962dc08f1725257c8b6af1392a5bf803" Mar 09 02:46:46 crc kubenswrapper[4901]: E0309 02:46:46.892883 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7670c3760e6467bb7b242b7754cd587962dc08f1725257c8b6af1392a5bf803\": container with ID starting with b7670c3760e6467bb7b242b7754cd587962dc08f1725257c8b6af1392a5bf803 not found: ID does not exist" containerID="b7670c3760e6467bb7b242b7754cd587962dc08f1725257c8b6af1392a5bf803" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.892927 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7670c3760e6467bb7b242b7754cd587962dc08f1725257c8b6af1392a5bf803"} err="failed to get container status \"b7670c3760e6467bb7b242b7754cd587962dc08f1725257c8b6af1392a5bf803\": rpc error: code = NotFound desc = could not find container \"b7670c3760e6467bb7b242b7754cd587962dc08f1725257c8b6af1392a5bf803\": container with ID starting with b7670c3760e6467bb7b242b7754cd587962dc08f1725257c8b6af1392a5bf803 not found: ID does not exist" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.892964 4901 scope.go:117] "RemoveContainer" containerID="183ccd3bb50f49307668bdc7a563bdfd50e71b0e126c2e31a957fa694aed581f" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.911824 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.922065 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7b84c445f9-767xj"] Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.926262 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7b84c445f9-767xj"] Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.930863 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.932601 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-ccd8f799d-cmm5g"] Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.935612 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-ccd8f799d-cmm5g"] Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.939139 4901 scope.go:117] "RemoveContainer" containerID="183ccd3bb50f49307668bdc7a563bdfd50e71b0e126c2e31a957fa694aed581f" Mar 09 02:46:46 crc kubenswrapper[4901]: E0309 02:46:46.939612 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"183ccd3bb50f49307668bdc7a563bdfd50e71b0e126c2e31a957fa694aed581f\": container with ID starting with 183ccd3bb50f49307668bdc7a563bdfd50e71b0e126c2e31a957fa694aed581f not found: ID does not exist" containerID="183ccd3bb50f49307668bdc7a563bdfd50e71b0e126c2e31a957fa694aed581f" Mar 09 02:46:46 crc kubenswrapper[4901]: I0309 02:46:46.939646 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"183ccd3bb50f49307668bdc7a563bdfd50e71b0e126c2e31a957fa694aed581f"} err="failed to get container status \"183ccd3bb50f49307668bdc7a563bdfd50e71b0e126c2e31a957fa694aed581f\": rpc error: code = NotFound desc = could not find container \"183ccd3bb50f49307668bdc7a563bdfd50e71b0e126c2e31a957fa694aed581f\": container with ID starting with 183ccd3bb50f49307668bdc7a563bdfd50e71b0e126c2e31a957fa694aed581f not found: ID does not exist" Mar 09 02:46:47 crc kubenswrapper[4901]: I0309 02:46:47.028862 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 09 02:46:47 crc kubenswrapper[4901]: I0309 02:46:47.044551 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 09 02:46:47 crc kubenswrapper[4901]: I0309 02:46:47.333603 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 09 02:46:47 crc kubenswrapper[4901]: I0309 02:46:47.356495 4901 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 09 02:46:47 crc kubenswrapper[4901]: I0309 02:46:47.356849 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://519dffd60b7d2a9e0f899da0e30ddb6ae5167331b7e470ef4c05af561c03eeff" gracePeriod=5 Mar 09 02:46:47 crc kubenswrapper[4901]: I0309 02:46:47.388294 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 09 02:46:47 crc kubenswrapper[4901]: I0309 02:46:47.413437 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 09 02:46:47 crc kubenswrapper[4901]: I0309 02:46:47.422818 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 09 02:46:47 crc kubenswrapper[4901]: I0309 02:46:47.428545 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 09 02:46:47 crc kubenswrapper[4901]: I0309 02:46:47.484117 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 09 02:46:47 crc kubenswrapper[4901]: I0309 02:46:47.518655 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 09 02:46:47 crc kubenswrapper[4901]: I0309 02:46:47.534847 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 09 02:46:47 crc kubenswrapper[4901]: I0309 02:46:47.566974 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 09 02:46:47 crc kubenswrapper[4901]: I0309 02:46:47.567933 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 09 02:46:47 crc kubenswrapper[4901]: I0309 02:46:47.576019 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 09 02:46:47 crc kubenswrapper[4901]: I0309 02:46:47.577522 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 09 02:46:47 crc kubenswrapper[4901]: I0309 02:46:47.611732 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 09 02:46:47 crc kubenswrapper[4901]: I0309 02:46:47.629050 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 09 02:46:47 crc kubenswrapper[4901]: I0309 02:46:47.664169 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 09 02:46:47 crc kubenswrapper[4901]: I0309 02:46:47.673052 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 09 02:46:47 crc kubenswrapper[4901]: I0309 02:46:47.708653 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 09 02:46:47 crc kubenswrapper[4901]: I0309 02:46:47.710783 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 09 02:46:47 crc kubenswrapper[4901]: I0309 02:46:47.798846 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 09 02:46:47 crc kubenswrapper[4901]: I0309 02:46:47.838669 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 09 02:46:47 crc kubenswrapper[4901]: I0309 02:46:47.886387 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7fdcdd74b7-fd7dj"] Mar 09 02:46:47 crc kubenswrapper[4901]: I0309 02:46:47.925822 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f5fcdfd45-6r89l"] Mar 09 02:46:47 crc kubenswrapper[4901]: I0309 02:46:47.964338 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 09 02:46:48 crc kubenswrapper[4901]: I0309 02:46:48.016344 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 09 02:46:48 crc kubenswrapper[4901]: I0309 02:46:48.094056 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7fdcdd74b7-fd7dj"] Mar 09 02:46:48 crc kubenswrapper[4901]: I0309 02:46:48.116551 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 09 02:46:48 crc kubenswrapper[4901]: I0309 02:46:48.122266 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91541d62-9618-47c1-8e73-935764f24375" path="/var/lib/kubelet/pods/91541d62-9618-47c1-8e73-935764f24375/volumes" Mar 09 02:46:48 crc kubenswrapper[4901]: I0309 02:46:48.122749 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b849405e-49b5-468b-bfd1-1b305aca9529" path="/var/lib/kubelet/pods/b849405e-49b5-468b-bfd1-1b305aca9529/volumes" Mar 09 02:46:48 crc kubenswrapper[4901]: I0309 02:46:48.145997 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f5fcdfd45-6r89l"] Mar 09 02:46:48 crc kubenswrapper[4901]: I0309 02:46:48.185725 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 09 02:46:48 crc kubenswrapper[4901]: I0309 02:46:48.327376 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 09 02:46:48 crc kubenswrapper[4901]: I0309 02:46:48.515162 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 09 02:46:48 crc kubenswrapper[4901]: I0309 02:46:48.518146 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 09 02:46:48 crc kubenswrapper[4901]: I0309 02:46:48.518244 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 09 02:46:48 crc kubenswrapper[4901]: I0309 02:46:48.611776 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 09 02:46:48 crc kubenswrapper[4901]: I0309 02:46:48.632940 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6d845599ff-t2nq7"] Mar 09 02:46:48 crc kubenswrapper[4901]: E0309 02:46:48.633247 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91541d62-9618-47c1-8e73-935764f24375" containerName="controller-manager" Mar 09 02:46:48 crc kubenswrapper[4901]: I0309 02:46:48.633275 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="91541d62-9618-47c1-8e73-935764f24375" containerName="controller-manager" Mar 09 02:46:48 crc kubenswrapper[4901]: E0309 02:46:48.633307 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 09 02:46:48 crc kubenswrapper[4901]: I0309 02:46:48.633320 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 09 02:46:48 crc kubenswrapper[4901]: I0309 02:46:48.633484 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="91541d62-9618-47c1-8e73-935764f24375" containerName="controller-manager" Mar 09 02:46:48 crc kubenswrapper[4901]: I0309 02:46:48.633512 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 09 02:46:48 crc kubenswrapper[4901]: I0309 02:46:48.634065 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d845599ff-t2nq7" Mar 09 02:46:48 crc kubenswrapper[4901]: I0309 02:46:48.635784 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 09 02:46:48 crc kubenswrapper[4901]: I0309 02:46:48.635858 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 02:46:48 crc kubenswrapper[4901]: I0309 02:46:48.640659 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 09 02:46:48 crc kubenswrapper[4901]: I0309 02:46:48.641274 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 02:46:48 crc kubenswrapper[4901]: I0309 02:46:48.642864 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 09 02:46:48 crc kubenswrapper[4901]: I0309 02:46:48.647646 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 09 02:46:48 crc kubenswrapper[4901]: I0309 02:46:48.648192 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 09 02:46:48 crc kubenswrapper[4901]: I0309 02:46:48.652669 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6d845599ff-t2nq7"] Mar 09 02:46:48 crc kubenswrapper[4901]: I0309 02:46:48.678071 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4c40144-aba6-43ab-9fc8-9db68f0785d0-config\") pod \"controller-manager-6d845599ff-t2nq7\" (UID: \"a4c40144-aba6-43ab-9fc8-9db68f0785d0\") " pod="openshift-controller-manager/controller-manager-6d845599ff-t2nq7" Mar 09 02:46:48 crc kubenswrapper[4901]: I0309 02:46:48.678139 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a4c40144-aba6-43ab-9fc8-9db68f0785d0-client-ca\") pod \"controller-manager-6d845599ff-t2nq7\" (UID: \"a4c40144-aba6-43ab-9fc8-9db68f0785d0\") " pod="openshift-controller-manager/controller-manager-6d845599ff-t2nq7" Mar 09 02:46:48 crc kubenswrapper[4901]: I0309 02:46:48.678208 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a4c40144-aba6-43ab-9fc8-9db68f0785d0-proxy-ca-bundles\") pod \"controller-manager-6d845599ff-t2nq7\" (UID: \"a4c40144-aba6-43ab-9fc8-9db68f0785d0\") " pod="openshift-controller-manager/controller-manager-6d845599ff-t2nq7" Mar 09 02:46:48 crc kubenswrapper[4901]: I0309 02:46:48.678261 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp9fx\" (UniqueName: \"kubernetes.io/projected/a4c40144-aba6-43ab-9fc8-9db68f0785d0-kube-api-access-cp9fx\") pod \"controller-manager-6d845599ff-t2nq7\" (UID: \"a4c40144-aba6-43ab-9fc8-9db68f0785d0\") " pod="openshift-controller-manager/controller-manager-6d845599ff-t2nq7" Mar 09 02:46:48 crc kubenswrapper[4901]: I0309 02:46:48.678285 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4c40144-aba6-43ab-9fc8-9db68f0785d0-serving-cert\") pod \"controller-manager-6d845599ff-t2nq7\" (UID: \"a4c40144-aba6-43ab-9fc8-9db68f0785d0\") " pod="openshift-controller-manager/controller-manager-6d845599ff-t2nq7" Mar 09 02:46:48 crc kubenswrapper[4901]: I0309 02:46:48.779425 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cp9fx\" (UniqueName: \"kubernetes.io/projected/a4c40144-aba6-43ab-9fc8-9db68f0785d0-kube-api-access-cp9fx\") pod \"controller-manager-6d845599ff-t2nq7\" (UID: \"a4c40144-aba6-43ab-9fc8-9db68f0785d0\") " pod="openshift-controller-manager/controller-manager-6d845599ff-t2nq7" Mar 09 02:46:48 crc kubenswrapper[4901]: I0309 02:46:48.779490 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4c40144-aba6-43ab-9fc8-9db68f0785d0-serving-cert\") pod \"controller-manager-6d845599ff-t2nq7\" (UID: \"a4c40144-aba6-43ab-9fc8-9db68f0785d0\") " pod="openshift-controller-manager/controller-manager-6d845599ff-t2nq7" Mar 09 02:46:48 crc kubenswrapper[4901]: I0309 02:46:48.779588 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4c40144-aba6-43ab-9fc8-9db68f0785d0-config\") pod \"controller-manager-6d845599ff-t2nq7\" (UID: \"a4c40144-aba6-43ab-9fc8-9db68f0785d0\") " pod="openshift-controller-manager/controller-manager-6d845599ff-t2nq7" Mar 09 02:46:48 crc kubenswrapper[4901]: I0309 02:46:48.779625 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a4c40144-aba6-43ab-9fc8-9db68f0785d0-client-ca\") pod \"controller-manager-6d845599ff-t2nq7\" (UID: \"a4c40144-aba6-43ab-9fc8-9db68f0785d0\") " pod="openshift-controller-manager/controller-manager-6d845599ff-t2nq7" Mar 09 02:46:48 crc kubenswrapper[4901]: I0309 02:46:48.779689 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a4c40144-aba6-43ab-9fc8-9db68f0785d0-proxy-ca-bundles\") pod \"controller-manager-6d845599ff-t2nq7\" (UID: \"a4c40144-aba6-43ab-9fc8-9db68f0785d0\") " pod="openshift-controller-manager/controller-manager-6d845599ff-t2nq7" Mar 09 02:46:48 crc kubenswrapper[4901]: I0309 02:46:48.781379 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a4c40144-aba6-43ab-9fc8-9db68f0785d0-proxy-ca-bundles\") pod \"controller-manager-6d845599ff-t2nq7\" (UID: \"a4c40144-aba6-43ab-9fc8-9db68f0785d0\") " pod="openshift-controller-manager/controller-manager-6d845599ff-t2nq7" Mar 09 02:46:48 crc kubenswrapper[4901]: I0309 02:46:48.784415 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a4c40144-aba6-43ab-9fc8-9db68f0785d0-client-ca\") pod \"controller-manager-6d845599ff-t2nq7\" (UID: \"a4c40144-aba6-43ab-9fc8-9db68f0785d0\") " pod="openshift-controller-manager/controller-manager-6d845599ff-t2nq7" Mar 09 02:46:48 crc kubenswrapper[4901]: I0309 02:46:48.784696 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4c40144-aba6-43ab-9fc8-9db68f0785d0-config\") pod \"controller-manager-6d845599ff-t2nq7\" (UID: \"a4c40144-aba6-43ab-9fc8-9db68f0785d0\") " pod="openshift-controller-manager/controller-manager-6d845599ff-t2nq7" Mar 09 02:46:48 crc kubenswrapper[4901]: I0309 02:46:48.789057 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 09 02:46:48 crc kubenswrapper[4901]: I0309 02:46:48.802286 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4c40144-aba6-43ab-9fc8-9db68f0785d0-serving-cert\") pod \"controller-manager-6d845599ff-t2nq7\" (UID: \"a4c40144-aba6-43ab-9fc8-9db68f0785d0\") " pod="openshift-controller-manager/controller-manager-6d845599ff-t2nq7" Mar 09 02:46:48 crc kubenswrapper[4901]: I0309 02:46:48.802471 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp9fx\" (UniqueName: \"kubernetes.io/projected/a4c40144-aba6-43ab-9fc8-9db68f0785d0-kube-api-access-cp9fx\") pod \"controller-manager-6d845599ff-t2nq7\" (UID: \"a4c40144-aba6-43ab-9fc8-9db68f0785d0\") " pod="openshift-controller-manager/controller-manager-6d845599ff-t2nq7" Mar 09 02:46:48 crc kubenswrapper[4901]: I0309 02:46:48.868038 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 09 02:46:48 crc kubenswrapper[4901]: I0309 02:46:48.870418 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 09 02:46:48 crc kubenswrapper[4901]: I0309 02:46:48.897278 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7fdcdd74b7-fd7dj" event={"ID":"0eb6a6fc-6667-4169-883b-e159b625af06","Type":"ContainerStarted","Data":"772f1f63d7f44c959f051d6cceb2a54dc11135a8882b29eea6389d43c92d606f"} Mar 09 02:46:48 crc kubenswrapper[4901]: I0309 02:46:48.897322 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7fdcdd74b7-fd7dj" event={"ID":"0eb6a6fc-6667-4169-883b-e159b625af06","Type":"ContainerStarted","Data":"261e78d998bf4ee714ec93f7e0332cb25c6daecc33787c20c1da98df77bfe3c0"} Mar 09 02:46:48 crc kubenswrapper[4901]: I0309 02:46:48.897901 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7fdcdd74b7-fd7dj" Mar 09 02:46:48 crc kubenswrapper[4901]: I0309 02:46:48.899535 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f5fcdfd45-6r89l" event={"ID":"e65cf217-f87a-45fd-a6d8-e3d3b2b11cab","Type":"ContainerStarted","Data":"60b84281f27443503a3b07635cc5fd6b6023911a4b923ea99b5bf709a8c10639"} Mar 09 02:46:48 crc kubenswrapper[4901]: I0309 02:46:48.899559 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f5fcdfd45-6r89l" event={"ID":"e65cf217-f87a-45fd-a6d8-e3d3b2b11cab","Type":"ContainerStarted","Data":"06989f9ec787d222e7c345107d4023e20f008c74fdb6661f56a33ece978a6958"} Mar 09 02:46:48 crc kubenswrapper[4901]: I0309 02:46:48.900055 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7f5fcdfd45-6r89l" Mar 09 02:46:48 crc kubenswrapper[4901]: I0309 02:46:48.906648 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7fdcdd74b7-fd7dj" Mar 09 02:46:48 crc kubenswrapper[4901]: I0309 02:46:48.917427 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 09 02:46:48 crc kubenswrapper[4901]: I0309 02:46:48.949604 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d845599ff-t2nq7" Mar 09 02:46:48 crc kubenswrapper[4901]: I0309 02:46:48.951761 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-7fdcdd74b7-fd7dj" podStartSLOduration=60.951741872 podStartE2EDuration="1m0.951741872s" podCreationTimestamp="2026-03-09 02:45:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:46:48.931998639 +0000 UTC m=+333.521662371" watchObservedRunningTime="2026-03-09 02:46:48.951741872 +0000 UTC m=+333.541405604" Mar 09 02:46:48 crc kubenswrapper[4901]: I0309 02:46:48.954128 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 09 02:46:48 crc kubenswrapper[4901]: I0309 02:46:48.979082 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 09 02:46:48 crc kubenswrapper[4901]: I0309 02:46:48.998207 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 09 02:46:49 crc kubenswrapper[4901]: I0309 02:46:49.088173 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 09 02:46:49 crc kubenswrapper[4901]: I0309 02:46:49.112008 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7f5fcdfd45-6r89l" Mar 09 02:46:49 crc kubenswrapper[4901]: I0309 02:46:49.126710 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7f5fcdfd45-6r89l" podStartSLOduration=5.126691318 podStartE2EDuration="5.126691318s" podCreationTimestamp="2026-03-09 02:46:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:46:48.978995994 +0000 UTC m=+333.568659766" watchObservedRunningTime="2026-03-09 02:46:49.126691318 +0000 UTC m=+333.716355050" Mar 09 02:46:49 crc kubenswrapper[4901]: I0309 02:46:49.157602 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 09 02:46:49 crc kubenswrapper[4901]: I0309 02:46:49.389091 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 09 02:46:49 crc kubenswrapper[4901]: I0309 02:46:49.408885 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 09 02:46:49 crc kubenswrapper[4901]: I0309 02:46:49.416431 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6d845599ff-t2nq7"] Mar 09 02:46:49 crc kubenswrapper[4901]: I0309 02:46:49.608842 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 09 02:46:49 crc kubenswrapper[4901]: I0309 02:46:49.768492 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 09 02:46:49 crc kubenswrapper[4901]: I0309 02:46:49.906002 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6d845599ff-t2nq7" event={"ID":"a4c40144-aba6-43ab-9fc8-9db68f0785d0","Type":"ContainerStarted","Data":"88713dab6d5f828b8e5a6cd4b34882dab4b0aceeb33af70bd346f50d13ed0509"} Mar 09 02:46:49 crc kubenswrapper[4901]: I0309 02:46:49.906076 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6d845599ff-t2nq7" event={"ID":"a4c40144-aba6-43ab-9fc8-9db68f0785d0","Type":"ContainerStarted","Data":"f812dc4279d170e7dff3aa9624d1dc6aafae0eec59e8f2eb40260894091285fa"} Mar 09 02:46:49 crc kubenswrapper[4901]: I0309 02:46:49.921138 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6d845599ff-t2nq7" podStartSLOduration=5.921117827 podStartE2EDuration="5.921117827s" podCreationTimestamp="2026-03-09 02:46:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:46:49.920902291 +0000 UTC m=+334.510566033" watchObservedRunningTime="2026-03-09 02:46:49.921117827 +0000 UTC m=+334.510781579" Mar 09 02:46:50 crc kubenswrapper[4901]: I0309 02:46:50.230845 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 09 02:46:50 crc kubenswrapper[4901]: I0309 02:46:50.297480 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 09 02:46:50 crc kubenswrapper[4901]: I0309 02:46:50.373309 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 09 02:46:50 crc kubenswrapper[4901]: I0309 02:46:50.397624 4901 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 09 02:46:50 crc kubenswrapper[4901]: I0309 02:46:50.672213 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 09 02:46:50 crc kubenswrapper[4901]: I0309 02:46:50.714140 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 09 02:46:50 crc kubenswrapper[4901]: I0309 02:46:50.759101 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 09 02:46:50 crc kubenswrapper[4901]: I0309 02:46:50.911330 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6d845599ff-t2nq7" Mar 09 02:46:50 crc kubenswrapper[4901]: I0309 02:46:50.917778 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6d845599ff-t2nq7" Mar 09 02:46:50 crc kubenswrapper[4901]: I0309 02:46:50.942331 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 09 02:46:51 crc kubenswrapper[4901]: I0309 02:46:51.612306 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 09 02:46:52 crc kubenswrapper[4901]: I0309 02:46:52.119655 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 09 02:46:52 crc kubenswrapper[4901]: I0309 02:46:52.507596 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 09 02:46:52 crc kubenswrapper[4901]: I0309 02:46:52.507688 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 02:46:52 crc kubenswrapper[4901]: I0309 02:46:52.536675 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 09 02:46:52 crc kubenswrapper[4901]: I0309 02:46:52.536738 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 09 02:46:52 crc kubenswrapper[4901]: I0309 02:46:52.536776 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 09 02:46:52 crc kubenswrapper[4901]: I0309 02:46:52.536826 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 09 02:46:52 crc kubenswrapper[4901]: I0309 02:46:52.536868 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 09 02:46:52 crc kubenswrapper[4901]: I0309 02:46:52.536820 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 02:46:52 crc kubenswrapper[4901]: I0309 02:46:52.537112 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 02:46:52 crc kubenswrapper[4901]: I0309 02:46:52.536985 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 02:46:52 crc kubenswrapper[4901]: I0309 02:46:52.537575 4901 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 09 02:46:52 crc kubenswrapper[4901]: I0309 02:46:52.537652 4901 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 09 02:46:52 crc kubenswrapper[4901]: I0309 02:46:52.537676 4901 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 09 02:46:52 crc kubenswrapper[4901]: I0309 02:46:52.537753 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 02:46:52 crc kubenswrapper[4901]: I0309 02:46:52.547797 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 02:46:52 crc kubenswrapper[4901]: I0309 02:46:52.639160 4901 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 09 02:46:52 crc kubenswrapper[4901]: I0309 02:46:52.639216 4901 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 09 02:46:52 crc kubenswrapper[4901]: I0309 02:46:52.936499 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 09 02:46:52 crc kubenswrapper[4901]: I0309 02:46:52.936549 4901 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="519dffd60b7d2a9e0f899da0e30ddb6ae5167331b7e470ef4c05af561c03eeff" exitCode=137 Mar 09 02:46:52 crc kubenswrapper[4901]: I0309 02:46:52.937270 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 02:46:52 crc kubenswrapper[4901]: I0309 02:46:52.938514 4901 scope.go:117] "RemoveContainer" containerID="519dffd60b7d2a9e0f899da0e30ddb6ae5167331b7e470ef4c05af561c03eeff" Mar 09 02:46:52 crc kubenswrapper[4901]: I0309 02:46:52.961432 4901 scope.go:117] "RemoveContainer" containerID="519dffd60b7d2a9e0f899da0e30ddb6ae5167331b7e470ef4c05af561c03eeff" Mar 09 02:46:52 crc kubenswrapper[4901]: E0309 02:46:52.962096 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"519dffd60b7d2a9e0f899da0e30ddb6ae5167331b7e470ef4c05af561c03eeff\": container with ID starting with 519dffd60b7d2a9e0f899da0e30ddb6ae5167331b7e470ef4c05af561c03eeff not found: ID does not exist" containerID="519dffd60b7d2a9e0f899da0e30ddb6ae5167331b7e470ef4c05af561c03eeff" Mar 09 02:46:52 crc kubenswrapper[4901]: I0309 02:46:52.962164 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"519dffd60b7d2a9e0f899da0e30ddb6ae5167331b7e470ef4c05af561c03eeff"} err="failed to get container status \"519dffd60b7d2a9e0f899da0e30ddb6ae5167331b7e470ef4c05af561c03eeff\": rpc error: code = NotFound desc = could not find container \"519dffd60b7d2a9e0f899da0e30ddb6ae5167331b7e470ef4c05af561c03eeff\": container with ID starting with 519dffd60b7d2a9e0f899da0e30ddb6ae5167331b7e470ef4c05af561c03eeff not found: ID does not exist" Mar 09 02:46:54 crc kubenswrapper[4901]: I0309 02:46:54.117668 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 09 02:47:04 crc kubenswrapper[4901]: I0309 02:47:04.213259 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6d845599ff-t2nq7"] Mar 09 02:47:04 crc kubenswrapper[4901]: I0309 02:47:04.213942 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6d845599ff-t2nq7" podUID="a4c40144-aba6-43ab-9fc8-9db68f0785d0" containerName="controller-manager" containerID="cri-o://88713dab6d5f828b8e5a6cd4b34882dab4b0aceeb33af70bd346f50d13ed0509" gracePeriod=30 Mar 09 02:47:04 crc kubenswrapper[4901]: I0309 02:47:04.221523 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f5fcdfd45-6r89l"] Mar 09 02:47:04 crc kubenswrapper[4901]: I0309 02:47:04.221763 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7f5fcdfd45-6r89l" podUID="e65cf217-f87a-45fd-a6d8-e3d3b2b11cab" containerName="route-controller-manager" containerID="cri-o://60b84281f27443503a3b07635cc5fd6b6023911a4b923ea99b5bf709a8c10639" gracePeriod=30 Mar 09 02:47:04 crc kubenswrapper[4901]: I0309 02:47:04.788600 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f5fcdfd45-6r89l" Mar 09 02:47:04 crc kubenswrapper[4901]: I0309 02:47:04.810668 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e65cf217-f87a-45fd-a6d8-e3d3b2b11cab-client-ca\") pod \"e65cf217-f87a-45fd-a6d8-e3d3b2b11cab\" (UID: \"e65cf217-f87a-45fd-a6d8-e3d3b2b11cab\") " Mar 09 02:47:04 crc kubenswrapper[4901]: I0309 02:47:04.810714 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cb4lg\" (UniqueName: \"kubernetes.io/projected/e65cf217-f87a-45fd-a6d8-e3d3b2b11cab-kube-api-access-cb4lg\") pod \"e65cf217-f87a-45fd-a6d8-e3d3b2b11cab\" (UID: \"e65cf217-f87a-45fd-a6d8-e3d3b2b11cab\") " Mar 09 02:47:04 crc kubenswrapper[4901]: I0309 02:47:04.810768 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e65cf217-f87a-45fd-a6d8-e3d3b2b11cab-config\") pod \"e65cf217-f87a-45fd-a6d8-e3d3b2b11cab\" (UID: \"e65cf217-f87a-45fd-a6d8-e3d3b2b11cab\") " Mar 09 02:47:04 crc kubenswrapper[4901]: I0309 02:47:04.810800 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e65cf217-f87a-45fd-a6d8-e3d3b2b11cab-serving-cert\") pod \"e65cf217-f87a-45fd-a6d8-e3d3b2b11cab\" (UID: \"e65cf217-f87a-45fd-a6d8-e3d3b2b11cab\") " Mar 09 02:47:04 crc kubenswrapper[4901]: I0309 02:47:04.811977 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e65cf217-f87a-45fd-a6d8-e3d3b2b11cab-client-ca" (OuterVolumeSpecName: "client-ca") pod "e65cf217-f87a-45fd-a6d8-e3d3b2b11cab" (UID: "e65cf217-f87a-45fd-a6d8-e3d3b2b11cab"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:47:04 crc kubenswrapper[4901]: I0309 02:47:04.812360 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e65cf217-f87a-45fd-a6d8-e3d3b2b11cab-config" (OuterVolumeSpecName: "config") pod "e65cf217-f87a-45fd-a6d8-e3d3b2b11cab" (UID: "e65cf217-f87a-45fd-a6d8-e3d3b2b11cab"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:47:04 crc kubenswrapper[4901]: I0309 02:47:04.816319 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e65cf217-f87a-45fd-a6d8-e3d3b2b11cab-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e65cf217-f87a-45fd-a6d8-e3d3b2b11cab" (UID: "e65cf217-f87a-45fd-a6d8-e3d3b2b11cab"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:47:04 crc kubenswrapper[4901]: I0309 02:47:04.816681 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e65cf217-f87a-45fd-a6d8-e3d3b2b11cab-kube-api-access-cb4lg" (OuterVolumeSpecName: "kube-api-access-cb4lg") pod "e65cf217-f87a-45fd-a6d8-e3d3b2b11cab" (UID: "e65cf217-f87a-45fd-a6d8-e3d3b2b11cab"). InnerVolumeSpecName "kube-api-access-cb4lg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:47:04 crc kubenswrapper[4901]: I0309 02:47:04.872664 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d845599ff-t2nq7" Mar 09 02:47:04 crc kubenswrapper[4901]: I0309 02:47:04.911977 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a4c40144-aba6-43ab-9fc8-9db68f0785d0-client-ca\") pod \"a4c40144-aba6-43ab-9fc8-9db68f0785d0\" (UID: \"a4c40144-aba6-43ab-9fc8-9db68f0785d0\") " Mar 09 02:47:04 crc kubenswrapper[4901]: I0309 02:47:04.912020 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a4c40144-aba6-43ab-9fc8-9db68f0785d0-proxy-ca-bundles\") pod \"a4c40144-aba6-43ab-9fc8-9db68f0785d0\" (UID: \"a4c40144-aba6-43ab-9fc8-9db68f0785d0\") " Mar 09 02:47:04 crc kubenswrapper[4901]: I0309 02:47:04.912107 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4c40144-aba6-43ab-9fc8-9db68f0785d0-serving-cert\") pod \"a4c40144-aba6-43ab-9fc8-9db68f0785d0\" (UID: \"a4c40144-aba6-43ab-9fc8-9db68f0785d0\") " Mar 09 02:47:04 crc kubenswrapper[4901]: I0309 02:47:04.912126 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4c40144-aba6-43ab-9fc8-9db68f0785d0-config\") pod \"a4c40144-aba6-43ab-9fc8-9db68f0785d0\" (UID: \"a4c40144-aba6-43ab-9fc8-9db68f0785d0\") " Mar 09 02:47:04 crc kubenswrapper[4901]: I0309 02:47:04.912166 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cp9fx\" (UniqueName: \"kubernetes.io/projected/a4c40144-aba6-43ab-9fc8-9db68f0785d0-kube-api-access-cp9fx\") pod \"a4c40144-aba6-43ab-9fc8-9db68f0785d0\" (UID: \"a4c40144-aba6-43ab-9fc8-9db68f0785d0\") " Mar 09 02:47:04 crc kubenswrapper[4901]: I0309 02:47:04.912784 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4c40144-aba6-43ab-9fc8-9db68f0785d0-client-ca" (OuterVolumeSpecName: "client-ca") pod "a4c40144-aba6-43ab-9fc8-9db68f0785d0" (UID: "a4c40144-aba6-43ab-9fc8-9db68f0785d0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:47:04 crc kubenswrapper[4901]: I0309 02:47:04.912777 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4c40144-aba6-43ab-9fc8-9db68f0785d0-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "a4c40144-aba6-43ab-9fc8-9db68f0785d0" (UID: "a4c40144-aba6-43ab-9fc8-9db68f0785d0"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:47:04 crc kubenswrapper[4901]: I0309 02:47:04.913000 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4c40144-aba6-43ab-9fc8-9db68f0785d0-config" (OuterVolumeSpecName: "config") pod "a4c40144-aba6-43ab-9fc8-9db68f0785d0" (UID: "a4c40144-aba6-43ab-9fc8-9db68f0785d0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:47:04 crc kubenswrapper[4901]: I0309 02:47:04.912935 4901 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e65cf217-f87a-45fd-a6d8-e3d3b2b11cab-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 02:47:04 crc kubenswrapper[4901]: I0309 02:47:04.913110 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cb4lg\" (UniqueName: \"kubernetes.io/projected/e65cf217-f87a-45fd-a6d8-e3d3b2b11cab-kube-api-access-cb4lg\") on node \"crc\" DevicePath \"\"" Mar 09 02:47:04 crc kubenswrapper[4901]: I0309 02:47:04.913123 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e65cf217-f87a-45fd-a6d8-e3d3b2b11cab-config\") on node \"crc\" DevicePath \"\"" Mar 09 02:47:04 crc kubenswrapper[4901]: I0309 02:47:04.913134 4901 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e65cf217-f87a-45fd-a6d8-e3d3b2b11cab-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 02:47:04 crc kubenswrapper[4901]: I0309 02:47:04.915190 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4c40144-aba6-43ab-9fc8-9db68f0785d0-kube-api-access-cp9fx" (OuterVolumeSpecName: "kube-api-access-cp9fx") pod "a4c40144-aba6-43ab-9fc8-9db68f0785d0" (UID: "a4c40144-aba6-43ab-9fc8-9db68f0785d0"). InnerVolumeSpecName "kube-api-access-cp9fx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:47:04 crc kubenswrapper[4901]: I0309 02:47:04.917482 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4c40144-aba6-43ab-9fc8-9db68f0785d0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a4c40144-aba6-43ab-9fc8-9db68f0785d0" (UID: "a4c40144-aba6-43ab-9fc8-9db68f0785d0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.013869 4901 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a4c40144-aba6-43ab-9fc8-9db68f0785d0-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.013923 4901 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a4c40144-aba6-43ab-9fc8-9db68f0785d0-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.013942 4901 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4c40144-aba6-43ab-9fc8-9db68f0785d0-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.013959 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4c40144-aba6-43ab-9fc8-9db68f0785d0-config\") on node \"crc\" DevicePath \"\"" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.013979 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cp9fx\" (UniqueName: \"kubernetes.io/projected/a4c40144-aba6-43ab-9fc8-9db68f0785d0-kube-api-access-cp9fx\") on node \"crc\" DevicePath \"\"" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.024906 4901 generic.go:334] "Generic (PLEG): container finished" podID="a4c40144-aba6-43ab-9fc8-9db68f0785d0" containerID="88713dab6d5f828b8e5a6cd4b34882dab4b0aceeb33af70bd346f50d13ed0509" exitCode=0 Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.025049 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6d845599ff-t2nq7" event={"ID":"a4c40144-aba6-43ab-9fc8-9db68f0785d0","Type":"ContainerDied","Data":"88713dab6d5f828b8e5a6cd4b34882dab4b0aceeb33af70bd346f50d13ed0509"} Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.025011 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d845599ff-t2nq7" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.025154 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6d845599ff-t2nq7" event={"ID":"a4c40144-aba6-43ab-9fc8-9db68f0785d0","Type":"ContainerDied","Data":"f812dc4279d170e7dff3aa9624d1dc6aafae0eec59e8f2eb40260894091285fa"} Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.025254 4901 scope.go:117] "RemoveContainer" containerID="88713dab6d5f828b8e5a6cd4b34882dab4b0aceeb33af70bd346f50d13ed0509" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.030632 4901 generic.go:334] "Generic (PLEG): container finished" podID="e65cf217-f87a-45fd-a6d8-e3d3b2b11cab" containerID="60b84281f27443503a3b07635cc5fd6b6023911a4b923ea99b5bf709a8c10639" exitCode=0 Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.030671 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f5fcdfd45-6r89l" event={"ID":"e65cf217-f87a-45fd-a6d8-e3d3b2b11cab","Type":"ContainerDied","Data":"60b84281f27443503a3b07635cc5fd6b6023911a4b923ea99b5bf709a8c10639"} Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.030706 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f5fcdfd45-6r89l" event={"ID":"e65cf217-f87a-45fd-a6d8-e3d3b2b11cab","Type":"ContainerDied","Data":"06989f9ec787d222e7c345107d4023e20f008c74fdb6661f56a33ece978a6958"} Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.030727 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f5fcdfd45-6r89l" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.050917 4901 scope.go:117] "RemoveContainer" containerID="88713dab6d5f828b8e5a6cd4b34882dab4b0aceeb33af70bd346f50d13ed0509" Mar 09 02:47:05 crc kubenswrapper[4901]: E0309 02:47:05.051311 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88713dab6d5f828b8e5a6cd4b34882dab4b0aceeb33af70bd346f50d13ed0509\": container with ID starting with 88713dab6d5f828b8e5a6cd4b34882dab4b0aceeb33af70bd346f50d13ed0509 not found: ID does not exist" containerID="88713dab6d5f828b8e5a6cd4b34882dab4b0aceeb33af70bd346f50d13ed0509" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.051385 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88713dab6d5f828b8e5a6cd4b34882dab4b0aceeb33af70bd346f50d13ed0509"} err="failed to get container status \"88713dab6d5f828b8e5a6cd4b34882dab4b0aceeb33af70bd346f50d13ed0509\": rpc error: code = NotFound desc = could not find container \"88713dab6d5f828b8e5a6cd4b34882dab4b0aceeb33af70bd346f50d13ed0509\": container with ID starting with 88713dab6d5f828b8e5a6cd4b34882dab4b0aceeb33af70bd346f50d13ed0509 not found: ID does not exist" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.051407 4901 scope.go:117] "RemoveContainer" containerID="60b84281f27443503a3b07635cc5fd6b6023911a4b923ea99b5bf709a8c10639" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.068158 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f5fcdfd45-6r89l"] Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.075666 4901 scope.go:117] "RemoveContainer" containerID="60b84281f27443503a3b07635cc5fd6b6023911a4b923ea99b5bf709a8c10639" Mar 09 02:47:05 crc kubenswrapper[4901]: E0309 02:47:05.076125 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60b84281f27443503a3b07635cc5fd6b6023911a4b923ea99b5bf709a8c10639\": container with ID starting with 60b84281f27443503a3b07635cc5fd6b6023911a4b923ea99b5bf709a8c10639 not found: ID does not exist" containerID="60b84281f27443503a3b07635cc5fd6b6023911a4b923ea99b5bf709a8c10639" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.076155 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60b84281f27443503a3b07635cc5fd6b6023911a4b923ea99b5bf709a8c10639"} err="failed to get container status \"60b84281f27443503a3b07635cc5fd6b6023911a4b923ea99b5bf709a8c10639\": rpc error: code = NotFound desc = could not find container \"60b84281f27443503a3b07635cc5fd6b6023911a4b923ea99b5bf709a8c10639\": container with ID starting with 60b84281f27443503a3b07635cc5fd6b6023911a4b923ea99b5bf709a8c10639 not found: ID does not exist" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.084673 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f5fcdfd45-6r89l"] Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.089734 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6d845599ff-t2nq7"] Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.093604 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6d845599ff-t2nq7"] Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.646116 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-64f4f8899d-hhvgw"] Mar 09 02:47:05 crc kubenswrapper[4901]: E0309 02:47:05.647118 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4c40144-aba6-43ab-9fc8-9db68f0785d0" containerName="controller-manager" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.647359 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4c40144-aba6-43ab-9fc8-9db68f0785d0" containerName="controller-manager" Mar 09 02:47:05 crc kubenswrapper[4901]: E0309 02:47:05.647494 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e65cf217-f87a-45fd-a6d8-e3d3b2b11cab" containerName="route-controller-manager" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.647602 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="e65cf217-f87a-45fd-a6d8-e3d3b2b11cab" containerName="route-controller-manager" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.647839 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="e65cf217-f87a-45fd-a6d8-e3d3b2b11cab" containerName="route-controller-manager" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.647963 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4c40144-aba6-43ab-9fc8-9db68f0785d0" containerName="controller-manager" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.648644 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64f4f8899d-hhvgw" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.650879 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.652722 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.652922 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.654142 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.654144 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.654256 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.654673 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77bbdf4988-j6jxt"] Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.655713 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-77bbdf4988-j6jxt" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.659101 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.659183 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.659269 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.659339 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.660520 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.660716 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.661662 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.675895 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77bbdf4988-j6jxt"] Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.683071 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-64f4f8899d-hhvgw"] Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.722480 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmggl\" (UniqueName: \"kubernetes.io/projected/9846729e-f0fa-4d45-8951-209bb228c5ed-kube-api-access-zmggl\") pod \"controller-manager-64f4f8899d-hhvgw\" (UID: \"9846729e-f0fa-4d45-8951-209bb228c5ed\") " pod="openshift-controller-manager/controller-manager-64f4f8899d-hhvgw" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.722683 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02138685-46e1-4e52-bec8-78b88fb4f85b-config\") pod \"route-controller-manager-77bbdf4988-j6jxt\" (UID: \"02138685-46e1-4e52-bec8-78b88fb4f85b\") " pod="openshift-route-controller-manager/route-controller-manager-77bbdf4988-j6jxt" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.722903 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9846729e-f0fa-4d45-8951-209bb228c5ed-serving-cert\") pod \"controller-manager-64f4f8899d-hhvgw\" (UID: \"9846729e-f0fa-4d45-8951-209bb228c5ed\") " pod="openshift-controller-manager/controller-manager-64f4f8899d-hhvgw" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.722952 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9846729e-f0fa-4d45-8951-209bb228c5ed-config\") pod \"controller-manager-64f4f8899d-hhvgw\" (UID: \"9846729e-f0fa-4d45-8951-209bb228c5ed\") " pod="openshift-controller-manager/controller-manager-64f4f8899d-hhvgw" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.723008 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c25r6\" (UniqueName: \"kubernetes.io/projected/02138685-46e1-4e52-bec8-78b88fb4f85b-kube-api-access-c25r6\") pod \"route-controller-manager-77bbdf4988-j6jxt\" (UID: \"02138685-46e1-4e52-bec8-78b88fb4f85b\") " pod="openshift-route-controller-manager/route-controller-manager-77bbdf4988-j6jxt" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.723031 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02138685-46e1-4e52-bec8-78b88fb4f85b-serving-cert\") pod \"route-controller-manager-77bbdf4988-j6jxt\" (UID: \"02138685-46e1-4e52-bec8-78b88fb4f85b\") " pod="openshift-route-controller-manager/route-controller-manager-77bbdf4988-j6jxt" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.723067 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9846729e-f0fa-4d45-8951-209bb228c5ed-client-ca\") pod \"controller-manager-64f4f8899d-hhvgw\" (UID: \"9846729e-f0fa-4d45-8951-209bb228c5ed\") " pod="openshift-controller-manager/controller-manager-64f4f8899d-hhvgw" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.723094 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/02138685-46e1-4e52-bec8-78b88fb4f85b-client-ca\") pod \"route-controller-manager-77bbdf4988-j6jxt\" (UID: \"02138685-46e1-4e52-bec8-78b88fb4f85b\") " pod="openshift-route-controller-manager/route-controller-manager-77bbdf4988-j6jxt" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.723198 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9846729e-f0fa-4d45-8951-209bb228c5ed-proxy-ca-bundles\") pod \"controller-manager-64f4f8899d-hhvgw\" (UID: \"9846729e-f0fa-4d45-8951-209bb228c5ed\") " pod="openshift-controller-manager/controller-manager-64f4f8899d-hhvgw" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.824515 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmggl\" (UniqueName: \"kubernetes.io/projected/9846729e-f0fa-4d45-8951-209bb228c5ed-kube-api-access-zmggl\") pod \"controller-manager-64f4f8899d-hhvgw\" (UID: \"9846729e-f0fa-4d45-8951-209bb228c5ed\") " pod="openshift-controller-manager/controller-manager-64f4f8899d-hhvgw" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.824778 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02138685-46e1-4e52-bec8-78b88fb4f85b-config\") pod \"route-controller-manager-77bbdf4988-j6jxt\" (UID: \"02138685-46e1-4e52-bec8-78b88fb4f85b\") " pod="openshift-route-controller-manager/route-controller-manager-77bbdf4988-j6jxt" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.824896 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9846729e-f0fa-4d45-8951-209bb228c5ed-serving-cert\") pod \"controller-manager-64f4f8899d-hhvgw\" (UID: \"9846729e-f0fa-4d45-8951-209bb228c5ed\") " pod="openshift-controller-manager/controller-manager-64f4f8899d-hhvgw" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.824991 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9846729e-f0fa-4d45-8951-209bb228c5ed-config\") pod \"controller-manager-64f4f8899d-hhvgw\" (UID: \"9846729e-f0fa-4d45-8951-209bb228c5ed\") " pod="openshift-controller-manager/controller-manager-64f4f8899d-hhvgw" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.825102 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c25r6\" (UniqueName: \"kubernetes.io/projected/02138685-46e1-4e52-bec8-78b88fb4f85b-kube-api-access-c25r6\") pod \"route-controller-manager-77bbdf4988-j6jxt\" (UID: \"02138685-46e1-4e52-bec8-78b88fb4f85b\") " pod="openshift-route-controller-manager/route-controller-manager-77bbdf4988-j6jxt" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.825503 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02138685-46e1-4e52-bec8-78b88fb4f85b-serving-cert\") pod \"route-controller-manager-77bbdf4988-j6jxt\" (UID: \"02138685-46e1-4e52-bec8-78b88fb4f85b\") " pod="openshift-route-controller-manager/route-controller-manager-77bbdf4988-j6jxt" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.825622 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9846729e-f0fa-4d45-8951-209bb228c5ed-client-ca\") pod \"controller-manager-64f4f8899d-hhvgw\" (UID: \"9846729e-f0fa-4d45-8951-209bb228c5ed\") " pod="openshift-controller-manager/controller-manager-64f4f8899d-hhvgw" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.825697 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/02138685-46e1-4e52-bec8-78b88fb4f85b-client-ca\") pod \"route-controller-manager-77bbdf4988-j6jxt\" (UID: \"02138685-46e1-4e52-bec8-78b88fb4f85b\") " pod="openshift-route-controller-manager/route-controller-manager-77bbdf4988-j6jxt" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.825780 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9846729e-f0fa-4d45-8951-209bb228c5ed-proxy-ca-bundles\") pod \"controller-manager-64f4f8899d-hhvgw\" (UID: \"9846729e-f0fa-4d45-8951-209bb228c5ed\") " pod="openshift-controller-manager/controller-manager-64f4f8899d-hhvgw" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.826618 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9846729e-f0fa-4d45-8951-209bb228c5ed-config\") pod \"controller-manager-64f4f8899d-hhvgw\" (UID: \"9846729e-f0fa-4d45-8951-209bb228c5ed\") " pod="openshift-controller-manager/controller-manager-64f4f8899d-hhvgw" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.826905 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9846729e-f0fa-4d45-8951-209bb228c5ed-proxy-ca-bundles\") pod \"controller-manager-64f4f8899d-hhvgw\" (UID: \"9846729e-f0fa-4d45-8951-209bb228c5ed\") " pod="openshift-controller-manager/controller-manager-64f4f8899d-hhvgw" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.827013 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9846729e-f0fa-4d45-8951-209bb228c5ed-client-ca\") pod \"controller-manager-64f4f8899d-hhvgw\" (UID: \"9846729e-f0fa-4d45-8951-209bb228c5ed\") " pod="openshift-controller-manager/controller-manager-64f4f8899d-hhvgw" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.827920 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/02138685-46e1-4e52-bec8-78b88fb4f85b-client-ca\") pod \"route-controller-manager-77bbdf4988-j6jxt\" (UID: \"02138685-46e1-4e52-bec8-78b88fb4f85b\") " pod="openshift-route-controller-manager/route-controller-manager-77bbdf4988-j6jxt" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.827980 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02138685-46e1-4e52-bec8-78b88fb4f85b-config\") pod \"route-controller-manager-77bbdf4988-j6jxt\" (UID: \"02138685-46e1-4e52-bec8-78b88fb4f85b\") " pod="openshift-route-controller-manager/route-controller-manager-77bbdf4988-j6jxt" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.828955 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9846729e-f0fa-4d45-8951-209bb228c5ed-serving-cert\") pod \"controller-manager-64f4f8899d-hhvgw\" (UID: \"9846729e-f0fa-4d45-8951-209bb228c5ed\") " pod="openshift-controller-manager/controller-manager-64f4f8899d-hhvgw" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.830716 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02138685-46e1-4e52-bec8-78b88fb4f85b-serving-cert\") pod \"route-controller-manager-77bbdf4988-j6jxt\" (UID: \"02138685-46e1-4e52-bec8-78b88fb4f85b\") " pod="openshift-route-controller-manager/route-controller-manager-77bbdf4988-j6jxt" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.843409 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c25r6\" (UniqueName: \"kubernetes.io/projected/02138685-46e1-4e52-bec8-78b88fb4f85b-kube-api-access-c25r6\") pod \"route-controller-manager-77bbdf4988-j6jxt\" (UID: \"02138685-46e1-4e52-bec8-78b88fb4f85b\") " pod="openshift-route-controller-manager/route-controller-manager-77bbdf4988-j6jxt" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.852938 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmggl\" (UniqueName: \"kubernetes.io/projected/9846729e-f0fa-4d45-8951-209bb228c5ed-kube-api-access-zmggl\") pod \"controller-manager-64f4f8899d-hhvgw\" (UID: \"9846729e-f0fa-4d45-8951-209bb228c5ed\") " pod="openshift-controller-manager/controller-manager-64f4f8899d-hhvgw" Mar 09 02:47:05 crc kubenswrapper[4901]: I0309 02:47:05.989460 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64f4f8899d-hhvgw" Mar 09 02:47:06 crc kubenswrapper[4901]: I0309 02:47:06.000652 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-77bbdf4988-j6jxt" Mar 09 02:47:06 crc kubenswrapper[4901]: I0309 02:47:06.132121 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4c40144-aba6-43ab-9fc8-9db68f0785d0" path="/var/lib/kubelet/pods/a4c40144-aba6-43ab-9fc8-9db68f0785d0/volumes" Mar 09 02:47:06 crc kubenswrapper[4901]: I0309 02:47:06.132819 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e65cf217-f87a-45fd-a6d8-e3d3b2b11cab" path="/var/lib/kubelet/pods/e65cf217-f87a-45fd-a6d8-e3d3b2b11cab/volumes" Mar 09 02:47:06 crc kubenswrapper[4901]: I0309 02:47:06.430363 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-64f4f8899d-hhvgw"] Mar 09 02:47:06 crc kubenswrapper[4901]: I0309 02:47:06.495330 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77bbdf4988-j6jxt"] Mar 09 02:47:06 crc kubenswrapper[4901]: W0309 02:47:06.610319 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02138685_46e1_4e52_bec8_78b88fb4f85b.slice/crio-1aa4e4397af2ed642527fb3f0d12c305b0dc71efe7135dc9ba1c5ca698940b5b WatchSource:0}: Error finding container 1aa4e4397af2ed642527fb3f0d12c305b0dc71efe7135dc9ba1c5ca698940b5b: Status 404 returned error can't find the container with id 1aa4e4397af2ed642527fb3f0d12c305b0dc71efe7135dc9ba1c5ca698940b5b Mar 09 02:47:07 crc kubenswrapper[4901]: I0309 02:47:07.042736 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-77bbdf4988-j6jxt" event={"ID":"02138685-46e1-4e52-bec8-78b88fb4f85b","Type":"ContainerStarted","Data":"d843013591b2789b9170b08145a2f6cb3062910f77b74dbfafd16f5ee42ffc45"} Mar 09 02:47:07 crc kubenswrapper[4901]: I0309 02:47:07.042977 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-77bbdf4988-j6jxt" event={"ID":"02138685-46e1-4e52-bec8-78b88fb4f85b","Type":"ContainerStarted","Data":"1aa4e4397af2ed642527fb3f0d12c305b0dc71efe7135dc9ba1c5ca698940b5b"} Mar 09 02:47:07 crc kubenswrapper[4901]: I0309 02:47:07.044372 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-77bbdf4988-j6jxt" Mar 09 02:47:07 crc kubenswrapper[4901]: I0309 02:47:07.045660 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64f4f8899d-hhvgw" event={"ID":"9846729e-f0fa-4d45-8951-209bb228c5ed","Type":"ContainerStarted","Data":"abbb7124e322c21585d012fc5d95e8635e364c245a2c80b7dff81af242f5dbfa"} Mar 09 02:47:07 crc kubenswrapper[4901]: I0309 02:47:07.045707 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64f4f8899d-hhvgw" event={"ID":"9846729e-f0fa-4d45-8951-209bb228c5ed","Type":"ContainerStarted","Data":"7947f5617c880c327d0b6d4620f5cdb2622f1708688018b674da7e75c5d091a2"} Mar 09 02:47:07 crc kubenswrapper[4901]: I0309 02:47:07.045916 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-64f4f8899d-hhvgw" Mar 09 02:47:07 crc kubenswrapper[4901]: I0309 02:47:07.051470 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-64f4f8899d-hhvgw" Mar 09 02:47:07 crc kubenswrapper[4901]: I0309 02:47:07.059344 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-77bbdf4988-j6jxt" podStartSLOduration=3.05932655 podStartE2EDuration="3.05932655s" podCreationTimestamp="2026-03-09 02:47:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:47:07.056601498 +0000 UTC m=+351.646265260" watchObservedRunningTime="2026-03-09 02:47:07.05932655 +0000 UTC m=+351.648990282" Mar 09 02:47:07 crc kubenswrapper[4901]: I0309 02:47:07.070370 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-64f4f8899d-hhvgw" podStartSLOduration=3.070352233 podStartE2EDuration="3.070352233s" podCreationTimestamp="2026-03-09 02:47:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:47:07.068575785 +0000 UTC m=+351.658239517" watchObservedRunningTime="2026-03-09 02:47:07.070352233 +0000 UTC m=+351.660015965" Mar 09 02:47:07 crc kubenswrapper[4901]: I0309 02:47:07.167902 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-77bbdf4988-j6jxt" Mar 09 02:48:00 crc kubenswrapper[4901]: I0309 02:48:00.154184 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550408-m756h"] Mar 09 02:48:00 crc kubenswrapper[4901]: I0309 02:48:00.155392 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550408-m756h" Mar 09 02:48:00 crc kubenswrapper[4901]: I0309 02:48:00.158348 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lq6vs" Mar 09 02:48:00 crc kubenswrapper[4901]: I0309 02:48:00.159583 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 02:48:00 crc kubenswrapper[4901]: I0309 02:48:00.159977 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 02:48:00 crc kubenswrapper[4901]: I0309 02:48:00.161509 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550408-m756h"] Mar 09 02:48:00 crc kubenswrapper[4901]: I0309 02:48:00.286258 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrbrc\" (UniqueName: \"kubernetes.io/projected/75f216f4-98d4-44fe-b4f3-e6908f28ed4e-kube-api-access-vrbrc\") pod \"auto-csr-approver-29550408-m756h\" (UID: \"75f216f4-98d4-44fe-b4f3-e6908f28ed4e\") " pod="openshift-infra/auto-csr-approver-29550408-m756h" Mar 09 02:48:00 crc kubenswrapper[4901]: I0309 02:48:00.388257 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrbrc\" (UniqueName: \"kubernetes.io/projected/75f216f4-98d4-44fe-b4f3-e6908f28ed4e-kube-api-access-vrbrc\") pod \"auto-csr-approver-29550408-m756h\" (UID: \"75f216f4-98d4-44fe-b4f3-e6908f28ed4e\") " pod="openshift-infra/auto-csr-approver-29550408-m756h" Mar 09 02:48:00 crc kubenswrapper[4901]: I0309 02:48:00.415256 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrbrc\" (UniqueName: \"kubernetes.io/projected/75f216f4-98d4-44fe-b4f3-e6908f28ed4e-kube-api-access-vrbrc\") pod \"auto-csr-approver-29550408-m756h\" (UID: \"75f216f4-98d4-44fe-b4f3-e6908f28ed4e\") " pod="openshift-infra/auto-csr-approver-29550408-m756h" Mar 09 02:48:00 crc kubenswrapper[4901]: I0309 02:48:00.481425 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550408-m756h" Mar 09 02:48:00 crc kubenswrapper[4901]: I0309 02:48:00.925507 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550408-m756h"] Mar 09 02:48:01 crc kubenswrapper[4901]: I0309 02:48:01.396053 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550408-m756h" event={"ID":"75f216f4-98d4-44fe-b4f3-e6908f28ed4e","Type":"ContainerStarted","Data":"b584439c4e00d5fdef80ff0e27a40f38d54f6629fa425606c86440b4005f908d"} Mar 09 02:48:03 crc kubenswrapper[4901]: I0309 02:48:03.411043 4901 generic.go:334] "Generic (PLEG): container finished" podID="75f216f4-98d4-44fe-b4f3-e6908f28ed4e" containerID="9aa22b48b9223f46a4856b17630905d18e6700f71dd31c936b28850433b71299" exitCode=0 Mar 09 02:48:03 crc kubenswrapper[4901]: I0309 02:48:03.411130 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550408-m756h" event={"ID":"75f216f4-98d4-44fe-b4f3-e6908f28ed4e","Type":"ContainerDied","Data":"9aa22b48b9223f46a4856b17630905d18e6700f71dd31c936b28850433b71299"} Mar 09 02:48:04 crc kubenswrapper[4901]: I0309 02:48:04.238590 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-64f4f8899d-hhvgw"] Mar 09 02:48:04 crc kubenswrapper[4901]: I0309 02:48:04.239049 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-64f4f8899d-hhvgw" podUID="9846729e-f0fa-4d45-8951-209bb228c5ed" containerName="controller-manager" containerID="cri-o://abbb7124e322c21585d012fc5d95e8635e364c245a2c80b7dff81af242f5dbfa" gracePeriod=30 Mar 09 02:48:04 crc kubenswrapper[4901]: I0309 02:48:04.256528 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77bbdf4988-j6jxt"] Mar 09 02:48:04 crc kubenswrapper[4901]: I0309 02:48:04.256898 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-77bbdf4988-j6jxt" podUID="02138685-46e1-4e52-bec8-78b88fb4f85b" containerName="route-controller-manager" containerID="cri-o://d843013591b2789b9170b08145a2f6cb3062910f77b74dbfafd16f5ee42ffc45" gracePeriod=30 Mar 09 02:48:04 crc kubenswrapper[4901]: I0309 02:48:04.421964 4901 generic.go:334] "Generic (PLEG): container finished" podID="9846729e-f0fa-4d45-8951-209bb228c5ed" containerID="abbb7124e322c21585d012fc5d95e8635e364c245a2c80b7dff81af242f5dbfa" exitCode=0 Mar 09 02:48:04 crc kubenswrapper[4901]: I0309 02:48:04.422063 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64f4f8899d-hhvgw" event={"ID":"9846729e-f0fa-4d45-8951-209bb228c5ed","Type":"ContainerDied","Data":"abbb7124e322c21585d012fc5d95e8635e364c245a2c80b7dff81af242f5dbfa"} Mar 09 02:48:04 crc kubenswrapper[4901]: I0309 02:48:04.424419 4901 generic.go:334] "Generic (PLEG): container finished" podID="02138685-46e1-4e52-bec8-78b88fb4f85b" containerID="d843013591b2789b9170b08145a2f6cb3062910f77b74dbfafd16f5ee42ffc45" exitCode=0 Mar 09 02:48:04 crc kubenswrapper[4901]: I0309 02:48:04.424568 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-77bbdf4988-j6jxt" event={"ID":"02138685-46e1-4e52-bec8-78b88fb4f85b","Type":"ContainerDied","Data":"d843013591b2789b9170b08145a2f6cb3062910f77b74dbfafd16f5ee42ffc45"} Mar 09 02:48:04 crc kubenswrapper[4901]: I0309 02:48:04.707873 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64f4f8899d-hhvgw" Mar 09 02:48:04 crc kubenswrapper[4901]: I0309 02:48:04.711893 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-77bbdf4988-j6jxt" Mar 09 02:48:04 crc kubenswrapper[4901]: I0309 02:48:04.717522 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550408-m756h" Mar 09 02:48:04 crc kubenswrapper[4901]: I0309 02:48:04.850625 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9846729e-f0fa-4d45-8951-209bb228c5ed-config\") pod \"9846729e-f0fa-4d45-8951-209bb228c5ed\" (UID: \"9846729e-f0fa-4d45-8951-209bb228c5ed\") " Mar 09 02:48:04 crc kubenswrapper[4901]: I0309 02:48:04.850694 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrbrc\" (UniqueName: \"kubernetes.io/projected/75f216f4-98d4-44fe-b4f3-e6908f28ed4e-kube-api-access-vrbrc\") pod \"75f216f4-98d4-44fe-b4f3-e6908f28ed4e\" (UID: \"75f216f4-98d4-44fe-b4f3-e6908f28ed4e\") " Mar 09 02:48:04 crc kubenswrapper[4901]: I0309 02:48:04.850739 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c25r6\" (UniqueName: \"kubernetes.io/projected/02138685-46e1-4e52-bec8-78b88fb4f85b-kube-api-access-c25r6\") pod \"02138685-46e1-4e52-bec8-78b88fb4f85b\" (UID: \"02138685-46e1-4e52-bec8-78b88fb4f85b\") " Mar 09 02:48:04 crc kubenswrapper[4901]: I0309 02:48:04.850802 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02138685-46e1-4e52-bec8-78b88fb4f85b-serving-cert\") pod \"02138685-46e1-4e52-bec8-78b88fb4f85b\" (UID: \"02138685-46e1-4e52-bec8-78b88fb4f85b\") " Mar 09 02:48:04 crc kubenswrapper[4901]: I0309 02:48:04.850854 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9846729e-f0fa-4d45-8951-209bb228c5ed-serving-cert\") pod \"9846729e-f0fa-4d45-8951-209bb228c5ed\" (UID: \"9846729e-f0fa-4d45-8951-209bb228c5ed\") " Mar 09 02:48:04 crc kubenswrapper[4901]: I0309 02:48:04.852262 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/02138685-46e1-4e52-bec8-78b88fb4f85b-client-ca\") pod \"02138685-46e1-4e52-bec8-78b88fb4f85b\" (UID: \"02138685-46e1-4e52-bec8-78b88fb4f85b\") " Mar 09 02:48:04 crc kubenswrapper[4901]: I0309 02:48:04.852316 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9846729e-f0fa-4d45-8951-209bb228c5ed-proxy-ca-bundles\") pod \"9846729e-f0fa-4d45-8951-209bb228c5ed\" (UID: \"9846729e-f0fa-4d45-8951-209bb228c5ed\") " Mar 09 02:48:04 crc kubenswrapper[4901]: I0309 02:48:04.852363 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmggl\" (UniqueName: \"kubernetes.io/projected/9846729e-f0fa-4d45-8951-209bb228c5ed-kube-api-access-zmggl\") pod \"9846729e-f0fa-4d45-8951-209bb228c5ed\" (UID: \"9846729e-f0fa-4d45-8951-209bb228c5ed\") " Mar 09 02:48:04 crc kubenswrapper[4901]: I0309 02:48:04.852398 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9846729e-f0fa-4d45-8951-209bb228c5ed-config" (OuterVolumeSpecName: "config") pod "9846729e-f0fa-4d45-8951-209bb228c5ed" (UID: "9846729e-f0fa-4d45-8951-209bb228c5ed"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:48:04 crc kubenswrapper[4901]: I0309 02:48:04.852433 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9846729e-f0fa-4d45-8951-209bb228c5ed-client-ca\") pod \"9846729e-f0fa-4d45-8951-209bb228c5ed\" (UID: \"9846729e-f0fa-4d45-8951-209bb228c5ed\") " Mar 09 02:48:04 crc kubenswrapper[4901]: I0309 02:48:04.852545 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02138685-46e1-4e52-bec8-78b88fb4f85b-config\") pod \"02138685-46e1-4e52-bec8-78b88fb4f85b\" (UID: \"02138685-46e1-4e52-bec8-78b88fb4f85b\") " Mar 09 02:48:04 crc kubenswrapper[4901]: I0309 02:48:04.853077 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9846729e-f0fa-4d45-8951-209bb228c5ed-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "9846729e-f0fa-4d45-8951-209bb228c5ed" (UID: "9846729e-f0fa-4d45-8951-209bb228c5ed"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:48:04 crc kubenswrapper[4901]: I0309 02:48:04.853138 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9846729e-f0fa-4d45-8951-209bb228c5ed-client-ca" (OuterVolumeSpecName: "client-ca") pod "9846729e-f0fa-4d45-8951-209bb228c5ed" (UID: "9846729e-f0fa-4d45-8951-209bb228c5ed"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:48:04 crc kubenswrapper[4901]: I0309 02:48:04.853291 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02138685-46e1-4e52-bec8-78b88fb4f85b-client-ca" (OuterVolumeSpecName: "client-ca") pod "02138685-46e1-4e52-bec8-78b88fb4f85b" (UID: "02138685-46e1-4e52-bec8-78b88fb4f85b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:48:04 crc kubenswrapper[4901]: I0309 02:48:04.853754 4901 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/02138685-46e1-4e52-bec8-78b88fb4f85b-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 02:48:04 crc kubenswrapper[4901]: I0309 02:48:04.853806 4901 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9846729e-f0fa-4d45-8951-209bb228c5ed-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 02:48:04 crc kubenswrapper[4901]: I0309 02:48:04.853838 4901 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9846729e-f0fa-4d45-8951-209bb228c5ed-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 02:48:04 crc kubenswrapper[4901]: I0309 02:48:04.853863 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9846729e-f0fa-4d45-8951-209bb228c5ed-config\") on node \"crc\" DevicePath \"\"" Mar 09 02:48:04 crc kubenswrapper[4901]: I0309 02:48:04.854504 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02138685-46e1-4e52-bec8-78b88fb4f85b-config" (OuterVolumeSpecName: "config") pod "02138685-46e1-4e52-bec8-78b88fb4f85b" (UID: "02138685-46e1-4e52-bec8-78b88fb4f85b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:48:04 crc kubenswrapper[4901]: I0309 02:48:04.855934 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02138685-46e1-4e52-bec8-78b88fb4f85b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "02138685-46e1-4e52-bec8-78b88fb4f85b" (UID: "02138685-46e1-4e52-bec8-78b88fb4f85b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:48:04 crc kubenswrapper[4901]: I0309 02:48:04.856659 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9846729e-f0fa-4d45-8951-209bb228c5ed-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9846729e-f0fa-4d45-8951-209bb228c5ed" (UID: "9846729e-f0fa-4d45-8951-209bb228c5ed"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:48:04 crc kubenswrapper[4901]: I0309 02:48:04.859332 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75f216f4-98d4-44fe-b4f3-e6908f28ed4e-kube-api-access-vrbrc" (OuterVolumeSpecName: "kube-api-access-vrbrc") pod "75f216f4-98d4-44fe-b4f3-e6908f28ed4e" (UID: "75f216f4-98d4-44fe-b4f3-e6908f28ed4e"). InnerVolumeSpecName "kube-api-access-vrbrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:48:04 crc kubenswrapper[4901]: I0309 02:48:04.859445 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02138685-46e1-4e52-bec8-78b88fb4f85b-kube-api-access-c25r6" (OuterVolumeSpecName: "kube-api-access-c25r6") pod "02138685-46e1-4e52-bec8-78b88fb4f85b" (UID: "02138685-46e1-4e52-bec8-78b88fb4f85b"). InnerVolumeSpecName "kube-api-access-c25r6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:48:04 crc kubenswrapper[4901]: I0309 02:48:04.859699 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9846729e-f0fa-4d45-8951-209bb228c5ed-kube-api-access-zmggl" (OuterVolumeSpecName: "kube-api-access-zmggl") pod "9846729e-f0fa-4d45-8951-209bb228c5ed" (UID: "9846729e-f0fa-4d45-8951-209bb228c5ed"). InnerVolumeSpecName "kube-api-access-zmggl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:48:04 crc kubenswrapper[4901]: I0309 02:48:04.955579 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02138685-46e1-4e52-bec8-78b88fb4f85b-config\") on node \"crc\" DevicePath \"\"" Mar 09 02:48:04 crc kubenswrapper[4901]: I0309 02:48:04.956218 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrbrc\" (UniqueName: \"kubernetes.io/projected/75f216f4-98d4-44fe-b4f3-e6908f28ed4e-kube-api-access-vrbrc\") on node \"crc\" DevicePath \"\"" Mar 09 02:48:04 crc kubenswrapper[4901]: I0309 02:48:04.956362 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c25r6\" (UniqueName: \"kubernetes.io/projected/02138685-46e1-4e52-bec8-78b88fb4f85b-kube-api-access-c25r6\") on node \"crc\" DevicePath \"\"" Mar 09 02:48:04 crc kubenswrapper[4901]: I0309 02:48:04.956394 4901 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02138685-46e1-4e52-bec8-78b88fb4f85b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 02:48:04 crc kubenswrapper[4901]: I0309 02:48:04.956456 4901 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9846729e-f0fa-4d45-8951-209bb228c5ed-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 02:48:04 crc kubenswrapper[4901]: I0309 02:48:04.956481 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmggl\" (UniqueName: \"kubernetes.io/projected/9846729e-f0fa-4d45-8951-209bb228c5ed-kube-api-access-zmggl\") on node \"crc\" DevicePath \"\"" Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.434255 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550408-m756h" event={"ID":"75f216f4-98d4-44fe-b4f3-e6908f28ed4e","Type":"ContainerDied","Data":"b584439c4e00d5fdef80ff0e27a40f38d54f6629fa425606c86440b4005f908d"} Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.434356 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b584439c4e00d5fdef80ff0e27a40f38d54f6629fa425606c86440b4005f908d" Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.434351 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550408-m756h" Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.440103 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64f4f8899d-hhvgw" event={"ID":"9846729e-f0fa-4d45-8951-209bb228c5ed","Type":"ContainerDied","Data":"7947f5617c880c327d0b6d4620f5cdb2622f1708688018b674da7e75c5d091a2"} Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.440194 4901 scope.go:117] "RemoveContainer" containerID="abbb7124e322c21585d012fc5d95e8635e364c245a2c80b7dff81af242f5dbfa" Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.440430 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64f4f8899d-hhvgw" Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.442763 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-77bbdf4988-j6jxt" event={"ID":"02138685-46e1-4e52-bec8-78b88fb4f85b","Type":"ContainerDied","Data":"1aa4e4397af2ed642527fb3f0d12c305b0dc71efe7135dc9ba1c5ca698940b5b"} Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.442822 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-77bbdf4988-j6jxt" Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.482325 4901 scope.go:117] "RemoveContainer" containerID="d843013591b2789b9170b08145a2f6cb3062910f77b74dbfafd16f5ee42ffc45" Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.482604 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77bbdf4988-j6jxt"] Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.489020 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77bbdf4988-j6jxt"] Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.511159 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-64f4f8899d-hhvgw"] Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.518368 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-64f4f8899d-hhvgw"] Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.683852 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6d845599ff-gshgh"] Mar 09 02:48:05 crc kubenswrapper[4901]: E0309 02:48:05.684212 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9846729e-f0fa-4d45-8951-209bb228c5ed" containerName="controller-manager" Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.684269 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="9846729e-f0fa-4d45-8951-209bb228c5ed" containerName="controller-manager" Mar 09 02:48:05 crc kubenswrapper[4901]: E0309 02:48:05.684317 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02138685-46e1-4e52-bec8-78b88fb4f85b" containerName="route-controller-manager" Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.684333 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="02138685-46e1-4e52-bec8-78b88fb4f85b" containerName="route-controller-manager" Mar 09 02:48:05 crc kubenswrapper[4901]: E0309 02:48:05.684355 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75f216f4-98d4-44fe-b4f3-e6908f28ed4e" containerName="oc" Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.684371 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="75f216f4-98d4-44fe-b4f3-e6908f28ed4e" containerName="oc" Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.684534 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="02138685-46e1-4e52-bec8-78b88fb4f85b" containerName="route-controller-manager" Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.684555 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="9846729e-f0fa-4d45-8951-209bb228c5ed" containerName="controller-manager" Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.684578 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="75f216f4-98d4-44fe-b4f3-e6908f28ed4e" containerName="oc" Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.685361 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d845599ff-gshgh" Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.688607 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.689173 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.689654 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.690388 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.692675 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f5fcdfd45-rnqd9"] Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.694066 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f5fcdfd45-rnqd9" Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.694642 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.704529 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.704660 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.705312 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.705766 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.708194 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.708300 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.708392 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.716423 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.726910 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6d845599ff-gshgh"] Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.753301 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f5fcdfd45-rnqd9"] Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.869376 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67193b91-57e2-4def-a4b7-22af2987ad90-config\") pod \"controller-manager-6d845599ff-gshgh\" (UID: \"67193b91-57e2-4def-a4b7-22af2987ad90\") " pod="openshift-controller-manager/controller-manager-6d845599ff-gshgh" Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.869441 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/67193b91-57e2-4def-a4b7-22af2987ad90-proxy-ca-bundles\") pod \"controller-manager-6d845599ff-gshgh\" (UID: \"67193b91-57e2-4def-a4b7-22af2987ad90\") " pod="openshift-controller-manager/controller-manager-6d845599ff-gshgh" Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.869469 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b16c08df-e2d1-48d8-8995-53c4707ef831-client-ca\") pod \"route-controller-manager-7f5fcdfd45-rnqd9\" (UID: \"b16c08df-e2d1-48d8-8995-53c4707ef831\") " pod="openshift-route-controller-manager/route-controller-manager-7f5fcdfd45-rnqd9" Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.869511 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b16c08df-e2d1-48d8-8995-53c4707ef831-config\") pod \"route-controller-manager-7f5fcdfd45-rnqd9\" (UID: \"b16c08df-e2d1-48d8-8995-53c4707ef831\") " pod="openshift-route-controller-manager/route-controller-manager-7f5fcdfd45-rnqd9" Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.869568 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b16c08df-e2d1-48d8-8995-53c4707ef831-serving-cert\") pod \"route-controller-manager-7f5fcdfd45-rnqd9\" (UID: \"b16c08df-e2d1-48d8-8995-53c4707ef831\") " pod="openshift-route-controller-manager/route-controller-manager-7f5fcdfd45-rnqd9" Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.869593 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67193b91-57e2-4def-a4b7-22af2987ad90-serving-cert\") pod \"controller-manager-6d845599ff-gshgh\" (UID: \"67193b91-57e2-4def-a4b7-22af2987ad90\") " pod="openshift-controller-manager/controller-manager-6d845599ff-gshgh" Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.869726 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67193b91-57e2-4def-a4b7-22af2987ad90-client-ca\") pod \"controller-manager-6d845599ff-gshgh\" (UID: \"67193b91-57e2-4def-a4b7-22af2987ad90\") " pod="openshift-controller-manager/controller-manager-6d845599ff-gshgh" Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.869791 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2xsn\" (UniqueName: \"kubernetes.io/projected/b16c08df-e2d1-48d8-8995-53c4707ef831-kube-api-access-h2xsn\") pod \"route-controller-manager-7f5fcdfd45-rnqd9\" (UID: \"b16c08df-e2d1-48d8-8995-53c4707ef831\") " pod="openshift-route-controller-manager/route-controller-manager-7f5fcdfd45-rnqd9" Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.869886 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k879g\" (UniqueName: \"kubernetes.io/projected/67193b91-57e2-4def-a4b7-22af2987ad90-kube-api-access-k879g\") pod \"controller-manager-6d845599ff-gshgh\" (UID: \"67193b91-57e2-4def-a4b7-22af2987ad90\") " pod="openshift-controller-manager/controller-manager-6d845599ff-gshgh" Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.971167 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b16c08df-e2d1-48d8-8995-53c4707ef831-serving-cert\") pod \"route-controller-manager-7f5fcdfd45-rnqd9\" (UID: \"b16c08df-e2d1-48d8-8995-53c4707ef831\") " pod="openshift-route-controller-manager/route-controller-manager-7f5fcdfd45-rnqd9" Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.971686 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67193b91-57e2-4def-a4b7-22af2987ad90-serving-cert\") pod \"controller-manager-6d845599ff-gshgh\" (UID: \"67193b91-57e2-4def-a4b7-22af2987ad90\") " pod="openshift-controller-manager/controller-manager-6d845599ff-gshgh" Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.971734 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67193b91-57e2-4def-a4b7-22af2987ad90-client-ca\") pod \"controller-manager-6d845599ff-gshgh\" (UID: \"67193b91-57e2-4def-a4b7-22af2987ad90\") " pod="openshift-controller-manager/controller-manager-6d845599ff-gshgh" Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.971760 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2xsn\" (UniqueName: \"kubernetes.io/projected/b16c08df-e2d1-48d8-8995-53c4707ef831-kube-api-access-h2xsn\") pod \"route-controller-manager-7f5fcdfd45-rnqd9\" (UID: \"b16c08df-e2d1-48d8-8995-53c4707ef831\") " pod="openshift-route-controller-manager/route-controller-manager-7f5fcdfd45-rnqd9" Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.971826 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k879g\" (UniqueName: \"kubernetes.io/projected/67193b91-57e2-4def-a4b7-22af2987ad90-kube-api-access-k879g\") pod \"controller-manager-6d845599ff-gshgh\" (UID: \"67193b91-57e2-4def-a4b7-22af2987ad90\") " pod="openshift-controller-manager/controller-manager-6d845599ff-gshgh" Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.973149 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67193b91-57e2-4def-a4b7-22af2987ad90-config\") pod \"controller-manager-6d845599ff-gshgh\" (UID: \"67193b91-57e2-4def-a4b7-22af2987ad90\") " pod="openshift-controller-manager/controller-manager-6d845599ff-gshgh" Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.973174 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/67193b91-57e2-4def-a4b7-22af2987ad90-proxy-ca-bundles\") pod \"controller-manager-6d845599ff-gshgh\" (UID: \"67193b91-57e2-4def-a4b7-22af2987ad90\") " pod="openshift-controller-manager/controller-manager-6d845599ff-gshgh" Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.974652 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b16c08df-e2d1-48d8-8995-53c4707ef831-client-ca\") pod \"route-controller-manager-7f5fcdfd45-rnqd9\" (UID: \"b16c08df-e2d1-48d8-8995-53c4707ef831\") " pod="openshift-route-controller-manager/route-controller-manager-7f5fcdfd45-rnqd9" Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.974204 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67193b91-57e2-4def-a4b7-22af2987ad90-client-ca\") pod \"controller-manager-6d845599ff-gshgh\" (UID: \"67193b91-57e2-4def-a4b7-22af2987ad90\") " pod="openshift-controller-manager/controller-manager-6d845599ff-gshgh" Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.974602 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67193b91-57e2-4def-a4b7-22af2987ad90-config\") pod \"controller-manager-6d845599ff-gshgh\" (UID: \"67193b91-57e2-4def-a4b7-22af2987ad90\") " pod="openshift-controller-manager/controller-manager-6d845599ff-gshgh" Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.974365 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/67193b91-57e2-4def-a4b7-22af2987ad90-proxy-ca-bundles\") pod \"controller-manager-6d845599ff-gshgh\" (UID: \"67193b91-57e2-4def-a4b7-22af2987ad90\") " pod="openshift-controller-manager/controller-manager-6d845599ff-gshgh" Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.974853 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b16c08df-e2d1-48d8-8995-53c4707ef831-config\") pod \"route-controller-manager-7f5fcdfd45-rnqd9\" (UID: \"b16c08df-e2d1-48d8-8995-53c4707ef831\") " pod="openshift-route-controller-manager/route-controller-manager-7f5fcdfd45-rnqd9" Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.975412 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b16c08df-e2d1-48d8-8995-53c4707ef831-client-ca\") pod \"route-controller-manager-7f5fcdfd45-rnqd9\" (UID: \"b16c08df-e2d1-48d8-8995-53c4707ef831\") " pod="openshift-route-controller-manager/route-controller-manager-7f5fcdfd45-rnqd9" Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.975703 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b16c08df-e2d1-48d8-8995-53c4707ef831-config\") pod \"route-controller-manager-7f5fcdfd45-rnqd9\" (UID: \"b16c08df-e2d1-48d8-8995-53c4707ef831\") " pod="openshift-route-controller-manager/route-controller-manager-7f5fcdfd45-rnqd9" Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.976486 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b16c08df-e2d1-48d8-8995-53c4707ef831-serving-cert\") pod \"route-controller-manager-7f5fcdfd45-rnqd9\" (UID: \"b16c08df-e2d1-48d8-8995-53c4707ef831\") " pod="openshift-route-controller-manager/route-controller-manager-7f5fcdfd45-rnqd9" Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.977135 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67193b91-57e2-4def-a4b7-22af2987ad90-serving-cert\") pod \"controller-manager-6d845599ff-gshgh\" (UID: \"67193b91-57e2-4def-a4b7-22af2987ad90\") " pod="openshift-controller-manager/controller-manager-6d845599ff-gshgh" Mar 09 02:48:05 crc kubenswrapper[4901]: I0309 02:48:05.998255 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2xsn\" (UniqueName: \"kubernetes.io/projected/b16c08df-e2d1-48d8-8995-53c4707ef831-kube-api-access-h2xsn\") pod \"route-controller-manager-7f5fcdfd45-rnqd9\" (UID: \"b16c08df-e2d1-48d8-8995-53c4707ef831\") " pod="openshift-route-controller-manager/route-controller-manager-7f5fcdfd45-rnqd9" Mar 09 02:48:06 crc kubenswrapper[4901]: I0309 02:48:06.003132 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k879g\" (UniqueName: \"kubernetes.io/projected/67193b91-57e2-4def-a4b7-22af2987ad90-kube-api-access-k879g\") pod \"controller-manager-6d845599ff-gshgh\" (UID: \"67193b91-57e2-4def-a4b7-22af2987ad90\") " pod="openshift-controller-manager/controller-manager-6d845599ff-gshgh" Mar 09 02:48:06 crc kubenswrapper[4901]: I0309 02:48:06.024326 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d845599ff-gshgh" Mar 09 02:48:06 crc kubenswrapper[4901]: I0309 02:48:06.047819 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f5fcdfd45-rnqd9" Mar 09 02:48:06 crc kubenswrapper[4901]: I0309 02:48:06.119764 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02138685-46e1-4e52-bec8-78b88fb4f85b" path="/var/lib/kubelet/pods/02138685-46e1-4e52-bec8-78b88fb4f85b/volumes" Mar 09 02:48:06 crc kubenswrapper[4901]: I0309 02:48:06.132860 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9846729e-f0fa-4d45-8951-209bb228c5ed" path="/var/lib/kubelet/pods/9846729e-f0fa-4d45-8951-209bb228c5ed/volumes" Mar 09 02:48:06 crc kubenswrapper[4901]: I0309 02:48:06.342280 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f5fcdfd45-rnqd9"] Mar 09 02:48:06 crc kubenswrapper[4901]: I0309 02:48:06.451160 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f5fcdfd45-rnqd9" event={"ID":"b16c08df-e2d1-48d8-8995-53c4707ef831","Type":"ContainerStarted","Data":"e6241392f1e4a6239683416a7242a21567b1cfd740619fbbf6d92fa013fe978c"} Mar 09 02:48:06 crc kubenswrapper[4901]: W0309 02:48:06.492739 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67193b91_57e2_4def_a4b7_22af2987ad90.slice/crio-2c98c7359bde4390686dab212b247ca43d37dc8b4345365644fee10aff6657cb WatchSource:0}: Error finding container 2c98c7359bde4390686dab212b247ca43d37dc8b4345365644fee10aff6657cb: Status 404 returned error can't find the container with id 2c98c7359bde4390686dab212b247ca43d37dc8b4345365644fee10aff6657cb Mar 09 02:48:06 crc kubenswrapper[4901]: I0309 02:48:06.492947 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6d845599ff-gshgh"] Mar 09 02:48:07 crc kubenswrapper[4901]: I0309 02:48:07.465896 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6d845599ff-gshgh" event={"ID":"67193b91-57e2-4def-a4b7-22af2987ad90","Type":"ContainerStarted","Data":"9bfbe705aa03e684c9ada02862a03cf620f86a742e4b06bb8d47df3c4ceec081"} Mar 09 02:48:07 crc kubenswrapper[4901]: I0309 02:48:07.466250 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6d845599ff-gshgh" event={"ID":"67193b91-57e2-4def-a4b7-22af2987ad90","Type":"ContainerStarted","Data":"2c98c7359bde4390686dab212b247ca43d37dc8b4345365644fee10aff6657cb"} Mar 09 02:48:07 crc kubenswrapper[4901]: I0309 02:48:07.466588 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6d845599ff-gshgh" Mar 09 02:48:07 crc kubenswrapper[4901]: I0309 02:48:07.467791 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f5fcdfd45-rnqd9" event={"ID":"b16c08df-e2d1-48d8-8995-53c4707ef831","Type":"ContainerStarted","Data":"87fdfbf3dae966ca9f750a44824755e3b380423b4afe16e481b68f6c63216c13"} Mar 09 02:48:07 crc kubenswrapper[4901]: I0309 02:48:07.470240 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6d845599ff-gshgh" Mar 09 02:48:07 crc kubenswrapper[4901]: I0309 02:48:07.488586 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6d845599ff-gshgh" podStartSLOduration=3.488570769 podStartE2EDuration="3.488570769s" podCreationTimestamp="2026-03-09 02:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:48:07.484978007 +0000 UTC m=+412.074641739" watchObservedRunningTime="2026-03-09 02:48:07.488570769 +0000 UTC m=+412.078234501" Mar 09 02:48:07 crc kubenswrapper[4901]: I0309 02:48:07.506380 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7f5fcdfd45-rnqd9" podStartSLOduration=3.5063663160000003 podStartE2EDuration="3.506366316s" podCreationTimestamp="2026-03-09 02:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:48:07.506062388 +0000 UTC m=+412.095726130" watchObservedRunningTime="2026-03-09 02:48:07.506366316 +0000 UTC m=+412.096030048" Mar 09 02:48:08 crc kubenswrapper[4901]: I0309 02:48:08.471598 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7f5fcdfd45-rnqd9" Mar 09 02:48:08 crc kubenswrapper[4901]: I0309 02:48:08.477316 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7f5fcdfd45-rnqd9" Mar 09 02:48:26 crc kubenswrapper[4901]: I0309 02:48:26.282639 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-bcbkv"] Mar 09 02:48:26 crc kubenswrapper[4901]: I0309 02:48:26.283892 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-bcbkv" Mar 09 02:48:26 crc kubenswrapper[4901]: I0309 02:48:26.308883 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-bcbkv"] Mar 09 02:48:26 crc kubenswrapper[4901]: I0309 02:48:26.447742 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rb92\" (UniqueName: \"kubernetes.io/projected/4d604daa-7a02-4d2a-ae84-a6d742481a04-kube-api-access-9rb92\") pod \"image-registry-66df7c8f76-bcbkv\" (UID: \"4d604daa-7a02-4d2a-ae84-a6d742481a04\") " pod="openshift-image-registry/image-registry-66df7c8f76-bcbkv" Mar 09 02:48:26 crc kubenswrapper[4901]: I0309 02:48:26.448009 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4d604daa-7a02-4d2a-ae84-a6d742481a04-bound-sa-token\") pod \"image-registry-66df7c8f76-bcbkv\" (UID: \"4d604daa-7a02-4d2a-ae84-a6d742481a04\") " pod="openshift-image-registry/image-registry-66df7c8f76-bcbkv" Mar 09 02:48:26 crc kubenswrapper[4901]: I0309 02:48:26.448035 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4d604daa-7a02-4d2a-ae84-a6d742481a04-registry-tls\") pod \"image-registry-66df7c8f76-bcbkv\" (UID: \"4d604daa-7a02-4d2a-ae84-a6d742481a04\") " pod="openshift-image-registry/image-registry-66df7c8f76-bcbkv" Mar 09 02:48:26 crc kubenswrapper[4901]: I0309 02:48:26.448068 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4d604daa-7a02-4d2a-ae84-a6d742481a04-installation-pull-secrets\") pod \"image-registry-66df7c8f76-bcbkv\" (UID: \"4d604daa-7a02-4d2a-ae84-a6d742481a04\") " pod="openshift-image-registry/image-registry-66df7c8f76-bcbkv" Mar 09 02:48:26 crc kubenswrapper[4901]: I0309 02:48:26.448087 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4d604daa-7a02-4d2a-ae84-a6d742481a04-trusted-ca\") pod \"image-registry-66df7c8f76-bcbkv\" (UID: \"4d604daa-7a02-4d2a-ae84-a6d742481a04\") " pod="openshift-image-registry/image-registry-66df7c8f76-bcbkv" Mar 09 02:48:26 crc kubenswrapper[4901]: I0309 02:48:26.448127 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-bcbkv\" (UID: \"4d604daa-7a02-4d2a-ae84-a6d742481a04\") " pod="openshift-image-registry/image-registry-66df7c8f76-bcbkv" Mar 09 02:48:26 crc kubenswrapper[4901]: I0309 02:48:26.448156 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4d604daa-7a02-4d2a-ae84-a6d742481a04-ca-trust-extracted\") pod \"image-registry-66df7c8f76-bcbkv\" (UID: \"4d604daa-7a02-4d2a-ae84-a6d742481a04\") " pod="openshift-image-registry/image-registry-66df7c8f76-bcbkv" Mar 09 02:48:26 crc kubenswrapper[4901]: I0309 02:48:26.448175 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4d604daa-7a02-4d2a-ae84-a6d742481a04-registry-certificates\") pod \"image-registry-66df7c8f76-bcbkv\" (UID: \"4d604daa-7a02-4d2a-ae84-a6d742481a04\") " pod="openshift-image-registry/image-registry-66df7c8f76-bcbkv" Mar 09 02:48:26 crc kubenswrapper[4901]: I0309 02:48:26.521476 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-bcbkv\" (UID: \"4d604daa-7a02-4d2a-ae84-a6d742481a04\") " pod="openshift-image-registry/image-registry-66df7c8f76-bcbkv" Mar 09 02:48:26 crc kubenswrapper[4901]: I0309 02:48:26.548957 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4d604daa-7a02-4d2a-ae84-a6d742481a04-ca-trust-extracted\") pod \"image-registry-66df7c8f76-bcbkv\" (UID: \"4d604daa-7a02-4d2a-ae84-a6d742481a04\") " pod="openshift-image-registry/image-registry-66df7c8f76-bcbkv" Mar 09 02:48:26 crc kubenswrapper[4901]: I0309 02:48:26.548997 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4d604daa-7a02-4d2a-ae84-a6d742481a04-registry-certificates\") pod \"image-registry-66df7c8f76-bcbkv\" (UID: \"4d604daa-7a02-4d2a-ae84-a6d742481a04\") " pod="openshift-image-registry/image-registry-66df7c8f76-bcbkv" Mar 09 02:48:26 crc kubenswrapper[4901]: I0309 02:48:26.549018 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rb92\" (UniqueName: \"kubernetes.io/projected/4d604daa-7a02-4d2a-ae84-a6d742481a04-kube-api-access-9rb92\") pod \"image-registry-66df7c8f76-bcbkv\" (UID: \"4d604daa-7a02-4d2a-ae84-a6d742481a04\") " pod="openshift-image-registry/image-registry-66df7c8f76-bcbkv" Mar 09 02:48:26 crc kubenswrapper[4901]: I0309 02:48:26.549042 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4d604daa-7a02-4d2a-ae84-a6d742481a04-bound-sa-token\") pod \"image-registry-66df7c8f76-bcbkv\" (UID: \"4d604daa-7a02-4d2a-ae84-a6d742481a04\") " pod="openshift-image-registry/image-registry-66df7c8f76-bcbkv" Mar 09 02:48:26 crc kubenswrapper[4901]: I0309 02:48:26.549066 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4d604daa-7a02-4d2a-ae84-a6d742481a04-registry-tls\") pod \"image-registry-66df7c8f76-bcbkv\" (UID: \"4d604daa-7a02-4d2a-ae84-a6d742481a04\") " pod="openshift-image-registry/image-registry-66df7c8f76-bcbkv" Mar 09 02:48:26 crc kubenswrapper[4901]: I0309 02:48:26.549092 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4d604daa-7a02-4d2a-ae84-a6d742481a04-installation-pull-secrets\") pod \"image-registry-66df7c8f76-bcbkv\" (UID: \"4d604daa-7a02-4d2a-ae84-a6d742481a04\") " pod="openshift-image-registry/image-registry-66df7c8f76-bcbkv" Mar 09 02:48:26 crc kubenswrapper[4901]: I0309 02:48:26.549109 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4d604daa-7a02-4d2a-ae84-a6d742481a04-trusted-ca\") pod \"image-registry-66df7c8f76-bcbkv\" (UID: \"4d604daa-7a02-4d2a-ae84-a6d742481a04\") " pod="openshift-image-registry/image-registry-66df7c8f76-bcbkv" Mar 09 02:48:26 crc kubenswrapper[4901]: I0309 02:48:26.549483 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4d604daa-7a02-4d2a-ae84-a6d742481a04-ca-trust-extracted\") pod \"image-registry-66df7c8f76-bcbkv\" (UID: \"4d604daa-7a02-4d2a-ae84-a6d742481a04\") " pod="openshift-image-registry/image-registry-66df7c8f76-bcbkv" Mar 09 02:48:26 crc kubenswrapper[4901]: I0309 02:48:26.550106 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4d604daa-7a02-4d2a-ae84-a6d742481a04-trusted-ca\") pod \"image-registry-66df7c8f76-bcbkv\" (UID: \"4d604daa-7a02-4d2a-ae84-a6d742481a04\") " pod="openshift-image-registry/image-registry-66df7c8f76-bcbkv" Mar 09 02:48:26 crc kubenswrapper[4901]: I0309 02:48:26.550787 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4d604daa-7a02-4d2a-ae84-a6d742481a04-registry-certificates\") pod \"image-registry-66df7c8f76-bcbkv\" (UID: \"4d604daa-7a02-4d2a-ae84-a6d742481a04\") " pod="openshift-image-registry/image-registry-66df7c8f76-bcbkv" Mar 09 02:48:26 crc kubenswrapper[4901]: I0309 02:48:26.555335 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4d604daa-7a02-4d2a-ae84-a6d742481a04-registry-tls\") pod \"image-registry-66df7c8f76-bcbkv\" (UID: \"4d604daa-7a02-4d2a-ae84-a6d742481a04\") " pod="openshift-image-registry/image-registry-66df7c8f76-bcbkv" Mar 09 02:48:26 crc kubenswrapper[4901]: I0309 02:48:26.555535 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4d604daa-7a02-4d2a-ae84-a6d742481a04-installation-pull-secrets\") pod \"image-registry-66df7c8f76-bcbkv\" (UID: \"4d604daa-7a02-4d2a-ae84-a6d742481a04\") " pod="openshift-image-registry/image-registry-66df7c8f76-bcbkv" Mar 09 02:48:26 crc kubenswrapper[4901]: I0309 02:48:26.567216 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rb92\" (UniqueName: \"kubernetes.io/projected/4d604daa-7a02-4d2a-ae84-a6d742481a04-kube-api-access-9rb92\") pod \"image-registry-66df7c8f76-bcbkv\" (UID: \"4d604daa-7a02-4d2a-ae84-a6d742481a04\") " pod="openshift-image-registry/image-registry-66df7c8f76-bcbkv" Mar 09 02:48:26 crc kubenswrapper[4901]: I0309 02:48:26.568708 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4d604daa-7a02-4d2a-ae84-a6d742481a04-bound-sa-token\") pod \"image-registry-66df7c8f76-bcbkv\" (UID: \"4d604daa-7a02-4d2a-ae84-a6d742481a04\") " pod="openshift-image-registry/image-registry-66df7c8f76-bcbkv" Mar 09 02:48:26 crc kubenswrapper[4901]: I0309 02:48:26.606573 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-bcbkv" Mar 09 02:48:26 crc kubenswrapper[4901]: I0309 02:48:26.951534 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f7kz2"] Mar 09 02:48:26 crc kubenswrapper[4901]: I0309 02:48:26.952557 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-f7kz2" podUID="04b4583f-8f26-47e0-8726-a0c2f2dca07e" containerName="registry-server" containerID="cri-o://3845026fe186c660c9b108474a279825364f66a95539fe61f869eff4ba913cbc" gracePeriod=30 Mar 09 02:48:26 crc kubenswrapper[4901]: I0309 02:48:26.962743 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8c7zs"] Mar 09 02:48:26 crc kubenswrapper[4901]: I0309 02:48:26.963594 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8c7zs" podUID="b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20" containerName="registry-server" containerID="cri-o://76f23be939376cddc439ea4d4d9116a4e5a11887f55232a53bfca8ed00f21d90" gracePeriod=30 Mar 09 02:48:26 crc kubenswrapper[4901]: I0309 02:48:26.974504 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-kwffm"] Mar 09 02:48:26 crc kubenswrapper[4901]: I0309 02:48:26.974739 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-kwffm" podUID="ae631b64-6f22-4112-8fb8-aa2c5140275b" containerName="marketplace-operator" containerID="cri-o://c0fe91b31aeae00aa38b08b18868079f749e2b65d9d799ba9b4efc927cb5ab96" gracePeriod=30 Mar 09 02:48:26 crc kubenswrapper[4901]: I0309 02:48:26.997386 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lvtrc"] Mar 09 02:48:26 crc kubenswrapper[4901]: I0309 02:48:26.997793 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lvtrc" podUID="1092e265-a1ed-40f3-9a91-c1996ea7479c" containerName="registry-server" containerID="cri-o://39efb4f2ab60f7232608db324b7dfe0d78f50cfa4a13054ef9e97b8132ac8389" gracePeriod=30 Mar 09 02:48:27 crc kubenswrapper[4901]: I0309 02:48:27.004008 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m9pkg"] Mar 09 02:48:27 crc kubenswrapper[4901]: I0309 02:48:27.004237 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-m9pkg" podUID="1f881266-e72e-4a74-a232-2dc3c6e95f08" containerName="registry-server" containerID="cri-o://eff38df816f5351f23333eb558e294ced397c67c3c364cc487c31e0d930ab1a7" gracePeriod=30 Mar 09 02:48:27 crc kubenswrapper[4901]: I0309 02:48:27.006629 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xrwmx"] Mar 09 02:48:27 crc kubenswrapper[4901]: I0309 02:48:27.007322 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xrwmx" Mar 09 02:48:27 crc kubenswrapper[4901]: I0309 02:48:27.018042 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xrwmx"] Mar 09 02:48:27 crc kubenswrapper[4901]: I0309 02:48:27.066194 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-bcbkv"] Mar 09 02:48:27 crc kubenswrapper[4901]: I0309 02:48:27.165930 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4383cf51-078c-4924-ac4d-746918c62fad-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xrwmx\" (UID: \"4383cf51-078c-4924-ac4d-746918c62fad\") " pod="openshift-marketplace/marketplace-operator-79b997595-xrwmx" Mar 09 02:48:27 crc kubenswrapper[4901]: I0309 02:48:27.165991 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cbhb\" (UniqueName: \"kubernetes.io/projected/4383cf51-078c-4924-ac4d-746918c62fad-kube-api-access-6cbhb\") pod \"marketplace-operator-79b997595-xrwmx\" (UID: \"4383cf51-078c-4924-ac4d-746918c62fad\") " pod="openshift-marketplace/marketplace-operator-79b997595-xrwmx" Mar 09 02:48:27 crc kubenswrapper[4901]: I0309 02:48:27.166024 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4383cf51-078c-4924-ac4d-746918c62fad-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xrwmx\" (UID: \"4383cf51-078c-4924-ac4d-746918c62fad\") " pod="openshift-marketplace/marketplace-operator-79b997595-xrwmx" Mar 09 02:48:27 crc kubenswrapper[4901]: I0309 02:48:27.267353 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cbhb\" (UniqueName: \"kubernetes.io/projected/4383cf51-078c-4924-ac4d-746918c62fad-kube-api-access-6cbhb\") pod \"marketplace-operator-79b997595-xrwmx\" (UID: \"4383cf51-078c-4924-ac4d-746918c62fad\") " pod="openshift-marketplace/marketplace-operator-79b997595-xrwmx" Mar 09 02:48:27 crc kubenswrapper[4901]: I0309 02:48:27.267415 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4383cf51-078c-4924-ac4d-746918c62fad-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xrwmx\" (UID: \"4383cf51-078c-4924-ac4d-746918c62fad\") " pod="openshift-marketplace/marketplace-operator-79b997595-xrwmx" Mar 09 02:48:27 crc kubenswrapper[4901]: I0309 02:48:27.267573 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4383cf51-078c-4924-ac4d-746918c62fad-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xrwmx\" (UID: \"4383cf51-078c-4924-ac4d-746918c62fad\") " pod="openshift-marketplace/marketplace-operator-79b997595-xrwmx" Mar 09 02:48:27 crc kubenswrapper[4901]: I0309 02:48:27.269521 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4383cf51-078c-4924-ac4d-746918c62fad-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xrwmx\" (UID: \"4383cf51-078c-4924-ac4d-746918c62fad\") " pod="openshift-marketplace/marketplace-operator-79b997595-xrwmx" Mar 09 02:48:27 crc kubenswrapper[4901]: I0309 02:48:27.273465 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4383cf51-078c-4924-ac4d-746918c62fad-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xrwmx\" (UID: \"4383cf51-078c-4924-ac4d-746918c62fad\") " pod="openshift-marketplace/marketplace-operator-79b997595-xrwmx" Mar 09 02:48:27 crc kubenswrapper[4901]: I0309 02:48:27.288157 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cbhb\" (UniqueName: \"kubernetes.io/projected/4383cf51-078c-4924-ac4d-746918c62fad-kube-api-access-6cbhb\") pod \"marketplace-operator-79b997595-xrwmx\" (UID: \"4383cf51-078c-4924-ac4d-746918c62fad\") " pod="openshift-marketplace/marketplace-operator-79b997595-xrwmx" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.332773 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xrwmx" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.513757 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f7kz2" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.613440 4901 generic.go:334] "Generic (PLEG): container finished" podID="ae631b64-6f22-4112-8fb8-aa2c5140275b" containerID="c0fe91b31aeae00aa38b08b18868079f749e2b65d9d799ba9b4efc927cb5ab96" exitCode=0 Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.613520 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-kwffm" event={"ID":"ae631b64-6f22-4112-8fb8-aa2c5140275b","Type":"ContainerDied","Data":"c0fe91b31aeae00aa38b08b18868079f749e2b65d9d799ba9b4efc927cb5ab96"} Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.619398 4901 generic.go:334] "Generic (PLEG): container finished" podID="1f881266-e72e-4a74-a232-2dc3c6e95f08" containerID="eff38df816f5351f23333eb558e294ced397c67c3c364cc487c31e0d930ab1a7" exitCode=0 Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.619453 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m9pkg" event={"ID":"1f881266-e72e-4a74-a232-2dc3c6e95f08","Type":"ContainerDied","Data":"eff38df816f5351f23333eb558e294ced397c67c3c364cc487c31e0d930ab1a7"} Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.620691 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-bcbkv" event={"ID":"4d604daa-7a02-4d2a-ae84-a6d742481a04","Type":"ContainerStarted","Data":"0fa306b4e3f545476c3fdbb832b638f5cf0c2ed0b74e14b6f308cf6e876ea331"} Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.623110 4901 generic.go:334] "Generic (PLEG): container finished" podID="04b4583f-8f26-47e0-8726-a0c2f2dca07e" containerID="3845026fe186c660c9b108474a279825364f66a95539fe61f869eff4ba913cbc" exitCode=0 Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.623168 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f7kz2" event={"ID":"04b4583f-8f26-47e0-8726-a0c2f2dca07e","Type":"ContainerDied","Data":"3845026fe186c660c9b108474a279825364f66a95539fe61f869eff4ba913cbc"} Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.623200 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f7kz2" event={"ID":"04b4583f-8f26-47e0-8726-a0c2f2dca07e","Type":"ContainerDied","Data":"fa840e7517bc7d02b5da717dfbd5f74776795a5c63c17597b6707a0d89c11c3e"} Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.623236 4901 scope.go:117] "RemoveContainer" containerID="3845026fe186c660c9b108474a279825364f66a95539fe61f869eff4ba913cbc" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.623340 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f7kz2" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.632724 4901 generic.go:334] "Generic (PLEG): container finished" podID="b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20" containerID="76f23be939376cddc439ea4d4d9116a4e5a11887f55232a53bfca8ed00f21d90" exitCode=0 Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.632780 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8c7zs" event={"ID":"b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20","Type":"ContainerDied","Data":"76f23be939376cddc439ea4d4d9116a4e5a11887f55232a53bfca8ed00f21d90"} Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.636557 4901 generic.go:334] "Generic (PLEG): container finished" podID="1092e265-a1ed-40f3-9a91-c1996ea7479c" containerID="39efb4f2ab60f7232608db324b7dfe0d78f50cfa4a13054ef9e97b8132ac8389" exitCode=0 Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.636602 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lvtrc" event={"ID":"1092e265-a1ed-40f3-9a91-c1996ea7479c","Type":"ContainerDied","Data":"39efb4f2ab60f7232608db324b7dfe0d78f50cfa4a13054ef9e97b8132ac8389"} Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.636628 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lvtrc" event={"ID":"1092e265-a1ed-40f3-9a91-c1996ea7479c","Type":"ContainerDied","Data":"4c82cd34d56ab02d1aa1166ad65a93b978d753395d57bf0a3096e373f891dc9a"} Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.636641 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c82cd34d56ab02d1aa1166ad65a93b978d753395d57bf0a3096e373f891dc9a" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.661348 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lvtrc" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.667626 4901 scope.go:117] "RemoveContainer" containerID="41b1ba29b78fb5b576a7bcccaae5ff9b4841b961b09d6dd144dffde6de0b4211" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.675263 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04b4583f-8f26-47e0-8726-a0c2f2dca07e-utilities\") pod \"04b4583f-8f26-47e0-8726-a0c2f2dca07e\" (UID: \"04b4583f-8f26-47e0-8726-a0c2f2dca07e\") " Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.676045 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04b4583f-8f26-47e0-8726-a0c2f2dca07e-utilities" (OuterVolumeSpecName: "utilities") pod "04b4583f-8f26-47e0-8726-a0c2f2dca07e" (UID: "04b4583f-8f26-47e0-8726-a0c2f2dca07e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.676200 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mw89\" (UniqueName: \"kubernetes.io/projected/04b4583f-8f26-47e0-8726-a0c2f2dca07e-kube-api-access-6mw89\") pod \"04b4583f-8f26-47e0-8726-a0c2f2dca07e\" (UID: \"04b4583f-8f26-47e0-8726-a0c2f2dca07e\") " Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.676271 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04b4583f-8f26-47e0-8726-a0c2f2dca07e-catalog-content\") pod \"04b4583f-8f26-47e0-8726-a0c2f2dca07e\" (UID: \"04b4583f-8f26-47e0-8726-a0c2f2dca07e\") " Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.677844 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04b4583f-8f26-47e0-8726-a0c2f2dca07e-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.690631 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-kwffm" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.709358 4901 scope.go:117] "RemoveContainer" containerID="2d265cfec6c4e06c96aa8bff7ddac181540e019a7827efc35c9a92f92e5afb66" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.712826 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8c7zs" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.712946 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04b4583f-8f26-47e0-8726-a0c2f2dca07e-kube-api-access-6mw89" (OuterVolumeSpecName: "kube-api-access-6mw89") pod "04b4583f-8f26-47e0-8726-a0c2f2dca07e" (UID: "04b4583f-8f26-47e0-8726-a0c2f2dca07e"). InnerVolumeSpecName "kube-api-access-6mw89". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.722196 4901 scope.go:117] "RemoveContainer" containerID="3845026fe186c660c9b108474a279825364f66a95539fe61f869eff4ba913cbc" Mar 09 02:48:28 crc kubenswrapper[4901]: E0309 02:48:27.726296 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3845026fe186c660c9b108474a279825364f66a95539fe61f869eff4ba913cbc\": container with ID starting with 3845026fe186c660c9b108474a279825364f66a95539fe61f869eff4ba913cbc not found: ID does not exist" containerID="3845026fe186c660c9b108474a279825364f66a95539fe61f869eff4ba913cbc" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.726342 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3845026fe186c660c9b108474a279825364f66a95539fe61f869eff4ba913cbc"} err="failed to get container status \"3845026fe186c660c9b108474a279825364f66a95539fe61f869eff4ba913cbc\": rpc error: code = NotFound desc = could not find container \"3845026fe186c660c9b108474a279825364f66a95539fe61f869eff4ba913cbc\": container with ID starting with 3845026fe186c660c9b108474a279825364f66a95539fe61f869eff4ba913cbc not found: ID does not exist" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.726374 4901 scope.go:117] "RemoveContainer" containerID="41b1ba29b78fb5b576a7bcccaae5ff9b4841b961b09d6dd144dffde6de0b4211" Mar 09 02:48:28 crc kubenswrapper[4901]: E0309 02:48:27.726810 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41b1ba29b78fb5b576a7bcccaae5ff9b4841b961b09d6dd144dffde6de0b4211\": container with ID starting with 41b1ba29b78fb5b576a7bcccaae5ff9b4841b961b09d6dd144dffde6de0b4211 not found: ID does not exist" containerID="41b1ba29b78fb5b576a7bcccaae5ff9b4841b961b09d6dd144dffde6de0b4211" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.726834 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41b1ba29b78fb5b576a7bcccaae5ff9b4841b961b09d6dd144dffde6de0b4211"} err="failed to get container status \"41b1ba29b78fb5b576a7bcccaae5ff9b4841b961b09d6dd144dffde6de0b4211\": rpc error: code = NotFound desc = could not find container \"41b1ba29b78fb5b576a7bcccaae5ff9b4841b961b09d6dd144dffde6de0b4211\": container with ID starting with 41b1ba29b78fb5b576a7bcccaae5ff9b4841b961b09d6dd144dffde6de0b4211 not found: ID does not exist" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.726852 4901 scope.go:117] "RemoveContainer" containerID="2d265cfec6c4e06c96aa8bff7ddac181540e019a7827efc35c9a92f92e5afb66" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.732843 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m9pkg" Mar 09 02:48:28 crc kubenswrapper[4901]: E0309 02:48:27.733171 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d265cfec6c4e06c96aa8bff7ddac181540e019a7827efc35c9a92f92e5afb66\": container with ID starting with 2d265cfec6c4e06c96aa8bff7ddac181540e019a7827efc35c9a92f92e5afb66 not found: ID does not exist" containerID="2d265cfec6c4e06c96aa8bff7ddac181540e019a7827efc35c9a92f92e5afb66" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.733197 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d265cfec6c4e06c96aa8bff7ddac181540e019a7827efc35c9a92f92e5afb66"} err="failed to get container status \"2d265cfec6c4e06c96aa8bff7ddac181540e019a7827efc35c9a92f92e5afb66\": rpc error: code = NotFound desc = could not find container \"2d265cfec6c4e06c96aa8bff7ddac181540e019a7827efc35c9a92f92e5afb66\": container with ID starting with 2d265cfec6c4e06c96aa8bff7ddac181540e019a7827efc35c9a92f92e5afb66 not found: ID does not exist" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.770557 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04b4583f-8f26-47e0-8726-a0c2f2dca07e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "04b4583f-8f26-47e0-8726-a0c2f2dca07e" (UID: "04b4583f-8f26-47e0-8726-a0c2f2dca07e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.778280 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1092e265-a1ed-40f3-9a91-c1996ea7479c-utilities\") pod \"1092e265-a1ed-40f3-9a91-c1996ea7479c\" (UID: \"1092e265-a1ed-40f3-9a91-c1996ea7479c\") " Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.778329 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q425q\" (UniqueName: \"kubernetes.io/projected/1092e265-a1ed-40f3-9a91-c1996ea7479c-kube-api-access-q425q\") pod \"1092e265-a1ed-40f3-9a91-c1996ea7479c\" (UID: \"1092e265-a1ed-40f3-9a91-c1996ea7479c\") " Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.778392 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ae631b64-6f22-4112-8fb8-aa2c5140275b-marketplace-operator-metrics\") pod \"ae631b64-6f22-4112-8fb8-aa2c5140275b\" (UID: \"ae631b64-6f22-4112-8fb8-aa2c5140275b\") " Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.778412 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20-catalog-content\") pod \"b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20\" (UID: \"b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20\") " Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.778432 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwnxp\" (UniqueName: \"kubernetes.io/projected/ae631b64-6f22-4112-8fb8-aa2c5140275b-kube-api-access-zwnxp\") pod \"ae631b64-6f22-4112-8fb8-aa2c5140275b\" (UID: \"ae631b64-6f22-4112-8fb8-aa2c5140275b\") " Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.779319 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1092e265-a1ed-40f3-9a91-c1996ea7479c-utilities" (OuterVolumeSpecName: "utilities") pod "1092e265-a1ed-40f3-9a91-c1996ea7479c" (UID: "1092e265-a1ed-40f3-9a91-c1996ea7479c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.779875 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1092e265-a1ed-40f3-9a91-c1996ea7479c-catalog-content\") pod \"1092e265-a1ed-40f3-9a91-c1996ea7479c\" (UID: \"1092e265-a1ed-40f3-9a91-c1996ea7479c\") " Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.779917 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ae631b64-6f22-4112-8fb8-aa2c5140275b-marketplace-trusted-ca\") pod \"ae631b64-6f22-4112-8fb8-aa2c5140275b\" (UID: \"ae631b64-6f22-4112-8fb8-aa2c5140275b\") " Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.779974 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzcbv\" (UniqueName: \"kubernetes.io/projected/b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20-kube-api-access-fzcbv\") pod \"b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20\" (UID: \"b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20\") " Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.780006 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20-utilities\") pod \"b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20\" (UID: \"b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20\") " Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.780207 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1092e265-a1ed-40f3-9a91-c1996ea7479c-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.780237 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mw89\" (UniqueName: \"kubernetes.io/projected/04b4583f-8f26-47e0-8726-a0c2f2dca07e-kube-api-access-6mw89\") on node \"crc\" DevicePath \"\"" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.780249 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04b4583f-8f26-47e0-8726-a0c2f2dca07e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.780780 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae631b64-6f22-4112-8fb8-aa2c5140275b-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "ae631b64-6f22-4112-8fb8-aa2c5140275b" (UID: "ae631b64-6f22-4112-8fb8-aa2c5140275b"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.780934 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20-utilities" (OuterVolumeSpecName: "utilities") pod "b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20" (UID: "b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.781094 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae631b64-6f22-4112-8fb8-aa2c5140275b-kube-api-access-zwnxp" (OuterVolumeSpecName: "kube-api-access-zwnxp") pod "ae631b64-6f22-4112-8fb8-aa2c5140275b" (UID: "ae631b64-6f22-4112-8fb8-aa2c5140275b"). InnerVolumeSpecName "kube-api-access-zwnxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.781127 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1092e265-a1ed-40f3-9a91-c1996ea7479c-kube-api-access-q425q" (OuterVolumeSpecName: "kube-api-access-q425q") pod "1092e265-a1ed-40f3-9a91-c1996ea7479c" (UID: "1092e265-a1ed-40f3-9a91-c1996ea7479c"). InnerVolumeSpecName "kube-api-access-q425q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.781194 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae631b64-6f22-4112-8fb8-aa2c5140275b-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "ae631b64-6f22-4112-8fb8-aa2c5140275b" (UID: "ae631b64-6f22-4112-8fb8-aa2c5140275b"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.783339 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20-kube-api-access-fzcbv" (OuterVolumeSpecName: "kube-api-access-fzcbv") pod "b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20" (UID: "b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20"). InnerVolumeSpecName "kube-api-access-fzcbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.822548 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1092e265-a1ed-40f3-9a91-c1996ea7479c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1092e265-a1ed-40f3-9a91-c1996ea7479c" (UID: "1092e265-a1ed-40f3-9a91-c1996ea7479c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.824716 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20" (UID: "b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.880763 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f881266-e72e-4a74-a232-2dc3c6e95f08-utilities\") pod \"1f881266-e72e-4a74-a232-2dc3c6e95f08\" (UID: \"1f881266-e72e-4a74-a232-2dc3c6e95f08\") " Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.880808 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvb89\" (UniqueName: \"kubernetes.io/projected/1f881266-e72e-4a74-a232-2dc3c6e95f08-kube-api-access-hvb89\") pod \"1f881266-e72e-4a74-a232-2dc3c6e95f08\" (UID: \"1f881266-e72e-4a74-a232-2dc3c6e95f08\") " Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.880859 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f881266-e72e-4a74-a232-2dc3c6e95f08-catalog-content\") pod \"1f881266-e72e-4a74-a232-2dc3c6e95f08\" (UID: \"1f881266-e72e-4a74-a232-2dc3c6e95f08\") " Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.881080 4901 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ae631b64-6f22-4112-8fb8-aa2c5140275b-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.881092 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.881103 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwnxp\" (UniqueName: \"kubernetes.io/projected/ae631b64-6f22-4112-8fb8-aa2c5140275b-kube-api-access-zwnxp\") on node \"crc\" DevicePath \"\"" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.881112 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1092e265-a1ed-40f3-9a91-c1996ea7479c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.881120 4901 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ae631b64-6f22-4112-8fb8-aa2c5140275b-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.881128 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzcbv\" (UniqueName: \"kubernetes.io/projected/b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20-kube-api-access-fzcbv\") on node \"crc\" DevicePath \"\"" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.881137 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.881146 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q425q\" (UniqueName: \"kubernetes.io/projected/1092e265-a1ed-40f3-9a91-c1996ea7479c-kube-api-access-q425q\") on node \"crc\" DevicePath \"\"" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.882373 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f881266-e72e-4a74-a232-2dc3c6e95f08-utilities" (OuterVolumeSpecName: "utilities") pod "1f881266-e72e-4a74-a232-2dc3c6e95f08" (UID: "1f881266-e72e-4a74-a232-2dc3c6e95f08"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.883354 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f881266-e72e-4a74-a232-2dc3c6e95f08-kube-api-access-hvb89" (OuterVolumeSpecName: "kube-api-access-hvb89") pod "1f881266-e72e-4a74-a232-2dc3c6e95f08" (UID: "1f881266-e72e-4a74-a232-2dc3c6e95f08"). InnerVolumeSpecName "kube-api-access-hvb89". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.962407 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f7kz2"] Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.968809 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-f7kz2"] Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.984342 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f881266-e72e-4a74-a232-2dc3c6e95f08-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:27.984365 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvb89\" (UniqueName: \"kubernetes.io/projected/1f881266-e72e-4a74-a232-2dc3c6e95f08-kube-api-access-hvb89\") on node \"crc\" DevicePath \"\"" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:28.060760 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f881266-e72e-4a74-a232-2dc3c6e95f08-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1f881266-e72e-4a74-a232-2dc3c6e95f08" (UID: "1f881266-e72e-4a74-a232-2dc3c6e95f08"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:28.085408 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f881266-e72e-4a74-a232-2dc3c6e95f08-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:28.112354 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04b4583f-8f26-47e0-8726-a0c2f2dca07e" path="/var/lib/kubelet/pods/04b4583f-8f26-47e0-8726-a0c2f2dca07e/volumes" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:28.309043 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xrwmx"] Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:28.646780 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8c7zs" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:28.646773 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8c7zs" event={"ID":"b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20","Type":"ContainerDied","Data":"ad77c7e73f6190c64b8f69b2d69a0e6be726b6532002de8c126c56ec67e77890"} Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:28.646842 4901 scope.go:117] "RemoveContainer" containerID="76f23be939376cddc439ea4d4d9116a4e5a11887f55232a53bfca8ed00f21d90" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:28.648349 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xrwmx" event={"ID":"4383cf51-078c-4924-ac4d-746918c62fad","Type":"ContainerStarted","Data":"ca74ad0bc7a4426be92794e0898ce4a955f2270e2cf127fb4925ca7bc0100897"} Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:28.648375 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xrwmx" event={"ID":"4383cf51-078c-4924-ac4d-746918c62fad","Type":"ContainerStarted","Data":"6fb1046fcdcbb5c0c919ca8034d17bbdb07a0392039f33e787782207a4b60bc0"} Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:28.649387 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-xrwmx" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:28.649766 4901 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-xrwmx container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.77:8080/healthz\": dial tcp 10.217.0.77:8080: connect: connection refused" start-of-body= Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:28.649816 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-xrwmx" podUID="4383cf51-078c-4924-ac4d-746918c62fad" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.77:8080/healthz\": dial tcp 10.217.0.77:8080: connect: connection refused" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:28.651156 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-kwffm" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:28.651163 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-kwffm" event={"ID":"ae631b64-6f22-4112-8fb8-aa2c5140275b","Type":"ContainerDied","Data":"db246342e47fd6acacbe8eabe928deb9ca559a03aad68d500c550d7c64a85815"} Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:28.654193 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m9pkg" event={"ID":"1f881266-e72e-4a74-a232-2dc3c6e95f08","Type":"ContainerDied","Data":"26f0c78f7c31f9c59be5d85ae92944375c6d9934b1dc573c9efa445e871ddb5c"} Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:28.654455 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m9pkg" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:28.658150 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lvtrc" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:28.659922 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-bcbkv" event={"ID":"4d604daa-7a02-4d2a-ae84-a6d742481a04","Type":"ContainerStarted","Data":"5ff78c46ffebe53aa0a05e5178abced3a94178c7439aab688461d7eb5cfa6231"} Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:28.659987 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-bcbkv" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:28.670524 4901 scope.go:117] "RemoveContainer" containerID="ca2c6c23aed288577f4d7e1e61eda7f536c85a75458cbd1c4d8dcaf8737d7d1f" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:28.688360 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-kwffm"] Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:28.696649 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-kwffm"] Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:28.698200 4901 scope.go:117] "RemoveContainer" containerID="871f8665b64e519e97af09130385cb75718dc35dd882a5dbc5ec662ba988581e" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:28.710580 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-xrwmx" podStartSLOduration=2.710564077 podStartE2EDuration="2.710564077s" podCreationTimestamp="2026-03-09 02:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:48:28.705887126 +0000 UTC m=+433.295550878" watchObservedRunningTime="2026-03-09 02:48:28.710564077 +0000 UTC m=+433.300227819" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:28.726638 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8c7zs"] Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:28.729326 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8c7zs"] Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:28.732970 4901 scope.go:117] "RemoveContainer" containerID="c0fe91b31aeae00aa38b08b18868079f749e2b65d9d799ba9b4efc927cb5ab96" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:28.743206 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m9pkg"] Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:28.754926 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-m9pkg"] Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:28.756456 4901 scope.go:117] "RemoveContainer" containerID="eff38df816f5351f23333eb558e294ced397c67c3c364cc487c31e0d930ab1a7" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:28.758044 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-bcbkv" podStartSLOduration=2.758025175 podStartE2EDuration="2.758025175s" podCreationTimestamp="2026-03-09 02:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:48:28.75276524 +0000 UTC m=+433.342428982" watchObservedRunningTime="2026-03-09 02:48:28.758025175 +0000 UTC m=+433.347688917" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:28.773609 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lvtrc"] Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:28.773652 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lvtrc"] Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:28.775508 4901 scope.go:117] "RemoveContainer" containerID="c4a8c0c135f0c22c3c3d701e20c21309460b6bc3206ee2cbe413b767f0b661fe" Mar 09 02:48:28 crc kubenswrapper[4901]: I0309 02:48:28.795204 4901 scope.go:117] "RemoveContainer" containerID="1dbb22790e59c59af731447241266c244f449a0deeb5bf08a313ec3f2c0c8430" Mar 09 02:48:29 crc kubenswrapper[4901]: I0309 02:48:29.367290 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8l687"] Mar 09 02:48:29 crc kubenswrapper[4901]: E0309 02:48:29.367693 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f881266-e72e-4a74-a232-2dc3c6e95f08" containerName="extract-utilities" Mar 09 02:48:29 crc kubenswrapper[4901]: I0309 02:48:29.367705 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f881266-e72e-4a74-a232-2dc3c6e95f08" containerName="extract-utilities" Mar 09 02:48:29 crc kubenswrapper[4901]: E0309 02:48:29.367716 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20" containerName="extract-utilities" Mar 09 02:48:29 crc kubenswrapper[4901]: I0309 02:48:29.367722 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20" containerName="extract-utilities" Mar 09 02:48:29 crc kubenswrapper[4901]: E0309 02:48:29.367732 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20" containerName="registry-server" Mar 09 02:48:29 crc kubenswrapper[4901]: I0309 02:48:29.367737 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20" containerName="registry-server" Mar 09 02:48:29 crc kubenswrapper[4901]: E0309 02:48:29.367743 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1092e265-a1ed-40f3-9a91-c1996ea7479c" containerName="extract-content" Mar 09 02:48:29 crc kubenswrapper[4901]: I0309 02:48:29.367749 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="1092e265-a1ed-40f3-9a91-c1996ea7479c" containerName="extract-content" Mar 09 02:48:29 crc kubenswrapper[4901]: E0309 02:48:29.367758 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04b4583f-8f26-47e0-8726-a0c2f2dca07e" containerName="extract-content" Mar 09 02:48:29 crc kubenswrapper[4901]: I0309 02:48:29.367765 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="04b4583f-8f26-47e0-8726-a0c2f2dca07e" containerName="extract-content" Mar 09 02:48:29 crc kubenswrapper[4901]: E0309 02:48:29.367772 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f881266-e72e-4a74-a232-2dc3c6e95f08" containerName="registry-server" Mar 09 02:48:29 crc kubenswrapper[4901]: I0309 02:48:29.367777 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f881266-e72e-4a74-a232-2dc3c6e95f08" containerName="registry-server" Mar 09 02:48:29 crc kubenswrapper[4901]: E0309 02:48:29.367787 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04b4583f-8f26-47e0-8726-a0c2f2dca07e" containerName="extract-utilities" Mar 09 02:48:29 crc kubenswrapper[4901]: I0309 02:48:29.367793 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="04b4583f-8f26-47e0-8726-a0c2f2dca07e" containerName="extract-utilities" Mar 09 02:48:29 crc kubenswrapper[4901]: E0309 02:48:29.367800 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f881266-e72e-4a74-a232-2dc3c6e95f08" containerName="extract-content" Mar 09 02:48:29 crc kubenswrapper[4901]: I0309 02:48:29.367806 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f881266-e72e-4a74-a232-2dc3c6e95f08" containerName="extract-content" Mar 09 02:48:29 crc kubenswrapper[4901]: E0309 02:48:29.367815 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1092e265-a1ed-40f3-9a91-c1996ea7479c" containerName="extract-utilities" Mar 09 02:48:29 crc kubenswrapper[4901]: I0309 02:48:29.367821 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="1092e265-a1ed-40f3-9a91-c1996ea7479c" containerName="extract-utilities" Mar 09 02:48:29 crc kubenswrapper[4901]: E0309 02:48:29.367829 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1092e265-a1ed-40f3-9a91-c1996ea7479c" containerName="registry-server" Mar 09 02:48:29 crc kubenswrapper[4901]: I0309 02:48:29.367834 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="1092e265-a1ed-40f3-9a91-c1996ea7479c" containerName="registry-server" Mar 09 02:48:29 crc kubenswrapper[4901]: E0309 02:48:29.367842 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20" containerName="extract-content" Mar 09 02:48:29 crc kubenswrapper[4901]: I0309 02:48:29.367849 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20" containerName="extract-content" Mar 09 02:48:29 crc kubenswrapper[4901]: E0309 02:48:29.367857 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae631b64-6f22-4112-8fb8-aa2c5140275b" containerName="marketplace-operator" Mar 09 02:48:29 crc kubenswrapper[4901]: I0309 02:48:29.367862 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae631b64-6f22-4112-8fb8-aa2c5140275b" containerName="marketplace-operator" Mar 09 02:48:29 crc kubenswrapper[4901]: E0309 02:48:29.367869 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04b4583f-8f26-47e0-8726-a0c2f2dca07e" containerName="registry-server" Mar 09 02:48:29 crc kubenswrapper[4901]: I0309 02:48:29.367875 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="04b4583f-8f26-47e0-8726-a0c2f2dca07e" containerName="registry-server" Mar 09 02:48:29 crc kubenswrapper[4901]: I0309 02:48:29.367951 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="1092e265-a1ed-40f3-9a91-c1996ea7479c" containerName="registry-server" Mar 09 02:48:29 crc kubenswrapper[4901]: I0309 02:48:29.367961 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae631b64-6f22-4112-8fb8-aa2c5140275b" containerName="marketplace-operator" Mar 09 02:48:29 crc kubenswrapper[4901]: I0309 02:48:29.367969 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="04b4583f-8f26-47e0-8726-a0c2f2dca07e" containerName="registry-server" Mar 09 02:48:29 crc kubenswrapper[4901]: I0309 02:48:29.367979 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f881266-e72e-4a74-a232-2dc3c6e95f08" containerName="registry-server" Mar 09 02:48:29 crc kubenswrapper[4901]: I0309 02:48:29.367988 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20" containerName="registry-server" Mar 09 02:48:29 crc kubenswrapper[4901]: I0309 02:48:29.369120 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8l687" Mar 09 02:48:29 crc kubenswrapper[4901]: I0309 02:48:29.372260 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 09 02:48:29 crc kubenswrapper[4901]: I0309 02:48:29.437786 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8l687"] Mar 09 02:48:29 crc kubenswrapper[4901]: I0309 02:48:29.507547 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29vrz\" (UniqueName: \"kubernetes.io/projected/4331116b-95d4-404d-9f0a-97919df59eb4-kube-api-access-29vrz\") pod \"redhat-operators-8l687\" (UID: \"4331116b-95d4-404d-9f0a-97919df59eb4\") " pod="openshift-marketplace/redhat-operators-8l687" Mar 09 02:48:29 crc kubenswrapper[4901]: I0309 02:48:29.507617 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4331116b-95d4-404d-9f0a-97919df59eb4-catalog-content\") pod \"redhat-operators-8l687\" (UID: \"4331116b-95d4-404d-9f0a-97919df59eb4\") " pod="openshift-marketplace/redhat-operators-8l687" Mar 09 02:48:29 crc kubenswrapper[4901]: I0309 02:48:29.507652 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4331116b-95d4-404d-9f0a-97919df59eb4-utilities\") pod \"redhat-operators-8l687\" (UID: \"4331116b-95d4-404d-9f0a-97919df59eb4\") " pod="openshift-marketplace/redhat-operators-8l687" Mar 09 02:48:29 crc kubenswrapper[4901]: I0309 02:48:29.608828 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4331116b-95d4-404d-9f0a-97919df59eb4-utilities\") pod \"redhat-operators-8l687\" (UID: \"4331116b-95d4-404d-9f0a-97919df59eb4\") " pod="openshift-marketplace/redhat-operators-8l687" Mar 09 02:48:29 crc kubenswrapper[4901]: I0309 02:48:29.608930 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29vrz\" (UniqueName: \"kubernetes.io/projected/4331116b-95d4-404d-9f0a-97919df59eb4-kube-api-access-29vrz\") pod \"redhat-operators-8l687\" (UID: \"4331116b-95d4-404d-9f0a-97919df59eb4\") " pod="openshift-marketplace/redhat-operators-8l687" Mar 09 02:48:29 crc kubenswrapper[4901]: I0309 02:48:29.609058 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4331116b-95d4-404d-9f0a-97919df59eb4-catalog-content\") pod \"redhat-operators-8l687\" (UID: \"4331116b-95d4-404d-9f0a-97919df59eb4\") " pod="openshift-marketplace/redhat-operators-8l687" Mar 09 02:48:29 crc kubenswrapper[4901]: I0309 02:48:29.609623 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4331116b-95d4-404d-9f0a-97919df59eb4-catalog-content\") pod \"redhat-operators-8l687\" (UID: \"4331116b-95d4-404d-9f0a-97919df59eb4\") " pod="openshift-marketplace/redhat-operators-8l687" Mar 09 02:48:29 crc kubenswrapper[4901]: I0309 02:48:29.609891 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4331116b-95d4-404d-9f0a-97919df59eb4-utilities\") pod \"redhat-operators-8l687\" (UID: \"4331116b-95d4-404d-9f0a-97919df59eb4\") " pod="openshift-marketplace/redhat-operators-8l687" Mar 09 02:48:29 crc kubenswrapper[4901]: I0309 02:48:29.637991 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29vrz\" (UniqueName: \"kubernetes.io/projected/4331116b-95d4-404d-9f0a-97919df59eb4-kube-api-access-29vrz\") pod \"redhat-operators-8l687\" (UID: \"4331116b-95d4-404d-9f0a-97919df59eb4\") " pod="openshift-marketplace/redhat-operators-8l687" Mar 09 02:48:29 crc kubenswrapper[4901]: I0309 02:48:29.672500 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-xrwmx" Mar 09 02:48:29 crc kubenswrapper[4901]: I0309 02:48:29.695267 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8l687" Mar 09 02:48:30 crc kubenswrapper[4901]: I0309 02:48:30.122939 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1092e265-a1ed-40f3-9a91-c1996ea7479c" path="/var/lib/kubelet/pods/1092e265-a1ed-40f3-9a91-c1996ea7479c/volumes" Mar 09 02:48:30 crc kubenswrapper[4901]: I0309 02:48:30.125597 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f881266-e72e-4a74-a232-2dc3c6e95f08" path="/var/lib/kubelet/pods/1f881266-e72e-4a74-a232-2dc3c6e95f08/volumes" Mar 09 02:48:30 crc kubenswrapper[4901]: I0309 02:48:30.128586 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae631b64-6f22-4112-8fb8-aa2c5140275b" path="/var/lib/kubelet/pods/ae631b64-6f22-4112-8fb8-aa2c5140275b/volumes" Mar 09 02:48:30 crc kubenswrapper[4901]: I0309 02:48:30.129749 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20" path="/var/lib/kubelet/pods/b8f8e088-32d5-4d89-87cc-5a3ecc3f3f20/volumes" Mar 09 02:48:30 crc kubenswrapper[4901]: I0309 02:48:30.130809 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8l687"] Mar 09 02:48:30 crc kubenswrapper[4901]: I0309 02:48:30.673942 4901 generic.go:334] "Generic (PLEG): container finished" podID="4331116b-95d4-404d-9f0a-97919df59eb4" containerID="6d9550535a1db09c461ad5296c2511dc546541ad686a9f1d66ff775350fc66fc" exitCode=0 Mar 09 02:48:30 crc kubenswrapper[4901]: I0309 02:48:30.674026 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8l687" event={"ID":"4331116b-95d4-404d-9f0a-97919df59eb4","Type":"ContainerDied","Data":"6d9550535a1db09c461ad5296c2511dc546541ad686a9f1d66ff775350fc66fc"} Mar 09 02:48:30 crc kubenswrapper[4901]: I0309 02:48:30.674271 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8l687" event={"ID":"4331116b-95d4-404d-9f0a-97919df59eb4","Type":"ContainerStarted","Data":"c8181be992345dd450632d169cf13c5865ea4b0acadc7d2060b5adc504cd71f4"} Mar 09 02:48:31 crc kubenswrapper[4901]: I0309 02:48:31.176106 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xkv4b"] Mar 09 02:48:31 crc kubenswrapper[4901]: I0309 02:48:31.177351 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xkv4b" Mar 09 02:48:31 crc kubenswrapper[4901]: I0309 02:48:31.178992 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 09 02:48:31 crc kubenswrapper[4901]: I0309 02:48:31.182735 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xkv4b"] Mar 09 02:48:31 crc kubenswrapper[4901]: I0309 02:48:31.234741 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6b5h\" (UniqueName: \"kubernetes.io/projected/6306095b-e7a3-4041-b513-2340505b5bef-kube-api-access-s6b5h\") pod \"certified-operators-xkv4b\" (UID: \"6306095b-e7a3-4041-b513-2340505b5bef\") " pod="openshift-marketplace/certified-operators-xkv4b" Mar 09 02:48:31 crc kubenswrapper[4901]: I0309 02:48:31.235042 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6306095b-e7a3-4041-b513-2340505b5bef-utilities\") pod \"certified-operators-xkv4b\" (UID: \"6306095b-e7a3-4041-b513-2340505b5bef\") " pod="openshift-marketplace/certified-operators-xkv4b" Mar 09 02:48:31 crc kubenswrapper[4901]: I0309 02:48:31.235089 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6306095b-e7a3-4041-b513-2340505b5bef-catalog-content\") pod \"certified-operators-xkv4b\" (UID: \"6306095b-e7a3-4041-b513-2340505b5bef\") " pod="openshift-marketplace/certified-operators-xkv4b" Mar 09 02:48:31 crc kubenswrapper[4901]: I0309 02:48:31.336212 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6306095b-e7a3-4041-b513-2340505b5bef-utilities\") pod \"certified-operators-xkv4b\" (UID: \"6306095b-e7a3-4041-b513-2340505b5bef\") " pod="openshift-marketplace/certified-operators-xkv4b" Mar 09 02:48:31 crc kubenswrapper[4901]: I0309 02:48:31.336352 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6306095b-e7a3-4041-b513-2340505b5bef-catalog-content\") pod \"certified-operators-xkv4b\" (UID: \"6306095b-e7a3-4041-b513-2340505b5bef\") " pod="openshift-marketplace/certified-operators-xkv4b" Mar 09 02:48:31 crc kubenswrapper[4901]: I0309 02:48:31.336575 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6b5h\" (UniqueName: \"kubernetes.io/projected/6306095b-e7a3-4041-b513-2340505b5bef-kube-api-access-s6b5h\") pod \"certified-operators-xkv4b\" (UID: \"6306095b-e7a3-4041-b513-2340505b5bef\") " pod="openshift-marketplace/certified-operators-xkv4b" Mar 09 02:48:31 crc kubenswrapper[4901]: I0309 02:48:31.336924 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6306095b-e7a3-4041-b513-2340505b5bef-utilities\") pod \"certified-operators-xkv4b\" (UID: \"6306095b-e7a3-4041-b513-2340505b5bef\") " pod="openshift-marketplace/certified-operators-xkv4b" Mar 09 02:48:31 crc kubenswrapper[4901]: I0309 02:48:31.337012 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6306095b-e7a3-4041-b513-2340505b5bef-catalog-content\") pod \"certified-operators-xkv4b\" (UID: \"6306095b-e7a3-4041-b513-2340505b5bef\") " pod="openshift-marketplace/certified-operators-xkv4b" Mar 09 02:48:31 crc kubenswrapper[4901]: I0309 02:48:31.364213 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6b5h\" (UniqueName: \"kubernetes.io/projected/6306095b-e7a3-4041-b513-2340505b5bef-kube-api-access-s6b5h\") pod \"certified-operators-xkv4b\" (UID: \"6306095b-e7a3-4041-b513-2340505b5bef\") " pod="openshift-marketplace/certified-operators-xkv4b" Mar 09 02:48:31 crc kubenswrapper[4901]: I0309 02:48:31.535010 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xkv4b" Mar 09 02:48:31 crc kubenswrapper[4901]: I0309 02:48:31.781097 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-r8t8d"] Mar 09 02:48:31 crc kubenswrapper[4901]: I0309 02:48:31.783095 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r8t8d" Mar 09 02:48:31 crc kubenswrapper[4901]: I0309 02:48:31.788600 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 09 02:48:31 crc kubenswrapper[4901]: I0309 02:48:31.791610 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r8t8d"] Mar 09 02:48:31 crc kubenswrapper[4901]: I0309 02:48:31.843729 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dd2e3f1-5a7c-44fc-900c-706a3fba8ff4-catalog-content\") pod \"community-operators-r8t8d\" (UID: \"6dd2e3f1-5a7c-44fc-900c-706a3fba8ff4\") " pod="openshift-marketplace/community-operators-r8t8d" Mar 09 02:48:31 crc kubenswrapper[4901]: I0309 02:48:31.843870 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t22s7\" (UniqueName: \"kubernetes.io/projected/6dd2e3f1-5a7c-44fc-900c-706a3fba8ff4-kube-api-access-t22s7\") pod \"community-operators-r8t8d\" (UID: \"6dd2e3f1-5a7c-44fc-900c-706a3fba8ff4\") " pod="openshift-marketplace/community-operators-r8t8d" Mar 09 02:48:31 crc kubenswrapper[4901]: I0309 02:48:31.844022 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dd2e3f1-5a7c-44fc-900c-706a3fba8ff4-utilities\") pod \"community-operators-r8t8d\" (UID: \"6dd2e3f1-5a7c-44fc-900c-706a3fba8ff4\") " pod="openshift-marketplace/community-operators-r8t8d" Mar 09 02:48:31 crc kubenswrapper[4901]: I0309 02:48:31.945673 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dd2e3f1-5a7c-44fc-900c-706a3fba8ff4-utilities\") pod \"community-operators-r8t8d\" (UID: \"6dd2e3f1-5a7c-44fc-900c-706a3fba8ff4\") " pod="openshift-marketplace/community-operators-r8t8d" Mar 09 02:48:31 crc kubenswrapper[4901]: I0309 02:48:31.945847 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dd2e3f1-5a7c-44fc-900c-706a3fba8ff4-catalog-content\") pod \"community-operators-r8t8d\" (UID: \"6dd2e3f1-5a7c-44fc-900c-706a3fba8ff4\") " pod="openshift-marketplace/community-operators-r8t8d" Mar 09 02:48:31 crc kubenswrapper[4901]: I0309 02:48:31.945961 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t22s7\" (UniqueName: \"kubernetes.io/projected/6dd2e3f1-5a7c-44fc-900c-706a3fba8ff4-kube-api-access-t22s7\") pod \"community-operators-r8t8d\" (UID: \"6dd2e3f1-5a7c-44fc-900c-706a3fba8ff4\") " pod="openshift-marketplace/community-operators-r8t8d" Mar 09 02:48:31 crc kubenswrapper[4901]: I0309 02:48:31.947335 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dd2e3f1-5a7c-44fc-900c-706a3fba8ff4-catalog-content\") pod \"community-operators-r8t8d\" (UID: \"6dd2e3f1-5a7c-44fc-900c-706a3fba8ff4\") " pod="openshift-marketplace/community-operators-r8t8d" Mar 09 02:48:31 crc kubenswrapper[4901]: I0309 02:48:31.947996 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dd2e3f1-5a7c-44fc-900c-706a3fba8ff4-utilities\") pod \"community-operators-r8t8d\" (UID: \"6dd2e3f1-5a7c-44fc-900c-706a3fba8ff4\") " pod="openshift-marketplace/community-operators-r8t8d" Mar 09 02:48:31 crc kubenswrapper[4901]: I0309 02:48:31.985147 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t22s7\" (UniqueName: \"kubernetes.io/projected/6dd2e3f1-5a7c-44fc-900c-706a3fba8ff4-kube-api-access-t22s7\") pod \"community-operators-r8t8d\" (UID: \"6dd2e3f1-5a7c-44fc-900c-706a3fba8ff4\") " pod="openshift-marketplace/community-operators-r8t8d" Mar 09 02:48:31 crc kubenswrapper[4901]: I0309 02:48:31.991501 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xkv4b"] Mar 09 02:48:32 crc kubenswrapper[4901]: I0309 02:48:32.111693 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r8t8d" Mar 09 02:48:32 crc kubenswrapper[4901]: I0309 02:48:32.563947 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r8t8d"] Mar 09 02:48:32 crc kubenswrapper[4901]: I0309 02:48:32.688146 4901 generic.go:334] "Generic (PLEG): container finished" podID="6306095b-e7a3-4041-b513-2340505b5bef" containerID="47df0d350fae3e09992b9e0f0c0ff49a7b6b9107f6f55d210d6de3b4a89938e6" exitCode=0 Mar 09 02:48:32 crc kubenswrapper[4901]: I0309 02:48:32.688216 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xkv4b" event={"ID":"6306095b-e7a3-4041-b513-2340505b5bef","Type":"ContainerDied","Data":"47df0d350fae3e09992b9e0f0c0ff49a7b6b9107f6f55d210d6de3b4a89938e6"} Mar 09 02:48:32 crc kubenswrapper[4901]: I0309 02:48:32.688266 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xkv4b" event={"ID":"6306095b-e7a3-4041-b513-2340505b5bef","Type":"ContainerStarted","Data":"fb01cca57483090d096a466fd4a7a25c549211ca7fe3a977b39f49a70208616c"} Mar 09 02:48:32 crc kubenswrapper[4901]: I0309 02:48:32.696914 4901 generic.go:334] "Generic (PLEG): container finished" podID="4331116b-95d4-404d-9f0a-97919df59eb4" containerID="c99b4d5f6e53033a32dad12cbd7949c450ca2c280e075c661af77382cd87b078" exitCode=0 Mar 09 02:48:32 crc kubenswrapper[4901]: I0309 02:48:32.697178 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8l687" event={"ID":"4331116b-95d4-404d-9f0a-97919df59eb4","Type":"ContainerDied","Data":"c99b4d5f6e53033a32dad12cbd7949c450ca2c280e075c661af77382cd87b078"} Mar 09 02:48:32 crc kubenswrapper[4901]: I0309 02:48:32.702431 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r8t8d" event={"ID":"6dd2e3f1-5a7c-44fc-900c-706a3fba8ff4","Type":"ContainerStarted","Data":"975bcf40a9e3e24669e242cc9e95d3f0e5dd3bfca5b3766a59c642cb5cf4c9c2"} Mar 09 02:48:33 crc kubenswrapper[4901]: I0309 02:48:33.565547 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dsnfd"] Mar 09 02:48:33 crc kubenswrapper[4901]: I0309 02:48:33.566524 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dsnfd" Mar 09 02:48:33 crc kubenswrapper[4901]: I0309 02:48:33.569557 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 09 02:48:33 crc kubenswrapper[4901]: I0309 02:48:33.591954 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dsnfd"] Mar 09 02:48:33 crc kubenswrapper[4901]: I0309 02:48:33.668900 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd6311cf-10e6-4c9f-a90c-9ed2a95680d8-utilities\") pod \"redhat-marketplace-dsnfd\" (UID: \"cd6311cf-10e6-4c9f-a90c-9ed2a95680d8\") " pod="openshift-marketplace/redhat-marketplace-dsnfd" Mar 09 02:48:33 crc kubenswrapper[4901]: I0309 02:48:33.668975 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd6311cf-10e6-4c9f-a90c-9ed2a95680d8-catalog-content\") pod \"redhat-marketplace-dsnfd\" (UID: \"cd6311cf-10e6-4c9f-a90c-9ed2a95680d8\") " pod="openshift-marketplace/redhat-marketplace-dsnfd" Mar 09 02:48:33 crc kubenswrapper[4901]: I0309 02:48:33.669067 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjwdp\" (UniqueName: \"kubernetes.io/projected/cd6311cf-10e6-4c9f-a90c-9ed2a95680d8-kube-api-access-qjwdp\") pod \"redhat-marketplace-dsnfd\" (UID: \"cd6311cf-10e6-4c9f-a90c-9ed2a95680d8\") " pod="openshift-marketplace/redhat-marketplace-dsnfd" Mar 09 02:48:33 crc kubenswrapper[4901]: I0309 02:48:33.710167 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xkv4b" event={"ID":"6306095b-e7a3-4041-b513-2340505b5bef","Type":"ContainerStarted","Data":"23726208052b2253d90a0e191ea4d3395ca489c9309955c74ead9da5c1f57b94"} Mar 09 02:48:33 crc kubenswrapper[4901]: I0309 02:48:33.714617 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8l687" event={"ID":"4331116b-95d4-404d-9f0a-97919df59eb4","Type":"ContainerStarted","Data":"8ad1f567382d718ea0a28b86104d22c987dabdc4e2a6780916f5ccd00d1bff86"} Mar 09 02:48:33 crc kubenswrapper[4901]: I0309 02:48:33.718038 4901 generic.go:334] "Generic (PLEG): container finished" podID="6dd2e3f1-5a7c-44fc-900c-706a3fba8ff4" containerID="1ce34f9d7e62a4521311bfb13974bd1200ea58985bdd638e84106eff8e8fde21" exitCode=0 Mar 09 02:48:33 crc kubenswrapper[4901]: I0309 02:48:33.718092 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r8t8d" event={"ID":"6dd2e3f1-5a7c-44fc-900c-706a3fba8ff4","Type":"ContainerDied","Data":"1ce34f9d7e62a4521311bfb13974bd1200ea58985bdd638e84106eff8e8fde21"} Mar 09 02:48:33 crc kubenswrapper[4901]: I0309 02:48:33.768316 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8l687" podStartSLOduration=2.329805932 podStartE2EDuration="4.768294848s" podCreationTimestamp="2026-03-09 02:48:29 +0000 UTC" firstStartedPulling="2026-03-09 02:48:30.675721047 +0000 UTC m=+435.265384769" lastFinishedPulling="2026-03-09 02:48:33.114209953 +0000 UTC m=+437.703873685" observedRunningTime="2026-03-09 02:48:33.75317495 +0000 UTC m=+438.342838712" watchObservedRunningTime="2026-03-09 02:48:33.768294848 +0000 UTC m=+438.357958570" Mar 09 02:48:33 crc kubenswrapper[4901]: I0309 02:48:33.770567 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjwdp\" (UniqueName: \"kubernetes.io/projected/cd6311cf-10e6-4c9f-a90c-9ed2a95680d8-kube-api-access-qjwdp\") pod \"redhat-marketplace-dsnfd\" (UID: \"cd6311cf-10e6-4c9f-a90c-9ed2a95680d8\") " pod="openshift-marketplace/redhat-marketplace-dsnfd" Mar 09 02:48:33 crc kubenswrapper[4901]: I0309 02:48:33.770626 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd6311cf-10e6-4c9f-a90c-9ed2a95680d8-utilities\") pod \"redhat-marketplace-dsnfd\" (UID: \"cd6311cf-10e6-4c9f-a90c-9ed2a95680d8\") " pod="openshift-marketplace/redhat-marketplace-dsnfd" Mar 09 02:48:33 crc kubenswrapper[4901]: I0309 02:48:33.770681 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd6311cf-10e6-4c9f-a90c-9ed2a95680d8-catalog-content\") pod \"redhat-marketplace-dsnfd\" (UID: \"cd6311cf-10e6-4c9f-a90c-9ed2a95680d8\") " pod="openshift-marketplace/redhat-marketplace-dsnfd" Mar 09 02:48:33 crc kubenswrapper[4901]: I0309 02:48:33.771426 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd6311cf-10e6-4c9f-a90c-9ed2a95680d8-catalog-content\") pod \"redhat-marketplace-dsnfd\" (UID: \"cd6311cf-10e6-4c9f-a90c-9ed2a95680d8\") " pod="openshift-marketplace/redhat-marketplace-dsnfd" Mar 09 02:48:33 crc kubenswrapper[4901]: I0309 02:48:33.771569 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd6311cf-10e6-4c9f-a90c-9ed2a95680d8-utilities\") pod \"redhat-marketplace-dsnfd\" (UID: \"cd6311cf-10e6-4c9f-a90c-9ed2a95680d8\") " pod="openshift-marketplace/redhat-marketplace-dsnfd" Mar 09 02:48:33 crc kubenswrapper[4901]: I0309 02:48:33.797690 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjwdp\" (UniqueName: \"kubernetes.io/projected/cd6311cf-10e6-4c9f-a90c-9ed2a95680d8-kube-api-access-qjwdp\") pod \"redhat-marketplace-dsnfd\" (UID: \"cd6311cf-10e6-4c9f-a90c-9ed2a95680d8\") " pod="openshift-marketplace/redhat-marketplace-dsnfd" Mar 09 02:48:33 crc kubenswrapper[4901]: I0309 02:48:33.883635 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dsnfd" Mar 09 02:48:34 crc kubenswrapper[4901]: I0309 02:48:34.329067 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dsnfd"] Mar 09 02:48:34 crc kubenswrapper[4901]: I0309 02:48:34.726853 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r8t8d" event={"ID":"6dd2e3f1-5a7c-44fc-900c-706a3fba8ff4","Type":"ContainerStarted","Data":"df6d6eceadae0725bb9d0b5e9f6c3fd88b640dfe82fc2d62037a1a62f048ee1e"} Mar 09 02:48:34 crc kubenswrapper[4901]: I0309 02:48:34.730431 4901 generic.go:334] "Generic (PLEG): container finished" podID="6306095b-e7a3-4041-b513-2340505b5bef" containerID="23726208052b2253d90a0e191ea4d3395ca489c9309955c74ead9da5c1f57b94" exitCode=0 Mar 09 02:48:34 crc kubenswrapper[4901]: I0309 02:48:34.730545 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xkv4b" event={"ID":"6306095b-e7a3-4041-b513-2340505b5bef","Type":"ContainerDied","Data":"23726208052b2253d90a0e191ea4d3395ca489c9309955c74ead9da5c1f57b94"} Mar 09 02:48:34 crc kubenswrapper[4901]: I0309 02:48:34.733309 4901 generic.go:334] "Generic (PLEG): container finished" podID="cd6311cf-10e6-4c9f-a90c-9ed2a95680d8" containerID="fd0ea4f67b34d9c3cceaaee561f849656bb49898958440c6debd3270244d3585" exitCode=0 Mar 09 02:48:34 crc kubenswrapper[4901]: I0309 02:48:34.733384 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dsnfd" event={"ID":"cd6311cf-10e6-4c9f-a90c-9ed2a95680d8","Type":"ContainerDied","Data":"fd0ea4f67b34d9c3cceaaee561f849656bb49898958440c6debd3270244d3585"} Mar 09 02:48:34 crc kubenswrapper[4901]: I0309 02:48:34.733420 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dsnfd" event={"ID":"cd6311cf-10e6-4c9f-a90c-9ed2a95680d8","Type":"ContainerStarted","Data":"b7c787418f6edacad243cd01ca48ca276778e99986c3792b2c132d9274aeae11"} Mar 09 02:48:35 crc kubenswrapper[4901]: I0309 02:48:35.742243 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dsnfd" event={"ID":"cd6311cf-10e6-4c9f-a90c-9ed2a95680d8","Type":"ContainerStarted","Data":"9e56d99eca027bb3ca77f0b3b3f20acb59d616ac2321db41a4943bcb32aebb60"} Mar 09 02:48:35 crc kubenswrapper[4901]: I0309 02:48:35.744249 4901 generic.go:334] "Generic (PLEG): container finished" podID="6dd2e3f1-5a7c-44fc-900c-706a3fba8ff4" containerID="df6d6eceadae0725bb9d0b5e9f6c3fd88b640dfe82fc2d62037a1a62f048ee1e" exitCode=0 Mar 09 02:48:35 crc kubenswrapper[4901]: I0309 02:48:35.744316 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r8t8d" event={"ID":"6dd2e3f1-5a7c-44fc-900c-706a3fba8ff4","Type":"ContainerDied","Data":"df6d6eceadae0725bb9d0b5e9f6c3fd88b640dfe82fc2d62037a1a62f048ee1e"} Mar 09 02:48:35 crc kubenswrapper[4901]: I0309 02:48:35.750075 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xkv4b" event={"ID":"6306095b-e7a3-4041-b513-2340505b5bef","Type":"ContainerStarted","Data":"85d1bdb20e96fbabfc4871a06aafcc472732e303d074331a1c8050a161896776"} Mar 09 02:48:35 crc kubenswrapper[4901]: I0309 02:48:35.820638 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xkv4b" podStartSLOduration=2.358500977 podStartE2EDuration="4.820619938s" podCreationTimestamp="2026-03-09 02:48:31 +0000 UTC" firstStartedPulling="2026-03-09 02:48:32.690518484 +0000 UTC m=+437.280182216" lastFinishedPulling="2026-03-09 02:48:35.152637445 +0000 UTC m=+439.742301177" observedRunningTime="2026-03-09 02:48:35.813900295 +0000 UTC m=+440.403564057" watchObservedRunningTime="2026-03-09 02:48:35.820619938 +0000 UTC m=+440.410283690" Mar 09 02:48:36 crc kubenswrapper[4901]: I0309 02:48:36.759432 4901 generic.go:334] "Generic (PLEG): container finished" podID="cd6311cf-10e6-4c9f-a90c-9ed2a95680d8" containerID="9e56d99eca027bb3ca77f0b3b3f20acb59d616ac2321db41a4943bcb32aebb60" exitCode=0 Mar 09 02:48:36 crc kubenswrapper[4901]: I0309 02:48:36.759501 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dsnfd" event={"ID":"cd6311cf-10e6-4c9f-a90c-9ed2a95680d8","Type":"ContainerDied","Data":"9e56d99eca027bb3ca77f0b3b3f20acb59d616ac2321db41a4943bcb32aebb60"} Mar 09 02:48:36 crc kubenswrapper[4901]: I0309 02:48:36.762365 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r8t8d" event={"ID":"6dd2e3f1-5a7c-44fc-900c-706a3fba8ff4","Type":"ContainerStarted","Data":"5e682c6274362e4d78e5e5777c1183305e59899a2e44cb2d01d04e3bf94f68a2"} Mar 09 02:48:36 crc kubenswrapper[4901]: I0309 02:48:36.809624 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-r8t8d" podStartSLOduration=3.358717659 podStartE2EDuration="5.809601023s" podCreationTimestamp="2026-03-09 02:48:31 +0000 UTC" firstStartedPulling="2026-03-09 02:48:33.721087746 +0000 UTC m=+438.310751718" lastFinishedPulling="2026-03-09 02:48:36.17197134 +0000 UTC m=+440.761635082" observedRunningTime="2026-03-09 02:48:36.808738531 +0000 UTC m=+441.398402263" watchObservedRunningTime="2026-03-09 02:48:36.809601023 +0000 UTC m=+441.399264765" Mar 09 02:48:37 crc kubenswrapper[4901]: I0309 02:48:37.769330 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dsnfd" event={"ID":"cd6311cf-10e6-4c9f-a90c-9ed2a95680d8","Type":"ContainerStarted","Data":"f21671e66a10ca1d0499e1ba2cbad839062e1127d3f05c53d0046734c97f15c1"} Mar 09 02:48:37 crc kubenswrapper[4901]: I0309 02:48:37.792496 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dsnfd" podStartSLOduration=2.371495854 podStartE2EDuration="4.792477521s" podCreationTimestamp="2026-03-09 02:48:33 +0000 UTC" firstStartedPulling="2026-03-09 02:48:34.736095099 +0000 UTC m=+439.325758841" lastFinishedPulling="2026-03-09 02:48:37.157076786 +0000 UTC m=+441.746740508" observedRunningTime="2026-03-09 02:48:37.788758245 +0000 UTC m=+442.378421987" watchObservedRunningTime="2026-03-09 02:48:37.792477521 +0000 UTC m=+442.382141253" Mar 09 02:48:39 crc kubenswrapper[4901]: I0309 02:48:39.696147 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8l687" Mar 09 02:48:39 crc kubenswrapper[4901]: I0309 02:48:39.696189 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8l687" Mar 09 02:48:40 crc kubenswrapper[4901]: I0309 02:48:40.740014 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8l687" podUID="4331116b-95d4-404d-9f0a-97919df59eb4" containerName="registry-server" probeResult="failure" output=< Mar 09 02:48:40 crc kubenswrapper[4901]: timeout: failed to connect service ":50051" within 1s Mar 09 02:48:40 crc kubenswrapper[4901]: > Mar 09 02:48:41 crc kubenswrapper[4901]: I0309 02:48:41.544589 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xkv4b" Mar 09 02:48:41 crc kubenswrapper[4901]: I0309 02:48:41.544681 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xkv4b" Mar 09 02:48:41 crc kubenswrapper[4901]: I0309 02:48:41.612384 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xkv4b" Mar 09 02:48:41 crc kubenswrapper[4901]: I0309 02:48:41.857131 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xkv4b" Mar 09 02:48:42 crc kubenswrapper[4901]: I0309 02:48:42.119403 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-r8t8d" Mar 09 02:48:42 crc kubenswrapper[4901]: I0309 02:48:42.119656 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-r8t8d" Mar 09 02:48:42 crc kubenswrapper[4901]: I0309 02:48:42.194540 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-r8t8d" Mar 09 02:48:42 crc kubenswrapper[4901]: I0309 02:48:42.870360 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-r8t8d" Mar 09 02:48:43 crc kubenswrapper[4901]: I0309 02:48:43.884005 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dsnfd" Mar 09 02:48:43 crc kubenswrapper[4901]: I0309 02:48:43.884073 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dsnfd" Mar 09 02:48:43 crc kubenswrapper[4901]: I0309 02:48:43.953453 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dsnfd" Mar 09 02:48:44 crc kubenswrapper[4901]: I0309 02:48:44.856543 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dsnfd" Mar 09 02:48:46 crc kubenswrapper[4901]: I0309 02:48:46.614164 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-bcbkv" Mar 09 02:48:46 crc kubenswrapper[4901]: I0309 02:48:46.698425 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tpmfc"] Mar 09 02:48:49 crc kubenswrapper[4901]: I0309 02:48:49.766105 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8l687" Mar 09 02:48:49 crc kubenswrapper[4901]: I0309 02:48:49.836499 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8l687" Mar 09 02:49:00 crc kubenswrapper[4901]: I0309 02:49:00.863006 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 02:49:00 crc kubenswrapper[4901]: I0309 02:49:00.863648 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 02:49:11 crc kubenswrapper[4901]: I0309 02:49:11.744097 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" podUID="3befa408-f48c-4244-81ca-6bf178967fbe" containerName="registry" containerID="cri-o://4accb1a5953199fd48c9457d5a8e4d43ff0a6f94b349e73f169595c37979ee89" gracePeriod=30 Mar 09 02:49:11 crc kubenswrapper[4901]: I0309 02:49:11.994015 4901 generic.go:334] "Generic (PLEG): container finished" podID="3befa408-f48c-4244-81ca-6bf178967fbe" containerID="4accb1a5953199fd48c9457d5a8e4d43ff0a6f94b349e73f169595c37979ee89" exitCode=0 Mar 09 02:49:11 crc kubenswrapper[4901]: I0309 02:49:11.994082 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" event={"ID":"3befa408-f48c-4244-81ca-6bf178967fbe","Type":"ContainerDied","Data":"4accb1a5953199fd48c9457d5a8e4d43ff0a6f94b349e73f169595c37979ee89"} Mar 09 02:49:12 crc kubenswrapper[4901]: I0309 02:49:12.152402 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:49:12 crc kubenswrapper[4901]: I0309 02:49:12.241531 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3befa408-f48c-4244-81ca-6bf178967fbe-bound-sa-token\") pod \"3befa408-f48c-4244-81ca-6bf178967fbe\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " Mar 09 02:49:12 crc kubenswrapper[4901]: I0309 02:49:12.241774 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"3befa408-f48c-4244-81ca-6bf178967fbe\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " Mar 09 02:49:12 crc kubenswrapper[4901]: I0309 02:49:12.241803 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3befa408-f48c-4244-81ca-6bf178967fbe-installation-pull-secrets\") pod \"3befa408-f48c-4244-81ca-6bf178967fbe\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " Mar 09 02:49:12 crc kubenswrapper[4901]: I0309 02:49:12.241828 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3befa408-f48c-4244-81ca-6bf178967fbe-registry-tls\") pod \"3befa408-f48c-4244-81ca-6bf178967fbe\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " Mar 09 02:49:12 crc kubenswrapper[4901]: I0309 02:49:12.241848 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3befa408-f48c-4244-81ca-6bf178967fbe-registry-certificates\") pod \"3befa408-f48c-4244-81ca-6bf178967fbe\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " Mar 09 02:49:12 crc kubenswrapper[4901]: I0309 02:49:12.241877 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3befa408-f48c-4244-81ca-6bf178967fbe-trusted-ca\") pod \"3befa408-f48c-4244-81ca-6bf178967fbe\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " Mar 09 02:49:12 crc kubenswrapper[4901]: I0309 02:49:12.241920 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3befa408-f48c-4244-81ca-6bf178967fbe-ca-trust-extracted\") pod \"3befa408-f48c-4244-81ca-6bf178967fbe\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " Mar 09 02:49:12 crc kubenswrapper[4901]: I0309 02:49:12.241945 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7w2l5\" (UniqueName: \"kubernetes.io/projected/3befa408-f48c-4244-81ca-6bf178967fbe-kube-api-access-7w2l5\") pod \"3befa408-f48c-4244-81ca-6bf178967fbe\" (UID: \"3befa408-f48c-4244-81ca-6bf178967fbe\") " Mar 09 02:49:12 crc kubenswrapper[4901]: I0309 02:49:12.242678 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3befa408-f48c-4244-81ca-6bf178967fbe-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "3befa408-f48c-4244-81ca-6bf178967fbe" (UID: "3befa408-f48c-4244-81ca-6bf178967fbe"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:49:12 crc kubenswrapper[4901]: I0309 02:49:12.242730 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3befa408-f48c-4244-81ca-6bf178967fbe-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "3befa408-f48c-4244-81ca-6bf178967fbe" (UID: "3befa408-f48c-4244-81ca-6bf178967fbe"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:49:12 crc kubenswrapper[4901]: I0309 02:49:12.248233 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3befa408-f48c-4244-81ca-6bf178967fbe-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "3befa408-f48c-4244-81ca-6bf178967fbe" (UID: "3befa408-f48c-4244-81ca-6bf178967fbe"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:49:12 crc kubenswrapper[4901]: I0309 02:49:12.248804 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3befa408-f48c-4244-81ca-6bf178967fbe-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "3befa408-f48c-4244-81ca-6bf178967fbe" (UID: "3befa408-f48c-4244-81ca-6bf178967fbe"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:49:12 crc kubenswrapper[4901]: I0309 02:49:12.250206 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3befa408-f48c-4244-81ca-6bf178967fbe-kube-api-access-7w2l5" (OuterVolumeSpecName: "kube-api-access-7w2l5") pod "3befa408-f48c-4244-81ca-6bf178967fbe" (UID: "3befa408-f48c-4244-81ca-6bf178967fbe"). InnerVolumeSpecName "kube-api-access-7w2l5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:49:12 crc kubenswrapper[4901]: I0309 02:49:12.251404 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3befa408-f48c-4244-81ca-6bf178967fbe-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "3befa408-f48c-4244-81ca-6bf178967fbe" (UID: "3befa408-f48c-4244-81ca-6bf178967fbe"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:49:12 crc kubenswrapper[4901]: I0309 02:49:12.258590 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "3befa408-f48c-4244-81ca-6bf178967fbe" (UID: "3befa408-f48c-4244-81ca-6bf178967fbe"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 09 02:49:12 crc kubenswrapper[4901]: I0309 02:49:12.266165 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3befa408-f48c-4244-81ca-6bf178967fbe-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "3befa408-f48c-4244-81ca-6bf178967fbe" (UID: "3befa408-f48c-4244-81ca-6bf178967fbe"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 02:49:12 crc kubenswrapper[4901]: I0309 02:49:12.343627 4901 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3befa408-f48c-4244-81ca-6bf178967fbe-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 09 02:49:12 crc kubenswrapper[4901]: I0309 02:49:12.343689 4901 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3befa408-f48c-4244-81ca-6bf178967fbe-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 09 02:49:12 crc kubenswrapper[4901]: I0309 02:49:12.343714 4901 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3befa408-f48c-4244-81ca-6bf178967fbe-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 09 02:49:12 crc kubenswrapper[4901]: I0309 02:49:12.343731 4901 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3befa408-f48c-4244-81ca-6bf178967fbe-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 09 02:49:12 crc kubenswrapper[4901]: I0309 02:49:12.343752 4901 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3befa408-f48c-4244-81ca-6bf178967fbe-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 02:49:12 crc kubenswrapper[4901]: I0309 02:49:12.343770 4901 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3befa408-f48c-4244-81ca-6bf178967fbe-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 09 02:49:12 crc kubenswrapper[4901]: I0309 02:49:12.343788 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7w2l5\" (UniqueName: \"kubernetes.io/projected/3befa408-f48c-4244-81ca-6bf178967fbe-kube-api-access-7w2l5\") on node \"crc\" DevicePath \"\"" Mar 09 02:49:13 crc kubenswrapper[4901]: I0309 02:49:13.002263 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" event={"ID":"3befa408-f48c-4244-81ca-6bf178967fbe","Type":"ContainerDied","Data":"27911bfed1ce10bbb49a63a16d2402f2ee0007b275a29004aa1f7f43403610e7"} Mar 09 02:49:13 crc kubenswrapper[4901]: I0309 02:49:13.002305 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-tpmfc" Mar 09 02:49:13 crc kubenswrapper[4901]: I0309 02:49:13.002356 4901 scope.go:117] "RemoveContainer" containerID="4accb1a5953199fd48c9457d5a8e4d43ff0a6f94b349e73f169595c37979ee89" Mar 09 02:49:13 crc kubenswrapper[4901]: I0309 02:49:13.042301 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tpmfc"] Mar 09 02:49:13 crc kubenswrapper[4901]: I0309 02:49:13.048756 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tpmfc"] Mar 09 02:49:14 crc kubenswrapper[4901]: I0309 02:49:14.119213 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3befa408-f48c-4244-81ca-6bf178967fbe" path="/var/lib/kubelet/pods/3befa408-f48c-4244-81ca-6bf178967fbe/volumes" Mar 09 02:49:30 crc kubenswrapper[4901]: I0309 02:49:30.863192 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 02:49:30 crc kubenswrapper[4901]: I0309 02:49:30.863900 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 02:50:00 crc kubenswrapper[4901]: I0309 02:50:00.138651 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550410-lbvpz"] Mar 09 02:50:00 crc kubenswrapper[4901]: E0309 02:50:00.139741 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3befa408-f48c-4244-81ca-6bf178967fbe" containerName="registry" Mar 09 02:50:00 crc kubenswrapper[4901]: I0309 02:50:00.139771 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="3befa408-f48c-4244-81ca-6bf178967fbe" containerName="registry" Mar 09 02:50:00 crc kubenswrapper[4901]: I0309 02:50:00.140030 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="3befa408-f48c-4244-81ca-6bf178967fbe" containerName="registry" Mar 09 02:50:00 crc kubenswrapper[4901]: I0309 02:50:00.140765 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550410-lbvpz" Mar 09 02:50:00 crc kubenswrapper[4901]: I0309 02:50:00.143876 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 02:50:00 crc kubenswrapper[4901]: I0309 02:50:00.144587 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lq6vs" Mar 09 02:50:00 crc kubenswrapper[4901]: I0309 02:50:00.144620 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 02:50:00 crc kubenswrapper[4901]: I0309 02:50:00.146968 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-857bc\" (UniqueName: \"kubernetes.io/projected/9d474c00-28a1-469a-aafd-c9b5bc4dd558-kube-api-access-857bc\") pod \"auto-csr-approver-29550410-lbvpz\" (UID: \"9d474c00-28a1-469a-aafd-c9b5bc4dd558\") " pod="openshift-infra/auto-csr-approver-29550410-lbvpz" Mar 09 02:50:00 crc kubenswrapper[4901]: I0309 02:50:00.148606 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550410-lbvpz"] Mar 09 02:50:00 crc kubenswrapper[4901]: I0309 02:50:00.248457 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-857bc\" (UniqueName: \"kubernetes.io/projected/9d474c00-28a1-469a-aafd-c9b5bc4dd558-kube-api-access-857bc\") pod \"auto-csr-approver-29550410-lbvpz\" (UID: \"9d474c00-28a1-469a-aafd-c9b5bc4dd558\") " pod="openshift-infra/auto-csr-approver-29550410-lbvpz" Mar 09 02:50:00 crc kubenswrapper[4901]: I0309 02:50:00.280066 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-857bc\" (UniqueName: \"kubernetes.io/projected/9d474c00-28a1-469a-aafd-c9b5bc4dd558-kube-api-access-857bc\") pod \"auto-csr-approver-29550410-lbvpz\" (UID: \"9d474c00-28a1-469a-aafd-c9b5bc4dd558\") " pod="openshift-infra/auto-csr-approver-29550410-lbvpz" Mar 09 02:50:00 crc kubenswrapper[4901]: I0309 02:50:00.466631 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550410-lbvpz" Mar 09 02:50:00 crc kubenswrapper[4901]: I0309 02:50:00.863348 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 02:50:00 crc kubenswrapper[4901]: I0309 02:50:00.863434 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 02:50:00 crc kubenswrapper[4901]: I0309 02:50:00.863499 4901 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5c998" Mar 09 02:50:00 crc kubenswrapper[4901]: I0309 02:50:00.864178 4901 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4e216bbf999577ca8f01e583ee820521f2479f711576cf371b955a56a58308e3"} pod="openshift-machine-config-operator/machine-config-daemon-5c998" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 02:50:00 crc kubenswrapper[4901]: I0309 02:50:00.864319 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" containerID="cri-o://4e216bbf999577ca8f01e583ee820521f2479f711576cf371b955a56a58308e3" gracePeriod=600 Mar 09 02:50:00 crc kubenswrapper[4901]: I0309 02:50:00.954600 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550410-lbvpz"] Mar 09 02:50:00 crc kubenswrapper[4901]: I0309 02:50:00.966495 4901 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 02:50:01 crc kubenswrapper[4901]: I0309 02:50:01.346784 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550410-lbvpz" event={"ID":"9d474c00-28a1-469a-aafd-c9b5bc4dd558","Type":"ContainerStarted","Data":"27894650f3a1d1e0d3a66fe8e4776cc4bf19720e34cdcf44ac43618e3e9b4f2c"} Mar 09 02:50:01 crc kubenswrapper[4901]: I0309 02:50:01.350040 4901 generic.go:334] "Generic (PLEG): container finished" podID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerID="4e216bbf999577ca8f01e583ee820521f2479f711576cf371b955a56a58308e3" exitCode=0 Mar 09 02:50:01 crc kubenswrapper[4901]: I0309 02:50:01.350108 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" event={"ID":"65e722e8-52c4-4bb6-9927-f378b2f7296a","Type":"ContainerDied","Data":"4e216bbf999577ca8f01e583ee820521f2479f711576cf371b955a56a58308e3"} Mar 09 02:50:01 crc kubenswrapper[4901]: I0309 02:50:01.350165 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" event={"ID":"65e722e8-52c4-4bb6-9927-f378b2f7296a","Type":"ContainerStarted","Data":"cca9e0fab8d2b8ceab32875581fb830c17911c0a79e4cc7ee07e546219448782"} Mar 09 02:50:01 crc kubenswrapper[4901]: I0309 02:50:01.351207 4901 scope.go:117] "RemoveContainer" containerID="f7db76f28a31cfe2c8e33f8d3addf0b75358f97a36710315a069de1716506eae" Mar 09 02:50:02 crc kubenswrapper[4901]: I0309 02:50:02.391088 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550410-lbvpz" podStartSLOduration=1.31958416 podStartE2EDuration="2.391070432s" podCreationTimestamp="2026-03-09 02:50:00 +0000 UTC" firstStartedPulling="2026-03-09 02:50:00.958130394 +0000 UTC m=+525.547794166" lastFinishedPulling="2026-03-09 02:50:02.029616666 +0000 UTC m=+526.619280438" observedRunningTime="2026-03-09 02:50:02.387376187 +0000 UTC m=+526.977039929" watchObservedRunningTime="2026-03-09 02:50:02.391070432 +0000 UTC m=+526.980734174" Mar 09 02:50:03 crc kubenswrapper[4901]: I0309 02:50:03.384190 4901 generic.go:334] "Generic (PLEG): container finished" podID="9d474c00-28a1-469a-aafd-c9b5bc4dd558" containerID="91454141e33be50fd8dc939678aa52fb96e5af281a4b959dcac3f73004e29217" exitCode=0 Mar 09 02:50:03 crc kubenswrapper[4901]: I0309 02:50:03.384327 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550410-lbvpz" event={"ID":"9d474c00-28a1-469a-aafd-c9b5bc4dd558","Type":"ContainerDied","Data":"91454141e33be50fd8dc939678aa52fb96e5af281a4b959dcac3f73004e29217"} Mar 09 02:50:04 crc kubenswrapper[4901]: I0309 02:50:04.793416 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550410-lbvpz" Mar 09 02:50:04 crc kubenswrapper[4901]: I0309 02:50:04.818886 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-857bc\" (UniqueName: \"kubernetes.io/projected/9d474c00-28a1-469a-aafd-c9b5bc4dd558-kube-api-access-857bc\") pod \"9d474c00-28a1-469a-aafd-c9b5bc4dd558\" (UID: \"9d474c00-28a1-469a-aafd-c9b5bc4dd558\") " Mar 09 02:50:04 crc kubenswrapper[4901]: I0309 02:50:04.832386 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d474c00-28a1-469a-aafd-c9b5bc4dd558-kube-api-access-857bc" (OuterVolumeSpecName: "kube-api-access-857bc") pod "9d474c00-28a1-469a-aafd-c9b5bc4dd558" (UID: "9d474c00-28a1-469a-aafd-c9b5bc4dd558"). InnerVolumeSpecName "kube-api-access-857bc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:50:04 crc kubenswrapper[4901]: I0309 02:50:04.920549 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-857bc\" (UniqueName: \"kubernetes.io/projected/9d474c00-28a1-469a-aafd-c9b5bc4dd558-kube-api-access-857bc\") on node \"crc\" DevicePath \"\"" Mar 09 02:50:05 crc kubenswrapper[4901]: I0309 02:50:05.399461 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550410-lbvpz" event={"ID":"9d474c00-28a1-469a-aafd-c9b5bc4dd558","Type":"ContainerDied","Data":"27894650f3a1d1e0d3a66fe8e4776cc4bf19720e34cdcf44ac43618e3e9b4f2c"} Mar 09 02:50:05 crc kubenswrapper[4901]: I0309 02:50:05.400096 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27894650f3a1d1e0d3a66fe8e4776cc4bf19720e34cdcf44ac43618e3e9b4f2c" Mar 09 02:50:05 crc kubenswrapper[4901]: I0309 02:50:05.399600 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550410-lbvpz" Mar 09 02:50:05 crc kubenswrapper[4901]: I0309 02:50:05.461365 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550404-w8858"] Mar 09 02:50:05 crc kubenswrapper[4901]: I0309 02:50:05.467444 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550404-w8858"] Mar 09 02:50:06 crc kubenswrapper[4901]: I0309 02:50:06.120569 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="680e9e87-71a2-402c-84f2-e8eb2b7a4c44" path="/var/lib/kubelet/pods/680e9e87-71a2-402c-84f2-e8eb2b7a4c44/volumes" Mar 09 02:51:16 crc kubenswrapper[4901]: I0309 02:51:16.969771 4901 scope.go:117] "RemoveContainer" containerID="53230f46debba4713ebe4d6ad9b7aefb27b7fd4a4b57aebf05721b6999490970" Mar 09 02:52:00 crc kubenswrapper[4901]: I0309 02:52:00.154949 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550412-rf8dh"] Mar 09 02:52:00 crc kubenswrapper[4901]: E0309 02:52:00.155977 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d474c00-28a1-469a-aafd-c9b5bc4dd558" containerName="oc" Mar 09 02:52:00 crc kubenswrapper[4901]: I0309 02:52:00.156003 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d474c00-28a1-469a-aafd-c9b5bc4dd558" containerName="oc" Mar 09 02:52:00 crc kubenswrapper[4901]: I0309 02:52:00.156179 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d474c00-28a1-469a-aafd-c9b5bc4dd558" containerName="oc" Mar 09 02:52:00 crc kubenswrapper[4901]: I0309 02:52:00.156811 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550412-rf8dh" Mar 09 02:52:00 crc kubenswrapper[4901]: I0309 02:52:00.160019 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 02:52:00 crc kubenswrapper[4901]: I0309 02:52:00.160049 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 02:52:00 crc kubenswrapper[4901]: I0309 02:52:00.162974 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lq6vs" Mar 09 02:52:00 crc kubenswrapper[4901]: I0309 02:52:00.163431 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550412-rf8dh"] Mar 09 02:52:00 crc kubenswrapper[4901]: I0309 02:52:00.296867 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqgxv\" (UniqueName: \"kubernetes.io/projected/fcb536b6-7907-4645-abe6-bcb1489c6739-kube-api-access-wqgxv\") pod \"auto-csr-approver-29550412-rf8dh\" (UID: \"fcb536b6-7907-4645-abe6-bcb1489c6739\") " pod="openshift-infra/auto-csr-approver-29550412-rf8dh" Mar 09 02:52:00 crc kubenswrapper[4901]: I0309 02:52:00.398777 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqgxv\" (UniqueName: \"kubernetes.io/projected/fcb536b6-7907-4645-abe6-bcb1489c6739-kube-api-access-wqgxv\") pod \"auto-csr-approver-29550412-rf8dh\" (UID: \"fcb536b6-7907-4645-abe6-bcb1489c6739\") " pod="openshift-infra/auto-csr-approver-29550412-rf8dh" Mar 09 02:52:00 crc kubenswrapper[4901]: I0309 02:52:00.431711 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqgxv\" (UniqueName: \"kubernetes.io/projected/fcb536b6-7907-4645-abe6-bcb1489c6739-kube-api-access-wqgxv\") pod \"auto-csr-approver-29550412-rf8dh\" (UID: \"fcb536b6-7907-4645-abe6-bcb1489c6739\") " pod="openshift-infra/auto-csr-approver-29550412-rf8dh" Mar 09 02:52:00 crc kubenswrapper[4901]: I0309 02:52:00.486712 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550412-rf8dh" Mar 09 02:52:00 crc kubenswrapper[4901]: I0309 02:52:00.791889 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550412-rf8dh"] Mar 09 02:52:01 crc kubenswrapper[4901]: I0309 02:52:01.512795 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550412-rf8dh" event={"ID":"fcb536b6-7907-4645-abe6-bcb1489c6739","Type":"ContainerStarted","Data":"8ee9143d809460bb2c420238261ab0269c2648853fd0191b8542f93b03a27947"} Mar 09 02:52:02 crc kubenswrapper[4901]: I0309 02:52:02.522822 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550412-rf8dh" event={"ID":"fcb536b6-7907-4645-abe6-bcb1489c6739","Type":"ContainerStarted","Data":"117f9dca07a548933ba5e15f807d01f05a62a9c6954c1054cff3e197c4a386e7"} Mar 09 02:52:02 crc kubenswrapper[4901]: I0309 02:52:02.546764 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550412-rf8dh" podStartSLOduration=1.335742966 podStartE2EDuration="2.546743786s" podCreationTimestamp="2026-03-09 02:52:00 +0000 UTC" firstStartedPulling="2026-03-09 02:52:00.8088533 +0000 UTC m=+645.398517062" lastFinishedPulling="2026-03-09 02:52:02.01985411 +0000 UTC m=+646.609517882" observedRunningTime="2026-03-09 02:52:02.543509386 +0000 UTC m=+647.133173158" watchObservedRunningTime="2026-03-09 02:52:02.546743786 +0000 UTC m=+647.136407518" Mar 09 02:52:03 crc kubenswrapper[4901]: I0309 02:52:03.532325 4901 generic.go:334] "Generic (PLEG): container finished" podID="fcb536b6-7907-4645-abe6-bcb1489c6739" containerID="117f9dca07a548933ba5e15f807d01f05a62a9c6954c1054cff3e197c4a386e7" exitCode=0 Mar 09 02:52:03 crc kubenswrapper[4901]: I0309 02:52:03.532398 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550412-rf8dh" event={"ID":"fcb536b6-7907-4645-abe6-bcb1489c6739","Type":"ContainerDied","Data":"117f9dca07a548933ba5e15f807d01f05a62a9c6954c1054cff3e197c4a386e7"} Mar 09 02:52:04 crc kubenswrapper[4901]: I0309 02:52:04.850676 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550412-rf8dh" Mar 09 02:52:04 crc kubenswrapper[4901]: I0309 02:52:04.960123 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqgxv\" (UniqueName: \"kubernetes.io/projected/fcb536b6-7907-4645-abe6-bcb1489c6739-kube-api-access-wqgxv\") pod \"fcb536b6-7907-4645-abe6-bcb1489c6739\" (UID: \"fcb536b6-7907-4645-abe6-bcb1489c6739\") " Mar 09 02:52:04 crc kubenswrapper[4901]: I0309 02:52:04.969703 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcb536b6-7907-4645-abe6-bcb1489c6739-kube-api-access-wqgxv" (OuterVolumeSpecName: "kube-api-access-wqgxv") pod "fcb536b6-7907-4645-abe6-bcb1489c6739" (UID: "fcb536b6-7907-4645-abe6-bcb1489c6739"). InnerVolumeSpecName "kube-api-access-wqgxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:52:05 crc kubenswrapper[4901]: I0309 02:52:05.061367 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqgxv\" (UniqueName: \"kubernetes.io/projected/fcb536b6-7907-4645-abe6-bcb1489c6739-kube-api-access-wqgxv\") on node \"crc\" DevicePath \"\"" Mar 09 02:52:05 crc kubenswrapper[4901]: I0309 02:52:05.548982 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550412-rf8dh" event={"ID":"fcb536b6-7907-4645-abe6-bcb1489c6739","Type":"ContainerDied","Data":"8ee9143d809460bb2c420238261ab0269c2648853fd0191b8542f93b03a27947"} Mar 09 02:52:05 crc kubenswrapper[4901]: I0309 02:52:05.549059 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ee9143d809460bb2c420238261ab0269c2648853fd0191b8542f93b03a27947" Mar 09 02:52:05 crc kubenswrapper[4901]: I0309 02:52:05.549192 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550412-rf8dh" Mar 09 02:52:05 crc kubenswrapper[4901]: I0309 02:52:05.624275 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550406-smrn9"] Mar 09 02:52:05 crc kubenswrapper[4901]: I0309 02:52:05.628292 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550406-smrn9"] Mar 09 02:52:06 crc kubenswrapper[4901]: I0309 02:52:06.119997 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a751411-2ccc-4bae-bdc6-c34fe2385db3" path="/var/lib/kubelet/pods/9a751411-2ccc-4bae-bdc6-c34fe2385db3/volumes" Mar 09 02:52:17 crc kubenswrapper[4901]: I0309 02:52:17.016527 4901 scope.go:117] "RemoveContainer" containerID="d71e6d807ace698720a2a4dd399cb57a364cb762660df830f7ad7c026fc22a73" Mar 09 02:52:17 crc kubenswrapper[4901]: I0309 02:52:17.086963 4901 scope.go:117] "RemoveContainer" containerID="5172438ccca7ce5fed0dddfef9f317c9139e2cda2f00e28127f72de1def21d6f" Mar 09 02:52:17 crc kubenswrapper[4901]: I0309 02:52:17.141133 4901 scope.go:117] "RemoveContainer" containerID="39efb4f2ab60f7232608db324b7dfe0d78f50cfa4a13054ef9e97b8132ac8389" Mar 09 02:52:17 crc kubenswrapper[4901]: I0309 02:52:17.162265 4901 scope.go:117] "RemoveContainer" containerID="3209d20df68398b5c74788dabea1a04c821a3282bc7f7575499b8b0b0a83bfd3" Mar 09 02:52:30 crc kubenswrapper[4901]: I0309 02:52:30.862966 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 02:52:30 crc kubenswrapper[4901]: I0309 02:52:30.863600 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 02:53:00 crc kubenswrapper[4901]: I0309 02:53:00.863047 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 02:53:00 crc kubenswrapper[4901]: I0309 02:53:00.863830 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 02:53:30 crc kubenswrapper[4901]: I0309 02:53:30.863458 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 02:53:30 crc kubenswrapper[4901]: I0309 02:53:30.864097 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 02:53:30 crc kubenswrapper[4901]: I0309 02:53:30.864149 4901 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5c998" Mar 09 02:53:30 crc kubenswrapper[4901]: I0309 02:53:30.864725 4901 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cca9e0fab8d2b8ceab32875581fb830c17911c0a79e4cc7ee07e546219448782"} pod="openshift-machine-config-operator/machine-config-daemon-5c998" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 02:53:30 crc kubenswrapper[4901]: I0309 02:53:30.864820 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" containerID="cri-o://cca9e0fab8d2b8ceab32875581fb830c17911c0a79e4cc7ee07e546219448782" gracePeriod=600 Mar 09 02:53:31 crc kubenswrapper[4901]: I0309 02:53:31.505490 4901 generic.go:334] "Generic (PLEG): container finished" podID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerID="cca9e0fab8d2b8ceab32875581fb830c17911c0a79e4cc7ee07e546219448782" exitCode=0 Mar 09 02:53:31 crc kubenswrapper[4901]: I0309 02:53:31.505569 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" event={"ID":"65e722e8-52c4-4bb6-9927-f378b2f7296a","Type":"ContainerDied","Data":"cca9e0fab8d2b8ceab32875581fb830c17911c0a79e4cc7ee07e546219448782"} Mar 09 02:53:31 crc kubenswrapper[4901]: I0309 02:53:31.506202 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" event={"ID":"65e722e8-52c4-4bb6-9927-f378b2f7296a","Type":"ContainerStarted","Data":"37ec3e94088a17553e2b069ce6fa01c84825c1f38b75b23f862711155501cfa6"} Mar 09 02:53:31 crc kubenswrapper[4901]: I0309 02:53:31.506323 4901 scope.go:117] "RemoveContainer" containerID="4e216bbf999577ca8f01e583ee820521f2479f711576cf371b955a56a58308e3" Mar 09 02:54:00 crc kubenswrapper[4901]: I0309 02:54:00.142272 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550414-zl74c"] Mar 09 02:54:00 crc kubenswrapper[4901]: E0309 02:54:00.142824 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcb536b6-7907-4645-abe6-bcb1489c6739" containerName="oc" Mar 09 02:54:00 crc kubenswrapper[4901]: I0309 02:54:00.142834 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcb536b6-7907-4645-abe6-bcb1489c6739" containerName="oc" Mar 09 02:54:00 crc kubenswrapper[4901]: I0309 02:54:00.142930 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcb536b6-7907-4645-abe6-bcb1489c6739" containerName="oc" Mar 09 02:54:00 crc kubenswrapper[4901]: I0309 02:54:00.143263 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550414-zl74c" Mar 09 02:54:00 crc kubenswrapper[4901]: I0309 02:54:00.146311 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 02:54:00 crc kubenswrapper[4901]: I0309 02:54:00.146388 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 02:54:00 crc kubenswrapper[4901]: I0309 02:54:00.149246 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lq6vs" Mar 09 02:54:00 crc kubenswrapper[4901]: I0309 02:54:00.156034 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550414-zl74c"] Mar 09 02:54:00 crc kubenswrapper[4901]: I0309 02:54:00.242855 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8grqp\" (UniqueName: \"kubernetes.io/projected/439c6578-bec4-4371-9f36-bbe54de578bf-kube-api-access-8grqp\") pod \"auto-csr-approver-29550414-zl74c\" (UID: \"439c6578-bec4-4371-9f36-bbe54de578bf\") " pod="openshift-infra/auto-csr-approver-29550414-zl74c" Mar 09 02:54:00 crc kubenswrapper[4901]: I0309 02:54:00.344123 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8grqp\" (UniqueName: \"kubernetes.io/projected/439c6578-bec4-4371-9f36-bbe54de578bf-kube-api-access-8grqp\") pod \"auto-csr-approver-29550414-zl74c\" (UID: \"439c6578-bec4-4371-9f36-bbe54de578bf\") " pod="openshift-infra/auto-csr-approver-29550414-zl74c" Mar 09 02:54:00 crc kubenswrapper[4901]: I0309 02:54:00.366144 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8grqp\" (UniqueName: \"kubernetes.io/projected/439c6578-bec4-4371-9f36-bbe54de578bf-kube-api-access-8grqp\") pod \"auto-csr-approver-29550414-zl74c\" (UID: \"439c6578-bec4-4371-9f36-bbe54de578bf\") " pod="openshift-infra/auto-csr-approver-29550414-zl74c" Mar 09 02:54:00 crc kubenswrapper[4901]: I0309 02:54:00.478630 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550414-zl74c" Mar 09 02:54:00 crc kubenswrapper[4901]: I0309 02:54:00.737882 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550414-zl74c"] Mar 09 02:54:00 crc kubenswrapper[4901]: W0309 02:54:00.744682 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod439c6578_bec4_4371_9f36_bbe54de578bf.slice/crio-7ab74d859e32dc0efa9f0f7707bbc1f186f3429c0d7064f0709ec9604fcb180e WatchSource:0}: Error finding container 7ab74d859e32dc0efa9f0f7707bbc1f186f3429c0d7064f0709ec9604fcb180e: Status 404 returned error can't find the container with id 7ab74d859e32dc0efa9f0f7707bbc1f186f3429c0d7064f0709ec9604fcb180e Mar 09 02:54:01 crc kubenswrapper[4901]: I0309 02:54:01.726665 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550414-zl74c" event={"ID":"439c6578-bec4-4371-9f36-bbe54de578bf","Type":"ContainerStarted","Data":"7ab74d859e32dc0efa9f0f7707bbc1f186f3429c0d7064f0709ec9604fcb180e"} Mar 09 02:54:02 crc kubenswrapper[4901]: I0309 02:54:02.739771 4901 generic.go:334] "Generic (PLEG): container finished" podID="439c6578-bec4-4371-9f36-bbe54de578bf" containerID="da4a75ba9b15303e5fc81245b4255efd1bebcc96b23483870a8d5b6ef37602ff" exitCode=0 Mar 09 02:54:02 crc kubenswrapper[4901]: I0309 02:54:02.739864 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550414-zl74c" event={"ID":"439c6578-bec4-4371-9f36-bbe54de578bf","Type":"ContainerDied","Data":"da4a75ba9b15303e5fc81245b4255efd1bebcc96b23483870a8d5b6ef37602ff"} Mar 09 02:54:04 crc kubenswrapper[4901]: I0309 02:54:04.047183 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550414-zl74c" Mar 09 02:54:04 crc kubenswrapper[4901]: I0309 02:54:04.099316 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8grqp\" (UniqueName: \"kubernetes.io/projected/439c6578-bec4-4371-9f36-bbe54de578bf-kube-api-access-8grqp\") pod \"439c6578-bec4-4371-9f36-bbe54de578bf\" (UID: \"439c6578-bec4-4371-9f36-bbe54de578bf\") " Mar 09 02:54:04 crc kubenswrapper[4901]: I0309 02:54:04.104731 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/439c6578-bec4-4371-9f36-bbe54de578bf-kube-api-access-8grqp" (OuterVolumeSpecName: "kube-api-access-8grqp") pod "439c6578-bec4-4371-9f36-bbe54de578bf" (UID: "439c6578-bec4-4371-9f36-bbe54de578bf"). InnerVolumeSpecName "kube-api-access-8grqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:54:04 crc kubenswrapper[4901]: I0309 02:54:04.201125 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8grqp\" (UniqueName: \"kubernetes.io/projected/439c6578-bec4-4371-9f36-bbe54de578bf-kube-api-access-8grqp\") on node \"crc\" DevicePath \"\"" Mar 09 02:54:04 crc kubenswrapper[4901]: I0309 02:54:04.751943 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550414-zl74c" event={"ID":"439c6578-bec4-4371-9f36-bbe54de578bf","Type":"ContainerDied","Data":"7ab74d859e32dc0efa9f0f7707bbc1f186f3429c0d7064f0709ec9604fcb180e"} Mar 09 02:54:04 crc kubenswrapper[4901]: I0309 02:54:04.752000 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ab74d859e32dc0efa9f0f7707bbc1f186f3429c0d7064f0709ec9604fcb180e" Mar 09 02:54:04 crc kubenswrapper[4901]: I0309 02:54:04.752005 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550414-zl74c" Mar 09 02:54:05 crc kubenswrapper[4901]: I0309 02:54:05.122933 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550408-m756h"] Mar 09 02:54:05 crc kubenswrapper[4901]: I0309 02:54:05.126744 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550408-m756h"] Mar 09 02:54:06 crc kubenswrapper[4901]: I0309 02:54:06.117567 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75f216f4-98d4-44fe-b4f3-e6908f28ed4e" path="/var/lib/kubelet/pods/75f216f4-98d4-44fe-b4f3-e6908f28ed4e/volumes" Mar 09 02:54:17 crc kubenswrapper[4901]: I0309 02:54:17.250134 4901 scope.go:117] "RemoveContainer" containerID="9aa22b48b9223f46a4856b17630905d18e6700f71dd31c936b28850433b71299" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.566087 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bmfgc"] Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.567360 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" podUID="40c17e04-3fc2-48a2-95dc-fe0428b91e66" containerName="ovn-controller" containerID="cri-o://4ce9289308fa0e0313c1e1de4a2da2e1544bf309f32c6f58cf4a914210f10c87" gracePeriod=30 Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.567815 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" podUID="40c17e04-3fc2-48a2-95dc-fe0428b91e66" containerName="sbdb" containerID="cri-o://f2c4c63b1ab59c59daca5ed71ee5ed0e53a565ffbc793ac516f9ba75a4ece272" gracePeriod=30 Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.567866 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" podUID="40c17e04-3fc2-48a2-95dc-fe0428b91e66" containerName="nbdb" containerID="cri-o://8eead4ce26a12a03f0e1808e8e8479ca48c9a409935ea566fae8a7cc2d8bc7c0" gracePeriod=30 Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.567912 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" podUID="40c17e04-3fc2-48a2-95dc-fe0428b91e66" containerName="northd" containerID="cri-o://9c9f87a22de3336b00d43f7848f50bda9f5f27f7ac84c6c6e63a224583005e77" gracePeriod=30 Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.567948 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" podUID="40c17e04-3fc2-48a2-95dc-fe0428b91e66" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://fce11cf214e5d6313c477f4939e0627d14eecc389d7df43bfa0a4bff5152baae" gracePeriod=30 Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.567985 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" podUID="40c17e04-3fc2-48a2-95dc-fe0428b91e66" containerName="kube-rbac-proxy-node" containerID="cri-o://3178075164db642d21adaf4deed75f6f295794530311c6476b6f0376101b61c4" gracePeriod=30 Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.568019 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" podUID="40c17e04-3fc2-48a2-95dc-fe0428b91e66" containerName="ovn-acl-logging" containerID="cri-o://17a115bbd5ac61babf8f88452e2c50b49fd24b34589357e4a7d59bd2a3511b49" gracePeriod=30 Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.614464 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" podUID="40c17e04-3fc2-48a2-95dc-fe0428b91e66" containerName="ovnkube-controller" containerID="cri-o://3e4b1f890041b1c44ceb9cb0ef734bdaa379b3af137404deaacb49deb082a289" gracePeriod=30 Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.847495 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmfgc_40c17e04-3fc2-48a2-95dc-fe0428b91e66/ovnkube-controller/2.log" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.849926 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmfgc_40c17e04-3fc2-48a2-95dc-fe0428b91e66/ovn-acl-logging/0.log" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.850392 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmfgc_40c17e04-3fc2-48a2-95dc-fe0428b91e66/ovn-controller/0.log" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.850881 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.899297 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xn8v2"] Mar 09 02:54:50 crc kubenswrapper[4901]: E0309 02:54:50.899610 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40c17e04-3fc2-48a2-95dc-fe0428b91e66" containerName="ovnkube-controller" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.899632 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="40c17e04-3fc2-48a2-95dc-fe0428b91e66" containerName="ovnkube-controller" Mar 09 02:54:50 crc kubenswrapper[4901]: E0309 02:54:50.899652 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40c17e04-3fc2-48a2-95dc-fe0428b91e66" containerName="ovnkube-controller" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.899665 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="40c17e04-3fc2-48a2-95dc-fe0428b91e66" containerName="ovnkube-controller" Mar 09 02:54:50 crc kubenswrapper[4901]: E0309 02:54:50.899685 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40c17e04-3fc2-48a2-95dc-fe0428b91e66" containerName="sbdb" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.899699 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="40c17e04-3fc2-48a2-95dc-fe0428b91e66" containerName="sbdb" Mar 09 02:54:50 crc kubenswrapper[4901]: E0309 02:54:50.899719 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40c17e04-3fc2-48a2-95dc-fe0428b91e66" containerName="kube-rbac-proxy-node" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.899732 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="40c17e04-3fc2-48a2-95dc-fe0428b91e66" containerName="kube-rbac-proxy-node" Mar 09 02:54:50 crc kubenswrapper[4901]: E0309 02:54:50.899757 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40c17e04-3fc2-48a2-95dc-fe0428b91e66" containerName="ovn-acl-logging" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.899769 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="40c17e04-3fc2-48a2-95dc-fe0428b91e66" containerName="ovn-acl-logging" Mar 09 02:54:50 crc kubenswrapper[4901]: E0309 02:54:50.899788 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40c17e04-3fc2-48a2-95dc-fe0428b91e66" containerName="ovn-controller" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.899801 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="40c17e04-3fc2-48a2-95dc-fe0428b91e66" containerName="ovn-controller" Mar 09 02:54:50 crc kubenswrapper[4901]: E0309 02:54:50.899824 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40c17e04-3fc2-48a2-95dc-fe0428b91e66" containerName="northd" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.899837 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="40c17e04-3fc2-48a2-95dc-fe0428b91e66" containerName="northd" Mar 09 02:54:50 crc kubenswrapper[4901]: E0309 02:54:50.899853 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40c17e04-3fc2-48a2-95dc-fe0428b91e66" containerName="kube-rbac-proxy-ovn-metrics" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.899866 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="40c17e04-3fc2-48a2-95dc-fe0428b91e66" containerName="kube-rbac-proxy-ovn-metrics" Mar 09 02:54:50 crc kubenswrapper[4901]: E0309 02:54:50.899885 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="439c6578-bec4-4371-9f36-bbe54de578bf" containerName="oc" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.899899 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="439c6578-bec4-4371-9f36-bbe54de578bf" containerName="oc" Mar 09 02:54:50 crc kubenswrapper[4901]: E0309 02:54:50.899916 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40c17e04-3fc2-48a2-95dc-fe0428b91e66" containerName="kubecfg-setup" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.899928 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="40c17e04-3fc2-48a2-95dc-fe0428b91e66" containerName="kubecfg-setup" Mar 09 02:54:50 crc kubenswrapper[4901]: E0309 02:54:50.899961 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40c17e04-3fc2-48a2-95dc-fe0428b91e66" containerName="nbdb" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.899974 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="40c17e04-3fc2-48a2-95dc-fe0428b91e66" containerName="nbdb" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.900141 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="40c17e04-3fc2-48a2-95dc-fe0428b91e66" containerName="kube-rbac-proxy-node" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.900163 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="40c17e04-3fc2-48a2-95dc-fe0428b91e66" containerName="nbdb" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.900181 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="40c17e04-3fc2-48a2-95dc-fe0428b91e66" containerName="kube-rbac-proxy-ovn-metrics" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.900202 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="439c6578-bec4-4371-9f36-bbe54de578bf" containerName="oc" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.900243 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="40c17e04-3fc2-48a2-95dc-fe0428b91e66" containerName="ovn-controller" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.900260 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="40c17e04-3fc2-48a2-95dc-fe0428b91e66" containerName="ovnkube-controller" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.900276 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="40c17e04-3fc2-48a2-95dc-fe0428b91e66" containerName="ovnkube-controller" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.900294 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="40c17e04-3fc2-48a2-95dc-fe0428b91e66" containerName="ovnkube-controller" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.900311 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="40c17e04-3fc2-48a2-95dc-fe0428b91e66" containerName="northd" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.900326 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="40c17e04-3fc2-48a2-95dc-fe0428b91e66" containerName="ovn-acl-logging" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.900346 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="40c17e04-3fc2-48a2-95dc-fe0428b91e66" containerName="sbdb" Mar 09 02:54:50 crc kubenswrapper[4901]: E0309 02:54:50.900532 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40c17e04-3fc2-48a2-95dc-fe0428b91e66" containerName="ovnkube-controller" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.900547 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="40c17e04-3fc2-48a2-95dc-fe0428b91e66" containerName="ovnkube-controller" Mar 09 02:54:50 crc kubenswrapper[4901]: E0309 02:54:50.900567 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40c17e04-3fc2-48a2-95dc-fe0428b91e66" containerName="ovnkube-controller" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.900581 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="40c17e04-3fc2-48a2-95dc-fe0428b91e66" containerName="ovnkube-controller" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.900751 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="40c17e04-3fc2-48a2-95dc-fe0428b91e66" containerName="ovnkube-controller" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.904311 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.972705 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-log-socket\") pod \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.972754 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-etc-openvswitch\") pod \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.972793 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-host-kubelet\") pod \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.972824 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-var-lib-openvswitch\") pod \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.972826 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-log-socket" (OuterVolumeSpecName: "log-socket") pod "40c17e04-3fc2-48a2-95dc-fe0428b91e66" (UID: "40c17e04-3fc2-48a2-95dc-fe0428b91e66"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.972845 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-node-log\") pod \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.972872 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-run-openvswitch\") pod \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.972895 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-host-run-netns\") pod \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.972919 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-host-slash\") pod \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.972875 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "40c17e04-3fc2-48a2-95dc-fe0428b91e66" (UID: "40c17e04-3fc2-48a2-95dc-fe0428b91e66"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.972892 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "40c17e04-3fc2-48a2-95dc-fe0428b91e66" (UID: "40c17e04-3fc2-48a2-95dc-fe0428b91e66"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.972910 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-node-log" (OuterVolumeSpecName: "node-log") pod "40c17e04-3fc2-48a2-95dc-fe0428b91e66" (UID: "40c17e04-3fc2-48a2-95dc-fe0428b91e66"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.972923 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "40c17e04-3fc2-48a2-95dc-fe0428b91e66" (UID: "40c17e04-3fc2-48a2-95dc-fe0428b91e66"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.972954 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-host-run-ovn-kubernetes\") pod \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.972962 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "40c17e04-3fc2-48a2-95dc-fe0428b91e66" (UID: "40c17e04-3fc2-48a2-95dc-fe0428b91e66"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.972974 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-host-cni-netd\") pod \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.972981 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-host-slash" (OuterVolumeSpecName: "host-slash") pod "40c17e04-3fc2-48a2-95dc-fe0428b91e66" (UID: "40c17e04-3fc2-48a2-95dc-fe0428b91e66"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.972999 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "40c17e04-3fc2-48a2-95dc-fe0428b91e66" (UID: "40c17e04-3fc2-48a2-95dc-fe0428b91e66"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.973003 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-host-cni-bin\") pod \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.973017 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "40c17e04-3fc2-48a2-95dc-fe0428b91e66" (UID: "40c17e04-3fc2-48a2-95dc-fe0428b91e66"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.973034 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "40c17e04-3fc2-48a2-95dc-fe0428b91e66" (UID: "40c17e04-3fc2-48a2-95dc-fe0428b91e66"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.973040 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cddsm\" (UniqueName: \"kubernetes.io/projected/40c17e04-3fc2-48a2-95dc-fe0428b91e66-kube-api-access-cddsm\") pod \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.973051 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "40c17e04-3fc2-48a2-95dc-fe0428b91e66" (UID: "40c17e04-3fc2-48a2-95dc-fe0428b91e66"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.973067 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-run-systemd\") pod \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.973099 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-run-ovn\") pod \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.973126 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-host-var-lib-cni-networks-ovn-kubernetes\") pod \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.973148 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/40c17e04-3fc2-48a2-95dc-fe0428b91e66-ovnkube-config\") pod \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.973177 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/40c17e04-3fc2-48a2-95dc-fe0428b91e66-ovnkube-script-lib\") pod \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.973197 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-systemd-units\") pod \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.973234 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/40c17e04-3fc2-48a2-95dc-fe0428b91e66-ovn-node-metrics-cert\") pod \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.973292 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/40c17e04-3fc2-48a2-95dc-fe0428b91e66-env-overrides\") pod \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\" (UID: \"40c17e04-3fc2-48a2-95dc-fe0428b91e66\") " Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.973444 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9a1851a0-5022-46f8-b5d7-94bb4e23bfc5-ovnkube-config\") pod \"ovnkube-node-xn8v2\" (UID: \"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.973472 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9a1851a0-5022-46f8-b5d7-94bb4e23bfc5-run-ovn\") pod \"ovnkube-node-xn8v2\" (UID: \"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.973496 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9a1851a0-5022-46f8-b5d7-94bb4e23bfc5-env-overrides\") pod \"ovnkube-node-xn8v2\" (UID: \"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.973520 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45v6w\" (UniqueName: \"kubernetes.io/projected/9a1851a0-5022-46f8-b5d7-94bb4e23bfc5-kube-api-access-45v6w\") pod \"ovnkube-node-xn8v2\" (UID: \"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.973542 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9a1851a0-5022-46f8-b5d7-94bb4e23bfc5-run-openvswitch\") pod \"ovnkube-node-xn8v2\" (UID: \"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.973564 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9a1851a0-5022-46f8-b5d7-94bb4e23bfc5-log-socket\") pod \"ovnkube-node-xn8v2\" (UID: \"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.973597 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9a1851a0-5022-46f8-b5d7-94bb4e23bfc5-host-slash\") pod \"ovnkube-node-xn8v2\" (UID: \"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.973623 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a1851a0-5022-46f8-b5d7-94bb4e23bfc5-host-kubelet\") pod \"ovnkube-node-xn8v2\" (UID: \"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.973651 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9a1851a0-5022-46f8-b5d7-94bb4e23bfc5-etc-openvswitch\") pod \"ovnkube-node-xn8v2\" (UID: \"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.973692 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9a1851a0-5022-46f8-b5d7-94bb4e23bfc5-systemd-units\") pod \"ovnkube-node-xn8v2\" (UID: \"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.973717 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9a1851a0-5022-46f8-b5d7-94bb4e23bfc5-host-run-netns\") pod \"ovnkube-node-xn8v2\" (UID: \"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.973741 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a1851a0-5022-46f8-b5d7-94bb4e23bfc5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xn8v2\" (UID: \"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.973766 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a1851a0-5022-46f8-b5d7-94bb4e23bfc5-host-run-ovn-kubernetes\") pod \"ovnkube-node-xn8v2\" (UID: \"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.973789 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9a1851a0-5022-46f8-b5d7-94bb4e23bfc5-host-cni-bin\") pod \"ovnkube-node-xn8v2\" (UID: \"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.973813 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9a1851a0-5022-46f8-b5d7-94bb4e23bfc5-node-log\") pod \"ovnkube-node-xn8v2\" (UID: \"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.973847 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "40c17e04-3fc2-48a2-95dc-fe0428b91e66" (UID: "40c17e04-3fc2-48a2-95dc-fe0428b91e66"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.973873 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "40c17e04-3fc2-48a2-95dc-fe0428b91e66" (UID: "40c17e04-3fc2-48a2-95dc-fe0428b91e66"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.973898 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9a1851a0-5022-46f8-b5d7-94bb4e23bfc5-run-systemd\") pod \"ovnkube-node-xn8v2\" (UID: \"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.973968 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9a1851a0-5022-46f8-b5d7-94bb4e23bfc5-var-lib-openvswitch\") pod \"ovnkube-node-xn8v2\" (UID: \"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.974006 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "40c17e04-3fc2-48a2-95dc-fe0428b91e66" (UID: "40c17e04-3fc2-48a2-95dc-fe0428b91e66"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.974216 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9a1851a0-5022-46f8-b5d7-94bb4e23bfc5-host-cni-netd\") pod \"ovnkube-node-xn8v2\" (UID: \"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.974326 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9a1851a0-5022-46f8-b5d7-94bb4e23bfc5-ovn-node-metrics-cert\") pod \"ovnkube-node-xn8v2\" (UID: \"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.974360 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9a1851a0-5022-46f8-b5d7-94bb4e23bfc5-ovnkube-script-lib\") pod \"ovnkube-node-xn8v2\" (UID: \"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.974467 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40c17e04-3fc2-48a2-95dc-fe0428b91e66-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "40c17e04-3fc2-48a2-95dc-fe0428b91e66" (UID: "40c17e04-3fc2-48a2-95dc-fe0428b91e66"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.974481 4901 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.974505 4901 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.974538 4901 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.974562 4901 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-log-socket\") on node \"crc\" DevicePath \"\"" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.974575 4901 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.974588 4901 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.974600 4901 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.974776 4901 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-node-log\") on node \"crc\" DevicePath \"\"" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.974794 4901 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.974808 4901 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-host-slash\") on node \"crc\" DevicePath \"\"" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.974820 4901 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.974834 4901 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.974816 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40c17e04-3fc2-48a2-95dc-fe0428b91e66-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "40c17e04-3fc2-48a2-95dc-fe0428b91e66" (UID: "40c17e04-3fc2-48a2-95dc-fe0428b91e66"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.974850 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40c17e04-3fc2-48a2-95dc-fe0428b91e66-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "40c17e04-3fc2-48a2-95dc-fe0428b91e66" (UID: "40c17e04-3fc2-48a2-95dc-fe0428b91e66"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.974847 4901 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.974932 4901 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.980705 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40c17e04-3fc2-48a2-95dc-fe0428b91e66-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "40c17e04-3fc2-48a2-95dc-fe0428b91e66" (UID: "40c17e04-3fc2-48a2-95dc-fe0428b91e66"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:54:50 crc kubenswrapper[4901]: I0309 02:54:50.980794 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40c17e04-3fc2-48a2-95dc-fe0428b91e66-kube-api-access-cddsm" (OuterVolumeSpecName: "kube-api-access-cddsm") pod "40c17e04-3fc2-48a2-95dc-fe0428b91e66" (UID: "40c17e04-3fc2-48a2-95dc-fe0428b91e66"). InnerVolumeSpecName "kube-api-access-cddsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.000896 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "40c17e04-3fc2-48a2-95dc-fe0428b91e66" (UID: "40c17e04-3fc2-48a2-95dc-fe0428b91e66"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.061231 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmfgc_40c17e04-3fc2-48a2-95dc-fe0428b91e66/ovnkube-controller/2.log" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.064012 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmfgc_40c17e04-3fc2-48a2-95dc-fe0428b91e66/ovn-acl-logging/0.log" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.064480 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmfgc_40c17e04-3fc2-48a2-95dc-fe0428b91e66/ovn-controller/0.log" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.064830 4901 generic.go:334] "Generic (PLEG): container finished" podID="40c17e04-3fc2-48a2-95dc-fe0428b91e66" containerID="3e4b1f890041b1c44ceb9cb0ef734bdaa379b3af137404deaacb49deb082a289" exitCode=0 Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.064852 4901 generic.go:334] "Generic (PLEG): container finished" podID="40c17e04-3fc2-48a2-95dc-fe0428b91e66" containerID="f2c4c63b1ab59c59daca5ed71ee5ed0e53a565ffbc793ac516f9ba75a4ece272" exitCode=0 Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.064860 4901 generic.go:334] "Generic (PLEG): container finished" podID="40c17e04-3fc2-48a2-95dc-fe0428b91e66" containerID="8eead4ce26a12a03f0e1808e8e8479ca48c9a409935ea566fae8a7cc2d8bc7c0" exitCode=0 Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.064866 4901 generic.go:334] "Generic (PLEG): container finished" podID="40c17e04-3fc2-48a2-95dc-fe0428b91e66" containerID="9c9f87a22de3336b00d43f7848f50bda9f5f27f7ac84c6c6e63a224583005e77" exitCode=0 Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.064874 4901 generic.go:334] "Generic (PLEG): container finished" podID="40c17e04-3fc2-48a2-95dc-fe0428b91e66" containerID="fce11cf214e5d6313c477f4939e0627d14eecc389d7df43bfa0a4bff5152baae" exitCode=0 Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.064882 4901 generic.go:334] "Generic (PLEG): container finished" podID="40c17e04-3fc2-48a2-95dc-fe0428b91e66" containerID="3178075164db642d21adaf4deed75f6f295794530311c6476b6f0376101b61c4" exitCode=0 Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.064891 4901 generic.go:334] "Generic (PLEG): container finished" podID="40c17e04-3fc2-48a2-95dc-fe0428b91e66" containerID="17a115bbd5ac61babf8f88452e2c50b49fd24b34589357e4a7d59bd2a3511b49" exitCode=143 Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.064897 4901 generic.go:334] "Generic (PLEG): container finished" podID="40c17e04-3fc2-48a2-95dc-fe0428b91e66" containerID="4ce9289308fa0e0313c1e1de4a2da2e1544bf309f32c6f58cf4a914210f10c87" exitCode=143 Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.064926 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" event={"ID":"40c17e04-3fc2-48a2-95dc-fe0428b91e66","Type":"ContainerDied","Data":"3e4b1f890041b1c44ceb9cb0ef734bdaa379b3af137404deaacb49deb082a289"} Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.064949 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" event={"ID":"40c17e04-3fc2-48a2-95dc-fe0428b91e66","Type":"ContainerDied","Data":"f2c4c63b1ab59c59daca5ed71ee5ed0e53a565ffbc793ac516f9ba75a4ece272"} Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.064959 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" event={"ID":"40c17e04-3fc2-48a2-95dc-fe0428b91e66","Type":"ContainerDied","Data":"8eead4ce26a12a03f0e1808e8e8479ca48c9a409935ea566fae8a7cc2d8bc7c0"} Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.064969 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" event={"ID":"40c17e04-3fc2-48a2-95dc-fe0428b91e66","Type":"ContainerDied","Data":"9c9f87a22de3336b00d43f7848f50bda9f5f27f7ac84c6c6e63a224583005e77"} Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.064977 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" event={"ID":"40c17e04-3fc2-48a2-95dc-fe0428b91e66","Type":"ContainerDied","Data":"fce11cf214e5d6313c477f4939e0627d14eecc389d7df43bfa0a4bff5152baae"} Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.064986 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" event={"ID":"40c17e04-3fc2-48a2-95dc-fe0428b91e66","Type":"ContainerDied","Data":"3178075164db642d21adaf4deed75f6f295794530311c6476b6f0376101b61c4"} Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.064996 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"42126b9cd7c418140407793d61fb0f841ddd7a74cc146c4a922d49bb56828d3e"} Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.065004 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f2c4c63b1ab59c59daca5ed71ee5ed0e53a565ffbc793ac516f9ba75a4ece272"} Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.065009 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8eead4ce26a12a03f0e1808e8e8479ca48c9a409935ea566fae8a7cc2d8bc7c0"} Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.065014 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9c9f87a22de3336b00d43f7848f50bda9f5f27f7ac84c6c6e63a224583005e77"} Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.065019 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fce11cf214e5d6313c477f4939e0627d14eecc389d7df43bfa0a4bff5152baae"} Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.065024 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3178075164db642d21adaf4deed75f6f295794530311c6476b6f0376101b61c4"} Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.065029 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"17a115bbd5ac61babf8f88452e2c50b49fd24b34589357e4a7d59bd2a3511b49"} Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.065034 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4ce9289308fa0e0313c1e1de4a2da2e1544bf309f32c6f58cf4a914210f10c87"} Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.065039 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf"} Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.065047 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" event={"ID":"40c17e04-3fc2-48a2-95dc-fe0428b91e66","Type":"ContainerDied","Data":"17a115bbd5ac61babf8f88452e2c50b49fd24b34589357e4a7d59bd2a3511b49"} Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.065055 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3e4b1f890041b1c44ceb9cb0ef734bdaa379b3af137404deaacb49deb082a289"} Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.065062 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"42126b9cd7c418140407793d61fb0f841ddd7a74cc146c4a922d49bb56828d3e"} Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.065067 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f2c4c63b1ab59c59daca5ed71ee5ed0e53a565ffbc793ac516f9ba75a4ece272"} Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.065071 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8eead4ce26a12a03f0e1808e8e8479ca48c9a409935ea566fae8a7cc2d8bc7c0"} Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.065076 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9c9f87a22de3336b00d43f7848f50bda9f5f27f7ac84c6c6e63a224583005e77"} Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.065081 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fce11cf214e5d6313c477f4939e0627d14eecc389d7df43bfa0a4bff5152baae"} Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.065086 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3178075164db642d21adaf4deed75f6f295794530311c6476b6f0376101b61c4"} Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.065091 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"17a115bbd5ac61babf8f88452e2c50b49fd24b34589357e4a7d59bd2a3511b49"} Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.065096 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4ce9289308fa0e0313c1e1de4a2da2e1544bf309f32c6f58cf4a914210f10c87"} Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.065100 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf"} Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.065107 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" event={"ID":"40c17e04-3fc2-48a2-95dc-fe0428b91e66","Type":"ContainerDied","Data":"4ce9289308fa0e0313c1e1de4a2da2e1544bf309f32c6f58cf4a914210f10c87"} Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.065114 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3e4b1f890041b1c44ceb9cb0ef734bdaa379b3af137404deaacb49deb082a289"} Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.065119 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"42126b9cd7c418140407793d61fb0f841ddd7a74cc146c4a922d49bb56828d3e"} Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.065125 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f2c4c63b1ab59c59daca5ed71ee5ed0e53a565ffbc793ac516f9ba75a4ece272"} Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.065130 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8eead4ce26a12a03f0e1808e8e8479ca48c9a409935ea566fae8a7cc2d8bc7c0"} Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.065136 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9c9f87a22de3336b00d43f7848f50bda9f5f27f7ac84c6c6e63a224583005e77"} Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.065141 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fce11cf214e5d6313c477f4939e0627d14eecc389d7df43bfa0a4bff5152baae"} Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.065146 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3178075164db642d21adaf4deed75f6f295794530311c6476b6f0376101b61c4"} Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.065151 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"17a115bbd5ac61babf8f88452e2c50b49fd24b34589357e4a7d59bd2a3511b49"} Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.065156 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4ce9289308fa0e0313c1e1de4a2da2e1544bf309f32c6f58cf4a914210f10c87"} Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.065161 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf"} Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.065168 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" event={"ID":"40c17e04-3fc2-48a2-95dc-fe0428b91e66","Type":"ContainerDied","Data":"9518415db7d7377fe8b5c75deee1d0719fdd022fe2f41787df42c8537002e69c"} Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.065175 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3e4b1f890041b1c44ceb9cb0ef734bdaa379b3af137404deaacb49deb082a289"} Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.065181 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"42126b9cd7c418140407793d61fb0f841ddd7a74cc146c4a922d49bb56828d3e"} Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.065186 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f2c4c63b1ab59c59daca5ed71ee5ed0e53a565ffbc793ac516f9ba75a4ece272"} Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.065191 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8eead4ce26a12a03f0e1808e8e8479ca48c9a409935ea566fae8a7cc2d8bc7c0"} Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.065196 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9c9f87a22de3336b00d43f7848f50bda9f5f27f7ac84c6c6e63a224583005e77"} Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.065201 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fce11cf214e5d6313c477f4939e0627d14eecc389d7df43bfa0a4bff5152baae"} Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.065207 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3178075164db642d21adaf4deed75f6f295794530311c6476b6f0376101b61c4"} Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.065212 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"17a115bbd5ac61babf8f88452e2c50b49fd24b34589357e4a7d59bd2a3511b49"} Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.065216 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4ce9289308fa0e0313c1e1de4a2da2e1544bf309f32c6f58cf4a914210f10c87"} Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.065243 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf"} Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.065256 4901 scope.go:117] "RemoveContainer" containerID="3e4b1f890041b1c44ceb9cb0ef734bdaa379b3af137404deaacb49deb082a289" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.065373 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bmfgc" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.072571 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-429fk_a0d0e040-7ca3-4af8-9f02-d96cff6b3edf/kube-multus/1.log" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.073268 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-429fk_a0d0e040-7ca3-4af8-9f02-d96cff6b3edf/kube-multus/0.log" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.073312 4901 generic.go:334] "Generic (PLEG): container finished" podID="a0d0e040-7ca3-4af8-9f02-d96cff6b3edf" containerID="3c66d39cf5683043c4f736459fb89385f1480a81932fe2d1f1fd5dc7314f6f54" exitCode=2 Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.073345 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-429fk" event={"ID":"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf","Type":"ContainerDied","Data":"3c66d39cf5683043c4f736459fb89385f1480a81932fe2d1f1fd5dc7314f6f54"} Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.073369 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0f00772dc49e31beb94fb8c970e126f77b33ecff59ca8d6fbc856704648ec6d1"} Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.073798 4901 scope.go:117] "RemoveContainer" containerID="3c66d39cf5683043c4f736459fb89385f1480a81932fe2d1f1fd5dc7314f6f54" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.075378 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9a1851a0-5022-46f8-b5d7-94bb4e23bfc5-host-cni-netd\") pod \"ovnkube-node-xn8v2\" (UID: \"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.075410 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9a1851a0-5022-46f8-b5d7-94bb4e23bfc5-ovn-node-metrics-cert\") pod \"ovnkube-node-xn8v2\" (UID: \"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.075427 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9a1851a0-5022-46f8-b5d7-94bb4e23bfc5-ovnkube-script-lib\") pod \"ovnkube-node-xn8v2\" (UID: \"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.075452 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9a1851a0-5022-46f8-b5d7-94bb4e23bfc5-ovnkube-config\") pod \"ovnkube-node-xn8v2\" (UID: \"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.075469 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9a1851a0-5022-46f8-b5d7-94bb4e23bfc5-run-ovn\") pod \"ovnkube-node-xn8v2\" (UID: \"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.075483 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9a1851a0-5022-46f8-b5d7-94bb4e23bfc5-env-overrides\") pod \"ovnkube-node-xn8v2\" (UID: \"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.075500 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9a1851a0-5022-46f8-b5d7-94bb4e23bfc5-run-openvswitch\") pod \"ovnkube-node-xn8v2\" (UID: \"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.075515 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45v6w\" (UniqueName: \"kubernetes.io/projected/9a1851a0-5022-46f8-b5d7-94bb4e23bfc5-kube-api-access-45v6w\") pod \"ovnkube-node-xn8v2\" (UID: \"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.075529 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9a1851a0-5022-46f8-b5d7-94bb4e23bfc5-log-socket\") pod \"ovnkube-node-xn8v2\" (UID: \"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.075549 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9a1851a0-5022-46f8-b5d7-94bb4e23bfc5-host-slash\") pod \"ovnkube-node-xn8v2\" (UID: \"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.075564 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a1851a0-5022-46f8-b5d7-94bb4e23bfc5-host-kubelet\") pod \"ovnkube-node-xn8v2\" (UID: \"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.075588 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9a1851a0-5022-46f8-b5d7-94bb4e23bfc5-etc-openvswitch\") pod \"ovnkube-node-xn8v2\" (UID: \"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.075610 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9a1851a0-5022-46f8-b5d7-94bb4e23bfc5-systemd-units\") pod \"ovnkube-node-xn8v2\" (UID: \"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.075627 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9a1851a0-5022-46f8-b5d7-94bb4e23bfc5-host-run-netns\") pod \"ovnkube-node-xn8v2\" (UID: \"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.075642 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a1851a0-5022-46f8-b5d7-94bb4e23bfc5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xn8v2\" (UID: \"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.075656 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a1851a0-5022-46f8-b5d7-94bb4e23bfc5-host-run-ovn-kubernetes\") pod \"ovnkube-node-xn8v2\" (UID: \"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.075671 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9a1851a0-5022-46f8-b5d7-94bb4e23bfc5-host-cni-bin\") pod \"ovnkube-node-xn8v2\" (UID: \"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.075687 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9a1851a0-5022-46f8-b5d7-94bb4e23bfc5-node-log\") pod \"ovnkube-node-xn8v2\" (UID: \"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.075703 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9a1851a0-5022-46f8-b5d7-94bb4e23bfc5-run-systemd\") pod \"ovnkube-node-xn8v2\" (UID: \"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.075718 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9a1851a0-5022-46f8-b5d7-94bb4e23bfc5-var-lib-openvswitch\") pod \"ovnkube-node-xn8v2\" (UID: \"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.075752 4901 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/40c17e04-3fc2-48a2-95dc-fe0428b91e66-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.075762 4901 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/40c17e04-3fc2-48a2-95dc-fe0428b91e66-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.075772 4901 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/40c17e04-3fc2-48a2-95dc-fe0428b91e66-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.075783 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cddsm\" (UniqueName: \"kubernetes.io/projected/40c17e04-3fc2-48a2-95dc-fe0428b91e66-kube-api-access-cddsm\") on node \"crc\" DevicePath \"\"" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.075793 4901 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/40c17e04-3fc2-48a2-95dc-fe0428b91e66-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.075802 4901 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/40c17e04-3fc2-48a2-95dc-fe0428b91e66-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.075838 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9a1851a0-5022-46f8-b5d7-94bb4e23bfc5-var-lib-openvswitch\") pod \"ovnkube-node-xn8v2\" (UID: \"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.075868 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9a1851a0-5022-46f8-b5d7-94bb4e23bfc5-run-ovn\") pod \"ovnkube-node-xn8v2\" (UID: \"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.076151 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9a1851a0-5022-46f8-b5d7-94bb4e23bfc5-host-cni-netd\") pod \"ovnkube-node-xn8v2\" (UID: \"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.076329 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9a1851a0-5022-46f8-b5d7-94bb4e23bfc5-systemd-units\") pod \"ovnkube-node-xn8v2\" (UID: \"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.076381 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9a1851a0-5022-46f8-b5d7-94bb4e23bfc5-host-slash\") pod \"ovnkube-node-xn8v2\" (UID: \"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.076433 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9a1851a0-5022-46f8-b5d7-94bb4e23bfc5-etc-openvswitch\") pod \"ovnkube-node-xn8v2\" (UID: \"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.076425 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9a1851a0-5022-46f8-b5d7-94bb4e23bfc5-log-socket\") pod \"ovnkube-node-xn8v2\" (UID: \"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.076457 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a1851a0-5022-46f8-b5d7-94bb4e23bfc5-host-kubelet\") pod \"ovnkube-node-xn8v2\" (UID: \"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.076476 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9a1851a0-5022-46f8-b5d7-94bb4e23bfc5-run-openvswitch\") pod \"ovnkube-node-xn8v2\" (UID: \"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.076586 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a1851a0-5022-46f8-b5d7-94bb4e23bfc5-host-run-ovn-kubernetes\") pod \"ovnkube-node-xn8v2\" (UID: \"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.076635 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9a1851a0-5022-46f8-b5d7-94bb4e23bfc5-host-run-netns\") pod \"ovnkube-node-xn8v2\" (UID: \"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.076777 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9a1851a0-5022-46f8-b5d7-94bb4e23bfc5-ovnkube-config\") pod \"ovnkube-node-xn8v2\" (UID: \"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.077479 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9a1851a0-5022-46f8-b5d7-94bb4e23bfc5-env-overrides\") pod \"ovnkube-node-xn8v2\" (UID: \"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.077509 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9a1851a0-5022-46f8-b5d7-94bb4e23bfc5-node-log\") pod \"ovnkube-node-xn8v2\" (UID: \"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.077532 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9a1851a0-5022-46f8-b5d7-94bb4e23bfc5-host-cni-bin\") pod \"ovnkube-node-xn8v2\" (UID: \"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.077525 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9a1851a0-5022-46f8-b5d7-94bb4e23bfc5-run-systemd\") pod \"ovnkube-node-xn8v2\" (UID: \"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.077898 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9a1851a0-5022-46f8-b5d7-94bb4e23bfc5-ovnkube-script-lib\") pod \"ovnkube-node-xn8v2\" (UID: \"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.078595 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a1851a0-5022-46f8-b5d7-94bb4e23bfc5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xn8v2\" (UID: \"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.090141 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9a1851a0-5022-46f8-b5d7-94bb4e23bfc5-ovn-node-metrics-cert\") pod \"ovnkube-node-xn8v2\" (UID: \"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.112350 4901 scope.go:117] "RemoveContainer" containerID="42126b9cd7c418140407793d61fb0f841ddd7a74cc146c4a922d49bb56828d3e" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.115728 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45v6w\" (UniqueName: \"kubernetes.io/projected/9a1851a0-5022-46f8-b5d7-94bb4e23bfc5-kube-api-access-45v6w\") pod \"ovnkube-node-xn8v2\" (UID: \"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.140745 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bmfgc"] Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.152921 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bmfgc"] Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.161107 4901 scope.go:117] "RemoveContainer" containerID="f2c4c63b1ab59c59daca5ed71ee5ed0e53a565ffbc793ac516f9ba75a4ece272" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.178413 4901 scope.go:117] "RemoveContainer" containerID="8eead4ce26a12a03f0e1808e8e8479ca48c9a409935ea566fae8a7cc2d8bc7c0" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.195978 4901 scope.go:117] "RemoveContainer" containerID="9c9f87a22de3336b00d43f7848f50bda9f5f27f7ac84c6c6e63a224583005e77" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.217880 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.218969 4901 scope.go:117] "RemoveContainer" containerID="fce11cf214e5d6313c477f4939e0627d14eecc389d7df43bfa0a4bff5152baae" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.237613 4901 scope.go:117] "RemoveContainer" containerID="3178075164db642d21adaf4deed75f6f295794530311c6476b6f0376101b61c4" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.262402 4901 scope.go:117] "RemoveContainer" containerID="17a115bbd5ac61babf8f88452e2c50b49fd24b34589357e4a7d59bd2a3511b49" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.278116 4901 scope.go:117] "RemoveContainer" containerID="4ce9289308fa0e0313c1e1de4a2da2e1544bf309f32c6f58cf4a914210f10c87" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.311547 4901 scope.go:117] "RemoveContainer" containerID="f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.334577 4901 scope.go:117] "RemoveContainer" containerID="3e4b1f890041b1c44ceb9cb0ef734bdaa379b3af137404deaacb49deb082a289" Mar 09 02:54:51 crc kubenswrapper[4901]: E0309 02:54:51.335066 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e4b1f890041b1c44ceb9cb0ef734bdaa379b3af137404deaacb49deb082a289\": container with ID starting with 3e4b1f890041b1c44ceb9cb0ef734bdaa379b3af137404deaacb49deb082a289 not found: ID does not exist" containerID="3e4b1f890041b1c44ceb9cb0ef734bdaa379b3af137404deaacb49deb082a289" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.335103 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e4b1f890041b1c44ceb9cb0ef734bdaa379b3af137404deaacb49deb082a289"} err="failed to get container status \"3e4b1f890041b1c44ceb9cb0ef734bdaa379b3af137404deaacb49deb082a289\": rpc error: code = NotFound desc = could not find container \"3e4b1f890041b1c44ceb9cb0ef734bdaa379b3af137404deaacb49deb082a289\": container with ID starting with 3e4b1f890041b1c44ceb9cb0ef734bdaa379b3af137404deaacb49deb082a289 not found: ID does not exist" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.335149 4901 scope.go:117] "RemoveContainer" containerID="42126b9cd7c418140407793d61fb0f841ddd7a74cc146c4a922d49bb56828d3e" Mar 09 02:54:51 crc kubenswrapper[4901]: E0309 02:54:51.335456 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42126b9cd7c418140407793d61fb0f841ddd7a74cc146c4a922d49bb56828d3e\": container with ID starting with 42126b9cd7c418140407793d61fb0f841ddd7a74cc146c4a922d49bb56828d3e not found: ID does not exist" containerID="42126b9cd7c418140407793d61fb0f841ddd7a74cc146c4a922d49bb56828d3e" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.335482 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42126b9cd7c418140407793d61fb0f841ddd7a74cc146c4a922d49bb56828d3e"} err="failed to get container status \"42126b9cd7c418140407793d61fb0f841ddd7a74cc146c4a922d49bb56828d3e\": rpc error: code = NotFound desc = could not find container \"42126b9cd7c418140407793d61fb0f841ddd7a74cc146c4a922d49bb56828d3e\": container with ID starting with 42126b9cd7c418140407793d61fb0f841ddd7a74cc146c4a922d49bb56828d3e not found: ID does not exist" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.335498 4901 scope.go:117] "RemoveContainer" containerID="f2c4c63b1ab59c59daca5ed71ee5ed0e53a565ffbc793ac516f9ba75a4ece272" Mar 09 02:54:51 crc kubenswrapper[4901]: E0309 02:54:51.335741 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2c4c63b1ab59c59daca5ed71ee5ed0e53a565ffbc793ac516f9ba75a4ece272\": container with ID starting with f2c4c63b1ab59c59daca5ed71ee5ed0e53a565ffbc793ac516f9ba75a4ece272 not found: ID does not exist" containerID="f2c4c63b1ab59c59daca5ed71ee5ed0e53a565ffbc793ac516f9ba75a4ece272" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.335766 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2c4c63b1ab59c59daca5ed71ee5ed0e53a565ffbc793ac516f9ba75a4ece272"} err="failed to get container status \"f2c4c63b1ab59c59daca5ed71ee5ed0e53a565ffbc793ac516f9ba75a4ece272\": rpc error: code = NotFound desc = could not find container \"f2c4c63b1ab59c59daca5ed71ee5ed0e53a565ffbc793ac516f9ba75a4ece272\": container with ID starting with f2c4c63b1ab59c59daca5ed71ee5ed0e53a565ffbc793ac516f9ba75a4ece272 not found: ID does not exist" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.335783 4901 scope.go:117] "RemoveContainer" containerID="8eead4ce26a12a03f0e1808e8e8479ca48c9a409935ea566fae8a7cc2d8bc7c0" Mar 09 02:54:51 crc kubenswrapper[4901]: E0309 02:54:51.336103 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8eead4ce26a12a03f0e1808e8e8479ca48c9a409935ea566fae8a7cc2d8bc7c0\": container with ID starting with 8eead4ce26a12a03f0e1808e8e8479ca48c9a409935ea566fae8a7cc2d8bc7c0 not found: ID does not exist" containerID="8eead4ce26a12a03f0e1808e8e8479ca48c9a409935ea566fae8a7cc2d8bc7c0" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.336137 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8eead4ce26a12a03f0e1808e8e8479ca48c9a409935ea566fae8a7cc2d8bc7c0"} err="failed to get container status \"8eead4ce26a12a03f0e1808e8e8479ca48c9a409935ea566fae8a7cc2d8bc7c0\": rpc error: code = NotFound desc = could not find container \"8eead4ce26a12a03f0e1808e8e8479ca48c9a409935ea566fae8a7cc2d8bc7c0\": container with ID starting with 8eead4ce26a12a03f0e1808e8e8479ca48c9a409935ea566fae8a7cc2d8bc7c0 not found: ID does not exist" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.336154 4901 scope.go:117] "RemoveContainer" containerID="9c9f87a22de3336b00d43f7848f50bda9f5f27f7ac84c6c6e63a224583005e77" Mar 09 02:54:51 crc kubenswrapper[4901]: E0309 02:54:51.336450 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c9f87a22de3336b00d43f7848f50bda9f5f27f7ac84c6c6e63a224583005e77\": container with ID starting with 9c9f87a22de3336b00d43f7848f50bda9f5f27f7ac84c6c6e63a224583005e77 not found: ID does not exist" containerID="9c9f87a22de3336b00d43f7848f50bda9f5f27f7ac84c6c6e63a224583005e77" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.336473 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c9f87a22de3336b00d43f7848f50bda9f5f27f7ac84c6c6e63a224583005e77"} err="failed to get container status \"9c9f87a22de3336b00d43f7848f50bda9f5f27f7ac84c6c6e63a224583005e77\": rpc error: code = NotFound desc = could not find container \"9c9f87a22de3336b00d43f7848f50bda9f5f27f7ac84c6c6e63a224583005e77\": container with ID starting with 9c9f87a22de3336b00d43f7848f50bda9f5f27f7ac84c6c6e63a224583005e77 not found: ID does not exist" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.336485 4901 scope.go:117] "RemoveContainer" containerID="fce11cf214e5d6313c477f4939e0627d14eecc389d7df43bfa0a4bff5152baae" Mar 09 02:54:51 crc kubenswrapper[4901]: E0309 02:54:51.336711 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fce11cf214e5d6313c477f4939e0627d14eecc389d7df43bfa0a4bff5152baae\": container with ID starting with fce11cf214e5d6313c477f4939e0627d14eecc389d7df43bfa0a4bff5152baae not found: ID does not exist" containerID="fce11cf214e5d6313c477f4939e0627d14eecc389d7df43bfa0a4bff5152baae" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.336758 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fce11cf214e5d6313c477f4939e0627d14eecc389d7df43bfa0a4bff5152baae"} err="failed to get container status \"fce11cf214e5d6313c477f4939e0627d14eecc389d7df43bfa0a4bff5152baae\": rpc error: code = NotFound desc = could not find container \"fce11cf214e5d6313c477f4939e0627d14eecc389d7df43bfa0a4bff5152baae\": container with ID starting with fce11cf214e5d6313c477f4939e0627d14eecc389d7df43bfa0a4bff5152baae not found: ID does not exist" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.336776 4901 scope.go:117] "RemoveContainer" containerID="3178075164db642d21adaf4deed75f6f295794530311c6476b6f0376101b61c4" Mar 09 02:54:51 crc kubenswrapper[4901]: E0309 02:54:51.337037 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3178075164db642d21adaf4deed75f6f295794530311c6476b6f0376101b61c4\": container with ID starting with 3178075164db642d21adaf4deed75f6f295794530311c6476b6f0376101b61c4 not found: ID does not exist" containerID="3178075164db642d21adaf4deed75f6f295794530311c6476b6f0376101b61c4" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.337065 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3178075164db642d21adaf4deed75f6f295794530311c6476b6f0376101b61c4"} err="failed to get container status \"3178075164db642d21adaf4deed75f6f295794530311c6476b6f0376101b61c4\": rpc error: code = NotFound desc = could not find container \"3178075164db642d21adaf4deed75f6f295794530311c6476b6f0376101b61c4\": container with ID starting with 3178075164db642d21adaf4deed75f6f295794530311c6476b6f0376101b61c4 not found: ID does not exist" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.337088 4901 scope.go:117] "RemoveContainer" containerID="17a115bbd5ac61babf8f88452e2c50b49fd24b34589357e4a7d59bd2a3511b49" Mar 09 02:54:51 crc kubenswrapper[4901]: E0309 02:54:51.337350 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17a115bbd5ac61babf8f88452e2c50b49fd24b34589357e4a7d59bd2a3511b49\": container with ID starting with 17a115bbd5ac61babf8f88452e2c50b49fd24b34589357e4a7d59bd2a3511b49 not found: ID does not exist" containerID="17a115bbd5ac61babf8f88452e2c50b49fd24b34589357e4a7d59bd2a3511b49" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.337373 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17a115bbd5ac61babf8f88452e2c50b49fd24b34589357e4a7d59bd2a3511b49"} err="failed to get container status \"17a115bbd5ac61babf8f88452e2c50b49fd24b34589357e4a7d59bd2a3511b49\": rpc error: code = NotFound desc = could not find container \"17a115bbd5ac61babf8f88452e2c50b49fd24b34589357e4a7d59bd2a3511b49\": container with ID starting with 17a115bbd5ac61babf8f88452e2c50b49fd24b34589357e4a7d59bd2a3511b49 not found: ID does not exist" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.337409 4901 scope.go:117] "RemoveContainer" containerID="4ce9289308fa0e0313c1e1de4a2da2e1544bf309f32c6f58cf4a914210f10c87" Mar 09 02:54:51 crc kubenswrapper[4901]: E0309 02:54:51.337655 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ce9289308fa0e0313c1e1de4a2da2e1544bf309f32c6f58cf4a914210f10c87\": container with ID starting with 4ce9289308fa0e0313c1e1de4a2da2e1544bf309f32c6f58cf4a914210f10c87 not found: ID does not exist" containerID="4ce9289308fa0e0313c1e1de4a2da2e1544bf309f32c6f58cf4a914210f10c87" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.337680 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ce9289308fa0e0313c1e1de4a2da2e1544bf309f32c6f58cf4a914210f10c87"} err="failed to get container status \"4ce9289308fa0e0313c1e1de4a2da2e1544bf309f32c6f58cf4a914210f10c87\": rpc error: code = NotFound desc = could not find container \"4ce9289308fa0e0313c1e1de4a2da2e1544bf309f32c6f58cf4a914210f10c87\": container with ID starting with 4ce9289308fa0e0313c1e1de4a2da2e1544bf309f32c6f58cf4a914210f10c87 not found: ID does not exist" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.337698 4901 scope.go:117] "RemoveContainer" containerID="f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf" Mar 09 02:54:51 crc kubenswrapper[4901]: E0309 02:54:51.338314 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\": container with ID starting with f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf not found: ID does not exist" containerID="f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.338338 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf"} err="failed to get container status \"f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\": rpc error: code = NotFound desc = could not find container \"f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\": container with ID starting with f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf not found: ID does not exist" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.338355 4901 scope.go:117] "RemoveContainer" containerID="3e4b1f890041b1c44ceb9cb0ef734bdaa379b3af137404deaacb49deb082a289" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.338727 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e4b1f890041b1c44ceb9cb0ef734bdaa379b3af137404deaacb49deb082a289"} err="failed to get container status \"3e4b1f890041b1c44ceb9cb0ef734bdaa379b3af137404deaacb49deb082a289\": rpc error: code = NotFound desc = could not find container \"3e4b1f890041b1c44ceb9cb0ef734bdaa379b3af137404deaacb49deb082a289\": container with ID starting with 3e4b1f890041b1c44ceb9cb0ef734bdaa379b3af137404deaacb49deb082a289 not found: ID does not exist" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.338782 4901 scope.go:117] "RemoveContainer" containerID="42126b9cd7c418140407793d61fb0f841ddd7a74cc146c4a922d49bb56828d3e" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.339121 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42126b9cd7c418140407793d61fb0f841ddd7a74cc146c4a922d49bb56828d3e"} err="failed to get container status \"42126b9cd7c418140407793d61fb0f841ddd7a74cc146c4a922d49bb56828d3e\": rpc error: code = NotFound desc = could not find container \"42126b9cd7c418140407793d61fb0f841ddd7a74cc146c4a922d49bb56828d3e\": container with ID starting with 42126b9cd7c418140407793d61fb0f841ddd7a74cc146c4a922d49bb56828d3e not found: ID does not exist" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.339142 4901 scope.go:117] "RemoveContainer" containerID="f2c4c63b1ab59c59daca5ed71ee5ed0e53a565ffbc793ac516f9ba75a4ece272" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.339609 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2c4c63b1ab59c59daca5ed71ee5ed0e53a565ffbc793ac516f9ba75a4ece272"} err="failed to get container status \"f2c4c63b1ab59c59daca5ed71ee5ed0e53a565ffbc793ac516f9ba75a4ece272\": rpc error: code = NotFound desc = could not find container \"f2c4c63b1ab59c59daca5ed71ee5ed0e53a565ffbc793ac516f9ba75a4ece272\": container with ID starting with f2c4c63b1ab59c59daca5ed71ee5ed0e53a565ffbc793ac516f9ba75a4ece272 not found: ID does not exist" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.339634 4901 scope.go:117] "RemoveContainer" containerID="8eead4ce26a12a03f0e1808e8e8479ca48c9a409935ea566fae8a7cc2d8bc7c0" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.339908 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8eead4ce26a12a03f0e1808e8e8479ca48c9a409935ea566fae8a7cc2d8bc7c0"} err="failed to get container status \"8eead4ce26a12a03f0e1808e8e8479ca48c9a409935ea566fae8a7cc2d8bc7c0\": rpc error: code = NotFound desc = could not find container \"8eead4ce26a12a03f0e1808e8e8479ca48c9a409935ea566fae8a7cc2d8bc7c0\": container with ID starting with 8eead4ce26a12a03f0e1808e8e8479ca48c9a409935ea566fae8a7cc2d8bc7c0 not found: ID does not exist" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.339949 4901 scope.go:117] "RemoveContainer" containerID="9c9f87a22de3336b00d43f7848f50bda9f5f27f7ac84c6c6e63a224583005e77" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.340198 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c9f87a22de3336b00d43f7848f50bda9f5f27f7ac84c6c6e63a224583005e77"} err="failed to get container status \"9c9f87a22de3336b00d43f7848f50bda9f5f27f7ac84c6c6e63a224583005e77\": rpc error: code = NotFound desc = could not find container \"9c9f87a22de3336b00d43f7848f50bda9f5f27f7ac84c6c6e63a224583005e77\": container with ID starting with 9c9f87a22de3336b00d43f7848f50bda9f5f27f7ac84c6c6e63a224583005e77 not found: ID does not exist" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.340233 4901 scope.go:117] "RemoveContainer" containerID="fce11cf214e5d6313c477f4939e0627d14eecc389d7df43bfa0a4bff5152baae" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.340503 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fce11cf214e5d6313c477f4939e0627d14eecc389d7df43bfa0a4bff5152baae"} err="failed to get container status \"fce11cf214e5d6313c477f4939e0627d14eecc389d7df43bfa0a4bff5152baae\": rpc error: code = NotFound desc = could not find container \"fce11cf214e5d6313c477f4939e0627d14eecc389d7df43bfa0a4bff5152baae\": container with ID starting with fce11cf214e5d6313c477f4939e0627d14eecc389d7df43bfa0a4bff5152baae not found: ID does not exist" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.340526 4901 scope.go:117] "RemoveContainer" containerID="3178075164db642d21adaf4deed75f6f295794530311c6476b6f0376101b61c4" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.340834 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3178075164db642d21adaf4deed75f6f295794530311c6476b6f0376101b61c4"} err="failed to get container status \"3178075164db642d21adaf4deed75f6f295794530311c6476b6f0376101b61c4\": rpc error: code = NotFound desc = could not find container \"3178075164db642d21adaf4deed75f6f295794530311c6476b6f0376101b61c4\": container with ID starting with 3178075164db642d21adaf4deed75f6f295794530311c6476b6f0376101b61c4 not found: ID does not exist" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.340851 4901 scope.go:117] "RemoveContainer" containerID="17a115bbd5ac61babf8f88452e2c50b49fd24b34589357e4a7d59bd2a3511b49" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.341036 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17a115bbd5ac61babf8f88452e2c50b49fd24b34589357e4a7d59bd2a3511b49"} err="failed to get container status \"17a115bbd5ac61babf8f88452e2c50b49fd24b34589357e4a7d59bd2a3511b49\": rpc error: code = NotFound desc = could not find container \"17a115bbd5ac61babf8f88452e2c50b49fd24b34589357e4a7d59bd2a3511b49\": container with ID starting with 17a115bbd5ac61babf8f88452e2c50b49fd24b34589357e4a7d59bd2a3511b49 not found: ID does not exist" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.341052 4901 scope.go:117] "RemoveContainer" containerID="4ce9289308fa0e0313c1e1de4a2da2e1544bf309f32c6f58cf4a914210f10c87" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.341552 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ce9289308fa0e0313c1e1de4a2da2e1544bf309f32c6f58cf4a914210f10c87"} err="failed to get container status \"4ce9289308fa0e0313c1e1de4a2da2e1544bf309f32c6f58cf4a914210f10c87\": rpc error: code = NotFound desc = could not find container \"4ce9289308fa0e0313c1e1de4a2da2e1544bf309f32c6f58cf4a914210f10c87\": container with ID starting with 4ce9289308fa0e0313c1e1de4a2da2e1544bf309f32c6f58cf4a914210f10c87 not found: ID does not exist" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.341581 4901 scope.go:117] "RemoveContainer" containerID="f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.341928 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf"} err="failed to get container status \"f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\": rpc error: code = NotFound desc = could not find container \"f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\": container with ID starting with f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf not found: ID does not exist" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.341944 4901 scope.go:117] "RemoveContainer" containerID="3e4b1f890041b1c44ceb9cb0ef734bdaa379b3af137404deaacb49deb082a289" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.342192 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e4b1f890041b1c44ceb9cb0ef734bdaa379b3af137404deaacb49deb082a289"} err="failed to get container status \"3e4b1f890041b1c44ceb9cb0ef734bdaa379b3af137404deaacb49deb082a289\": rpc error: code = NotFound desc = could not find container \"3e4b1f890041b1c44ceb9cb0ef734bdaa379b3af137404deaacb49deb082a289\": container with ID starting with 3e4b1f890041b1c44ceb9cb0ef734bdaa379b3af137404deaacb49deb082a289 not found: ID does not exist" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.342210 4901 scope.go:117] "RemoveContainer" containerID="42126b9cd7c418140407793d61fb0f841ddd7a74cc146c4a922d49bb56828d3e" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.342542 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42126b9cd7c418140407793d61fb0f841ddd7a74cc146c4a922d49bb56828d3e"} err="failed to get container status \"42126b9cd7c418140407793d61fb0f841ddd7a74cc146c4a922d49bb56828d3e\": rpc error: code = NotFound desc = could not find container \"42126b9cd7c418140407793d61fb0f841ddd7a74cc146c4a922d49bb56828d3e\": container with ID starting with 42126b9cd7c418140407793d61fb0f841ddd7a74cc146c4a922d49bb56828d3e not found: ID does not exist" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.342594 4901 scope.go:117] "RemoveContainer" containerID="f2c4c63b1ab59c59daca5ed71ee5ed0e53a565ffbc793ac516f9ba75a4ece272" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.342896 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2c4c63b1ab59c59daca5ed71ee5ed0e53a565ffbc793ac516f9ba75a4ece272"} err="failed to get container status \"f2c4c63b1ab59c59daca5ed71ee5ed0e53a565ffbc793ac516f9ba75a4ece272\": rpc error: code = NotFound desc = could not find container \"f2c4c63b1ab59c59daca5ed71ee5ed0e53a565ffbc793ac516f9ba75a4ece272\": container with ID starting with f2c4c63b1ab59c59daca5ed71ee5ed0e53a565ffbc793ac516f9ba75a4ece272 not found: ID does not exist" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.342917 4901 scope.go:117] "RemoveContainer" containerID="8eead4ce26a12a03f0e1808e8e8479ca48c9a409935ea566fae8a7cc2d8bc7c0" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.343186 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8eead4ce26a12a03f0e1808e8e8479ca48c9a409935ea566fae8a7cc2d8bc7c0"} err="failed to get container status \"8eead4ce26a12a03f0e1808e8e8479ca48c9a409935ea566fae8a7cc2d8bc7c0\": rpc error: code = NotFound desc = could not find container \"8eead4ce26a12a03f0e1808e8e8479ca48c9a409935ea566fae8a7cc2d8bc7c0\": container with ID starting with 8eead4ce26a12a03f0e1808e8e8479ca48c9a409935ea566fae8a7cc2d8bc7c0 not found: ID does not exist" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.343230 4901 scope.go:117] "RemoveContainer" containerID="9c9f87a22de3336b00d43f7848f50bda9f5f27f7ac84c6c6e63a224583005e77" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.343536 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c9f87a22de3336b00d43f7848f50bda9f5f27f7ac84c6c6e63a224583005e77"} err="failed to get container status \"9c9f87a22de3336b00d43f7848f50bda9f5f27f7ac84c6c6e63a224583005e77\": rpc error: code = NotFound desc = could not find container \"9c9f87a22de3336b00d43f7848f50bda9f5f27f7ac84c6c6e63a224583005e77\": container with ID starting with 9c9f87a22de3336b00d43f7848f50bda9f5f27f7ac84c6c6e63a224583005e77 not found: ID does not exist" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.343553 4901 scope.go:117] "RemoveContainer" containerID="fce11cf214e5d6313c477f4939e0627d14eecc389d7df43bfa0a4bff5152baae" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.343820 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fce11cf214e5d6313c477f4939e0627d14eecc389d7df43bfa0a4bff5152baae"} err="failed to get container status \"fce11cf214e5d6313c477f4939e0627d14eecc389d7df43bfa0a4bff5152baae\": rpc error: code = NotFound desc = could not find container \"fce11cf214e5d6313c477f4939e0627d14eecc389d7df43bfa0a4bff5152baae\": container with ID starting with fce11cf214e5d6313c477f4939e0627d14eecc389d7df43bfa0a4bff5152baae not found: ID does not exist" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.343836 4901 scope.go:117] "RemoveContainer" containerID="3178075164db642d21adaf4deed75f6f295794530311c6476b6f0376101b61c4" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.344184 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3178075164db642d21adaf4deed75f6f295794530311c6476b6f0376101b61c4"} err="failed to get container status \"3178075164db642d21adaf4deed75f6f295794530311c6476b6f0376101b61c4\": rpc error: code = NotFound desc = could not find container \"3178075164db642d21adaf4deed75f6f295794530311c6476b6f0376101b61c4\": container with ID starting with 3178075164db642d21adaf4deed75f6f295794530311c6476b6f0376101b61c4 not found: ID does not exist" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.344211 4901 scope.go:117] "RemoveContainer" containerID="17a115bbd5ac61babf8f88452e2c50b49fd24b34589357e4a7d59bd2a3511b49" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.344524 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17a115bbd5ac61babf8f88452e2c50b49fd24b34589357e4a7d59bd2a3511b49"} err="failed to get container status \"17a115bbd5ac61babf8f88452e2c50b49fd24b34589357e4a7d59bd2a3511b49\": rpc error: code = NotFound desc = could not find container \"17a115bbd5ac61babf8f88452e2c50b49fd24b34589357e4a7d59bd2a3511b49\": container with ID starting with 17a115bbd5ac61babf8f88452e2c50b49fd24b34589357e4a7d59bd2a3511b49 not found: ID does not exist" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.344551 4901 scope.go:117] "RemoveContainer" containerID="4ce9289308fa0e0313c1e1de4a2da2e1544bf309f32c6f58cf4a914210f10c87" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.344901 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ce9289308fa0e0313c1e1de4a2da2e1544bf309f32c6f58cf4a914210f10c87"} err="failed to get container status \"4ce9289308fa0e0313c1e1de4a2da2e1544bf309f32c6f58cf4a914210f10c87\": rpc error: code = NotFound desc = could not find container \"4ce9289308fa0e0313c1e1de4a2da2e1544bf309f32c6f58cf4a914210f10c87\": container with ID starting with 4ce9289308fa0e0313c1e1de4a2da2e1544bf309f32c6f58cf4a914210f10c87 not found: ID does not exist" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.344938 4901 scope.go:117] "RemoveContainer" containerID="f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.345234 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf"} err="failed to get container status \"f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\": rpc error: code = NotFound desc = could not find container \"f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\": container with ID starting with f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf not found: ID does not exist" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.345255 4901 scope.go:117] "RemoveContainer" containerID="3e4b1f890041b1c44ceb9cb0ef734bdaa379b3af137404deaacb49deb082a289" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.345484 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e4b1f890041b1c44ceb9cb0ef734bdaa379b3af137404deaacb49deb082a289"} err="failed to get container status \"3e4b1f890041b1c44ceb9cb0ef734bdaa379b3af137404deaacb49deb082a289\": rpc error: code = NotFound desc = could not find container \"3e4b1f890041b1c44ceb9cb0ef734bdaa379b3af137404deaacb49deb082a289\": container with ID starting with 3e4b1f890041b1c44ceb9cb0ef734bdaa379b3af137404deaacb49deb082a289 not found: ID does not exist" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.345500 4901 scope.go:117] "RemoveContainer" containerID="42126b9cd7c418140407793d61fb0f841ddd7a74cc146c4a922d49bb56828d3e" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.345725 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42126b9cd7c418140407793d61fb0f841ddd7a74cc146c4a922d49bb56828d3e"} err="failed to get container status \"42126b9cd7c418140407793d61fb0f841ddd7a74cc146c4a922d49bb56828d3e\": rpc error: code = NotFound desc = could not find container \"42126b9cd7c418140407793d61fb0f841ddd7a74cc146c4a922d49bb56828d3e\": container with ID starting with 42126b9cd7c418140407793d61fb0f841ddd7a74cc146c4a922d49bb56828d3e not found: ID does not exist" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.345747 4901 scope.go:117] "RemoveContainer" containerID="f2c4c63b1ab59c59daca5ed71ee5ed0e53a565ffbc793ac516f9ba75a4ece272" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.346013 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2c4c63b1ab59c59daca5ed71ee5ed0e53a565ffbc793ac516f9ba75a4ece272"} err="failed to get container status \"f2c4c63b1ab59c59daca5ed71ee5ed0e53a565ffbc793ac516f9ba75a4ece272\": rpc error: code = NotFound desc = could not find container \"f2c4c63b1ab59c59daca5ed71ee5ed0e53a565ffbc793ac516f9ba75a4ece272\": container with ID starting with f2c4c63b1ab59c59daca5ed71ee5ed0e53a565ffbc793ac516f9ba75a4ece272 not found: ID does not exist" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.346036 4901 scope.go:117] "RemoveContainer" containerID="8eead4ce26a12a03f0e1808e8e8479ca48c9a409935ea566fae8a7cc2d8bc7c0" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.346345 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8eead4ce26a12a03f0e1808e8e8479ca48c9a409935ea566fae8a7cc2d8bc7c0"} err="failed to get container status \"8eead4ce26a12a03f0e1808e8e8479ca48c9a409935ea566fae8a7cc2d8bc7c0\": rpc error: code = NotFound desc = could not find container \"8eead4ce26a12a03f0e1808e8e8479ca48c9a409935ea566fae8a7cc2d8bc7c0\": container with ID starting with 8eead4ce26a12a03f0e1808e8e8479ca48c9a409935ea566fae8a7cc2d8bc7c0 not found: ID does not exist" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.346366 4901 scope.go:117] "RemoveContainer" containerID="9c9f87a22de3336b00d43f7848f50bda9f5f27f7ac84c6c6e63a224583005e77" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.346673 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c9f87a22de3336b00d43f7848f50bda9f5f27f7ac84c6c6e63a224583005e77"} err="failed to get container status \"9c9f87a22de3336b00d43f7848f50bda9f5f27f7ac84c6c6e63a224583005e77\": rpc error: code = NotFound desc = could not find container \"9c9f87a22de3336b00d43f7848f50bda9f5f27f7ac84c6c6e63a224583005e77\": container with ID starting with 9c9f87a22de3336b00d43f7848f50bda9f5f27f7ac84c6c6e63a224583005e77 not found: ID does not exist" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.346700 4901 scope.go:117] "RemoveContainer" containerID="fce11cf214e5d6313c477f4939e0627d14eecc389d7df43bfa0a4bff5152baae" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.346953 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fce11cf214e5d6313c477f4939e0627d14eecc389d7df43bfa0a4bff5152baae"} err="failed to get container status \"fce11cf214e5d6313c477f4939e0627d14eecc389d7df43bfa0a4bff5152baae\": rpc error: code = NotFound desc = could not find container \"fce11cf214e5d6313c477f4939e0627d14eecc389d7df43bfa0a4bff5152baae\": container with ID starting with fce11cf214e5d6313c477f4939e0627d14eecc389d7df43bfa0a4bff5152baae not found: ID does not exist" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.346978 4901 scope.go:117] "RemoveContainer" containerID="3178075164db642d21adaf4deed75f6f295794530311c6476b6f0376101b61c4" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.347301 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3178075164db642d21adaf4deed75f6f295794530311c6476b6f0376101b61c4"} err="failed to get container status \"3178075164db642d21adaf4deed75f6f295794530311c6476b6f0376101b61c4\": rpc error: code = NotFound desc = could not find container \"3178075164db642d21adaf4deed75f6f295794530311c6476b6f0376101b61c4\": container with ID starting with 3178075164db642d21adaf4deed75f6f295794530311c6476b6f0376101b61c4 not found: ID does not exist" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.347325 4901 scope.go:117] "RemoveContainer" containerID="17a115bbd5ac61babf8f88452e2c50b49fd24b34589357e4a7d59bd2a3511b49" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.347566 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17a115bbd5ac61babf8f88452e2c50b49fd24b34589357e4a7d59bd2a3511b49"} err="failed to get container status \"17a115bbd5ac61babf8f88452e2c50b49fd24b34589357e4a7d59bd2a3511b49\": rpc error: code = NotFound desc = could not find container \"17a115bbd5ac61babf8f88452e2c50b49fd24b34589357e4a7d59bd2a3511b49\": container with ID starting with 17a115bbd5ac61babf8f88452e2c50b49fd24b34589357e4a7d59bd2a3511b49 not found: ID does not exist" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.347585 4901 scope.go:117] "RemoveContainer" containerID="4ce9289308fa0e0313c1e1de4a2da2e1544bf309f32c6f58cf4a914210f10c87" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.347896 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ce9289308fa0e0313c1e1de4a2da2e1544bf309f32c6f58cf4a914210f10c87"} err="failed to get container status \"4ce9289308fa0e0313c1e1de4a2da2e1544bf309f32c6f58cf4a914210f10c87\": rpc error: code = NotFound desc = could not find container \"4ce9289308fa0e0313c1e1de4a2da2e1544bf309f32c6f58cf4a914210f10c87\": container with ID starting with 4ce9289308fa0e0313c1e1de4a2da2e1544bf309f32c6f58cf4a914210f10c87 not found: ID does not exist" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.347913 4901 scope.go:117] "RemoveContainer" containerID="f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.348145 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf"} err="failed to get container status \"f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\": rpc error: code = NotFound desc = could not find container \"f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf\": container with ID starting with f3947a1608ce0285ce265114f0e360f669eb7f593722a7ec363ed4c3f29505cf not found: ID does not exist" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.348170 4901 scope.go:117] "RemoveContainer" containerID="3e4b1f890041b1c44ceb9cb0ef734bdaa379b3af137404deaacb49deb082a289" Mar 09 02:54:51 crc kubenswrapper[4901]: I0309 02:54:51.348478 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e4b1f890041b1c44ceb9cb0ef734bdaa379b3af137404deaacb49deb082a289"} err="failed to get container status \"3e4b1f890041b1c44ceb9cb0ef734bdaa379b3af137404deaacb49deb082a289\": rpc error: code = NotFound desc = could not find container \"3e4b1f890041b1c44ceb9cb0ef734bdaa379b3af137404deaacb49deb082a289\": container with ID starting with 3e4b1f890041b1c44ceb9cb0ef734bdaa379b3af137404deaacb49deb082a289 not found: ID does not exist" Mar 09 02:54:52 crc kubenswrapper[4901]: I0309 02:54:52.082250 4901 generic.go:334] "Generic (PLEG): container finished" podID="9a1851a0-5022-46f8-b5d7-94bb4e23bfc5" containerID="d5fe94a0373f92cc47e05ee59f2c5f5b639a1d940133457c43fe5c6482529a9f" exitCode=0 Mar 09 02:54:52 crc kubenswrapper[4901]: I0309 02:54:52.082369 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" event={"ID":"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5","Type":"ContainerDied","Data":"d5fe94a0373f92cc47e05ee59f2c5f5b639a1d940133457c43fe5c6482529a9f"} Mar 09 02:54:52 crc kubenswrapper[4901]: I0309 02:54:52.082765 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" event={"ID":"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5","Type":"ContainerStarted","Data":"c6293013abdcdd2691b2436dcd1d02a404482fc9a901417a219f814327a5433a"} Mar 09 02:54:52 crc kubenswrapper[4901]: I0309 02:54:52.087269 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-429fk_a0d0e040-7ca3-4af8-9f02-d96cff6b3edf/kube-multus/1.log" Mar 09 02:54:52 crc kubenswrapper[4901]: I0309 02:54:52.087871 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-429fk_a0d0e040-7ca3-4af8-9f02-d96cff6b3edf/kube-multus/0.log" Mar 09 02:54:52 crc kubenswrapper[4901]: I0309 02:54:52.087930 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-429fk" event={"ID":"a0d0e040-7ca3-4af8-9f02-d96cff6b3edf","Type":"ContainerStarted","Data":"1b603bb65b83a36c206fba55c1e41d53df79404555f5bd2e1a4a81fb4ef4c6a8"} Mar 09 02:54:52 crc kubenswrapper[4901]: I0309 02:54:52.117018 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40c17e04-3fc2-48a2-95dc-fe0428b91e66" path="/var/lib/kubelet/pods/40c17e04-3fc2-48a2-95dc-fe0428b91e66/volumes" Mar 09 02:54:52 crc kubenswrapper[4901]: I0309 02:54:52.913540 4901 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 09 02:54:53 crc kubenswrapper[4901]: I0309 02:54:53.095007 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" event={"ID":"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5","Type":"ContainerStarted","Data":"f244927eb24734eac651ecc2dca984e694004af9b7e0bc4687da8310ce47a32f"} Mar 09 02:54:53 crc kubenswrapper[4901]: I0309 02:54:53.095053 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" event={"ID":"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5","Type":"ContainerStarted","Data":"0a3c1267f2384d667460e3c42551853678e8e07d722e54fcc7982a90e292cefe"} Mar 09 02:54:53 crc kubenswrapper[4901]: I0309 02:54:53.095071 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" event={"ID":"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5","Type":"ContainerStarted","Data":"40a9bf69e2aa4274de961c409599a67eb100b5c2dc3759750945b625e2d8e8e7"} Mar 09 02:54:53 crc kubenswrapper[4901]: I0309 02:54:53.095082 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" event={"ID":"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5","Type":"ContainerStarted","Data":"807d3443d6375dd51b4950bfb68e985a68b7b8645287b2eb261063aff8d1759d"} Mar 09 02:54:53 crc kubenswrapper[4901]: I0309 02:54:53.095092 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" event={"ID":"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5","Type":"ContainerStarted","Data":"f6de2eff7c363f7635b78b8cac58f1ed8c3475cd333d4c98b434111160004657"} Mar 09 02:54:53 crc kubenswrapper[4901]: I0309 02:54:53.095102 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" event={"ID":"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5","Type":"ContainerStarted","Data":"725db9394dd0adab29dc25178c2d6c164c4180af2134552541ef6dfb7d27c078"} Mar 09 02:54:56 crc kubenswrapper[4901]: I0309 02:54:56.140395 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" event={"ID":"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5","Type":"ContainerStarted","Data":"4e8573026803ead3f921048d2778a965a55a20feef1e8a24baf8f19dd4cdb492"} Mar 09 02:54:56 crc kubenswrapper[4901]: I0309 02:54:56.488538 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-j4cvp"] Mar 09 02:54:56 crc kubenswrapper[4901]: I0309 02:54:56.489293 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-j4cvp" Mar 09 02:54:56 crc kubenswrapper[4901]: I0309 02:54:56.490980 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 09 02:54:56 crc kubenswrapper[4901]: I0309 02:54:56.491171 4901 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-xv8zs" Mar 09 02:54:56 crc kubenswrapper[4901]: I0309 02:54:56.491328 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 09 02:54:56 crc kubenswrapper[4901]: I0309 02:54:56.491496 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 09 02:54:56 crc kubenswrapper[4901]: I0309 02:54:56.545379 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbhpz\" (UniqueName: \"kubernetes.io/projected/e0ede59c-7e38-437e-ba16-82adff7f9ef4-kube-api-access-lbhpz\") pod \"crc-storage-crc-j4cvp\" (UID: \"e0ede59c-7e38-437e-ba16-82adff7f9ef4\") " pod="crc-storage/crc-storage-crc-j4cvp" Mar 09 02:54:56 crc kubenswrapper[4901]: I0309 02:54:56.545456 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/e0ede59c-7e38-437e-ba16-82adff7f9ef4-crc-storage\") pod \"crc-storage-crc-j4cvp\" (UID: \"e0ede59c-7e38-437e-ba16-82adff7f9ef4\") " pod="crc-storage/crc-storage-crc-j4cvp" Mar 09 02:54:56 crc kubenswrapper[4901]: I0309 02:54:56.545522 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/e0ede59c-7e38-437e-ba16-82adff7f9ef4-node-mnt\") pod \"crc-storage-crc-j4cvp\" (UID: \"e0ede59c-7e38-437e-ba16-82adff7f9ef4\") " pod="crc-storage/crc-storage-crc-j4cvp" Mar 09 02:54:56 crc kubenswrapper[4901]: I0309 02:54:56.646884 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/e0ede59c-7e38-437e-ba16-82adff7f9ef4-node-mnt\") pod \"crc-storage-crc-j4cvp\" (UID: \"e0ede59c-7e38-437e-ba16-82adff7f9ef4\") " pod="crc-storage/crc-storage-crc-j4cvp" Mar 09 02:54:56 crc kubenswrapper[4901]: I0309 02:54:56.647036 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbhpz\" (UniqueName: \"kubernetes.io/projected/e0ede59c-7e38-437e-ba16-82adff7f9ef4-kube-api-access-lbhpz\") pod \"crc-storage-crc-j4cvp\" (UID: \"e0ede59c-7e38-437e-ba16-82adff7f9ef4\") " pod="crc-storage/crc-storage-crc-j4cvp" Mar 09 02:54:56 crc kubenswrapper[4901]: I0309 02:54:56.647090 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/e0ede59c-7e38-437e-ba16-82adff7f9ef4-crc-storage\") pod \"crc-storage-crc-j4cvp\" (UID: \"e0ede59c-7e38-437e-ba16-82adff7f9ef4\") " pod="crc-storage/crc-storage-crc-j4cvp" Mar 09 02:54:56 crc kubenswrapper[4901]: I0309 02:54:56.647283 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/e0ede59c-7e38-437e-ba16-82adff7f9ef4-node-mnt\") pod \"crc-storage-crc-j4cvp\" (UID: \"e0ede59c-7e38-437e-ba16-82adff7f9ef4\") " pod="crc-storage/crc-storage-crc-j4cvp" Mar 09 02:54:56 crc kubenswrapper[4901]: I0309 02:54:56.651515 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/e0ede59c-7e38-437e-ba16-82adff7f9ef4-crc-storage\") pod \"crc-storage-crc-j4cvp\" (UID: \"e0ede59c-7e38-437e-ba16-82adff7f9ef4\") " pod="crc-storage/crc-storage-crc-j4cvp" Mar 09 02:54:56 crc kubenswrapper[4901]: I0309 02:54:56.688424 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbhpz\" (UniqueName: \"kubernetes.io/projected/e0ede59c-7e38-437e-ba16-82adff7f9ef4-kube-api-access-lbhpz\") pod \"crc-storage-crc-j4cvp\" (UID: \"e0ede59c-7e38-437e-ba16-82adff7f9ef4\") " pod="crc-storage/crc-storage-crc-j4cvp" Mar 09 02:54:56 crc kubenswrapper[4901]: I0309 02:54:56.815855 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-j4cvp" Mar 09 02:54:56 crc kubenswrapper[4901]: E0309 02:54:56.848402 4901 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-j4cvp_crc-storage_e0ede59c-7e38-437e-ba16-82adff7f9ef4_0(f94dc191a1de603c8b2eab497439fb87ef923b6690141d76692d0d01f1f9813a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 02:54:56 crc kubenswrapper[4901]: E0309 02:54:56.848587 4901 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-j4cvp_crc-storage_e0ede59c-7e38-437e-ba16-82adff7f9ef4_0(f94dc191a1de603c8b2eab497439fb87ef923b6690141d76692d0d01f1f9813a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-j4cvp" Mar 09 02:54:56 crc kubenswrapper[4901]: E0309 02:54:56.848693 4901 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-j4cvp_crc-storage_e0ede59c-7e38-437e-ba16-82adff7f9ef4_0(f94dc191a1de603c8b2eab497439fb87ef923b6690141d76692d0d01f1f9813a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-j4cvp" Mar 09 02:54:56 crc kubenswrapper[4901]: E0309 02:54:56.848834 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-j4cvp_crc-storage(e0ede59c-7e38-437e-ba16-82adff7f9ef4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-j4cvp_crc-storage(e0ede59c-7e38-437e-ba16-82adff7f9ef4)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-j4cvp_crc-storage_e0ede59c-7e38-437e-ba16-82adff7f9ef4_0(f94dc191a1de603c8b2eab497439fb87ef923b6690141d76692d0d01f1f9813a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-j4cvp" podUID="e0ede59c-7e38-437e-ba16-82adff7f9ef4" Mar 09 02:54:58 crc kubenswrapper[4901]: I0309 02:54:58.157623 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" event={"ID":"9a1851a0-5022-46f8-b5d7-94bb4e23bfc5","Type":"ContainerStarted","Data":"2d72df4e71812b96651ead9852b3684b4ad51a57913647683e0d96ce064c8980"} Mar 09 02:54:58 crc kubenswrapper[4901]: I0309 02:54:58.158306 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:58 crc kubenswrapper[4901]: I0309 02:54:58.202932 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" podStartSLOduration=8.202890842 podStartE2EDuration="8.202890842s" podCreationTimestamp="2026-03-09 02:54:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:54:58.193891244 +0000 UTC m=+822.783554996" watchObservedRunningTime="2026-03-09 02:54:58.202890842 +0000 UTC m=+822.792554614" Mar 09 02:54:58 crc kubenswrapper[4901]: I0309 02:54:58.204225 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:58 crc kubenswrapper[4901]: I0309 02:54:58.437732 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-j4cvp"] Mar 09 02:54:58 crc kubenswrapper[4901]: I0309 02:54:58.438657 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-j4cvp" Mar 09 02:54:58 crc kubenswrapper[4901]: I0309 02:54:58.439486 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-j4cvp" Mar 09 02:54:58 crc kubenswrapper[4901]: E0309 02:54:58.472779 4901 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-j4cvp_crc-storage_e0ede59c-7e38-437e-ba16-82adff7f9ef4_0(bfea59b559b3649687ea2cba4cbd7a8a8801d89e7e62f3e47678ca90ec74f713): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 02:54:58 crc kubenswrapper[4901]: E0309 02:54:58.473003 4901 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-j4cvp_crc-storage_e0ede59c-7e38-437e-ba16-82adff7f9ef4_0(bfea59b559b3649687ea2cba4cbd7a8a8801d89e7e62f3e47678ca90ec74f713): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-j4cvp" Mar 09 02:54:58 crc kubenswrapper[4901]: E0309 02:54:58.473160 4901 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-j4cvp_crc-storage_e0ede59c-7e38-437e-ba16-82adff7f9ef4_0(bfea59b559b3649687ea2cba4cbd7a8a8801d89e7e62f3e47678ca90ec74f713): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-j4cvp" Mar 09 02:54:58 crc kubenswrapper[4901]: E0309 02:54:58.473446 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-j4cvp_crc-storage(e0ede59c-7e38-437e-ba16-82adff7f9ef4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-j4cvp_crc-storage(e0ede59c-7e38-437e-ba16-82adff7f9ef4)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-j4cvp_crc-storage_e0ede59c-7e38-437e-ba16-82adff7f9ef4_0(bfea59b559b3649687ea2cba4cbd7a8a8801d89e7e62f3e47678ca90ec74f713): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-j4cvp" podUID="e0ede59c-7e38-437e-ba16-82adff7f9ef4" Mar 09 02:54:59 crc kubenswrapper[4901]: I0309 02:54:59.164954 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:59 crc kubenswrapper[4901]: I0309 02:54:59.166060 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:54:59 crc kubenswrapper[4901]: I0309 02:54:59.258829 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:55:12 crc kubenswrapper[4901]: I0309 02:55:12.106419 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-j4cvp" Mar 09 02:55:12 crc kubenswrapper[4901]: I0309 02:55:12.107675 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-j4cvp" Mar 09 02:55:12 crc kubenswrapper[4901]: I0309 02:55:12.386572 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-j4cvp"] Mar 09 02:55:12 crc kubenswrapper[4901]: W0309 02:55:12.394421 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0ede59c_7e38_437e_ba16_82adff7f9ef4.slice/crio-d0896a46a52d7f70cacdb57109e28be4ceb73ea031a225559bbf64e40c03f8e0 WatchSource:0}: Error finding container d0896a46a52d7f70cacdb57109e28be4ceb73ea031a225559bbf64e40c03f8e0: Status 404 returned error can't find the container with id d0896a46a52d7f70cacdb57109e28be4ceb73ea031a225559bbf64e40c03f8e0 Mar 09 02:55:12 crc kubenswrapper[4901]: I0309 02:55:12.397809 4901 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 02:55:13 crc kubenswrapper[4901]: I0309 02:55:13.264859 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-j4cvp" event={"ID":"e0ede59c-7e38-437e-ba16-82adff7f9ef4","Type":"ContainerStarted","Data":"d0896a46a52d7f70cacdb57109e28be4ceb73ea031a225559bbf64e40c03f8e0"} Mar 09 02:55:14 crc kubenswrapper[4901]: I0309 02:55:14.273692 4901 generic.go:334] "Generic (PLEG): container finished" podID="e0ede59c-7e38-437e-ba16-82adff7f9ef4" containerID="0a620c1720dbb11c2c0624f8c305c9ee0a274e8f00074f89bbf3562aba93deb8" exitCode=0 Mar 09 02:55:14 crc kubenswrapper[4901]: I0309 02:55:14.273781 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-j4cvp" event={"ID":"e0ede59c-7e38-437e-ba16-82adff7f9ef4","Type":"ContainerDied","Data":"0a620c1720dbb11c2c0624f8c305c9ee0a274e8f00074f89bbf3562aba93deb8"} Mar 09 02:55:15 crc kubenswrapper[4901]: I0309 02:55:15.612932 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-j4cvp" Mar 09 02:55:15 crc kubenswrapper[4901]: I0309 02:55:15.760296 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbhpz\" (UniqueName: \"kubernetes.io/projected/e0ede59c-7e38-437e-ba16-82adff7f9ef4-kube-api-access-lbhpz\") pod \"e0ede59c-7e38-437e-ba16-82adff7f9ef4\" (UID: \"e0ede59c-7e38-437e-ba16-82adff7f9ef4\") " Mar 09 02:55:15 crc kubenswrapper[4901]: I0309 02:55:15.760860 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/e0ede59c-7e38-437e-ba16-82adff7f9ef4-crc-storage\") pod \"e0ede59c-7e38-437e-ba16-82adff7f9ef4\" (UID: \"e0ede59c-7e38-437e-ba16-82adff7f9ef4\") " Mar 09 02:55:15 crc kubenswrapper[4901]: I0309 02:55:15.761047 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/e0ede59c-7e38-437e-ba16-82adff7f9ef4-node-mnt\") pod \"e0ede59c-7e38-437e-ba16-82adff7f9ef4\" (UID: \"e0ede59c-7e38-437e-ba16-82adff7f9ef4\") " Mar 09 02:55:15 crc kubenswrapper[4901]: I0309 02:55:15.761188 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0ede59c-7e38-437e-ba16-82adff7f9ef4-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "e0ede59c-7e38-437e-ba16-82adff7f9ef4" (UID: "e0ede59c-7e38-437e-ba16-82adff7f9ef4"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 02:55:15 crc kubenswrapper[4901]: I0309 02:55:15.761475 4901 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/e0ede59c-7e38-437e-ba16-82adff7f9ef4-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 09 02:55:15 crc kubenswrapper[4901]: I0309 02:55:15.768008 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0ede59c-7e38-437e-ba16-82adff7f9ef4-kube-api-access-lbhpz" (OuterVolumeSpecName: "kube-api-access-lbhpz") pod "e0ede59c-7e38-437e-ba16-82adff7f9ef4" (UID: "e0ede59c-7e38-437e-ba16-82adff7f9ef4"). InnerVolumeSpecName "kube-api-access-lbhpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:55:15 crc kubenswrapper[4901]: I0309 02:55:15.790570 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0ede59c-7e38-437e-ba16-82adff7f9ef4-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "e0ede59c-7e38-437e-ba16-82adff7f9ef4" (UID: "e0ede59c-7e38-437e-ba16-82adff7f9ef4"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:55:15 crc kubenswrapper[4901]: I0309 02:55:15.862736 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbhpz\" (UniqueName: \"kubernetes.io/projected/e0ede59c-7e38-437e-ba16-82adff7f9ef4-kube-api-access-lbhpz\") on node \"crc\" DevicePath \"\"" Mar 09 02:55:15 crc kubenswrapper[4901]: I0309 02:55:15.862787 4901 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/e0ede59c-7e38-437e-ba16-82adff7f9ef4-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 09 02:55:16 crc kubenswrapper[4901]: I0309 02:55:16.291796 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-j4cvp" event={"ID":"e0ede59c-7e38-437e-ba16-82adff7f9ef4","Type":"ContainerDied","Data":"d0896a46a52d7f70cacdb57109e28be4ceb73ea031a225559bbf64e40c03f8e0"} Mar 09 02:55:16 crc kubenswrapper[4901]: I0309 02:55:16.291836 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0896a46a52d7f70cacdb57109e28be4ceb73ea031a225559bbf64e40c03f8e0" Mar 09 02:55:16 crc kubenswrapper[4901]: I0309 02:55:16.291966 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-j4cvp" Mar 09 02:55:16 crc kubenswrapper[4901]: I0309 02:55:16.313982 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wwr9b"] Mar 09 02:55:16 crc kubenswrapper[4901]: E0309 02:55:16.314324 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0ede59c-7e38-437e-ba16-82adff7f9ef4" containerName="storage" Mar 09 02:55:16 crc kubenswrapper[4901]: I0309 02:55:16.314351 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0ede59c-7e38-437e-ba16-82adff7f9ef4" containerName="storage" Mar 09 02:55:16 crc kubenswrapper[4901]: I0309 02:55:16.314469 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0ede59c-7e38-437e-ba16-82adff7f9ef4" containerName="storage" Mar 09 02:55:16 crc kubenswrapper[4901]: I0309 02:55:16.315607 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wwr9b" Mar 09 02:55:16 crc kubenswrapper[4901]: I0309 02:55:16.331659 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wwr9b"] Mar 09 02:55:16 crc kubenswrapper[4901]: I0309 02:55:16.471500 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50574bea-7c1e-4e25-8700-cbcbcc4d07e4-catalog-content\") pod \"certified-operators-wwr9b\" (UID: \"50574bea-7c1e-4e25-8700-cbcbcc4d07e4\") " pod="openshift-marketplace/certified-operators-wwr9b" Mar 09 02:55:16 crc kubenswrapper[4901]: I0309 02:55:16.471705 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50574bea-7c1e-4e25-8700-cbcbcc4d07e4-utilities\") pod \"certified-operators-wwr9b\" (UID: \"50574bea-7c1e-4e25-8700-cbcbcc4d07e4\") " pod="openshift-marketplace/certified-operators-wwr9b" Mar 09 02:55:16 crc kubenswrapper[4901]: I0309 02:55:16.471748 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2c72r\" (UniqueName: \"kubernetes.io/projected/50574bea-7c1e-4e25-8700-cbcbcc4d07e4-kube-api-access-2c72r\") pod \"certified-operators-wwr9b\" (UID: \"50574bea-7c1e-4e25-8700-cbcbcc4d07e4\") " pod="openshift-marketplace/certified-operators-wwr9b" Mar 09 02:55:16 crc kubenswrapper[4901]: I0309 02:55:16.572641 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50574bea-7c1e-4e25-8700-cbcbcc4d07e4-catalog-content\") pod \"certified-operators-wwr9b\" (UID: \"50574bea-7c1e-4e25-8700-cbcbcc4d07e4\") " pod="openshift-marketplace/certified-operators-wwr9b" Mar 09 02:55:16 crc kubenswrapper[4901]: I0309 02:55:16.572722 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2c72r\" (UniqueName: \"kubernetes.io/projected/50574bea-7c1e-4e25-8700-cbcbcc4d07e4-kube-api-access-2c72r\") pod \"certified-operators-wwr9b\" (UID: \"50574bea-7c1e-4e25-8700-cbcbcc4d07e4\") " pod="openshift-marketplace/certified-operators-wwr9b" Mar 09 02:55:16 crc kubenswrapper[4901]: I0309 02:55:16.572752 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50574bea-7c1e-4e25-8700-cbcbcc4d07e4-utilities\") pod \"certified-operators-wwr9b\" (UID: \"50574bea-7c1e-4e25-8700-cbcbcc4d07e4\") " pod="openshift-marketplace/certified-operators-wwr9b" Mar 09 02:55:16 crc kubenswrapper[4901]: I0309 02:55:16.573285 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50574bea-7c1e-4e25-8700-cbcbcc4d07e4-catalog-content\") pod \"certified-operators-wwr9b\" (UID: \"50574bea-7c1e-4e25-8700-cbcbcc4d07e4\") " pod="openshift-marketplace/certified-operators-wwr9b" Mar 09 02:55:16 crc kubenswrapper[4901]: I0309 02:55:16.573315 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50574bea-7c1e-4e25-8700-cbcbcc4d07e4-utilities\") pod \"certified-operators-wwr9b\" (UID: \"50574bea-7c1e-4e25-8700-cbcbcc4d07e4\") " pod="openshift-marketplace/certified-operators-wwr9b" Mar 09 02:55:16 crc kubenswrapper[4901]: I0309 02:55:16.610882 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2c72r\" (UniqueName: \"kubernetes.io/projected/50574bea-7c1e-4e25-8700-cbcbcc4d07e4-kube-api-access-2c72r\") pod \"certified-operators-wwr9b\" (UID: \"50574bea-7c1e-4e25-8700-cbcbcc4d07e4\") " pod="openshift-marketplace/certified-operators-wwr9b" Mar 09 02:55:16 crc kubenswrapper[4901]: I0309 02:55:16.642130 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wwr9b" Mar 09 02:55:17 crc kubenswrapper[4901]: I0309 02:55:17.113138 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wwr9b"] Mar 09 02:55:17 crc kubenswrapper[4901]: W0309 02:55:17.118666 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50574bea_7c1e_4e25_8700_cbcbcc4d07e4.slice/crio-ae8ddcf8ba28eaf4f42b596cf80a7eae6298786611b2ca5c27119d498a8fa858 WatchSource:0}: Error finding container ae8ddcf8ba28eaf4f42b596cf80a7eae6298786611b2ca5c27119d498a8fa858: Status 404 returned error can't find the container with id ae8ddcf8ba28eaf4f42b596cf80a7eae6298786611b2ca5c27119d498a8fa858 Mar 09 02:55:17 crc kubenswrapper[4901]: I0309 02:55:17.299052 4901 generic.go:334] "Generic (PLEG): container finished" podID="50574bea-7c1e-4e25-8700-cbcbcc4d07e4" containerID="2b4257388d4d8358b9fcabf020290e2a54ad5ffce8683137a7edf9ac4b90e8fa" exitCode=0 Mar 09 02:55:17 crc kubenswrapper[4901]: I0309 02:55:17.299257 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wwr9b" event={"ID":"50574bea-7c1e-4e25-8700-cbcbcc4d07e4","Type":"ContainerDied","Data":"2b4257388d4d8358b9fcabf020290e2a54ad5ffce8683137a7edf9ac4b90e8fa"} Mar 09 02:55:17 crc kubenswrapper[4901]: I0309 02:55:17.299365 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wwr9b" event={"ID":"50574bea-7c1e-4e25-8700-cbcbcc4d07e4","Type":"ContainerStarted","Data":"ae8ddcf8ba28eaf4f42b596cf80a7eae6298786611b2ca5c27119d498a8fa858"} Mar 09 02:55:17 crc kubenswrapper[4901]: I0309 02:55:17.324913 4901 scope.go:117] "RemoveContainer" containerID="0f00772dc49e31beb94fb8c970e126f77b33ecff59ca8d6fbc856704648ec6d1" Mar 09 02:55:18 crc kubenswrapper[4901]: I0309 02:55:18.313399 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-429fk_a0d0e040-7ca3-4af8-9f02-d96cff6b3edf/kube-multus/1.log" Mar 09 02:55:18 crc kubenswrapper[4901]: I0309 02:55:18.316186 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wwr9b" event={"ID":"50574bea-7c1e-4e25-8700-cbcbcc4d07e4","Type":"ContainerStarted","Data":"71788b5b9f6f0686fd195a991793a4e3cab0293b4e0422087f4a5d992cbb8a8b"} Mar 09 02:55:19 crc kubenswrapper[4901]: I0309 02:55:19.325529 4901 generic.go:334] "Generic (PLEG): container finished" podID="50574bea-7c1e-4e25-8700-cbcbcc4d07e4" containerID="71788b5b9f6f0686fd195a991793a4e3cab0293b4e0422087f4a5d992cbb8a8b" exitCode=0 Mar 09 02:55:19 crc kubenswrapper[4901]: I0309 02:55:19.325584 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wwr9b" event={"ID":"50574bea-7c1e-4e25-8700-cbcbcc4d07e4","Type":"ContainerDied","Data":"71788b5b9f6f0686fd195a991793a4e3cab0293b4e0422087f4a5d992cbb8a8b"} Mar 09 02:55:20 crc kubenswrapper[4901]: I0309 02:55:20.339276 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wwr9b" event={"ID":"50574bea-7c1e-4e25-8700-cbcbcc4d07e4","Type":"ContainerStarted","Data":"822b02fd0ce37531b90882ad8f7b89224f61fd7e709c87f08fc8afe430b26d66"} Mar 09 02:55:20 crc kubenswrapper[4901]: I0309 02:55:20.378451 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wwr9b" podStartSLOduration=1.934678334 podStartE2EDuration="4.378424438s" podCreationTimestamp="2026-03-09 02:55:16 +0000 UTC" firstStartedPulling="2026-03-09 02:55:17.300845043 +0000 UTC m=+841.890508775" lastFinishedPulling="2026-03-09 02:55:19.744591107 +0000 UTC m=+844.334254879" observedRunningTime="2026-03-09 02:55:20.369148853 +0000 UTC m=+844.958812645" watchObservedRunningTime="2026-03-09 02:55:20.378424438 +0000 UTC m=+844.968088210" Mar 09 02:55:21 crc kubenswrapper[4901]: I0309 02:55:21.245832 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xn8v2" Mar 09 02:55:23 crc kubenswrapper[4901]: I0309 02:55:23.340996 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82scnqs"] Mar 09 02:55:23 crc kubenswrapper[4901]: I0309 02:55:23.343102 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82scnqs" Mar 09 02:55:23 crc kubenswrapper[4901]: I0309 02:55:23.345115 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 09 02:55:23 crc kubenswrapper[4901]: I0309 02:55:23.349286 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82scnqs"] Mar 09 02:55:23 crc kubenswrapper[4901]: I0309 02:55:23.467375 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tscjk\" (UniqueName: \"kubernetes.io/projected/ecd1807d-8e6c-4310-b7ba-e710f97e7d87-kube-api-access-tscjk\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82scnqs\" (UID: \"ecd1807d-8e6c-4310-b7ba-e710f97e7d87\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82scnqs" Mar 09 02:55:23 crc kubenswrapper[4901]: I0309 02:55:23.467420 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ecd1807d-8e6c-4310-b7ba-e710f97e7d87-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82scnqs\" (UID: \"ecd1807d-8e6c-4310-b7ba-e710f97e7d87\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82scnqs" Mar 09 02:55:23 crc kubenswrapper[4901]: I0309 02:55:23.467483 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ecd1807d-8e6c-4310-b7ba-e710f97e7d87-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82scnqs\" (UID: \"ecd1807d-8e6c-4310-b7ba-e710f97e7d87\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82scnqs" Mar 09 02:55:23 crc kubenswrapper[4901]: I0309 02:55:23.569081 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ecd1807d-8e6c-4310-b7ba-e710f97e7d87-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82scnqs\" (UID: \"ecd1807d-8e6c-4310-b7ba-e710f97e7d87\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82scnqs" Mar 09 02:55:23 crc kubenswrapper[4901]: I0309 02:55:23.569206 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ecd1807d-8e6c-4310-b7ba-e710f97e7d87-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82scnqs\" (UID: \"ecd1807d-8e6c-4310-b7ba-e710f97e7d87\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82scnqs" Mar 09 02:55:23 crc kubenswrapper[4901]: I0309 02:55:23.569343 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tscjk\" (UniqueName: \"kubernetes.io/projected/ecd1807d-8e6c-4310-b7ba-e710f97e7d87-kube-api-access-tscjk\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82scnqs\" (UID: \"ecd1807d-8e6c-4310-b7ba-e710f97e7d87\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82scnqs" Mar 09 02:55:23 crc kubenswrapper[4901]: I0309 02:55:23.570178 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ecd1807d-8e6c-4310-b7ba-e710f97e7d87-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82scnqs\" (UID: \"ecd1807d-8e6c-4310-b7ba-e710f97e7d87\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82scnqs" Mar 09 02:55:23 crc kubenswrapper[4901]: I0309 02:55:23.570255 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ecd1807d-8e6c-4310-b7ba-e710f97e7d87-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82scnqs\" (UID: \"ecd1807d-8e6c-4310-b7ba-e710f97e7d87\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82scnqs" Mar 09 02:55:23 crc kubenswrapper[4901]: I0309 02:55:23.600196 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tscjk\" (UniqueName: \"kubernetes.io/projected/ecd1807d-8e6c-4310-b7ba-e710f97e7d87-kube-api-access-tscjk\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82scnqs\" (UID: \"ecd1807d-8e6c-4310-b7ba-e710f97e7d87\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82scnqs" Mar 09 02:55:23 crc kubenswrapper[4901]: I0309 02:55:23.709573 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82scnqs" Mar 09 02:55:23 crc kubenswrapper[4901]: I0309 02:55:23.955787 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82scnqs"] Mar 09 02:55:23 crc kubenswrapper[4901]: W0309 02:55:23.962314 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podecd1807d_8e6c_4310_b7ba_e710f97e7d87.slice/crio-8592e8ca9f857901923eb089881429fac078d84852aeeecfad4087e9a986c3b4 WatchSource:0}: Error finding container 8592e8ca9f857901923eb089881429fac078d84852aeeecfad4087e9a986c3b4: Status 404 returned error can't find the container with id 8592e8ca9f857901923eb089881429fac078d84852aeeecfad4087e9a986c3b4 Mar 09 02:55:24 crc kubenswrapper[4901]: I0309 02:55:24.365048 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82scnqs" event={"ID":"ecd1807d-8e6c-4310-b7ba-e710f97e7d87","Type":"ContainerStarted","Data":"143ac1cc1965fe0f35a14363aa87c6a6d7e32cebf8df2b7dd37ceda32d0888e4"} Mar 09 02:55:24 crc kubenswrapper[4901]: I0309 02:55:24.365123 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82scnqs" event={"ID":"ecd1807d-8e6c-4310-b7ba-e710f97e7d87","Type":"ContainerStarted","Data":"8592e8ca9f857901923eb089881429fac078d84852aeeecfad4087e9a986c3b4"} Mar 09 02:55:25 crc kubenswrapper[4901]: I0309 02:55:25.375625 4901 generic.go:334] "Generic (PLEG): container finished" podID="ecd1807d-8e6c-4310-b7ba-e710f97e7d87" containerID="143ac1cc1965fe0f35a14363aa87c6a6d7e32cebf8df2b7dd37ceda32d0888e4" exitCode=0 Mar 09 02:55:25 crc kubenswrapper[4901]: I0309 02:55:25.375726 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82scnqs" event={"ID":"ecd1807d-8e6c-4310-b7ba-e710f97e7d87","Type":"ContainerDied","Data":"143ac1cc1965fe0f35a14363aa87c6a6d7e32cebf8df2b7dd37ceda32d0888e4"} Mar 09 02:55:25 crc kubenswrapper[4901]: I0309 02:55:25.693083 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tmkmt"] Mar 09 02:55:25 crc kubenswrapper[4901]: I0309 02:55:25.694797 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tmkmt" Mar 09 02:55:25 crc kubenswrapper[4901]: I0309 02:55:25.702484 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73a96733-bf28-4970-a38b-c26f8b8979ae-catalog-content\") pod \"redhat-operators-tmkmt\" (UID: \"73a96733-bf28-4970-a38b-c26f8b8979ae\") " pod="openshift-marketplace/redhat-operators-tmkmt" Mar 09 02:55:25 crc kubenswrapper[4901]: I0309 02:55:25.702553 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73a96733-bf28-4970-a38b-c26f8b8979ae-utilities\") pod \"redhat-operators-tmkmt\" (UID: \"73a96733-bf28-4970-a38b-c26f8b8979ae\") " pod="openshift-marketplace/redhat-operators-tmkmt" Mar 09 02:55:25 crc kubenswrapper[4901]: I0309 02:55:25.702876 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jtxd\" (UniqueName: \"kubernetes.io/projected/73a96733-bf28-4970-a38b-c26f8b8979ae-kube-api-access-9jtxd\") pod \"redhat-operators-tmkmt\" (UID: \"73a96733-bf28-4970-a38b-c26f8b8979ae\") " pod="openshift-marketplace/redhat-operators-tmkmt" Mar 09 02:55:25 crc kubenswrapper[4901]: I0309 02:55:25.722432 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tmkmt"] Mar 09 02:55:25 crc kubenswrapper[4901]: I0309 02:55:25.819703 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73a96733-bf28-4970-a38b-c26f8b8979ae-catalog-content\") pod \"redhat-operators-tmkmt\" (UID: \"73a96733-bf28-4970-a38b-c26f8b8979ae\") " pod="openshift-marketplace/redhat-operators-tmkmt" Mar 09 02:55:25 crc kubenswrapper[4901]: I0309 02:55:25.820308 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73a96733-bf28-4970-a38b-c26f8b8979ae-utilities\") pod \"redhat-operators-tmkmt\" (UID: \"73a96733-bf28-4970-a38b-c26f8b8979ae\") " pod="openshift-marketplace/redhat-operators-tmkmt" Mar 09 02:55:25 crc kubenswrapper[4901]: I0309 02:55:25.823403 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jtxd\" (UniqueName: \"kubernetes.io/projected/73a96733-bf28-4970-a38b-c26f8b8979ae-kube-api-access-9jtxd\") pod \"redhat-operators-tmkmt\" (UID: \"73a96733-bf28-4970-a38b-c26f8b8979ae\") " pod="openshift-marketplace/redhat-operators-tmkmt" Mar 09 02:55:25 crc kubenswrapper[4901]: I0309 02:55:25.824100 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73a96733-bf28-4970-a38b-c26f8b8979ae-utilities\") pod \"redhat-operators-tmkmt\" (UID: \"73a96733-bf28-4970-a38b-c26f8b8979ae\") " pod="openshift-marketplace/redhat-operators-tmkmt" Mar 09 02:55:25 crc kubenswrapper[4901]: I0309 02:55:25.824383 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73a96733-bf28-4970-a38b-c26f8b8979ae-catalog-content\") pod \"redhat-operators-tmkmt\" (UID: \"73a96733-bf28-4970-a38b-c26f8b8979ae\") " pod="openshift-marketplace/redhat-operators-tmkmt" Mar 09 02:55:25 crc kubenswrapper[4901]: I0309 02:55:25.860092 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jtxd\" (UniqueName: \"kubernetes.io/projected/73a96733-bf28-4970-a38b-c26f8b8979ae-kube-api-access-9jtxd\") pod \"redhat-operators-tmkmt\" (UID: \"73a96733-bf28-4970-a38b-c26f8b8979ae\") " pod="openshift-marketplace/redhat-operators-tmkmt" Mar 09 02:55:26 crc kubenswrapper[4901]: I0309 02:55:26.036485 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tmkmt" Mar 09 02:55:26 crc kubenswrapper[4901]: I0309 02:55:26.341423 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tmkmt"] Mar 09 02:55:26 crc kubenswrapper[4901]: I0309 02:55:26.390135 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tmkmt" event={"ID":"73a96733-bf28-4970-a38b-c26f8b8979ae","Type":"ContainerStarted","Data":"460b9cfa3a3951460de732d068d86be8624d42207d353acf4ba6878ae7702df2"} Mar 09 02:55:26 crc kubenswrapper[4901]: I0309 02:55:26.642533 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wwr9b" Mar 09 02:55:26 crc kubenswrapper[4901]: I0309 02:55:26.642584 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wwr9b" Mar 09 02:55:26 crc kubenswrapper[4901]: I0309 02:55:26.691086 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wwr9b" Mar 09 02:55:27 crc kubenswrapper[4901]: I0309 02:55:27.398670 4901 generic.go:334] "Generic (PLEG): container finished" podID="ecd1807d-8e6c-4310-b7ba-e710f97e7d87" containerID="76adb4189c02d1eea196cfde567827e523f444b179e566138a2445bcf6a66473" exitCode=0 Mar 09 02:55:27 crc kubenswrapper[4901]: I0309 02:55:27.398727 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82scnqs" event={"ID":"ecd1807d-8e6c-4310-b7ba-e710f97e7d87","Type":"ContainerDied","Data":"76adb4189c02d1eea196cfde567827e523f444b179e566138a2445bcf6a66473"} Mar 09 02:55:27 crc kubenswrapper[4901]: I0309 02:55:27.400320 4901 generic.go:334] "Generic (PLEG): container finished" podID="73a96733-bf28-4970-a38b-c26f8b8979ae" containerID="54b8ac136f6fe8db2710793429c591661790857339230f30a73f09446b82ece2" exitCode=0 Mar 09 02:55:27 crc kubenswrapper[4901]: I0309 02:55:27.400650 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tmkmt" event={"ID":"73a96733-bf28-4970-a38b-c26f8b8979ae","Type":"ContainerDied","Data":"54b8ac136f6fe8db2710793429c591661790857339230f30a73f09446b82ece2"} Mar 09 02:55:27 crc kubenswrapper[4901]: I0309 02:55:27.452134 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wwr9b" Mar 09 02:55:28 crc kubenswrapper[4901]: I0309 02:55:28.412344 4901 generic.go:334] "Generic (PLEG): container finished" podID="ecd1807d-8e6c-4310-b7ba-e710f97e7d87" containerID="7058064e22d120adefa5d1d7a4befe2c51e722cb825448b48141912b4e6ac402" exitCode=0 Mar 09 02:55:28 crc kubenswrapper[4901]: I0309 02:55:28.412467 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82scnqs" event={"ID":"ecd1807d-8e6c-4310-b7ba-e710f97e7d87","Type":"ContainerDied","Data":"7058064e22d120adefa5d1d7a4befe2c51e722cb825448b48141912b4e6ac402"} Mar 09 02:55:28 crc kubenswrapper[4901]: I0309 02:55:28.419637 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tmkmt" event={"ID":"73a96733-bf28-4970-a38b-c26f8b8979ae","Type":"ContainerStarted","Data":"4d1b3597f16e565c1cac28f9373337393bf3a42ba9e28584a3a52ac93c7eb7c4"} Mar 09 02:55:29 crc kubenswrapper[4901]: I0309 02:55:29.428876 4901 generic.go:334] "Generic (PLEG): container finished" podID="73a96733-bf28-4970-a38b-c26f8b8979ae" containerID="4d1b3597f16e565c1cac28f9373337393bf3a42ba9e28584a3a52ac93c7eb7c4" exitCode=0 Mar 09 02:55:29 crc kubenswrapper[4901]: I0309 02:55:29.429043 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tmkmt" event={"ID":"73a96733-bf28-4970-a38b-c26f8b8979ae","Type":"ContainerDied","Data":"4d1b3597f16e565c1cac28f9373337393bf3a42ba9e28584a3a52ac93c7eb7c4"} Mar 09 02:55:29 crc kubenswrapper[4901]: I0309 02:55:29.717915 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82scnqs" Mar 09 02:55:29 crc kubenswrapper[4901]: I0309 02:55:29.783147 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tscjk\" (UniqueName: \"kubernetes.io/projected/ecd1807d-8e6c-4310-b7ba-e710f97e7d87-kube-api-access-tscjk\") pod \"ecd1807d-8e6c-4310-b7ba-e710f97e7d87\" (UID: \"ecd1807d-8e6c-4310-b7ba-e710f97e7d87\") " Mar 09 02:55:29 crc kubenswrapper[4901]: I0309 02:55:29.783269 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ecd1807d-8e6c-4310-b7ba-e710f97e7d87-bundle\") pod \"ecd1807d-8e6c-4310-b7ba-e710f97e7d87\" (UID: \"ecd1807d-8e6c-4310-b7ba-e710f97e7d87\") " Mar 09 02:55:29 crc kubenswrapper[4901]: I0309 02:55:29.783345 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ecd1807d-8e6c-4310-b7ba-e710f97e7d87-util\") pod \"ecd1807d-8e6c-4310-b7ba-e710f97e7d87\" (UID: \"ecd1807d-8e6c-4310-b7ba-e710f97e7d87\") " Mar 09 02:55:29 crc kubenswrapper[4901]: I0309 02:55:29.784752 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecd1807d-8e6c-4310-b7ba-e710f97e7d87-bundle" (OuterVolumeSpecName: "bundle") pod "ecd1807d-8e6c-4310-b7ba-e710f97e7d87" (UID: "ecd1807d-8e6c-4310-b7ba-e710f97e7d87"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 02:55:29 crc kubenswrapper[4901]: I0309 02:55:29.789027 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecd1807d-8e6c-4310-b7ba-e710f97e7d87-kube-api-access-tscjk" (OuterVolumeSpecName: "kube-api-access-tscjk") pod "ecd1807d-8e6c-4310-b7ba-e710f97e7d87" (UID: "ecd1807d-8e6c-4310-b7ba-e710f97e7d87"). InnerVolumeSpecName "kube-api-access-tscjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:55:29 crc kubenswrapper[4901]: I0309 02:55:29.871241 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecd1807d-8e6c-4310-b7ba-e710f97e7d87-util" (OuterVolumeSpecName: "util") pod "ecd1807d-8e6c-4310-b7ba-e710f97e7d87" (UID: "ecd1807d-8e6c-4310-b7ba-e710f97e7d87"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 02:55:29 crc kubenswrapper[4901]: I0309 02:55:29.884696 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tscjk\" (UniqueName: \"kubernetes.io/projected/ecd1807d-8e6c-4310-b7ba-e710f97e7d87-kube-api-access-tscjk\") on node \"crc\" DevicePath \"\"" Mar 09 02:55:29 crc kubenswrapper[4901]: I0309 02:55:29.885046 4901 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ecd1807d-8e6c-4310-b7ba-e710f97e7d87-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 02:55:29 crc kubenswrapper[4901]: I0309 02:55:29.885057 4901 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ecd1807d-8e6c-4310-b7ba-e710f97e7d87-util\") on node \"crc\" DevicePath \"\"" Mar 09 02:55:30 crc kubenswrapper[4901]: I0309 02:55:30.277033 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wwr9b"] Mar 09 02:55:30 crc kubenswrapper[4901]: I0309 02:55:30.277296 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wwr9b" podUID="50574bea-7c1e-4e25-8700-cbcbcc4d07e4" containerName="registry-server" containerID="cri-o://822b02fd0ce37531b90882ad8f7b89224f61fd7e709c87f08fc8afe430b26d66" gracePeriod=2 Mar 09 02:55:30 crc kubenswrapper[4901]: I0309 02:55:30.442550 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82scnqs" Mar 09 02:55:30 crc kubenswrapper[4901]: I0309 02:55:30.444780 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82scnqs" event={"ID":"ecd1807d-8e6c-4310-b7ba-e710f97e7d87","Type":"ContainerDied","Data":"8592e8ca9f857901923eb089881429fac078d84852aeeecfad4087e9a986c3b4"} Mar 09 02:55:30 crc kubenswrapper[4901]: I0309 02:55:30.445376 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8592e8ca9f857901923eb089881429fac078d84852aeeecfad4087e9a986c3b4" Mar 09 02:55:30 crc kubenswrapper[4901]: I0309 02:55:30.446423 4901 generic.go:334] "Generic (PLEG): container finished" podID="50574bea-7c1e-4e25-8700-cbcbcc4d07e4" containerID="822b02fd0ce37531b90882ad8f7b89224f61fd7e709c87f08fc8afe430b26d66" exitCode=0 Mar 09 02:55:30 crc kubenswrapper[4901]: I0309 02:55:30.446505 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wwr9b" event={"ID":"50574bea-7c1e-4e25-8700-cbcbcc4d07e4","Type":"ContainerDied","Data":"822b02fd0ce37531b90882ad8f7b89224f61fd7e709c87f08fc8afe430b26d66"} Mar 09 02:55:30 crc kubenswrapper[4901]: I0309 02:55:30.450437 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tmkmt" event={"ID":"73a96733-bf28-4970-a38b-c26f8b8979ae","Type":"ContainerStarted","Data":"30b2331760d5f74b6f96ff6de84b0964778f01ee4eaacf47c3119c6e6aa25562"} Mar 09 02:55:30 crc kubenswrapper[4901]: I0309 02:55:30.493826 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tmkmt" podStartSLOduration=3.010489719 podStartE2EDuration="5.493809033s" podCreationTimestamp="2026-03-09 02:55:25 +0000 UTC" firstStartedPulling="2026-03-09 02:55:27.401410829 +0000 UTC m=+851.991074561" lastFinishedPulling="2026-03-09 02:55:29.884730113 +0000 UTC m=+854.474393875" observedRunningTime="2026-03-09 02:55:30.469744731 +0000 UTC m=+855.059408473" watchObservedRunningTime="2026-03-09 02:55:30.493809033 +0000 UTC m=+855.083472765" Mar 09 02:55:30 crc kubenswrapper[4901]: I0309 02:55:30.685132 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wwr9b" Mar 09 02:55:30 crc kubenswrapper[4901]: I0309 02:55:30.693207 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50574bea-7c1e-4e25-8700-cbcbcc4d07e4-utilities\") pod \"50574bea-7c1e-4e25-8700-cbcbcc4d07e4\" (UID: \"50574bea-7c1e-4e25-8700-cbcbcc4d07e4\") " Mar 09 02:55:30 crc kubenswrapper[4901]: I0309 02:55:30.693274 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50574bea-7c1e-4e25-8700-cbcbcc4d07e4-catalog-content\") pod \"50574bea-7c1e-4e25-8700-cbcbcc4d07e4\" (UID: \"50574bea-7c1e-4e25-8700-cbcbcc4d07e4\") " Mar 09 02:55:30 crc kubenswrapper[4901]: I0309 02:55:30.693315 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2c72r\" (UniqueName: \"kubernetes.io/projected/50574bea-7c1e-4e25-8700-cbcbcc4d07e4-kube-api-access-2c72r\") pod \"50574bea-7c1e-4e25-8700-cbcbcc4d07e4\" (UID: \"50574bea-7c1e-4e25-8700-cbcbcc4d07e4\") " Mar 09 02:55:30 crc kubenswrapper[4901]: I0309 02:55:30.694543 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50574bea-7c1e-4e25-8700-cbcbcc4d07e4-utilities" (OuterVolumeSpecName: "utilities") pod "50574bea-7c1e-4e25-8700-cbcbcc4d07e4" (UID: "50574bea-7c1e-4e25-8700-cbcbcc4d07e4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 02:55:30 crc kubenswrapper[4901]: I0309 02:55:30.706466 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50574bea-7c1e-4e25-8700-cbcbcc4d07e4-kube-api-access-2c72r" (OuterVolumeSpecName: "kube-api-access-2c72r") pod "50574bea-7c1e-4e25-8700-cbcbcc4d07e4" (UID: "50574bea-7c1e-4e25-8700-cbcbcc4d07e4"). InnerVolumeSpecName "kube-api-access-2c72r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:55:30 crc kubenswrapper[4901]: I0309 02:55:30.767452 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50574bea-7c1e-4e25-8700-cbcbcc4d07e4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "50574bea-7c1e-4e25-8700-cbcbcc4d07e4" (UID: "50574bea-7c1e-4e25-8700-cbcbcc4d07e4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 02:55:30 crc kubenswrapper[4901]: I0309 02:55:30.794516 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50574bea-7c1e-4e25-8700-cbcbcc4d07e4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 02:55:30 crc kubenswrapper[4901]: I0309 02:55:30.794566 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2c72r\" (UniqueName: \"kubernetes.io/projected/50574bea-7c1e-4e25-8700-cbcbcc4d07e4-kube-api-access-2c72r\") on node \"crc\" DevicePath \"\"" Mar 09 02:55:30 crc kubenswrapper[4901]: I0309 02:55:30.794634 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50574bea-7c1e-4e25-8700-cbcbcc4d07e4-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 02:55:31 crc kubenswrapper[4901]: I0309 02:55:31.461575 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wwr9b" event={"ID":"50574bea-7c1e-4e25-8700-cbcbcc4d07e4","Type":"ContainerDied","Data":"ae8ddcf8ba28eaf4f42b596cf80a7eae6298786611b2ca5c27119d498a8fa858"} Mar 09 02:55:31 crc kubenswrapper[4901]: I0309 02:55:31.461661 4901 scope.go:117] "RemoveContainer" containerID="822b02fd0ce37531b90882ad8f7b89224f61fd7e709c87f08fc8afe430b26d66" Mar 09 02:55:31 crc kubenswrapper[4901]: I0309 02:55:31.461617 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wwr9b" Mar 09 02:55:31 crc kubenswrapper[4901]: I0309 02:55:31.498015 4901 scope.go:117] "RemoveContainer" containerID="71788b5b9f6f0686fd195a991793a4e3cab0293b4e0422087f4a5d992cbb8a8b" Mar 09 02:55:31 crc kubenswrapper[4901]: I0309 02:55:31.527458 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wwr9b"] Mar 09 02:55:31 crc kubenswrapper[4901]: I0309 02:55:31.541144 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wwr9b"] Mar 09 02:55:31 crc kubenswrapper[4901]: I0309 02:55:31.563388 4901 scope.go:117] "RemoveContainer" containerID="2b4257388d4d8358b9fcabf020290e2a54ad5ffce8683137a7edf9ac4b90e8fa" Mar 09 02:55:32 crc kubenswrapper[4901]: I0309 02:55:32.118285 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50574bea-7c1e-4e25-8700-cbcbcc4d07e4" path="/var/lib/kubelet/pods/50574bea-7c1e-4e25-8700-cbcbcc4d07e4/volumes" Mar 09 02:55:33 crc kubenswrapper[4901]: I0309 02:55:33.899973 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-kzrdg"] Mar 09 02:55:33 crc kubenswrapper[4901]: E0309 02:55:33.900297 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecd1807d-8e6c-4310-b7ba-e710f97e7d87" containerName="extract" Mar 09 02:55:33 crc kubenswrapper[4901]: I0309 02:55:33.900311 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecd1807d-8e6c-4310-b7ba-e710f97e7d87" containerName="extract" Mar 09 02:55:33 crc kubenswrapper[4901]: E0309 02:55:33.900322 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50574bea-7c1e-4e25-8700-cbcbcc4d07e4" containerName="extract-utilities" Mar 09 02:55:33 crc kubenswrapper[4901]: I0309 02:55:33.900329 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="50574bea-7c1e-4e25-8700-cbcbcc4d07e4" containerName="extract-utilities" Mar 09 02:55:33 crc kubenswrapper[4901]: E0309 02:55:33.900343 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecd1807d-8e6c-4310-b7ba-e710f97e7d87" containerName="pull" Mar 09 02:55:33 crc kubenswrapper[4901]: I0309 02:55:33.900349 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecd1807d-8e6c-4310-b7ba-e710f97e7d87" containerName="pull" Mar 09 02:55:33 crc kubenswrapper[4901]: E0309 02:55:33.900367 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50574bea-7c1e-4e25-8700-cbcbcc4d07e4" containerName="extract-content" Mar 09 02:55:33 crc kubenswrapper[4901]: I0309 02:55:33.900373 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="50574bea-7c1e-4e25-8700-cbcbcc4d07e4" containerName="extract-content" Mar 09 02:55:33 crc kubenswrapper[4901]: E0309 02:55:33.900387 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecd1807d-8e6c-4310-b7ba-e710f97e7d87" containerName="util" Mar 09 02:55:33 crc kubenswrapper[4901]: I0309 02:55:33.900393 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecd1807d-8e6c-4310-b7ba-e710f97e7d87" containerName="util" Mar 09 02:55:33 crc kubenswrapper[4901]: E0309 02:55:33.900400 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50574bea-7c1e-4e25-8700-cbcbcc4d07e4" containerName="registry-server" Mar 09 02:55:33 crc kubenswrapper[4901]: I0309 02:55:33.900406 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="50574bea-7c1e-4e25-8700-cbcbcc4d07e4" containerName="registry-server" Mar 09 02:55:33 crc kubenswrapper[4901]: I0309 02:55:33.900515 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecd1807d-8e6c-4310-b7ba-e710f97e7d87" containerName="extract" Mar 09 02:55:33 crc kubenswrapper[4901]: I0309 02:55:33.900536 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="50574bea-7c1e-4e25-8700-cbcbcc4d07e4" containerName="registry-server" Mar 09 02:55:33 crc kubenswrapper[4901]: I0309 02:55:33.900999 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-kzrdg" Mar 09 02:55:33 crc kubenswrapper[4901]: I0309 02:55:33.910865 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 09 02:55:33 crc kubenswrapper[4901]: I0309 02:55:33.911147 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-zb5fm" Mar 09 02:55:33 crc kubenswrapper[4901]: I0309 02:55:33.911942 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 09 02:55:33 crc kubenswrapper[4901]: I0309 02:55:33.941997 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-kzrdg"] Mar 09 02:55:34 crc kubenswrapper[4901]: I0309 02:55:34.042833 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvgz8\" (UniqueName: \"kubernetes.io/projected/704c84a3-97b2-4627-a1fe-7d187799db15-kube-api-access-mvgz8\") pod \"nmstate-operator-75c5dccd6c-kzrdg\" (UID: \"704c84a3-97b2-4627-a1fe-7d187799db15\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-kzrdg" Mar 09 02:55:34 crc kubenswrapper[4901]: I0309 02:55:34.143470 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvgz8\" (UniqueName: \"kubernetes.io/projected/704c84a3-97b2-4627-a1fe-7d187799db15-kube-api-access-mvgz8\") pod \"nmstate-operator-75c5dccd6c-kzrdg\" (UID: \"704c84a3-97b2-4627-a1fe-7d187799db15\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-kzrdg" Mar 09 02:55:34 crc kubenswrapper[4901]: I0309 02:55:34.167614 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvgz8\" (UniqueName: \"kubernetes.io/projected/704c84a3-97b2-4627-a1fe-7d187799db15-kube-api-access-mvgz8\") pod \"nmstate-operator-75c5dccd6c-kzrdg\" (UID: \"704c84a3-97b2-4627-a1fe-7d187799db15\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-kzrdg" Mar 09 02:55:34 crc kubenswrapper[4901]: I0309 02:55:34.244859 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-kzrdg" Mar 09 02:55:34 crc kubenswrapper[4901]: I0309 02:55:34.656108 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-kzrdg"] Mar 09 02:55:34 crc kubenswrapper[4901]: W0309 02:55:34.665473 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod704c84a3_97b2_4627_a1fe_7d187799db15.slice/crio-ce32c8e9a6227a5528131ad8c019926851ec883c5c2f9cd143db300c1032bb8b WatchSource:0}: Error finding container ce32c8e9a6227a5528131ad8c019926851ec883c5c2f9cd143db300c1032bb8b: Status 404 returned error can't find the container with id ce32c8e9a6227a5528131ad8c019926851ec883c5c2f9cd143db300c1032bb8b Mar 09 02:55:35 crc kubenswrapper[4901]: I0309 02:55:35.491832 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-kzrdg" event={"ID":"704c84a3-97b2-4627-a1fe-7d187799db15","Type":"ContainerStarted","Data":"ce32c8e9a6227a5528131ad8c019926851ec883c5c2f9cd143db300c1032bb8b"} Mar 09 02:55:36 crc kubenswrapper[4901]: I0309 02:55:36.037118 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tmkmt" Mar 09 02:55:36 crc kubenswrapper[4901]: I0309 02:55:36.037182 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tmkmt" Mar 09 02:55:37 crc kubenswrapper[4901]: I0309 02:55:37.147364 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tmkmt" podUID="73a96733-bf28-4970-a38b-c26f8b8979ae" containerName="registry-server" probeResult="failure" output=< Mar 09 02:55:37 crc kubenswrapper[4901]: timeout: failed to connect service ":50051" within 1s Mar 09 02:55:37 crc kubenswrapper[4901]: > Mar 09 02:55:37 crc kubenswrapper[4901]: I0309 02:55:37.506339 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-kzrdg" event={"ID":"704c84a3-97b2-4627-a1fe-7d187799db15","Type":"ContainerStarted","Data":"980a1426709292862abe16d83533eace3b8b66e8ffd345e234fea3a5c8984d0e"} Mar 09 02:55:37 crc kubenswrapper[4901]: I0309 02:55:37.531330 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-kzrdg" podStartSLOduration=2.044514835 podStartE2EDuration="4.531307134s" podCreationTimestamp="2026-03-09 02:55:33 +0000 UTC" firstStartedPulling="2026-03-09 02:55:34.66783295 +0000 UTC m=+859.257496722" lastFinishedPulling="2026-03-09 02:55:37.154625279 +0000 UTC m=+861.744289021" observedRunningTime="2026-03-09 02:55:37.526677987 +0000 UTC m=+862.116341739" watchObservedRunningTime="2026-03-09 02:55:37.531307134 +0000 UTC m=+862.120970886" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.100768 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-w2vms"] Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.103115 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-w2vms" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.105199 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-fzqpc" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.107105 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-l5p5p"] Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.108108 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-l5p5p" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.113555 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.123142 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-w2vms"] Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.139617 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-l5p5p"] Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.156067 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-xfhkr"] Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.157051 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-xfhkr" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.234095 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-sdgzs"] Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.234917 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-sdgzs" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.238771 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.239313 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-hrx77" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.239487 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.245050 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-sdgzs"] Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.255391 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/8397d2d3-a343-42d3-9443-03474b7ad195-nmstate-lock\") pod \"nmstate-handler-xfhkr\" (UID: \"8397d2d3-a343-42d3-9443-03474b7ad195\") " pod="openshift-nmstate/nmstate-handler-xfhkr" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.255449 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/8397d2d3-a343-42d3-9443-03474b7ad195-ovs-socket\") pod \"nmstate-handler-xfhkr\" (UID: \"8397d2d3-a343-42d3-9443-03474b7ad195\") " pod="openshift-nmstate/nmstate-handler-xfhkr" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.255472 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdljg\" (UniqueName: \"kubernetes.io/projected/052e1ec6-21bf-4ce6-9460-d639e85112a4-kube-api-access-tdljg\") pod \"nmstate-metrics-69594cc75-w2vms\" (UID: \"052e1ec6-21bf-4ce6-9460-d639e85112a4\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-w2vms" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.255510 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjx2m\" (UniqueName: \"kubernetes.io/projected/23c87d69-7fab-4ce7-99bc-9d076777b1ef-kube-api-access-rjx2m\") pod \"nmstate-webhook-786f45cff4-l5p5p\" (UID: \"23c87d69-7fab-4ce7-99bc-9d076777b1ef\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-l5p5p" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.255535 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/8397d2d3-a343-42d3-9443-03474b7ad195-dbus-socket\") pod \"nmstate-handler-xfhkr\" (UID: \"8397d2d3-a343-42d3-9443-03474b7ad195\") " pod="openshift-nmstate/nmstate-handler-xfhkr" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.255551 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdx6r\" (UniqueName: \"kubernetes.io/projected/8397d2d3-a343-42d3-9443-03474b7ad195-kube-api-access-pdx6r\") pod \"nmstate-handler-xfhkr\" (UID: \"8397d2d3-a343-42d3-9443-03474b7ad195\") " pod="openshift-nmstate/nmstate-handler-xfhkr" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.255565 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/23c87d69-7fab-4ce7-99bc-9d076777b1ef-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-l5p5p\" (UID: \"23c87d69-7fab-4ce7-99bc-9d076777b1ef\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-l5p5p" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.356883 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-595q6\" (UniqueName: \"kubernetes.io/projected/2fa716e4-3b2c-4ad8-b89b-cd7a931a658d-kube-api-access-595q6\") pod \"nmstate-console-plugin-5dcbbd79cf-sdgzs\" (UID: \"2fa716e4-3b2c-4ad8-b89b-cd7a931a658d\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-sdgzs" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.356927 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/8397d2d3-a343-42d3-9443-03474b7ad195-nmstate-lock\") pod \"nmstate-handler-xfhkr\" (UID: \"8397d2d3-a343-42d3-9443-03474b7ad195\") " pod="openshift-nmstate/nmstate-handler-xfhkr" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.356955 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/8397d2d3-a343-42d3-9443-03474b7ad195-ovs-socket\") pod \"nmstate-handler-xfhkr\" (UID: \"8397d2d3-a343-42d3-9443-03474b7ad195\") " pod="openshift-nmstate/nmstate-handler-xfhkr" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.356972 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/2fa716e4-3b2c-4ad8-b89b-cd7a931a658d-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-sdgzs\" (UID: \"2fa716e4-3b2c-4ad8-b89b-cd7a931a658d\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-sdgzs" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.356993 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdljg\" (UniqueName: \"kubernetes.io/projected/052e1ec6-21bf-4ce6-9460-d639e85112a4-kube-api-access-tdljg\") pod \"nmstate-metrics-69594cc75-w2vms\" (UID: \"052e1ec6-21bf-4ce6-9460-d639e85112a4\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-w2vms" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.357027 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjx2m\" (UniqueName: \"kubernetes.io/projected/23c87d69-7fab-4ce7-99bc-9d076777b1ef-kube-api-access-rjx2m\") pod \"nmstate-webhook-786f45cff4-l5p5p\" (UID: \"23c87d69-7fab-4ce7-99bc-9d076777b1ef\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-l5p5p" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.357053 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/2fa716e4-3b2c-4ad8-b89b-cd7a931a658d-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-sdgzs\" (UID: \"2fa716e4-3b2c-4ad8-b89b-cd7a931a658d\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-sdgzs" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.357069 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/8397d2d3-a343-42d3-9443-03474b7ad195-dbus-socket\") pod \"nmstate-handler-xfhkr\" (UID: \"8397d2d3-a343-42d3-9443-03474b7ad195\") " pod="openshift-nmstate/nmstate-handler-xfhkr" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.357087 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdx6r\" (UniqueName: \"kubernetes.io/projected/8397d2d3-a343-42d3-9443-03474b7ad195-kube-api-access-pdx6r\") pod \"nmstate-handler-xfhkr\" (UID: \"8397d2d3-a343-42d3-9443-03474b7ad195\") " pod="openshift-nmstate/nmstate-handler-xfhkr" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.357102 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/23c87d69-7fab-4ce7-99bc-9d076777b1ef-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-l5p5p\" (UID: \"23c87d69-7fab-4ce7-99bc-9d076777b1ef\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-l5p5p" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.357608 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/8397d2d3-a343-42d3-9443-03474b7ad195-nmstate-lock\") pod \"nmstate-handler-xfhkr\" (UID: \"8397d2d3-a343-42d3-9443-03474b7ad195\") " pod="openshift-nmstate/nmstate-handler-xfhkr" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.357660 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/8397d2d3-a343-42d3-9443-03474b7ad195-ovs-socket\") pod \"nmstate-handler-xfhkr\" (UID: \"8397d2d3-a343-42d3-9443-03474b7ad195\") " pod="openshift-nmstate/nmstate-handler-xfhkr" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.358323 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/8397d2d3-a343-42d3-9443-03474b7ad195-dbus-socket\") pod \"nmstate-handler-xfhkr\" (UID: \"8397d2d3-a343-42d3-9443-03474b7ad195\") " pod="openshift-nmstate/nmstate-handler-xfhkr" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.366595 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/23c87d69-7fab-4ce7-99bc-9d076777b1ef-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-l5p5p\" (UID: \"23c87d69-7fab-4ce7-99bc-9d076777b1ef\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-l5p5p" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.379204 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdx6r\" (UniqueName: \"kubernetes.io/projected/8397d2d3-a343-42d3-9443-03474b7ad195-kube-api-access-pdx6r\") pod \"nmstate-handler-xfhkr\" (UID: \"8397d2d3-a343-42d3-9443-03474b7ad195\") " pod="openshift-nmstate/nmstate-handler-xfhkr" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.392038 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdljg\" (UniqueName: \"kubernetes.io/projected/052e1ec6-21bf-4ce6-9460-d639e85112a4-kube-api-access-tdljg\") pod \"nmstate-metrics-69594cc75-w2vms\" (UID: \"052e1ec6-21bf-4ce6-9460-d639e85112a4\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-w2vms" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.412125 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjx2m\" (UniqueName: \"kubernetes.io/projected/23c87d69-7fab-4ce7-99bc-9d076777b1ef-kube-api-access-rjx2m\") pod \"nmstate-webhook-786f45cff4-l5p5p\" (UID: \"23c87d69-7fab-4ce7-99bc-9d076777b1ef\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-l5p5p" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.432489 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-w2vms" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.448085 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-l5p5p" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.465494 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/2fa716e4-3b2c-4ad8-b89b-cd7a931a658d-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-sdgzs\" (UID: \"2fa716e4-3b2c-4ad8-b89b-cd7a931a658d\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-sdgzs" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.465554 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-595q6\" (UniqueName: \"kubernetes.io/projected/2fa716e4-3b2c-4ad8-b89b-cd7a931a658d-kube-api-access-595q6\") pod \"nmstate-console-plugin-5dcbbd79cf-sdgzs\" (UID: \"2fa716e4-3b2c-4ad8-b89b-cd7a931a658d\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-sdgzs" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.465585 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/2fa716e4-3b2c-4ad8-b89b-cd7a931a658d-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-sdgzs\" (UID: \"2fa716e4-3b2c-4ad8-b89b-cd7a931a658d\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-sdgzs" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.467254 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/2fa716e4-3b2c-4ad8-b89b-cd7a931a658d-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-sdgzs\" (UID: \"2fa716e4-3b2c-4ad8-b89b-cd7a931a658d\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-sdgzs" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.474791 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/2fa716e4-3b2c-4ad8-b89b-cd7a931a658d-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-sdgzs\" (UID: \"2fa716e4-3b2c-4ad8-b89b-cd7a931a658d\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-sdgzs" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.481634 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-xfhkr" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.504498 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-595q6\" (UniqueName: \"kubernetes.io/projected/2fa716e4-3b2c-4ad8-b89b-cd7a931a658d-kube-api-access-595q6\") pod \"nmstate-console-plugin-5dcbbd79cf-sdgzs\" (UID: \"2fa716e4-3b2c-4ad8-b89b-cd7a931a658d\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-sdgzs" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.523160 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-78d4959985-psdq9"] Mar 09 02:55:43 crc kubenswrapper[4901]: W0309 02:55:43.523517 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8397d2d3_a343_42d3_9443_03474b7ad195.slice/crio-63f03408fd220afc46bd314c40b66178d7b45593bb411c65088302feb6d45b07 WatchSource:0}: Error finding container 63f03408fd220afc46bd314c40b66178d7b45593bb411c65088302feb6d45b07: Status 404 returned error can't find the container with id 63f03408fd220afc46bd314c40b66178d7b45593bb411c65088302feb6d45b07 Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.523913 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78d4959985-psdq9" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.541056 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-78d4959985-psdq9"] Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.553912 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-xfhkr" event={"ID":"8397d2d3-a343-42d3-9443-03474b7ad195","Type":"ContainerStarted","Data":"63f03408fd220afc46bd314c40b66178d7b45593bb411c65088302feb6d45b07"} Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.555493 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-sdgzs" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.669332 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/80e11306-e531-4ba0-891d-b43c2b34e8d0-oauth-serving-cert\") pod \"console-78d4959985-psdq9\" (UID: \"80e11306-e531-4ba0-891d-b43c2b34e8d0\") " pod="openshift-console/console-78d4959985-psdq9" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.669398 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjzsq\" (UniqueName: \"kubernetes.io/projected/80e11306-e531-4ba0-891d-b43c2b34e8d0-kube-api-access-gjzsq\") pod \"console-78d4959985-psdq9\" (UID: \"80e11306-e531-4ba0-891d-b43c2b34e8d0\") " pod="openshift-console/console-78d4959985-psdq9" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.669438 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/80e11306-e531-4ba0-891d-b43c2b34e8d0-service-ca\") pod \"console-78d4959985-psdq9\" (UID: \"80e11306-e531-4ba0-891d-b43c2b34e8d0\") " pod="openshift-console/console-78d4959985-psdq9" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.669466 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/80e11306-e531-4ba0-891d-b43c2b34e8d0-console-config\") pod \"console-78d4959985-psdq9\" (UID: \"80e11306-e531-4ba0-891d-b43c2b34e8d0\") " pod="openshift-console/console-78d4959985-psdq9" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.669511 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80e11306-e531-4ba0-891d-b43c2b34e8d0-trusted-ca-bundle\") pod \"console-78d4959985-psdq9\" (UID: \"80e11306-e531-4ba0-891d-b43c2b34e8d0\") " pod="openshift-console/console-78d4959985-psdq9" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.669574 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/80e11306-e531-4ba0-891d-b43c2b34e8d0-console-oauth-config\") pod \"console-78d4959985-psdq9\" (UID: \"80e11306-e531-4ba0-891d-b43c2b34e8d0\") " pod="openshift-console/console-78d4959985-psdq9" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.669619 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/80e11306-e531-4ba0-891d-b43c2b34e8d0-console-serving-cert\") pod \"console-78d4959985-psdq9\" (UID: \"80e11306-e531-4ba0-891d-b43c2b34e8d0\") " pod="openshift-console/console-78d4959985-psdq9" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.675312 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-l5p5p"] Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.763561 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-sdgzs"] Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.771324 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/80e11306-e531-4ba0-891d-b43c2b34e8d0-console-serving-cert\") pod \"console-78d4959985-psdq9\" (UID: \"80e11306-e531-4ba0-891d-b43c2b34e8d0\") " pod="openshift-console/console-78d4959985-psdq9" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.771383 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/80e11306-e531-4ba0-891d-b43c2b34e8d0-oauth-serving-cert\") pod \"console-78d4959985-psdq9\" (UID: \"80e11306-e531-4ba0-891d-b43c2b34e8d0\") " pod="openshift-console/console-78d4959985-psdq9" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.771423 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjzsq\" (UniqueName: \"kubernetes.io/projected/80e11306-e531-4ba0-891d-b43c2b34e8d0-kube-api-access-gjzsq\") pod \"console-78d4959985-psdq9\" (UID: \"80e11306-e531-4ba0-891d-b43c2b34e8d0\") " pod="openshift-console/console-78d4959985-psdq9" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.771457 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/80e11306-e531-4ba0-891d-b43c2b34e8d0-service-ca\") pod \"console-78d4959985-psdq9\" (UID: \"80e11306-e531-4ba0-891d-b43c2b34e8d0\") " pod="openshift-console/console-78d4959985-psdq9" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.771485 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/80e11306-e531-4ba0-891d-b43c2b34e8d0-console-config\") pod \"console-78d4959985-psdq9\" (UID: \"80e11306-e531-4ba0-891d-b43c2b34e8d0\") " pod="openshift-console/console-78d4959985-psdq9" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.771517 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80e11306-e531-4ba0-891d-b43c2b34e8d0-trusted-ca-bundle\") pod \"console-78d4959985-psdq9\" (UID: \"80e11306-e531-4ba0-891d-b43c2b34e8d0\") " pod="openshift-console/console-78d4959985-psdq9" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.771564 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/80e11306-e531-4ba0-891d-b43c2b34e8d0-console-oauth-config\") pod \"console-78d4959985-psdq9\" (UID: \"80e11306-e531-4ba0-891d-b43c2b34e8d0\") " pod="openshift-console/console-78d4959985-psdq9" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.772230 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/80e11306-e531-4ba0-891d-b43c2b34e8d0-oauth-serving-cert\") pod \"console-78d4959985-psdq9\" (UID: \"80e11306-e531-4ba0-891d-b43c2b34e8d0\") " pod="openshift-console/console-78d4959985-psdq9" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.772327 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/80e11306-e531-4ba0-891d-b43c2b34e8d0-service-ca\") pod \"console-78d4959985-psdq9\" (UID: \"80e11306-e531-4ba0-891d-b43c2b34e8d0\") " pod="openshift-console/console-78d4959985-psdq9" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.772461 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/80e11306-e531-4ba0-891d-b43c2b34e8d0-console-config\") pod \"console-78d4959985-psdq9\" (UID: \"80e11306-e531-4ba0-891d-b43c2b34e8d0\") " pod="openshift-console/console-78d4959985-psdq9" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.772623 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80e11306-e531-4ba0-891d-b43c2b34e8d0-trusted-ca-bundle\") pod \"console-78d4959985-psdq9\" (UID: \"80e11306-e531-4ba0-891d-b43c2b34e8d0\") " pod="openshift-console/console-78d4959985-psdq9" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.776836 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/80e11306-e531-4ba0-891d-b43c2b34e8d0-console-oauth-config\") pod \"console-78d4959985-psdq9\" (UID: \"80e11306-e531-4ba0-891d-b43c2b34e8d0\") " pod="openshift-console/console-78d4959985-psdq9" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.776905 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/80e11306-e531-4ba0-891d-b43c2b34e8d0-console-serving-cert\") pod \"console-78d4959985-psdq9\" (UID: \"80e11306-e531-4ba0-891d-b43c2b34e8d0\") " pod="openshift-console/console-78d4959985-psdq9" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.788137 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjzsq\" (UniqueName: \"kubernetes.io/projected/80e11306-e531-4ba0-891d-b43c2b34e8d0-kube-api-access-gjzsq\") pod \"console-78d4959985-psdq9\" (UID: \"80e11306-e531-4ba0-891d-b43c2b34e8d0\") " pod="openshift-console/console-78d4959985-psdq9" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.845077 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78d4959985-psdq9" Mar 09 02:55:43 crc kubenswrapper[4901]: I0309 02:55:43.971034 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-w2vms"] Mar 09 02:55:43 crc kubenswrapper[4901]: W0309 02:55:43.974195 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod052e1ec6_21bf_4ce6_9460_d639e85112a4.slice/crio-c530b1a56a4230fedc45cdb5b5086ac7be606eae0728da9f6c28c21a5b03a72e WatchSource:0}: Error finding container c530b1a56a4230fedc45cdb5b5086ac7be606eae0728da9f6c28c21a5b03a72e: Status 404 returned error can't find the container with id c530b1a56a4230fedc45cdb5b5086ac7be606eae0728da9f6c28c21a5b03a72e Mar 09 02:55:44 crc kubenswrapper[4901]: I0309 02:55:44.048381 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-78d4959985-psdq9"] Mar 09 02:55:44 crc kubenswrapper[4901]: W0309 02:55:44.054274 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80e11306_e531_4ba0_891d_b43c2b34e8d0.slice/crio-2636d954fd19d93dd6c8a5e8fb286d4e258c9d0a776868bbe6c8a72bb020e264 WatchSource:0}: Error finding container 2636d954fd19d93dd6c8a5e8fb286d4e258c9d0a776868bbe6c8a72bb020e264: Status 404 returned error can't find the container with id 2636d954fd19d93dd6c8a5e8fb286d4e258c9d0a776868bbe6c8a72bb020e264 Mar 09 02:55:44 crc kubenswrapper[4901]: I0309 02:55:44.561426 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-sdgzs" event={"ID":"2fa716e4-3b2c-4ad8-b89b-cd7a931a658d","Type":"ContainerStarted","Data":"5382055b39eabd8b00563f3c5edeff704b5e41ee2fcd2a3f1f6e23e857879654"} Mar 09 02:55:44 crc kubenswrapper[4901]: I0309 02:55:44.563377 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-w2vms" event={"ID":"052e1ec6-21bf-4ce6-9460-d639e85112a4","Type":"ContainerStarted","Data":"c530b1a56a4230fedc45cdb5b5086ac7be606eae0728da9f6c28c21a5b03a72e"} Mar 09 02:55:44 crc kubenswrapper[4901]: I0309 02:55:44.565542 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78d4959985-psdq9" event={"ID":"80e11306-e531-4ba0-891d-b43c2b34e8d0","Type":"ContainerStarted","Data":"46ae298558cd57f67ebfe994ed2067ff3147a5e84423332fd3c257da64c685d5"} Mar 09 02:55:44 crc kubenswrapper[4901]: I0309 02:55:44.565567 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78d4959985-psdq9" event={"ID":"80e11306-e531-4ba0-891d-b43c2b34e8d0","Type":"ContainerStarted","Data":"2636d954fd19d93dd6c8a5e8fb286d4e258c9d0a776868bbe6c8a72bb020e264"} Mar 09 02:55:44 crc kubenswrapper[4901]: I0309 02:55:44.567702 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-l5p5p" event={"ID":"23c87d69-7fab-4ce7-99bc-9d076777b1ef","Type":"ContainerStarted","Data":"cfc7c3806b428b9be303d04d699977f8c0b1e4c482ef3d85701f5968c6412d85"} Mar 09 02:55:44 crc kubenswrapper[4901]: I0309 02:55:44.591704 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-78d4959985-psdq9" podStartSLOduration=1.591675411 podStartE2EDuration="1.591675411s" podCreationTimestamp="2026-03-09 02:55:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:55:44.588554062 +0000 UTC m=+869.178217844" watchObservedRunningTime="2026-03-09 02:55:44.591675411 +0000 UTC m=+869.181339193" Mar 09 02:55:46 crc kubenswrapper[4901]: I0309 02:55:46.086489 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tmkmt" Mar 09 02:55:46 crc kubenswrapper[4901]: I0309 02:55:46.136447 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tmkmt" Mar 09 02:55:47 crc kubenswrapper[4901]: I0309 02:55:47.589811 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-sdgzs" event={"ID":"2fa716e4-3b2c-4ad8-b89b-cd7a931a658d","Type":"ContainerStarted","Data":"cc59f46586fc1bfed6a92b73aefb08a3fad80f891a0e4e7478f722fc13e1026b"} Mar 09 02:55:47 crc kubenswrapper[4901]: I0309 02:55:47.595029 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-w2vms" event={"ID":"052e1ec6-21bf-4ce6-9460-d639e85112a4","Type":"ContainerStarted","Data":"7d4a320144b8ba39b5b1f7d689768d2f25cb13d9cade69134032f06775332750"} Mar 09 02:55:47 crc kubenswrapper[4901]: I0309 02:55:47.597545 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-xfhkr" event={"ID":"8397d2d3-a343-42d3-9443-03474b7ad195","Type":"ContainerStarted","Data":"fa086474180436b1e74e4a8d01a90107765cd46e24b68b745e0f3f4c719a7090"} Mar 09 02:55:47 crc kubenswrapper[4901]: I0309 02:55:47.598539 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-xfhkr" Mar 09 02:55:47 crc kubenswrapper[4901]: I0309 02:55:47.600978 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-l5p5p" event={"ID":"23c87d69-7fab-4ce7-99bc-9d076777b1ef","Type":"ContainerStarted","Data":"63d404cea0958bfb5aad9b35957c1edc14789ea6028de43f4fd45e68ce6c4ab6"} Mar 09 02:55:47 crc kubenswrapper[4901]: I0309 02:55:47.601163 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-786f45cff4-l5p5p" Mar 09 02:55:47 crc kubenswrapper[4901]: I0309 02:55:47.615695 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-sdgzs" podStartSLOduration=1.77748403 podStartE2EDuration="4.615639328s" podCreationTimestamp="2026-03-09 02:55:43 +0000 UTC" firstStartedPulling="2026-03-09 02:55:43.767553372 +0000 UTC m=+868.357217104" lastFinishedPulling="2026-03-09 02:55:46.60570866 +0000 UTC m=+871.195372402" observedRunningTime="2026-03-09 02:55:47.609982485 +0000 UTC m=+872.199646247" watchObservedRunningTime="2026-03-09 02:55:47.615639328 +0000 UTC m=+872.205303090" Mar 09 02:55:47 crc kubenswrapper[4901]: I0309 02:55:47.666076 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-786f45cff4-l5p5p" podStartSLOduration=1.7152126779999999 podStartE2EDuration="4.66604972s" podCreationTimestamp="2026-03-09 02:55:43 +0000 UTC" firstStartedPulling="2026-03-09 02:55:43.682115516 +0000 UTC m=+868.271779248" lastFinishedPulling="2026-03-09 02:55:46.632952558 +0000 UTC m=+871.222616290" observedRunningTime="2026-03-09 02:55:47.637748066 +0000 UTC m=+872.227411838" watchObservedRunningTime="2026-03-09 02:55:47.66604972 +0000 UTC m=+872.255713472" Mar 09 02:55:47 crc kubenswrapper[4901]: I0309 02:55:47.666994 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-xfhkr" podStartSLOduration=1.608826624 podStartE2EDuration="4.666984194s" podCreationTimestamp="2026-03-09 02:55:43 +0000 UTC" firstStartedPulling="2026-03-09 02:55:43.548375981 +0000 UTC m=+868.138039713" lastFinishedPulling="2026-03-09 02:55:46.606533511 +0000 UTC m=+871.196197283" observedRunningTime="2026-03-09 02:55:47.663070825 +0000 UTC m=+872.252734567" watchObservedRunningTime="2026-03-09 02:55:47.666984194 +0000 UTC m=+872.256647936" Mar 09 02:55:49 crc kubenswrapper[4901]: I0309 02:55:49.479585 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tmkmt"] Mar 09 02:55:49 crc kubenswrapper[4901]: I0309 02:55:49.480420 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tmkmt" podUID="73a96733-bf28-4970-a38b-c26f8b8979ae" containerName="registry-server" containerID="cri-o://30b2331760d5f74b6f96ff6de84b0964778f01ee4eaacf47c3119c6e6aa25562" gracePeriod=2 Mar 09 02:55:49 crc kubenswrapper[4901]: I0309 02:55:49.636772 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-w2vms" event={"ID":"052e1ec6-21bf-4ce6-9460-d639e85112a4","Type":"ContainerStarted","Data":"444a75c06f8eb3f202e0931b930337b57aaab58b29213425f62c794dd015ca75"} Mar 09 02:55:49 crc kubenswrapper[4901]: I0309 02:55:49.642738 4901 generic.go:334] "Generic (PLEG): container finished" podID="73a96733-bf28-4970-a38b-c26f8b8979ae" containerID="30b2331760d5f74b6f96ff6de84b0964778f01ee4eaacf47c3119c6e6aa25562" exitCode=0 Mar 09 02:55:49 crc kubenswrapper[4901]: I0309 02:55:49.642826 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tmkmt" event={"ID":"73a96733-bf28-4970-a38b-c26f8b8979ae","Type":"ContainerDied","Data":"30b2331760d5f74b6f96ff6de84b0964778f01ee4eaacf47c3119c6e6aa25562"} Mar 09 02:55:49 crc kubenswrapper[4901]: I0309 02:55:49.670278 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-69594cc75-w2vms" podStartSLOduration=1.446148498 podStartE2EDuration="6.670248911s" podCreationTimestamp="2026-03-09 02:55:43 +0000 UTC" firstStartedPulling="2026-03-09 02:55:43.976792953 +0000 UTC m=+868.566456695" lastFinishedPulling="2026-03-09 02:55:49.200893376 +0000 UTC m=+873.790557108" observedRunningTime="2026-03-09 02:55:49.656781021 +0000 UTC m=+874.246444763" watchObservedRunningTime="2026-03-09 02:55:49.670248911 +0000 UTC m=+874.259912663" Mar 09 02:55:49 crc kubenswrapper[4901]: I0309 02:55:49.890730 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tmkmt" Mar 09 02:55:50 crc kubenswrapper[4901]: I0309 02:55:50.062817 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73a96733-bf28-4970-a38b-c26f8b8979ae-catalog-content\") pod \"73a96733-bf28-4970-a38b-c26f8b8979ae\" (UID: \"73a96733-bf28-4970-a38b-c26f8b8979ae\") " Mar 09 02:55:50 crc kubenswrapper[4901]: I0309 02:55:50.063189 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73a96733-bf28-4970-a38b-c26f8b8979ae-utilities\") pod \"73a96733-bf28-4970-a38b-c26f8b8979ae\" (UID: \"73a96733-bf28-4970-a38b-c26f8b8979ae\") " Mar 09 02:55:50 crc kubenswrapper[4901]: I0309 02:55:50.063291 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jtxd\" (UniqueName: \"kubernetes.io/projected/73a96733-bf28-4970-a38b-c26f8b8979ae-kube-api-access-9jtxd\") pod \"73a96733-bf28-4970-a38b-c26f8b8979ae\" (UID: \"73a96733-bf28-4970-a38b-c26f8b8979ae\") " Mar 09 02:55:50 crc kubenswrapper[4901]: I0309 02:55:50.064974 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73a96733-bf28-4970-a38b-c26f8b8979ae-utilities" (OuterVolumeSpecName: "utilities") pod "73a96733-bf28-4970-a38b-c26f8b8979ae" (UID: "73a96733-bf28-4970-a38b-c26f8b8979ae"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 02:55:50 crc kubenswrapper[4901]: I0309 02:55:50.073008 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73a96733-bf28-4970-a38b-c26f8b8979ae-kube-api-access-9jtxd" (OuterVolumeSpecName: "kube-api-access-9jtxd") pod "73a96733-bf28-4970-a38b-c26f8b8979ae" (UID: "73a96733-bf28-4970-a38b-c26f8b8979ae"). InnerVolumeSpecName "kube-api-access-9jtxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:55:50 crc kubenswrapper[4901]: I0309 02:55:50.166009 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jtxd\" (UniqueName: \"kubernetes.io/projected/73a96733-bf28-4970-a38b-c26f8b8979ae-kube-api-access-9jtxd\") on node \"crc\" DevicePath \"\"" Mar 09 02:55:50 crc kubenswrapper[4901]: I0309 02:55:50.166112 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73a96733-bf28-4970-a38b-c26f8b8979ae-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 02:55:50 crc kubenswrapper[4901]: I0309 02:55:50.219764 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73a96733-bf28-4970-a38b-c26f8b8979ae-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "73a96733-bf28-4970-a38b-c26f8b8979ae" (UID: "73a96733-bf28-4970-a38b-c26f8b8979ae"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 02:55:50 crc kubenswrapper[4901]: I0309 02:55:50.268385 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73a96733-bf28-4970-a38b-c26f8b8979ae-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 02:55:50 crc kubenswrapper[4901]: I0309 02:55:50.655002 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tmkmt" event={"ID":"73a96733-bf28-4970-a38b-c26f8b8979ae","Type":"ContainerDied","Data":"460b9cfa3a3951460de732d068d86be8624d42207d353acf4ba6878ae7702df2"} Mar 09 02:55:50 crc kubenswrapper[4901]: I0309 02:55:50.655039 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tmkmt" Mar 09 02:55:50 crc kubenswrapper[4901]: I0309 02:55:50.655083 4901 scope.go:117] "RemoveContainer" containerID="30b2331760d5f74b6f96ff6de84b0964778f01ee4eaacf47c3119c6e6aa25562" Mar 09 02:55:50 crc kubenswrapper[4901]: I0309 02:55:50.687355 4901 scope.go:117] "RemoveContainer" containerID="4d1b3597f16e565c1cac28f9373337393bf3a42ba9e28584a3a52ac93c7eb7c4" Mar 09 02:55:50 crc kubenswrapper[4901]: I0309 02:55:50.695139 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tmkmt"] Mar 09 02:55:50 crc kubenswrapper[4901]: I0309 02:55:50.702902 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tmkmt"] Mar 09 02:55:50 crc kubenswrapper[4901]: I0309 02:55:50.724546 4901 scope.go:117] "RemoveContainer" containerID="54b8ac136f6fe8db2710793429c591661790857339230f30a73f09446b82ece2" Mar 09 02:55:52 crc kubenswrapper[4901]: I0309 02:55:52.134159 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73a96733-bf28-4970-a38b-c26f8b8979ae" path="/var/lib/kubelet/pods/73a96733-bf28-4970-a38b-c26f8b8979ae/volumes" Mar 09 02:55:53 crc kubenswrapper[4901]: I0309 02:55:53.506970 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-xfhkr" Mar 09 02:55:53 crc kubenswrapper[4901]: I0309 02:55:53.845752 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-78d4959985-psdq9" Mar 09 02:55:53 crc kubenswrapper[4901]: I0309 02:55:53.845824 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-78d4959985-psdq9" Mar 09 02:55:53 crc kubenswrapper[4901]: I0309 02:55:53.854303 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-78d4959985-psdq9" Mar 09 02:55:54 crc kubenswrapper[4901]: I0309 02:55:54.698948 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-78d4959985-psdq9" Mar 09 02:55:54 crc kubenswrapper[4901]: I0309 02:55:54.789695 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-xc8gr"] Mar 09 02:56:00 crc kubenswrapper[4901]: I0309 02:56:00.148531 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550416-zc9wt"] Mar 09 02:56:00 crc kubenswrapper[4901]: E0309 02:56:00.149453 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73a96733-bf28-4970-a38b-c26f8b8979ae" containerName="registry-server" Mar 09 02:56:00 crc kubenswrapper[4901]: I0309 02:56:00.149474 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="73a96733-bf28-4970-a38b-c26f8b8979ae" containerName="registry-server" Mar 09 02:56:00 crc kubenswrapper[4901]: E0309 02:56:00.149491 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73a96733-bf28-4970-a38b-c26f8b8979ae" containerName="extract-utilities" Mar 09 02:56:00 crc kubenswrapper[4901]: I0309 02:56:00.149504 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="73a96733-bf28-4970-a38b-c26f8b8979ae" containerName="extract-utilities" Mar 09 02:56:00 crc kubenswrapper[4901]: E0309 02:56:00.149525 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73a96733-bf28-4970-a38b-c26f8b8979ae" containerName="extract-content" Mar 09 02:56:00 crc kubenswrapper[4901]: I0309 02:56:00.149539 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="73a96733-bf28-4970-a38b-c26f8b8979ae" containerName="extract-content" Mar 09 02:56:00 crc kubenswrapper[4901]: I0309 02:56:00.149714 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="73a96733-bf28-4970-a38b-c26f8b8979ae" containerName="registry-server" Mar 09 02:56:00 crc kubenswrapper[4901]: I0309 02:56:00.150290 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550416-zc9wt" Mar 09 02:56:00 crc kubenswrapper[4901]: I0309 02:56:00.153377 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lq6vs" Mar 09 02:56:00 crc kubenswrapper[4901]: I0309 02:56:00.153553 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 02:56:00 crc kubenswrapper[4901]: I0309 02:56:00.157883 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 02:56:00 crc kubenswrapper[4901]: I0309 02:56:00.161279 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550416-zc9wt"] Mar 09 02:56:00 crc kubenswrapper[4901]: I0309 02:56:00.207102 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d5gv\" (UniqueName: \"kubernetes.io/projected/8fd0bf33-39b8-4401-a8af-63b45e82b5fd-kube-api-access-8d5gv\") pod \"auto-csr-approver-29550416-zc9wt\" (UID: \"8fd0bf33-39b8-4401-a8af-63b45e82b5fd\") " pod="openshift-infra/auto-csr-approver-29550416-zc9wt" Mar 09 02:56:00 crc kubenswrapper[4901]: I0309 02:56:00.308123 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8d5gv\" (UniqueName: \"kubernetes.io/projected/8fd0bf33-39b8-4401-a8af-63b45e82b5fd-kube-api-access-8d5gv\") pod \"auto-csr-approver-29550416-zc9wt\" (UID: \"8fd0bf33-39b8-4401-a8af-63b45e82b5fd\") " pod="openshift-infra/auto-csr-approver-29550416-zc9wt" Mar 09 02:56:00 crc kubenswrapper[4901]: I0309 02:56:00.347833 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d5gv\" (UniqueName: \"kubernetes.io/projected/8fd0bf33-39b8-4401-a8af-63b45e82b5fd-kube-api-access-8d5gv\") pod \"auto-csr-approver-29550416-zc9wt\" (UID: \"8fd0bf33-39b8-4401-a8af-63b45e82b5fd\") " pod="openshift-infra/auto-csr-approver-29550416-zc9wt" Mar 09 02:56:00 crc kubenswrapper[4901]: I0309 02:56:00.485008 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550416-zc9wt" Mar 09 02:56:00 crc kubenswrapper[4901]: I0309 02:56:00.781832 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550416-zc9wt"] Mar 09 02:56:00 crc kubenswrapper[4901]: W0309 02:56:00.793634 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8fd0bf33_39b8_4401_a8af_63b45e82b5fd.slice/crio-7278a0e3e1558a5924996c5bde2b87b373a293fd0bc4340c1bfaff5e6db5601c WatchSource:0}: Error finding container 7278a0e3e1558a5924996c5bde2b87b373a293fd0bc4340c1bfaff5e6db5601c: Status 404 returned error can't find the container with id 7278a0e3e1558a5924996c5bde2b87b373a293fd0bc4340c1bfaff5e6db5601c Mar 09 02:56:00 crc kubenswrapper[4901]: I0309 02:56:00.862822 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 02:56:00 crc kubenswrapper[4901]: I0309 02:56:00.862906 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 02:56:01 crc kubenswrapper[4901]: I0309 02:56:01.754800 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550416-zc9wt" event={"ID":"8fd0bf33-39b8-4401-a8af-63b45e82b5fd","Type":"ContainerStarted","Data":"7278a0e3e1558a5924996c5bde2b87b373a293fd0bc4340c1bfaff5e6db5601c"} Mar 09 02:56:02 crc kubenswrapper[4901]: I0309 02:56:02.765909 4901 generic.go:334] "Generic (PLEG): container finished" podID="8fd0bf33-39b8-4401-a8af-63b45e82b5fd" containerID="b155cf8af242930d449d148a50eeb7478ddef3a68e54ecc4a4107560daad0045" exitCode=0 Mar 09 02:56:02 crc kubenswrapper[4901]: I0309 02:56:02.765992 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550416-zc9wt" event={"ID":"8fd0bf33-39b8-4401-a8af-63b45e82b5fd","Type":"ContainerDied","Data":"b155cf8af242930d449d148a50eeb7478ddef3a68e54ecc4a4107560daad0045"} Mar 09 02:56:03 crc kubenswrapper[4901]: I0309 02:56:03.460029 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-786f45cff4-l5p5p" Mar 09 02:56:04 crc kubenswrapper[4901]: I0309 02:56:04.103832 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550416-zc9wt" Mar 09 02:56:04 crc kubenswrapper[4901]: I0309 02:56:04.273077 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8d5gv\" (UniqueName: \"kubernetes.io/projected/8fd0bf33-39b8-4401-a8af-63b45e82b5fd-kube-api-access-8d5gv\") pod \"8fd0bf33-39b8-4401-a8af-63b45e82b5fd\" (UID: \"8fd0bf33-39b8-4401-a8af-63b45e82b5fd\") " Mar 09 02:56:04 crc kubenswrapper[4901]: I0309 02:56:04.281968 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fd0bf33-39b8-4401-a8af-63b45e82b5fd-kube-api-access-8d5gv" (OuterVolumeSpecName: "kube-api-access-8d5gv") pod "8fd0bf33-39b8-4401-a8af-63b45e82b5fd" (UID: "8fd0bf33-39b8-4401-a8af-63b45e82b5fd"). InnerVolumeSpecName "kube-api-access-8d5gv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:56:04 crc kubenswrapper[4901]: I0309 02:56:04.375303 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8d5gv\" (UniqueName: \"kubernetes.io/projected/8fd0bf33-39b8-4401-a8af-63b45e82b5fd-kube-api-access-8d5gv\") on node \"crc\" DevicePath \"\"" Mar 09 02:56:04 crc kubenswrapper[4901]: I0309 02:56:04.781361 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550416-zc9wt" event={"ID":"8fd0bf33-39b8-4401-a8af-63b45e82b5fd","Type":"ContainerDied","Data":"7278a0e3e1558a5924996c5bde2b87b373a293fd0bc4340c1bfaff5e6db5601c"} Mar 09 02:56:04 crc kubenswrapper[4901]: I0309 02:56:04.781406 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7278a0e3e1558a5924996c5bde2b87b373a293fd0bc4340c1bfaff5e6db5601c" Mar 09 02:56:04 crc kubenswrapper[4901]: I0309 02:56:04.781418 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550416-zc9wt" Mar 09 02:56:05 crc kubenswrapper[4901]: I0309 02:56:05.159330 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550410-lbvpz"] Mar 09 02:56:05 crc kubenswrapper[4901]: I0309 02:56:05.164311 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550410-lbvpz"] Mar 09 02:56:06 crc kubenswrapper[4901]: I0309 02:56:06.117632 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d474c00-28a1-469a-aafd-c9b5bc4dd558" path="/var/lib/kubelet/pods/9d474c00-28a1-469a-aafd-c9b5bc4dd558/volumes" Mar 09 02:56:17 crc kubenswrapper[4901]: I0309 02:56:17.364194 4901 scope.go:117] "RemoveContainer" containerID="91454141e33be50fd8dc939678aa52fb96e5af281a4b959dcac3f73004e29217" Mar 09 02:56:19 crc kubenswrapper[4901]: I0309 02:56:19.151370 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44t8xx"] Mar 09 02:56:19 crc kubenswrapper[4901]: E0309 02:56:19.152102 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fd0bf33-39b8-4401-a8af-63b45e82b5fd" containerName="oc" Mar 09 02:56:19 crc kubenswrapper[4901]: I0309 02:56:19.152126 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fd0bf33-39b8-4401-a8af-63b45e82b5fd" containerName="oc" Mar 09 02:56:19 crc kubenswrapper[4901]: I0309 02:56:19.152361 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fd0bf33-39b8-4401-a8af-63b45e82b5fd" containerName="oc" Mar 09 02:56:19 crc kubenswrapper[4901]: I0309 02:56:19.153564 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44t8xx" Mar 09 02:56:19 crc kubenswrapper[4901]: I0309 02:56:19.156740 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 09 02:56:19 crc kubenswrapper[4901]: I0309 02:56:19.164271 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44t8xx"] Mar 09 02:56:19 crc kubenswrapper[4901]: I0309 02:56:19.234031 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44t8xx\" (UID: \"cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44t8xx" Mar 09 02:56:19 crc kubenswrapper[4901]: I0309 02:56:19.234132 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzvn5\" (UniqueName: \"kubernetes.io/projected/cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6-kube-api-access-mzvn5\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44t8xx\" (UID: \"cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44t8xx" Mar 09 02:56:19 crc kubenswrapper[4901]: I0309 02:56:19.234164 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44t8xx\" (UID: \"cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44t8xx" Mar 09 02:56:19 crc kubenswrapper[4901]: I0309 02:56:19.335041 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44t8xx\" (UID: \"cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44t8xx" Mar 09 02:56:19 crc kubenswrapper[4901]: I0309 02:56:19.335114 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzvn5\" (UniqueName: \"kubernetes.io/projected/cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6-kube-api-access-mzvn5\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44t8xx\" (UID: \"cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44t8xx" Mar 09 02:56:19 crc kubenswrapper[4901]: I0309 02:56:19.335153 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44t8xx\" (UID: \"cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44t8xx" Mar 09 02:56:19 crc kubenswrapper[4901]: I0309 02:56:19.335471 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44t8xx\" (UID: \"cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44t8xx" Mar 09 02:56:19 crc kubenswrapper[4901]: I0309 02:56:19.336834 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44t8xx\" (UID: \"cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44t8xx" Mar 09 02:56:19 crc kubenswrapper[4901]: I0309 02:56:19.371057 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzvn5\" (UniqueName: \"kubernetes.io/projected/cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6-kube-api-access-mzvn5\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44t8xx\" (UID: \"cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44t8xx" Mar 09 02:56:19 crc kubenswrapper[4901]: I0309 02:56:19.520577 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44t8xx" Mar 09 02:56:19 crc kubenswrapper[4901]: I0309 02:56:19.831612 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44t8xx"] Mar 09 02:56:19 crc kubenswrapper[4901]: W0309 02:56:19.835299 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd3f1bf2_2f9e_4369_9fb6_e042d67eeac6.slice/crio-44e05d7b780aa574a759b1eec1df99e8f9302452f48c4e587c9ff7e1bea8b6c8 WatchSource:0}: Error finding container 44e05d7b780aa574a759b1eec1df99e8f9302452f48c4e587c9ff7e1bea8b6c8: Status 404 returned error can't find the container with id 44e05d7b780aa574a759b1eec1df99e8f9302452f48c4e587c9ff7e1bea8b6c8 Mar 09 02:56:19 crc kubenswrapper[4901]: I0309 02:56:19.853920 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-xc8gr" podUID="65646690-9b87-47d8-a187-207924a2c486" containerName="console" containerID="cri-o://5e047bcbc976639e944c1f21371f9d16ac25de2ac2d08b344fee0c7fcdea4045" gracePeriod=15 Mar 09 02:56:20 crc kubenswrapper[4901]: I0309 02:56:20.248328 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-xc8gr_65646690-9b87-47d8-a187-207924a2c486/console/0.log" Mar 09 02:56:20 crc kubenswrapper[4901]: I0309 02:56:20.248657 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-xc8gr" Mar 09 02:56:20 crc kubenswrapper[4901]: I0309 02:56:20.386018 4901 generic.go:334] "Generic (PLEG): container finished" podID="cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6" containerID="0e265535226aa952a4387667d10eb27f5a677e232761c9d504a14cce3f6095ae" exitCode=0 Mar 09 02:56:20 crc kubenswrapper[4901]: I0309 02:56:20.386167 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44t8xx" event={"ID":"cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6","Type":"ContainerDied","Data":"0e265535226aa952a4387667d10eb27f5a677e232761c9d504a14cce3f6095ae"} Mar 09 02:56:20 crc kubenswrapper[4901]: I0309 02:56:20.386257 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44t8xx" event={"ID":"cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6","Type":"ContainerStarted","Data":"44e05d7b780aa574a759b1eec1df99e8f9302452f48c4e587c9ff7e1bea8b6c8"} Mar 09 02:56:20 crc kubenswrapper[4901]: I0309 02:56:20.388455 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-xc8gr_65646690-9b87-47d8-a187-207924a2c486/console/0.log" Mar 09 02:56:20 crc kubenswrapper[4901]: I0309 02:56:20.388535 4901 generic.go:334] "Generic (PLEG): container finished" podID="65646690-9b87-47d8-a187-207924a2c486" containerID="5e047bcbc976639e944c1f21371f9d16ac25de2ac2d08b344fee0c7fcdea4045" exitCode=2 Mar 09 02:56:20 crc kubenswrapper[4901]: I0309 02:56:20.388589 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-xc8gr" event={"ID":"65646690-9b87-47d8-a187-207924a2c486","Type":"ContainerDied","Data":"5e047bcbc976639e944c1f21371f9d16ac25de2ac2d08b344fee0c7fcdea4045"} Mar 09 02:56:20 crc kubenswrapper[4901]: I0309 02:56:20.388704 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-xc8gr" event={"ID":"65646690-9b87-47d8-a187-207924a2c486","Type":"ContainerDied","Data":"f8a83cdb6731c81c2438eb2c6959538df9187f75414e3ab0a69797b0115de3f4"} Mar 09 02:56:20 crc kubenswrapper[4901]: I0309 02:56:20.388741 4901 scope.go:117] "RemoveContainer" containerID="5e047bcbc976639e944c1f21371f9d16ac25de2ac2d08b344fee0c7fcdea4045" Mar 09 02:56:20 crc kubenswrapper[4901]: I0309 02:56:20.388873 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-xc8gr" Mar 09 02:56:20 crc kubenswrapper[4901]: I0309 02:56:20.416287 4901 scope.go:117] "RemoveContainer" containerID="5e047bcbc976639e944c1f21371f9d16ac25de2ac2d08b344fee0c7fcdea4045" Mar 09 02:56:20 crc kubenswrapper[4901]: E0309 02:56:20.416796 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e047bcbc976639e944c1f21371f9d16ac25de2ac2d08b344fee0c7fcdea4045\": container with ID starting with 5e047bcbc976639e944c1f21371f9d16ac25de2ac2d08b344fee0c7fcdea4045 not found: ID does not exist" containerID="5e047bcbc976639e944c1f21371f9d16ac25de2ac2d08b344fee0c7fcdea4045" Mar 09 02:56:20 crc kubenswrapper[4901]: I0309 02:56:20.416837 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e047bcbc976639e944c1f21371f9d16ac25de2ac2d08b344fee0c7fcdea4045"} err="failed to get container status \"5e047bcbc976639e944c1f21371f9d16ac25de2ac2d08b344fee0c7fcdea4045\": rpc error: code = NotFound desc = could not find container \"5e047bcbc976639e944c1f21371f9d16ac25de2ac2d08b344fee0c7fcdea4045\": container with ID starting with 5e047bcbc976639e944c1f21371f9d16ac25de2ac2d08b344fee0c7fcdea4045 not found: ID does not exist" Mar 09 02:56:20 crc kubenswrapper[4901]: I0309 02:56:20.448356 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqrxt\" (UniqueName: \"kubernetes.io/projected/65646690-9b87-47d8-a187-207924a2c486-kube-api-access-nqrxt\") pod \"65646690-9b87-47d8-a187-207924a2c486\" (UID: \"65646690-9b87-47d8-a187-207924a2c486\") " Mar 09 02:56:20 crc kubenswrapper[4901]: I0309 02:56:20.449534 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65646690-9b87-47d8-a187-207924a2c486-trusted-ca-bundle\") pod \"65646690-9b87-47d8-a187-207924a2c486\" (UID: \"65646690-9b87-47d8-a187-207924a2c486\") " Mar 09 02:56:20 crc kubenswrapper[4901]: I0309 02:56:20.449575 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/65646690-9b87-47d8-a187-207924a2c486-console-config\") pod \"65646690-9b87-47d8-a187-207924a2c486\" (UID: \"65646690-9b87-47d8-a187-207924a2c486\") " Mar 09 02:56:20 crc kubenswrapper[4901]: I0309 02:56:20.449618 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/65646690-9b87-47d8-a187-207924a2c486-oauth-serving-cert\") pod \"65646690-9b87-47d8-a187-207924a2c486\" (UID: \"65646690-9b87-47d8-a187-207924a2c486\") " Mar 09 02:56:20 crc kubenswrapper[4901]: I0309 02:56:20.449640 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/65646690-9b87-47d8-a187-207924a2c486-console-oauth-config\") pod \"65646690-9b87-47d8-a187-207924a2c486\" (UID: \"65646690-9b87-47d8-a187-207924a2c486\") " Mar 09 02:56:20 crc kubenswrapper[4901]: I0309 02:56:20.449702 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/65646690-9b87-47d8-a187-207924a2c486-console-serving-cert\") pod \"65646690-9b87-47d8-a187-207924a2c486\" (UID: \"65646690-9b87-47d8-a187-207924a2c486\") " Mar 09 02:56:20 crc kubenswrapper[4901]: I0309 02:56:20.449734 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/65646690-9b87-47d8-a187-207924a2c486-service-ca\") pod \"65646690-9b87-47d8-a187-207924a2c486\" (UID: \"65646690-9b87-47d8-a187-207924a2c486\") " Mar 09 02:56:20 crc kubenswrapper[4901]: I0309 02:56:20.450376 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65646690-9b87-47d8-a187-207924a2c486-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "65646690-9b87-47d8-a187-207924a2c486" (UID: "65646690-9b87-47d8-a187-207924a2c486"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:56:20 crc kubenswrapper[4901]: I0309 02:56:20.451303 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65646690-9b87-47d8-a187-207924a2c486-service-ca" (OuterVolumeSpecName: "service-ca") pod "65646690-9b87-47d8-a187-207924a2c486" (UID: "65646690-9b87-47d8-a187-207924a2c486"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:56:20 crc kubenswrapper[4901]: I0309 02:56:20.451499 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65646690-9b87-47d8-a187-207924a2c486-console-config" (OuterVolumeSpecName: "console-config") pod "65646690-9b87-47d8-a187-207924a2c486" (UID: "65646690-9b87-47d8-a187-207924a2c486"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:56:20 crc kubenswrapper[4901]: I0309 02:56:20.451776 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65646690-9b87-47d8-a187-207924a2c486-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "65646690-9b87-47d8-a187-207924a2c486" (UID: "65646690-9b87-47d8-a187-207924a2c486"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 02:56:20 crc kubenswrapper[4901]: I0309 02:56:20.455465 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65646690-9b87-47d8-a187-207924a2c486-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "65646690-9b87-47d8-a187-207924a2c486" (UID: "65646690-9b87-47d8-a187-207924a2c486"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:56:20 crc kubenswrapper[4901]: I0309 02:56:20.455515 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65646690-9b87-47d8-a187-207924a2c486-kube-api-access-nqrxt" (OuterVolumeSpecName: "kube-api-access-nqrxt") pod "65646690-9b87-47d8-a187-207924a2c486" (UID: "65646690-9b87-47d8-a187-207924a2c486"). InnerVolumeSpecName "kube-api-access-nqrxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:56:20 crc kubenswrapper[4901]: I0309 02:56:20.455519 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65646690-9b87-47d8-a187-207924a2c486-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "65646690-9b87-47d8-a187-207924a2c486" (UID: "65646690-9b87-47d8-a187-207924a2c486"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 02:56:20 crc kubenswrapper[4901]: I0309 02:56:20.551054 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqrxt\" (UniqueName: \"kubernetes.io/projected/65646690-9b87-47d8-a187-207924a2c486-kube-api-access-nqrxt\") on node \"crc\" DevicePath \"\"" Mar 09 02:56:20 crc kubenswrapper[4901]: I0309 02:56:20.551169 4901 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65646690-9b87-47d8-a187-207924a2c486-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 02:56:20 crc kubenswrapper[4901]: I0309 02:56:20.551263 4901 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/65646690-9b87-47d8-a187-207924a2c486-console-config\") on node \"crc\" DevicePath \"\"" Mar 09 02:56:20 crc kubenswrapper[4901]: I0309 02:56:20.551285 4901 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/65646690-9b87-47d8-a187-207924a2c486-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 02:56:20 crc kubenswrapper[4901]: I0309 02:56:20.551306 4901 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/65646690-9b87-47d8-a187-207924a2c486-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 09 02:56:20 crc kubenswrapper[4901]: I0309 02:56:20.551325 4901 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/65646690-9b87-47d8-a187-207924a2c486-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 02:56:20 crc kubenswrapper[4901]: I0309 02:56:20.551344 4901 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/65646690-9b87-47d8-a187-207924a2c486-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 02:56:20 crc kubenswrapper[4901]: I0309 02:56:20.739722 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-xc8gr"] Mar 09 02:56:20 crc kubenswrapper[4901]: I0309 02:56:20.746561 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-xc8gr"] Mar 09 02:56:22 crc kubenswrapper[4901]: I0309 02:56:22.124418 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65646690-9b87-47d8-a187-207924a2c486" path="/var/lib/kubelet/pods/65646690-9b87-47d8-a187-207924a2c486/volumes" Mar 09 02:56:22 crc kubenswrapper[4901]: I0309 02:56:22.415288 4901 generic.go:334] "Generic (PLEG): container finished" podID="cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6" containerID="3bf478bacf594d2c60928e6bfa2d34ebfe63ef85655a64b8a98c535b5aafe94a" exitCode=0 Mar 09 02:56:22 crc kubenswrapper[4901]: I0309 02:56:22.415353 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44t8xx" event={"ID":"cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6","Type":"ContainerDied","Data":"3bf478bacf594d2c60928e6bfa2d34ebfe63ef85655a64b8a98c535b5aafe94a"} Mar 09 02:56:23 crc kubenswrapper[4901]: I0309 02:56:23.438514 4901 generic.go:334] "Generic (PLEG): container finished" podID="cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6" containerID="0dcade5e850061bae4255cc82675138b39ce00fd4ae3abb429223f8f8ac1f0c5" exitCode=0 Mar 09 02:56:23 crc kubenswrapper[4901]: I0309 02:56:23.438593 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44t8xx" event={"ID":"cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6","Type":"ContainerDied","Data":"0dcade5e850061bae4255cc82675138b39ce00fd4ae3abb429223f8f8ac1f0c5"} Mar 09 02:56:24 crc kubenswrapper[4901]: I0309 02:56:24.793035 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44t8xx" Mar 09 02:56:24 crc kubenswrapper[4901]: I0309 02:56:24.811539 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6-bundle\") pod \"cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6\" (UID: \"cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6\") " Mar 09 02:56:24 crc kubenswrapper[4901]: I0309 02:56:24.811641 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzvn5\" (UniqueName: \"kubernetes.io/projected/cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6-kube-api-access-mzvn5\") pod \"cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6\" (UID: \"cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6\") " Mar 09 02:56:24 crc kubenswrapper[4901]: I0309 02:56:24.811742 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6-util\") pod \"cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6\" (UID: \"cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6\") " Mar 09 02:56:24 crc kubenswrapper[4901]: I0309 02:56:24.813042 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6-bundle" (OuterVolumeSpecName: "bundle") pod "cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6" (UID: "cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 02:56:24 crc kubenswrapper[4901]: I0309 02:56:24.821774 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6-kube-api-access-mzvn5" (OuterVolumeSpecName: "kube-api-access-mzvn5") pod "cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6" (UID: "cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6"). InnerVolumeSpecName "kube-api-access-mzvn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:56:24 crc kubenswrapper[4901]: I0309 02:56:24.848449 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6-util" (OuterVolumeSpecName: "util") pod "cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6" (UID: "cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 02:56:24 crc kubenswrapper[4901]: I0309 02:56:24.912800 4901 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6-util\") on node \"crc\" DevicePath \"\"" Mar 09 02:56:24 crc kubenswrapper[4901]: I0309 02:56:24.912852 4901 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 02:56:24 crc kubenswrapper[4901]: I0309 02:56:24.912872 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzvn5\" (UniqueName: \"kubernetes.io/projected/cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6-kube-api-access-mzvn5\") on node \"crc\" DevicePath \"\"" Mar 09 02:56:25 crc kubenswrapper[4901]: I0309 02:56:25.456460 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44t8xx" event={"ID":"cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6","Type":"ContainerDied","Data":"44e05d7b780aa574a759b1eec1df99e8f9302452f48c4e587c9ff7e1bea8b6c8"} Mar 09 02:56:25 crc kubenswrapper[4901]: I0309 02:56:25.456806 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44e05d7b780aa574a759b1eec1df99e8f9302452f48c4e587c9ff7e1bea8b6c8" Mar 09 02:56:25 crc kubenswrapper[4901]: I0309 02:56:25.456542 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44t8xx" Mar 09 02:56:30 crc kubenswrapper[4901]: I0309 02:56:30.863586 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 02:56:30 crc kubenswrapper[4901]: I0309 02:56:30.863968 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 02:56:36 crc kubenswrapper[4901]: I0309 02:56:36.379492 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-64bbdf86d5-89qqq"] Mar 09 02:56:36 crc kubenswrapper[4901]: E0309 02:56:36.380183 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6" containerName="extract" Mar 09 02:56:36 crc kubenswrapper[4901]: I0309 02:56:36.380198 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6" containerName="extract" Mar 09 02:56:36 crc kubenswrapper[4901]: E0309 02:56:36.380210 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65646690-9b87-47d8-a187-207924a2c486" containerName="console" Mar 09 02:56:36 crc kubenswrapper[4901]: I0309 02:56:36.380234 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="65646690-9b87-47d8-a187-207924a2c486" containerName="console" Mar 09 02:56:36 crc kubenswrapper[4901]: E0309 02:56:36.380246 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6" containerName="pull" Mar 09 02:56:36 crc kubenswrapper[4901]: I0309 02:56:36.380255 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6" containerName="pull" Mar 09 02:56:36 crc kubenswrapper[4901]: E0309 02:56:36.380272 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6" containerName="util" Mar 09 02:56:36 crc kubenswrapper[4901]: I0309 02:56:36.380281 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6" containerName="util" Mar 09 02:56:36 crc kubenswrapper[4901]: I0309 02:56:36.380398 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="65646690-9b87-47d8-a187-207924a2c486" containerName="console" Mar 09 02:56:36 crc kubenswrapper[4901]: I0309 02:56:36.380417 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6" containerName="extract" Mar 09 02:56:36 crc kubenswrapper[4901]: I0309 02:56:36.380853 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-64bbdf86d5-89qqq" Mar 09 02:56:36 crc kubenswrapper[4901]: I0309 02:56:36.383438 4901 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 09 02:56:36 crc kubenswrapper[4901]: I0309 02:56:36.383518 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 09 02:56:36 crc kubenswrapper[4901]: I0309 02:56:36.387709 4901 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-fh84w" Mar 09 02:56:36 crc kubenswrapper[4901]: I0309 02:56:36.391628 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 09 02:56:36 crc kubenswrapper[4901]: I0309 02:56:36.394206 4901 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 09 02:56:36 crc kubenswrapper[4901]: I0309 02:56:36.403775 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-64bbdf86d5-89qqq"] Mar 09 02:56:36 crc kubenswrapper[4901]: I0309 02:56:36.573194 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/79f9ef23-e8a8-4608-90fb-83ee291b5794-webhook-cert\") pod \"metallb-operator-controller-manager-64bbdf86d5-89qqq\" (UID: \"79f9ef23-e8a8-4608-90fb-83ee291b5794\") " pod="metallb-system/metallb-operator-controller-manager-64bbdf86d5-89qqq" Mar 09 02:56:36 crc kubenswrapper[4901]: I0309 02:56:36.573250 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/79f9ef23-e8a8-4608-90fb-83ee291b5794-apiservice-cert\") pod \"metallb-operator-controller-manager-64bbdf86d5-89qqq\" (UID: \"79f9ef23-e8a8-4608-90fb-83ee291b5794\") " pod="metallb-system/metallb-operator-controller-manager-64bbdf86d5-89qqq" Mar 09 02:56:36 crc kubenswrapper[4901]: I0309 02:56:36.573320 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hcqk\" (UniqueName: \"kubernetes.io/projected/79f9ef23-e8a8-4608-90fb-83ee291b5794-kube-api-access-4hcqk\") pod \"metallb-operator-controller-manager-64bbdf86d5-89qqq\" (UID: \"79f9ef23-e8a8-4608-90fb-83ee291b5794\") " pod="metallb-system/metallb-operator-controller-manager-64bbdf86d5-89qqq" Mar 09 02:56:36 crc kubenswrapper[4901]: I0309 02:56:36.612389 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7dbdd58ff4-c2bzv"] Mar 09 02:56:36 crc kubenswrapper[4901]: I0309 02:56:36.613654 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7dbdd58ff4-c2bzv" Mar 09 02:56:36 crc kubenswrapper[4901]: I0309 02:56:36.619542 4901 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 09 02:56:36 crc kubenswrapper[4901]: I0309 02:56:36.619606 4901 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 09 02:56:36 crc kubenswrapper[4901]: I0309 02:56:36.619842 4901 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-qqhn9" Mar 09 02:56:36 crc kubenswrapper[4901]: I0309 02:56:36.631803 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7dbdd58ff4-c2bzv"] Mar 09 02:56:36 crc kubenswrapper[4901]: I0309 02:56:36.674897 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttwmv\" (UniqueName: \"kubernetes.io/projected/a342f2a2-396a-4a32-b09e-0e3327534ca6-kube-api-access-ttwmv\") pod \"metallb-operator-webhook-server-7dbdd58ff4-c2bzv\" (UID: \"a342f2a2-396a-4a32-b09e-0e3327534ca6\") " pod="metallb-system/metallb-operator-webhook-server-7dbdd58ff4-c2bzv" Mar 09 02:56:36 crc kubenswrapper[4901]: I0309 02:56:36.674987 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hcqk\" (UniqueName: \"kubernetes.io/projected/79f9ef23-e8a8-4608-90fb-83ee291b5794-kube-api-access-4hcqk\") pod \"metallb-operator-controller-manager-64bbdf86d5-89qqq\" (UID: \"79f9ef23-e8a8-4608-90fb-83ee291b5794\") " pod="metallb-system/metallb-operator-controller-manager-64bbdf86d5-89qqq" Mar 09 02:56:36 crc kubenswrapper[4901]: I0309 02:56:36.675117 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/79f9ef23-e8a8-4608-90fb-83ee291b5794-webhook-cert\") pod \"metallb-operator-controller-manager-64bbdf86d5-89qqq\" (UID: \"79f9ef23-e8a8-4608-90fb-83ee291b5794\") " pod="metallb-system/metallb-operator-controller-manager-64bbdf86d5-89qqq" Mar 09 02:56:36 crc kubenswrapper[4901]: I0309 02:56:36.675171 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a342f2a2-396a-4a32-b09e-0e3327534ca6-webhook-cert\") pod \"metallb-operator-webhook-server-7dbdd58ff4-c2bzv\" (UID: \"a342f2a2-396a-4a32-b09e-0e3327534ca6\") " pod="metallb-system/metallb-operator-webhook-server-7dbdd58ff4-c2bzv" Mar 09 02:56:36 crc kubenswrapper[4901]: I0309 02:56:36.675195 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a342f2a2-396a-4a32-b09e-0e3327534ca6-apiservice-cert\") pod \"metallb-operator-webhook-server-7dbdd58ff4-c2bzv\" (UID: \"a342f2a2-396a-4a32-b09e-0e3327534ca6\") " pod="metallb-system/metallb-operator-webhook-server-7dbdd58ff4-c2bzv" Mar 09 02:56:36 crc kubenswrapper[4901]: I0309 02:56:36.675278 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/79f9ef23-e8a8-4608-90fb-83ee291b5794-apiservice-cert\") pod \"metallb-operator-controller-manager-64bbdf86d5-89qqq\" (UID: \"79f9ef23-e8a8-4608-90fb-83ee291b5794\") " pod="metallb-system/metallb-operator-controller-manager-64bbdf86d5-89qqq" Mar 09 02:56:36 crc kubenswrapper[4901]: I0309 02:56:36.687007 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/79f9ef23-e8a8-4608-90fb-83ee291b5794-apiservice-cert\") pod \"metallb-operator-controller-manager-64bbdf86d5-89qqq\" (UID: \"79f9ef23-e8a8-4608-90fb-83ee291b5794\") " pod="metallb-system/metallb-operator-controller-manager-64bbdf86d5-89qqq" Mar 09 02:56:36 crc kubenswrapper[4901]: I0309 02:56:36.687855 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/79f9ef23-e8a8-4608-90fb-83ee291b5794-webhook-cert\") pod \"metallb-operator-controller-manager-64bbdf86d5-89qqq\" (UID: \"79f9ef23-e8a8-4608-90fb-83ee291b5794\") " pod="metallb-system/metallb-operator-controller-manager-64bbdf86d5-89qqq" Mar 09 02:56:36 crc kubenswrapper[4901]: I0309 02:56:36.700466 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hcqk\" (UniqueName: \"kubernetes.io/projected/79f9ef23-e8a8-4608-90fb-83ee291b5794-kube-api-access-4hcqk\") pod \"metallb-operator-controller-manager-64bbdf86d5-89qqq\" (UID: \"79f9ef23-e8a8-4608-90fb-83ee291b5794\") " pod="metallb-system/metallb-operator-controller-manager-64bbdf86d5-89qqq" Mar 09 02:56:36 crc kubenswrapper[4901]: I0309 02:56:36.775672 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a342f2a2-396a-4a32-b09e-0e3327534ca6-webhook-cert\") pod \"metallb-operator-webhook-server-7dbdd58ff4-c2bzv\" (UID: \"a342f2a2-396a-4a32-b09e-0e3327534ca6\") " pod="metallb-system/metallb-operator-webhook-server-7dbdd58ff4-c2bzv" Mar 09 02:56:36 crc kubenswrapper[4901]: I0309 02:56:36.775712 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a342f2a2-396a-4a32-b09e-0e3327534ca6-apiservice-cert\") pod \"metallb-operator-webhook-server-7dbdd58ff4-c2bzv\" (UID: \"a342f2a2-396a-4a32-b09e-0e3327534ca6\") " pod="metallb-system/metallb-operator-webhook-server-7dbdd58ff4-c2bzv" Mar 09 02:56:36 crc kubenswrapper[4901]: I0309 02:56:36.775746 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttwmv\" (UniqueName: \"kubernetes.io/projected/a342f2a2-396a-4a32-b09e-0e3327534ca6-kube-api-access-ttwmv\") pod \"metallb-operator-webhook-server-7dbdd58ff4-c2bzv\" (UID: \"a342f2a2-396a-4a32-b09e-0e3327534ca6\") " pod="metallb-system/metallb-operator-webhook-server-7dbdd58ff4-c2bzv" Mar 09 02:56:36 crc kubenswrapper[4901]: I0309 02:56:36.783766 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a342f2a2-396a-4a32-b09e-0e3327534ca6-apiservice-cert\") pod \"metallb-operator-webhook-server-7dbdd58ff4-c2bzv\" (UID: \"a342f2a2-396a-4a32-b09e-0e3327534ca6\") " pod="metallb-system/metallb-operator-webhook-server-7dbdd58ff4-c2bzv" Mar 09 02:56:36 crc kubenswrapper[4901]: I0309 02:56:36.783854 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a342f2a2-396a-4a32-b09e-0e3327534ca6-webhook-cert\") pod \"metallb-operator-webhook-server-7dbdd58ff4-c2bzv\" (UID: \"a342f2a2-396a-4a32-b09e-0e3327534ca6\") " pod="metallb-system/metallb-operator-webhook-server-7dbdd58ff4-c2bzv" Mar 09 02:56:36 crc kubenswrapper[4901]: I0309 02:56:36.798990 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttwmv\" (UniqueName: \"kubernetes.io/projected/a342f2a2-396a-4a32-b09e-0e3327534ca6-kube-api-access-ttwmv\") pod \"metallb-operator-webhook-server-7dbdd58ff4-c2bzv\" (UID: \"a342f2a2-396a-4a32-b09e-0e3327534ca6\") " pod="metallb-system/metallb-operator-webhook-server-7dbdd58ff4-c2bzv" Mar 09 02:56:36 crc kubenswrapper[4901]: I0309 02:56:36.929276 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7dbdd58ff4-c2bzv" Mar 09 02:56:36 crc kubenswrapper[4901]: I0309 02:56:36.994545 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-64bbdf86d5-89qqq" Mar 09 02:56:37 crc kubenswrapper[4901]: I0309 02:56:37.194074 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7dbdd58ff4-c2bzv"] Mar 09 02:56:37 crc kubenswrapper[4901]: W0309 02:56:37.211845 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda342f2a2_396a_4a32_b09e_0e3327534ca6.slice/crio-0305565a6cae7bf51433c7c74906218f8ecf026f5bc0414a6de50c743738d6b9 WatchSource:0}: Error finding container 0305565a6cae7bf51433c7c74906218f8ecf026f5bc0414a6de50c743738d6b9: Status 404 returned error can't find the container with id 0305565a6cae7bf51433c7c74906218f8ecf026f5bc0414a6de50c743738d6b9 Mar 09 02:56:37 crc kubenswrapper[4901]: I0309 02:56:37.226250 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-64bbdf86d5-89qqq"] Mar 09 02:56:37 crc kubenswrapper[4901]: W0309 02:56:37.233304 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79f9ef23_e8a8_4608_90fb_83ee291b5794.slice/crio-c5da665a12eb9f4021f092b95cd0b98865d63fcf0233b1da28b94146ef0494b4 WatchSource:0}: Error finding container c5da665a12eb9f4021f092b95cd0b98865d63fcf0233b1da28b94146ef0494b4: Status 404 returned error can't find the container with id c5da665a12eb9f4021f092b95cd0b98865d63fcf0233b1da28b94146ef0494b4 Mar 09 02:56:37 crc kubenswrapper[4901]: I0309 02:56:37.529610 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-64bbdf86d5-89qqq" event={"ID":"79f9ef23-e8a8-4608-90fb-83ee291b5794","Type":"ContainerStarted","Data":"c5da665a12eb9f4021f092b95cd0b98865d63fcf0233b1da28b94146ef0494b4"} Mar 09 02:56:37 crc kubenswrapper[4901]: I0309 02:56:37.530936 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7dbdd58ff4-c2bzv" event={"ID":"a342f2a2-396a-4a32-b09e-0e3327534ca6","Type":"ContainerStarted","Data":"0305565a6cae7bf51433c7c74906218f8ecf026f5bc0414a6de50c743738d6b9"} Mar 09 02:56:42 crc kubenswrapper[4901]: I0309 02:56:42.560955 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-64bbdf86d5-89qqq" event={"ID":"79f9ef23-e8a8-4608-90fb-83ee291b5794","Type":"ContainerStarted","Data":"8269acb3dbe466ef61d6a9e819e5c513305ee83f3d530832c9c89dd953bbf598"} Mar 09 02:56:42 crc kubenswrapper[4901]: I0309 02:56:42.561727 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-64bbdf86d5-89qqq" Mar 09 02:56:42 crc kubenswrapper[4901]: I0309 02:56:42.562650 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7dbdd58ff4-c2bzv" event={"ID":"a342f2a2-396a-4a32-b09e-0e3327534ca6","Type":"ContainerStarted","Data":"328e54868c6d235e157a4dc1a93924328bf71fa9dc052f5520f1eceb838c89f0"} Mar 09 02:56:42 crc kubenswrapper[4901]: I0309 02:56:42.562827 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7dbdd58ff4-c2bzv" Mar 09 02:56:42 crc kubenswrapper[4901]: I0309 02:56:42.587168 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-64bbdf86d5-89qqq" podStartSLOduration=1.778300564 podStartE2EDuration="6.587144877s" podCreationTimestamp="2026-03-09 02:56:36 +0000 UTC" firstStartedPulling="2026-03-09 02:56:37.2360343 +0000 UTC m=+921.825698032" lastFinishedPulling="2026-03-09 02:56:42.044878613 +0000 UTC m=+926.634542345" observedRunningTime="2026-03-09 02:56:42.582348946 +0000 UTC m=+927.172012708" watchObservedRunningTime="2026-03-09 02:56:42.587144877 +0000 UTC m=+927.176808629" Mar 09 02:56:42 crc kubenswrapper[4901]: I0309 02:56:42.610654 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7dbdd58ff4-c2bzv" podStartSLOduration=1.746050292 podStartE2EDuration="6.610630868s" podCreationTimestamp="2026-03-09 02:56:36 +0000 UTC" firstStartedPulling="2026-03-09 02:56:37.215462132 +0000 UTC m=+921.805125854" lastFinishedPulling="2026-03-09 02:56:42.080042698 +0000 UTC m=+926.669706430" observedRunningTime="2026-03-09 02:56:42.606540805 +0000 UTC m=+927.196204567" watchObservedRunningTime="2026-03-09 02:56:42.610630868 +0000 UTC m=+927.200294620" Mar 09 02:56:56 crc kubenswrapper[4901]: I0309 02:56:56.936666 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7dbdd58ff4-c2bzv" Mar 09 02:57:00 crc kubenswrapper[4901]: I0309 02:57:00.863347 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 02:57:00 crc kubenswrapper[4901]: I0309 02:57:00.863410 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 02:57:00 crc kubenswrapper[4901]: I0309 02:57:00.863476 4901 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5c998" Mar 09 02:57:00 crc kubenswrapper[4901]: I0309 02:57:00.864058 4901 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"37ec3e94088a17553e2b069ce6fa01c84825c1f38b75b23f862711155501cfa6"} pod="openshift-machine-config-operator/machine-config-daemon-5c998" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 02:57:00 crc kubenswrapper[4901]: I0309 02:57:00.864123 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" containerID="cri-o://37ec3e94088a17553e2b069ce6fa01c84825c1f38b75b23f862711155501cfa6" gracePeriod=600 Mar 09 02:57:01 crc kubenswrapper[4901]: I0309 02:57:01.679174 4901 generic.go:334] "Generic (PLEG): container finished" podID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerID="37ec3e94088a17553e2b069ce6fa01c84825c1f38b75b23f862711155501cfa6" exitCode=0 Mar 09 02:57:01 crc kubenswrapper[4901]: I0309 02:57:01.679250 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" event={"ID":"65e722e8-52c4-4bb6-9927-f378b2f7296a","Type":"ContainerDied","Data":"37ec3e94088a17553e2b069ce6fa01c84825c1f38b75b23f862711155501cfa6"} Mar 09 02:57:01 crc kubenswrapper[4901]: I0309 02:57:01.679765 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" event={"ID":"65e722e8-52c4-4bb6-9927-f378b2f7296a","Type":"ContainerStarted","Data":"de1ddcd67e5e6d7dcbea8ef2824b5106d2b931526ee8f8e98968dbc1152811b6"} Mar 09 02:57:01 crc kubenswrapper[4901]: I0309 02:57:01.679812 4901 scope.go:117] "RemoveContainer" containerID="cca9e0fab8d2b8ceab32875581fb830c17911c0a79e4cc7ee07e546219448782" Mar 09 02:57:16 crc kubenswrapper[4901]: I0309 02:57:16.997595 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-64bbdf86d5-89qqq" Mar 09 02:57:17 crc kubenswrapper[4901]: I0309 02:57:17.740292 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-llmsx"] Mar 09 02:57:17 crc kubenswrapper[4901]: I0309 02:57:17.743123 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-llmsx" Mar 09 02:57:17 crc kubenswrapper[4901]: I0309 02:57:17.744164 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-z2jph"] Mar 09 02:57:17 crc kubenswrapper[4901]: I0309 02:57:17.744919 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-z2jph" Mar 09 02:57:17 crc kubenswrapper[4901]: I0309 02:57:17.745961 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 09 02:57:17 crc kubenswrapper[4901]: I0309 02:57:17.745970 4901 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 09 02:57:17 crc kubenswrapper[4901]: I0309 02:57:17.746161 4901 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-jztfd" Mar 09 02:57:17 crc kubenswrapper[4901]: I0309 02:57:17.746947 4901 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 09 02:57:17 crc kubenswrapper[4901]: I0309 02:57:17.757671 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-z2jph"] Mar 09 02:57:17 crc kubenswrapper[4901]: I0309 02:57:17.840918 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-l7twq"] Mar 09 02:57:17 crc kubenswrapper[4901]: I0309 02:57:17.841744 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-l7twq" Mar 09 02:57:17 crc kubenswrapper[4901]: I0309 02:57:17.846087 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 09 02:57:17 crc kubenswrapper[4901]: I0309 02:57:17.846315 4901 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-9mxvf" Mar 09 02:57:17 crc kubenswrapper[4901]: I0309 02:57:17.846483 4901 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 09 02:57:17 crc kubenswrapper[4901]: I0309 02:57:17.847340 4901 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 09 02:57:17 crc kubenswrapper[4901]: I0309 02:57:17.850508 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-86ddb6bd46-n25nj"] Mar 09 02:57:17 crc kubenswrapper[4901]: I0309 02:57:17.851384 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-n25nj" Mar 09 02:57:17 crc kubenswrapper[4901]: I0309 02:57:17.854687 4901 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 09 02:57:17 crc kubenswrapper[4901]: I0309 02:57:17.859188 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-n25nj"] Mar 09 02:57:17 crc kubenswrapper[4901]: I0309 02:57:17.926379 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6c138591-09ca-4d38-90fa-61a52081ac72-cert\") pod \"frr-k8s-webhook-server-7f989f654f-z2jph\" (UID: \"6c138591-09ca-4d38-90fa-61a52081ac72\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-z2jph" Mar 09 02:57:17 crc kubenswrapper[4901]: I0309 02:57:17.926433 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/95d29111-dd64-49e3-8c7d-2924c047094b-frr-sockets\") pod \"frr-k8s-llmsx\" (UID: \"95d29111-dd64-49e3-8c7d-2924c047094b\") " pod="metallb-system/frr-k8s-llmsx" Mar 09 02:57:17 crc kubenswrapper[4901]: I0309 02:57:17.926457 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/95d29111-dd64-49e3-8c7d-2924c047094b-reloader\") pod \"frr-k8s-llmsx\" (UID: \"95d29111-dd64-49e3-8c7d-2924c047094b\") " pod="metallb-system/frr-k8s-llmsx" Mar 09 02:57:17 crc kubenswrapper[4901]: I0309 02:57:17.926482 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/95d29111-dd64-49e3-8c7d-2924c047094b-metrics\") pod \"frr-k8s-llmsx\" (UID: \"95d29111-dd64-49e3-8c7d-2924c047094b\") " pod="metallb-system/frr-k8s-llmsx" Mar 09 02:57:17 crc kubenswrapper[4901]: I0309 02:57:17.926500 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9spn\" (UniqueName: \"kubernetes.io/projected/95d29111-dd64-49e3-8c7d-2924c047094b-kube-api-access-s9spn\") pod \"frr-k8s-llmsx\" (UID: \"95d29111-dd64-49e3-8c7d-2924c047094b\") " pod="metallb-system/frr-k8s-llmsx" Mar 09 02:57:17 crc kubenswrapper[4901]: I0309 02:57:17.926516 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/95d29111-dd64-49e3-8c7d-2924c047094b-frr-startup\") pod \"frr-k8s-llmsx\" (UID: \"95d29111-dd64-49e3-8c7d-2924c047094b\") " pod="metallb-system/frr-k8s-llmsx" Mar 09 02:57:17 crc kubenswrapper[4901]: I0309 02:57:17.926531 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/95d29111-dd64-49e3-8c7d-2924c047094b-metrics-certs\") pod \"frr-k8s-llmsx\" (UID: \"95d29111-dd64-49e3-8c7d-2924c047094b\") " pod="metallb-system/frr-k8s-llmsx" Mar 09 02:57:17 crc kubenswrapper[4901]: I0309 02:57:17.926557 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tblp\" (UniqueName: \"kubernetes.io/projected/6c138591-09ca-4d38-90fa-61a52081ac72-kube-api-access-5tblp\") pod \"frr-k8s-webhook-server-7f989f654f-z2jph\" (UID: \"6c138591-09ca-4d38-90fa-61a52081ac72\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-z2jph" Mar 09 02:57:17 crc kubenswrapper[4901]: I0309 02:57:17.926583 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/95d29111-dd64-49e3-8c7d-2924c047094b-frr-conf\") pod \"frr-k8s-llmsx\" (UID: \"95d29111-dd64-49e3-8c7d-2924c047094b\") " pod="metallb-system/frr-k8s-llmsx" Mar 09 02:57:18 crc kubenswrapper[4901]: I0309 02:57:18.027169 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/21029bf0-ecef-45a6-8600-10f73ef1949b-memberlist\") pod \"speaker-l7twq\" (UID: \"21029bf0-ecef-45a6-8600-10f73ef1949b\") " pod="metallb-system/speaker-l7twq" Mar 09 02:57:18 crc kubenswrapper[4901]: I0309 02:57:18.027269 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/95d29111-dd64-49e3-8c7d-2924c047094b-frr-sockets\") pod \"frr-k8s-llmsx\" (UID: \"95d29111-dd64-49e3-8c7d-2924c047094b\") " pod="metallb-system/frr-k8s-llmsx" Mar 09 02:57:18 crc kubenswrapper[4901]: I0309 02:57:18.027366 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/95d29111-dd64-49e3-8c7d-2924c047094b-reloader\") pod \"frr-k8s-llmsx\" (UID: \"95d29111-dd64-49e3-8c7d-2924c047094b\") " pod="metallb-system/frr-k8s-llmsx" Mar 09 02:57:18 crc kubenswrapper[4901]: I0309 02:57:18.027409 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53342ff7-72fc-4eda-aa06-6330700e43cb-metrics-certs\") pod \"controller-86ddb6bd46-n25nj\" (UID: \"53342ff7-72fc-4eda-aa06-6330700e43cb\") " pod="metallb-system/controller-86ddb6bd46-n25nj" Mar 09 02:57:18 crc kubenswrapper[4901]: I0309 02:57:18.027451 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/53342ff7-72fc-4eda-aa06-6330700e43cb-cert\") pod \"controller-86ddb6bd46-n25nj\" (UID: \"53342ff7-72fc-4eda-aa06-6330700e43cb\") " pod="metallb-system/controller-86ddb6bd46-n25nj" Mar 09 02:57:18 crc kubenswrapper[4901]: I0309 02:57:18.027479 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/21029bf0-ecef-45a6-8600-10f73ef1949b-metallb-excludel2\") pod \"speaker-l7twq\" (UID: \"21029bf0-ecef-45a6-8600-10f73ef1949b\") " pod="metallb-system/speaker-l7twq" Mar 09 02:57:18 crc kubenswrapper[4901]: I0309 02:57:18.027505 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/95d29111-dd64-49e3-8c7d-2924c047094b-frr-startup\") pod \"frr-k8s-llmsx\" (UID: \"95d29111-dd64-49e3-8c7d-2924c047094b\") " pod="metallb-system/frr-k8s-llmsx" Mar 09 02:57:18 crc kubenswrapper[4901]: I0309 02:57:18.027522 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/95d29111-dd64-49e3-8c7d-2924c047094b-metrics\") pod \"frr-k8s-llmsx\" (UID: \"95d29111-dd64-49e3-8c7d-2924c047094b\") " pod="metallb-system/frr-k8s-llmsx" Mar 09 02:57:18 crc kubenswrapper[4901]: I0309 02:57:18.027538 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9spn\" (UniqueName: \"kubernetes.io/projected/95d29111-dd64-49e3-8c7d-2924c047094b-kube-api-access-s9spn\") pod \"frr-k8s-llmsx\" (UID: \"95d29111-dd64-49e3-8c7d-2924c047094b\") " pod="metallb-system/frr-k8s-llmsx" Mar 09 02:57:18 crc kubenswrapper[4901]: I0309 02:57:18.027558 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/95d29111-dd64-49e3-8c7d-2924c047094b-metrics-certs\") pod \"frr-k8s-llmsx\" (UID: \"95d29111-dd64-49e3-8c7d-2924c047094b\") " pod="metallb-system/frr-k8s-llmsx" Mar 09 02:57:18 crc kubenswrapper[4901]: I0309 02:57:18.027588 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrvmm\" (UniqueName: \"kubernetes.io/projected/21029bf0-ecef-45a6-8600-10f73ef1949b-kube-api-access-wrvmm\") pod \"speaker-l7twq\" (UID: \"21029bf0-ecef-45a6-8600-10f73ef1949b\") " pod="metallb-system/speaker-l7twq" Mar 09 02:57:18 crc kubenswrapper[4901]: I0309 02:57:18.027615 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/95d29111-dd64-49e3-8c7d-2924c047094b-frr-sockets\") pod \"frr-k8s-llmsx\" (UID: \"95d29111-dd64-49e3-8c7d-2924c047094b\") " pod="metallb-system/frr-k8s-llmsx" Mar 09 02:57:18 crc kubenswrapper[4901]: I0309 02:57:18.027645 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/21029bf0-ecef-45a6-8600-10f73ef1949b-metrics-certs\") pod \"speaker-l7twq\" (UID: \"21029bf0-ecef-45a6-8600-10f73ef1949b\") " pod="metallb-system/speaker-l7twq" Mar 09 02:57:18 crc kubenswrapper[4901]: I0309 02:57:18.027665 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tblp\" (UniqueName: \"kubernetes.io/projected/6c138591-09ca-4d38-90fa-61a52081ac72-kube-api-access-5tblp\") pod \"frr-k8s-webhook-server-7f989f654f-z2jph\" (UID: \"6c138591-09ca-4d38-90fa-61a52081ac72\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-z2jph" Mar 09 02:57:18 crc kubenswrapper[4901]: I0309 02:57:18.027722 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/95d29111-dd64-49e3-8c7d-2924c047094b-frr-conf\") pod \"frr-k8s-llmsx\" (UID: \"95d29111-dd64-49e3-8c7d-2924c047094b\") " pod="metallb-system/frr-k8s-llmsx" Mar 09 02:57:18 crc kubenswrapper[4901]: I0309 02:57:18.027779 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6c138591-09ca-4d38-90fa-61a52081ac72-cert\") pod \"frr-k8s-webhook-server-7f989f654f-z2jph\" (UID: \"6c138591-09ca-4d38-90fa-61a52081ac72\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-z2jph" Mar 09 02:57:18 crc kubenswrapper[4901]: I0309 02:57:18.027800 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hscfc\" (UniqueName: \"kubernetes.io/projected/53342ff7-72fc-4eda-aa06-6330700e43cb-kube-api-access-hscfc\") pod \"controller-86ddb6bd46-n25nj\" (UID: \"53342ff7-72fc-4eda-aa06-6330700e43cb\") " pod="metallb-system/controller-86ddb6bd46-n25nj" Mar 09 02:57:18 crc kubenswrapper[4901]: I0309 02:57:18.028150 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/95d29111-dd64-49e3-8c7d-2924c047094b-metrics\") pod \"frr-k8s-llmsx\" (UID: \"95d29111-dd64-49e3-8c7d-2924c047094b\") " pod="metallb-system/frr-k8s-llmsx" Mar 09 02:57:18 crc kubenswrapper[4901]: I0309 02:57:18.028345 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/95d29111-dd64-49e3-8c7d-2924c047094b-frr-startup\") pod \"frr-k8s-llmsx\" (UID: \"95d29111-dd64-49e3-8c7d-2924c047094b\") " pod="metallb-system/frr-k8s-llmsx" Mar 09 02:57:18 crc kubenswrapper[4901]: I0309 02:57:18.028526 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/95d29111-dd64-49e3-8c7d-2924c047094b-reloader\") pod \"frr-k8s-llmsx\" (UID: \"95d29111-dd64-49e3-8c7d-2924c047094b\") " pod="metallb-system/frr-k8s-llmsx" Mar 09 02:57:18 crc kubenswrapper[4901]: I0309 02:57:18.028809 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/95d29111-dd64-49e3-8c7d-2924c047094b-frr-conf\") pod \"frr-k8s-llmsx\" (UID: \"95d29111-dd64-49e3-8c7d-2924c047094b\") " pod="metallb-system/frr-k8s-llmsx" Mar 09 02:57:18 crc kubenswrapper[4901]: I0309 02:57:18.033778 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/95d29111-dd64-49e3-8c7d-2924c047094b-metrics-certs\") pod \"frr-k8s-llmsx\" (UID: \"95d29111-dd64-49e3-8c7d-2924c047094b\") " pod="metallb-system/frr-k8s-llmsx" Mar 09 02:57:18 crc kubenswrapper[4901]: I0309 02:57:18.041813 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6c138591-09ca-4d38-90fa-61a52081ac72-cert\") pod \"frr-k8s-webhook-server-7f989f654f-z2jph\" (UID: \"6c138591-09ca-4d38-90fa-61a52081ac72\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-z2jph" Mar 09 02:57:18 crc kubenswrapper[4901]: I0309 02:57:18.046557 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9spn\" (UniqueName: \"kubernetes.io/projected/95d29111-dd64-49e3-8c7d-2924c047094b-kube-api-access-s9spn\") pod \"frr-k8s-llmsx\" (UID: \"95d29111-dd64-49e3-8c7d-2924c047094b\") " pod="metallb-system/frr-k8s-llmsx" Mar 09 02:57:18 crc kubenswrapper[4901]: I0309 02:57:18.050119 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tblp\" (UniqueName: \"kubernetes.io/projected/6c138591-09ca-4d38-90fa-61a52081ac72-kube-api-access-5tblp\") pod \"frr-k8s-webhook-server-7f989f654f-z2jph\" (UID: \"6c138591-09ca-4d38-90fa-61a52081ac72\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-z2jph" Mar 09 02:57:18 crc kubenswrapper[4901]: I0309 02:57:18.065342 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-llmsx" Mar 09 02:57:18 crc kubenswrapper[4901]: I0309 02:57:18.074353 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-z2jph" Mar 09 02:57:18 crc kubenswrapper[4901]: I0309 02:57:18.129384 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53342ff7-72fc-4eda-aa06-6330700e43cb-metrics-certs\") pod \"controller-86ddb6bd46-n25nj\" (UID: \"53342ff7-72fc-4eda-aa06-6330700e43cb\") " pod="metallb-system/controller-86ddb6bd46-n25nj" Mar 09 02:57:18 crc kubenswrapper[4901]: I0309 02:57:18.129794 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/53342ff7-72fc-4eda-aa06-6330700e43cb-cert\") pod \"controller-86ddb6bd46-n25nj\" (UID: \"53342ff7-72fc-4eda-aa06-6330700e43cb\") " pod="metallb-system/controller-86ddb6bd46-n25nj" Mar 09 02:57:18 crc kubenswrapper[4901]: I0309 02:57:18.129844 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/21029bf0-ecef-45a6-8600-10f73ef1949b-metallb-excludel2\") pod \"speaker-l7twq\" (UID: \"21029bf0-ecef-45a6-8600-10f73ef1949b\") " pod="metallb-system/speaker-l7twq" Mar 09 02:57:18 crc kubenswrapper[4901]: I0309 02:57:18.130715 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/21029bf0-ecef-45a6-8600-10f73ef1949b-metallb-excludel2\") pod \"speaker-l7twq\" (UID: \"21029bf0-ecef-45a6-8600-10f73ef1949b\") " pod="metallb-system/speaker-l7twq" Mar 09 02:57:18 crc kubenswrapper[4901]: I0309 02:57:18.129908 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrvmm\" (UniqueName: \"kubernetes.io/projected/21029bf0-ecef-45a6-8600-10f73ef1949b-kube-api-access-wrvmm\") pod \"speaker-l7twq\" (UID: \"21029bf0-ecef-45a6-8600-10f73ef1949b\") " pod="metallb-system/speaker-l7twq" Mar 09 02:57:18 crc kubenswrapper[4901]: I0309 02:57:18.130798 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/21029bf0-ecef-45a6-8600-10f73ef1949b-metrics-certs\") pod \"speaker-l7twq\" (UID: \"21029bf0-ecef-45a6-8600-10f73ef1949b\") " pod="metallb-system/speaker-l7twq" Mar 09 02:57:18 crc kubenswrapper[4901]: I0309 02:57:18.130878 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hscfc\" (UniqueName: \"kubernetes.io/projected/53342ff7-72fc-4eda-aa06-6330700e43cb-kube-api-access-hscfc\") pod \"controller-86ddb6bd46-n25nj\" (UID: \"53342ff7-72fc-4eda-aa06-6330700e43cb\") " pod="metallb-system/controller-86ddb6bd46-n25nj" Mar 09 02:57:18 crc kubenswrapper[4901]: I0309 02:57:18.130928 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/21029bf0-ecef-45a6-8600-10f73ef1949b-memberlist\") pod \"speaker-l7twq\" (UID: \"21029bf0-ecef-45a6-8600-10f73ef1949b\") " pod="metallb-system/speaker-l7twq" Mar 09 02:57:18 crc kubenswrapper[4901]: E0309 02:57:18.131027 4901 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 09 02:57:18 crc kubenswrapper[4901]: E0309 02:57:18.132866 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21029bf0-ecef-45a6-8600-10f73ef1949b-memberlist podName:21029bf0-ecef-45a6-8600-10f73ef1949b nodeName:}" failed. No retries permitted until 2026-03-09 02:57:18.632849472 +0000 UTC m=+963.222513204 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/21029bf0-ecef-45a6-8600-10f73ef1949b-memberlist") pod "speaker-l7twq" (UID: "21029bf0-ecef-45a6-8600-10f73ef1949b") : secret "metallb-memberlist" not found Mar 09 02:57:18 crc kubenswrapper[4901]: I0309 02:57:18.133467 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53342ff7-72fc-4eda-aa06-6330700e43cb-metrics-certs\") pod \"controller-86ddb6bd46-n25nj\" (UID: \"53342ff7-72fc-4eda-aa06-6330700e43cb\") " pod="metallb-system/controller-86ddb6bd46-n25nj" Mar 09 02:57:18 crc kubenswrapper[4901]: I0309 02:57:18.144077 4901 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 09 02:57:18 crc kubenswrapper[4901]: I0309 02:57:18.147789 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/21029bf0-ecef-45a6-8600-10f73ef1949b-metrics-certs\") pod \"speaker-l7twq\" (UID: \"21029bf0-ecef-45a6-8600-10f73ef1949b\") " pod="metallb-system/speaker-l7twq" Mar 09 02:57:18 crc kubenswrapper[4901]: I0309 02:57:18.155234 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrvmm\" (UniqueName: \"kubernetes.io/projected/21029bf0-ecef-45a6-8600-10f73ef1949b-kube-api-access-wrvmm\") pod \"speaker-l7twq\" (UID: \"21029bf0-ecef-45a6-8600-10f73ef1949b\") " pod="metallb-system/speaker-l7twq" Mar 09 02:57:18 crc kubenswrapper[4901]: I0309 02:57:18.159514 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/53342ff7-72fc-4eda-aa06-6330700e43cb-cert\") pod \"controller-86ddb6bd46-n25nj\" (UID: \"53342ff7-72fc-4eda-aa06-6330700e43cb\") " pod="metallb-system/controller-86ddb6bd46-n25nj" Mar 09 02:57:18 crc kubenswrapper[4901]: I0309 02:57:18.159515 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hscfc\" (UniqueName: \"kubernetes.io/projected/53342ff7-72fc-4eda-aa06-6330700e43cb-kube-api-access-hscfc\") pod \"controller-86ddb6bd46-n25nj\" (UID: \"53342ff7-72fc-4eda-aa06-6330700e43cb\") " pod="metallb-system/controller-86ddb6bd46-n25nj" Mar 09 02:57:18 crc kubenswrapper[4901]: I0309 02:57:18.165463 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-n25nj" Mar 09 02:57:18 crc kubenswrapper[4901]: I0309 02:57:18.336571 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-z2jph"] Mar 09 02:57:18 crc kubenswrapper[4901]: W0309 02:57:18.353573 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c138591_09ca_4d38_90fa_61a52081ac72.slice/crio-eb938e0cfc97e8aa7b12468f909f9a89300d332f2459160231e8af19b7f9fbd9 WatchSource:0}: Error finding container eb938e0cfc97e8aa7b12468f909f9a89300d332f2459160231e8af19b7f9fbd9: Status 404 returned error can't find the container with id eb938e0cfc97e8aa7b12468f909f9a89300d332f2459160231e8af19b7f9fbd9 Mar 09 02:57:18 crc kubenswrapper[4901]: I0309 02:57:18.379573 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-n25nj"] Mar 09 02:57:18 crc kubenswrapper[4901]: W0309 02:57:18.424087 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53342ff7_72fc_4eda_aa06_6330700e43cb.slice/crio-2d2eeccc603a7f03ecb2b73fac5af1ea72a3ffcb12c80c6a25e6e61020f31d38 WatchSource:0}: Error finding container 2d2eeccc603a7f03ecb2b73fac5af1ea72a3ffcb12c80c6a25e6e61020f31d38: Status 404 returned error can't find the container with id 2d2eeccc603a7f03ecb2b73fac5af1ea72a3ffcb12c80c6a25e6e61020f31d38 Mar 09 02:57:18 crc kubenswrapper[4901]: I0309 02:57:18.636707 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/21029bf0-ecef-45a6-8600-10f73ef1949b-memberlist\") pod \"speaker-l7twq\" (UID: \"21029bf0-ecef-45a6-8600-10f73ef1949b\") " pod="metallb-system/speaker-l7twq" Mar 09 02:57:18 crc kubenswrapper[4901]: I0309 02:57:18.646935 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/21029bf0-ecef-45a6-8600-10f73ef1949b-memberlist\") pod \"speaker-l7twq\" (UID: \"21029bf0-ecef-45a6-8600-10f73ef1949b\") " pod="metallb-system/speaker-l7twq" Mar 09 02:57:18 crc kubenswrapper[4901]: I0309 02:57:18.756363 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-l7twq" Mar 09 02:57:18 crc kubenswrapper[4901]: W0309 02:57:18.776798 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21029bf0_ecef_45a6_8600_10f73ef1949b.slice/crio-1ef0ac1045e8a9cc239fc3c2bc70e7d968003f040da0a2d8d4a2c9fd66445ffa WatchSource:0}: Error finding container 1ef0ac1045e8a9cc239fc3c2bc70e7d968003f040da0a2d8d4a2c9fd66445ffa: Status 404 returned error can't find the container with id 1ef0ac1045e8a9cc239fc3c2bc70e7d968003f040da0a2d8d4a2c9fd66445ffa Mar 09 02:57:18 crc kubenswrapper[4901]: I0309 02:57:18.807141 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-l7twq" event={"ID":"21029bf0-ecef-45a6-8600-10f73ef1949b","Type":"ContainerStarted","Data":"1ef0ac1045e8a9cc239fc3c2bc70e7d968003f040da0a2d8d4a2c9fd66445ffa"} Mar 09 02:57:18 crc kubenswrapper[4901]: I0309 02:57:18.809607 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-z2jph" event={"ID":"6c138591-09ca-4d38-90fa-61a52081ac72","Type":"ContainerStarted","Data":"eb938e0cfc97e8aa7b12468f909f9a89300d332f2459160231e8af19b7f9fbd9"} Mar 09 02:57:18 crc kubenswrapper[4901]: I0309 02:57:18.811449 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-n25nj" event={"ID":"53342ff7-72fc-4eda-aa06-6330700e43cb","Type":"ContainerStarted","Data":"b05fab67a4fe01602766fa4acbcbbed0b24bd5423bd7b030f9cafee7ae71e50d"} Mar 09 02:57:18 crc kubenswrapper[4901]: I0309 02:57:18.811481 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-n25nj" event={"ID":"53342ff7-72fc-4eda-aa06-6330700e43cb","Type":"ContainerStarted","Data":"e358aef896c49e38c08cf21a6f3d5f0a43ed6c6a9bdf280da53dfe1f6715a611"} Mar 09 02:57:18 crc kubenswrapper[4901]: I0309 02:57:18.811494 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-n25nj" event={"ID":"53342ff7-72fc-4eda-aa06-6330700e43cb","Type":"ContainerStarted","Data":"2d2eeccc603a7f03ecb2b73fac5af1ea72a3ffcb12c80c6a25e6e61020f31d38"} Mar 09 02:57:18 crc kubenswrapper[4901]: I0309 02:57:18.811635 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-86ddb6bd46-n25nj" Mar 09 02:57:18 crc kubenswrapper[4901]: I0309 02:57:18.812578 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-llmsx" event={"ID":"95d29111-dd64-49e3-8c7d-2924c047094b","Type":"ContainerStarted","Data":"5ef654047adea6aa94902ba8b2f097ad833909ef7c968c48f64fc04d72583525"} Mar 09 02:57:18 crc kubenswrapper[4901]: I0309 02:57:18.830139 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-86ddb6bd46-n25nj" podStartSLOduration=1.830121766 podStartE2EDuration="1.830121766s" podCreationTimestamp="2026-03-09 02:57:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:57:18.826022833 +0000 UTC m=+963.415686575" watchObservedRunningTime="2026-03-09 02:57:18.830121766 +0000 UTC m=+963.419785508" Mar 09 02:57:19 crc kubenswrapper[4901]: I0309 02:57:19.819931 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-l7twq" event={"ID":"21029bf0-ecef-45a6-8600-10f73ef1949b","Type":"ContainerStarted","Data":"b7e7c664df31f6e4722d17bea07d971e1d70907f9cbb45d5b937768848044957"} Mar 09 02:57:19 crc kubenswrapper[4901]: I0309 02:57:19.820358 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-l7twq" Mar 09 02:57:19 crc kubenswrapper[4901]: I0309 02:57:19.820389 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-l7twq" event={"ID":"21029bf0-ecef-45a6-8600-10f73ef1949b","Type":"ContainerStarted","Data":"16ba4c628a734a42f8021d407b3bcf1c516dc09cd4db4c36b3646af8d3575476"} Mar 09 02:57:19 crc kubenswrapper[4901]: I0309 02:57:19.837977 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-l7twq" podStartSLOduration=2.837961073 podStartE2EDuration="2.837961073s" podCreationTimestamp="2026-03-09 02:57:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:57:19.836295021 +0000 UTC m=+964.425958803" watchObservedRunningTime="2026-03-09 02:57:19.837961073 +0000 UTC m=+964.427624795" Mar 09 02:57:24 crc kubenswrapper[4901]: I0309 02:57:24.126343 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mgwq2"] Mar 09 02:57:24 crc kubenswrapper[4901]: I0309 02:57:24.127567 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mgwq2"] Mar 09 02:57:24 crc kubenswrapper[4901]: I0309 02:57:24.127678 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mgwq2" Mar 09 02:57:24 crc kubenswrapper[4901]: I0309 02:57:24.231819 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bm92j\" (UniqueName: \"kubernetes.io/projected/a01c202e-0da7-44f2-b1fa-42ef7d93ad7e-kube-api-access-bm92j\") pod \"redhat-marketplace-mgwq2\" (UID: \"a01c202e-0da7-44f2-b1fa-42ef7d93ad7e\") " pod="openshift-marketplace/redhat-marketplace-mgwq2" Mar 09 02:57:24 crc kubenswrapper[4901]: I0309 02:57:24.231881 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a01c202e-0da7-44f2-b1fa-42ef7d93ad7e-utilities\") pod \"redhat-marketplace-mgwq2\" (UID: \"a01c202e-0da7-44f2-b1fa-42ef7d93ad7e\") " pod="openshift-marketplace/redhat-marketplace-mgwq2" Mar 09 02:57:24 crc kubenswrapper[4901]: I0309 02:57:24.231931 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a01c202e-0da7-44f2-b1fa-42ef7d93ad7e-catalog-content\") pod \"redhat-marketplace-mgwq2\" (UID: \"a01c202e-0da7-44f2-b1fa-42ef7d93ad7e\") " pod="openshift-marketplace/redhat-marketplace-mgwq2" Mar 09 02:57:24 crc kubenswrapper[4901]: I0309 02:57:24.333015 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a01c202e-0da7-44f2-b1fa-42ef7d93ad7e-utilities\") pod \"redhat-marketplace-mgwq2\" (UID: \"a01c202e-0da7-44f2-b1fa-42ef7d93ad7e\") " pod="openshift-marketplace/redhat-marketplace-mgwq2" Mar 09 02:57:24 crc kubenswrapper[4901]: I0309 02:57:24.333082 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a01c202e-0da7-44f2-b1fa-42ef7d93ad7e-catalog-content\") pod \"redhat-marketplace-mgwq2\" (UID: \"a01c202e-0da7-44f2-b1fa-42ef7d93ad7e\") " pod="openshift-marketplace/redhat-marketplace-mgwq2" Mar 09 02:57:24 crc kubenswrapper[4901]: I0309 02:57:24.333153 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bm92j\" (UniqueName: \"kubernetes.io/projected/a01c202e-0da7-44f2-b1fa-42ef7d93ad7e-kube-api-access-bm92j\") pod \"redhat-marketplace-mgwq2\" (UID: \"a01c202e-0da7-44f2-b1fa-42ef7d93ad7e\") " pod="openshift-marketplace/redhat-marketplace-mgwq2" Mar 09 02:57:24 crc kubenswrapper[4901]: I0309 02:57:24.333854 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a01c202e-0da7-44f2-b1fa-42ef7d93ad7e-utilities\") pod \"redhat-marketplace-mgwq2\" (UID: \"a01c202e-0da7-44f2-b1fa-42ef7d93ad7e\") " pod="openshift-marketplace/redhat-marketplace-mgwq2" Mar 09 02:57:24 crc kubenswrapper[4901]: I0309 02:57:24.334122 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a01c202e-0da7-44f2-b1fa-42ef7d93ad7e-catalog-content\") pod \"redhat-marketplace-mgwq2\" (UID: \"a01c202e-0da7-44f2-b1fa-42ef7d93ad7e\") " pod="openshift-marketplace/redhat-marketplace-mgwq2" Mar 09 02:57:24 crc kubenswrapper[4901]: I0309 02:57:24.353214 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bm92j\" (UniqueName: \"kubernetes.io/projected/a01c202e-0da7-44f2-b1fa-42ef7d93ad7e-kube-api-access-bm92j\") pod \"redhat-marketplace-mgwq2\" (UID: \"a01c202e-0da7-44f2-b1fa-42ef7d93ad7e\") " pod="openshift-marketplace/redhat-marketplace-mgwq2" Mar 09 02:57:24 crc kubenswrapper[4901]: I0309 02:57:24.456061 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mgwq2" Mar 09 02:57:25 crc kubenswrapper[4901]: I0309 02:57:25.758384 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mgwq2"] Mar 09 02:57:25 crc kubenswrapper[4901]: W0309 02:57:25.766384 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda01c202e_0da7_44f2_b1fa_42ef7d93ad7e.slice/crio-15e4bdb28ea5d7dd4bb5511c58514464d43a4f69c44e96bce4982b941c9b4efe WatchSource:0}: Error finding container 15e4bdb28ea5d7dd4bb5511c58514464d43a4f69c44e96bce4982b941c9b4efe: Status 404 returned error can't find the container with id 15e4bdb28ea5d7dd4bb5511c58514464d43a4f69c44e96bce4982b941c9b4efe Mar 09 02:57:25 crc kubenswrapper[4901]: I0309 02:57:25.863062 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mgwq2" event={"ID":"a01c202e-0da7-44f2-b1fa-42ef7d93ad7e","Type":"ContainerStarted","Data":"15e4bdb28ea5d7dd4bb5511c58514464d43a4f69c44e96bce4982b941c9b4efe"} Mar 09 02:57:25 crc kubenswrapper[4901]: I0309 02:57:25.864432 4901 generic.go:334] "Generic (PLEG): container finished" podID="95d29111-dd64-49e3-8c7d-2924c047094b" containerID="c6c342fc01aa9224f97d663ff984511ef23b51bb876e9bc5c1938611aa5c2454" exitCode=0 Mar 09 02:57:25 crc kubenswrapper[4901]: I0309 02:57:25.864710 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-llmsx" event={"ID":"95d29111-dd64-49e3-8c7d-2924c047094b","Type":"ContainerDied","Data":"c6c342fc01aa9224f97d663ff984511ef23b51bb876e9bc5c1938611aa5c2454"} Mar 09 02:57:25 crc kubenswrapper[4901]: I0309 02:57:25.869334 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-z2jph" event={"ID":"6c138591-09ca-4d38-90fa-61a52081ac72","Type":"ContainerStarted","Data":"7e7fc012d3b1f58db5d917c0f3091cdc1318b68861231f8e86e5341ed0ca0438"} Mar 09 02:57:25 crc kubenswrapper[4901]: I0309 02:57:25.869496 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-z2jph" Mar 09 02:57:25 crc kubenswrapper[4901]: I0309 02:57:25.901377 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-z2jph" podStartSLOduration=1.783314439 podStartE2EDuration="8.901353192s" podCreationTimestamp="2026-03-09 02:57:17 +0000 UTC" firstStartedPulling="2026-03-09 02:57:18.356368026 +0000 UTC m=+962.946031758" lastFinishedPulling="2026-03-09 02:57:25.474406779 +0000 UTC m=+970.064070511" observedRunningTime="2026-03-09 02:57:25.900303325 +0000 UTC m=+970.489967107" watchObservedRunningTime="2026-03-09 02:57:25.901353192 +0000 UTC m=+970.491016944" Mar 09 02:57:26 crc kubenswrapper[4901]: I0309 02:57:26.879542 4901 generic.go:334] "Generic (PLEG): container finished" podID="95d29111-dd64-49e3-8c7d-2924c047094b" containerID="19772c922b9b3af44138be87a425bc8f3d2df384d6e99a107f76515d014d100a" exitCode=0 Mar 09 02:57:26 crc kubenswrapper[4901]: I0309 02:57:26.880476 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-llmsx" event={"ID":"95d29111-dd64-49e3-8c7d-2924c047094b","Type":"ContainerDied","Data":"19772c922b9b3af44138be87a425bc8f3d2df384d6e99a107f76515d014d100a"} Mar 09 02:57:26 crc kubenswrapper[4901]: I0309 02:57:26.883323 4901 generic.go:334] "Generic (PLEG): container finished" podID="a01c202e-0da7-44f2-b1fa-42ef7d93ad7e" containerID="53355a9b10878fa8dc367f9ed248fb2df866e2c47805aafb50d8e19ea55cb5f1" exitCode=0 Mar 09 02:57:26 crc kubenswrapper[4901]: I0309 02:57:26.883439 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mgwq2" event={"ID":"a01c202e-0da7-44f2-b1fa-42ef7d93ad7e","Type":"ContainerDied","Data":"53355a9b10878fa8dc367f9ed248fb2df866e2c47805aafb50d8e19ea55cb5f1"} Mar 09 02:57:27 crc kubenswrapper[4901]: I0309 02:57:27.897763 4901 generic.go:334] "Generic (PLEG): container finished" podID="95d29111-dd64-49e3-8c7d-2924c047094b" containerID="9f33e4a996ed4ae71b023b5f0925c02983b5086ebc8843d798f11f54ac7f6b41" exitCode=0 Mar 09 02:57:27 crc kubenswrapper[4901]: I0309 02:57:27.897967 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-llmsx" event={"ID":"95d29111-dd64-49e3-8c7d-2924c047094b","Type":"ContainerDied","Data":"9f33e4a996ed4ae71b023b5f0925c02983b5086ebc8843d798f11f54ac7f6b41"} Mar 09 02:57:28 crc kubenswrapper[4901]: I0309 02:57:28.171949 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-86ddb6bd46-n25nj" Mar 09 02:57:28 crc kubenswrapper[4901]: I0309 02:57:28.906476 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-llmsx" event={"ID":"95d29111-dd64-49e3-8c7d-2924c047094b","Type":"ContainerStarted","Data":"feccb820359504cf89b761f4c557b7c03cf7c4ff24d173b8bfba145cc9d235d5"} Mar 09 02:57:28 crc kubenswrapper[4901]: I0309 02:57:28.906769 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-llmsx" event={"ID":"95d29111-dd64-49e3-8c7d-2924c047094b","Type":"ContainerStarted","Data":"c8675be628716ff3fc897c522c891521fd87057e0ad700e5a793df4d74d55390"} Mar 09 02:57:28 crc kubenswrapper[4901]: I0309 02:57:28.906784 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-llmsx" event={"ID":"95d29111-dd64-49e3-8c7d-2924c047094b","Type":"ContainerStarted","Data":"7b99999c1e5386f48736dd607407cda2b92c6f553634db4324ec723563129296"} Mar 09 02:57:28 crc kubenswrapper[4901]: I0309 02:57:28.906792 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-llmsx" event={"ID":"95d29111-dd64-49e3-8c7d-2924c047094b","Type":"ContainerStarted","Data":"a0e1a1d4e6bc8a9334c3aa15af5275dd954d424d8deb48ce7811ff81c1f7ac2d"} Mar 09 02:57:28 crc kubenswrapper[4901]: I0309 02:57:28.906801 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-llmsx" event={"ID":"95d29111-dd64-49e3-8c7d-2924c047094b","Type":"ContainerStarted","Data":"0347206bdc939b4b5fc88e4221f7aa8977e43392778cdc6651d9ab36a583ba6b"} Mar 09 02:57:28 crc kubenswrapper[4901]: I0309 02:57:28.907378 4901 generic.go:334] "Generic (PLEG): container finished" podID="a01c202e-0da7-44f2-b1fa-42ef7d93ad7e" containerID="5d3dd1d530d4253c5d537f4e04d650d84a9ab3eaad7b29ee545fc4ab6206bf33" exitCode=0 Mar 09 02:57:28 crc kubenswrapper[4901]: I0309 02:57:28.907414 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mgwq2" event={"ID":"a01c202e-0da7-44f2-b1fa-42ef7d93ad7e","Type":"ContainerDied","Data":"5d3dd1d530d4253c5d537f4e04d650d84a9ab3eaad7b29ee545fc4ab6206bf33"} Mar 09 02:57:29 crc kubenswrapper[4901]: I0309 02:57:29.917473 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-llmsx" event={"ID":"95d29111-dd64-49e3-8c7d-2924c047094b","Type":"ContainerStarted","Data":"609fe3f28f8babcc72b56e91e51946daf51d25fa0eb9833a4663e58d2824590e"} Mar 09 02:57:29 crc kubenswrapper[4901]: I0309 02:57:29.917772 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-llmsx" Mar 09 02:57:29 crc kubenswrapper[4901]: I0309 02:57:29.919748 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mgwq2" event={"ID":"a01c202e-0da7-44f2-b1fa-42ef7d93ad7e","Type":"ContainerStarted","Data":"98e3856142f4088ee2a834c7a3ca1c4a96be53ef6e077ab66e8f17f522b99fd7"} Mar 09 02:57:29 crc kubenswrapper[4901]: I0309 02:57:29.947596 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-llmsx" podStartSLOduration=5.697389659 podStartE2EDuration="12.947579357s" podCreationTimestamp="2026-03-09 02:57:17 +0000 UTC" firstStartedPulling="2026-03-09 02:57:18.223216226 +0000 UTC m=+962.812879958" lastFinishedPulling="2026-03-09 02:57:25.473405924 +0000 UTC m=+970.063069656" observedRunningTime="2026-03-09 02:57:29.944399757 +0000 UTC m=+974.534063539" watchObservedRunningTime="2026-03-09 02:57:29.947579357 +0000 UTC m=+974.537243089" Mar 09 02:57:29 crc kubenswrapper[4901]: I0309 02:57:29.971854 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mgwq2" podStartSLOduration=3.564031136 podStartE2EDuration="5.971838687s" podCreationTimestamp="2026-03-09 02:57:24 +0000 UTC" firstStartedPulling="2026-03-09 02:57:26.885838922 +0000 UTC m=+971.475502654" lastFinishedPulling="2026-03-09 02:57:29.293646463 +0000 UTC m=+973.883310205" observedRunningTime="2026-03-09 02:57:29.968132684 +0000 UTC m=+974.557796426" watchObservedRunningTime="2026-03-09 02:57:29.971838687 +0000 UTC m=+974.561502419" Mar 09 02:57:33 crc kubenswrapper[4901]: I0309 02:57:33.065672 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-llmsx" Mar 09 02:57:33 crc kubenswrapper[4901]: I0309 02:57:33.110734 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-llmsx" Mar 09 02:57:34 crc kubenswrapper[4901]: I0309 02:57:34.457619 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mgwq2" Mar 09 02:57:34 crc kubenswrapper[4901]: I0309 02:57:34.457691 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mgwq2" Mar 09 02:57:34 crc kubenswrapper[4901]: I0309 02:57:34.507904 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mgwq2" Mar 09 02:57:34 crc kubenswrapper[4901]: I0309 02:57:34.865137 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wfr8q"] Mar 09 02:57:34 crc kubenswrapper[4901]: I0309 02:57:34.867885 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wfr8q" Mar 09 02:57:34 crc kubenswrapper[4901]: I0309 02:57:34.876135 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51a52c36-e39b-42f6-bb20-b2c8702cb4bf-catalog-content\") pod \"community-operators-wfr8q\" (UID: \"51a52c36-e39b-42f6-bb20-b2c8702cb4bf\") " pod="openshift-marketplace/community-operators-wfr8q" Mar 09 02:57:34 crc kubenswrapper[4901]: I0309 02:57:34.876583 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdfrb\" (UniqueName: \"kubernetes.io/projected/51a52c36-e39b-42f6-bb20-b2c8702cb4bf-kube-api-access-sdfrb\") pod \"community-operators-wfr8q\" (UID: \"51a52c36-e39b-42f6-bb20-b2c8702cb4bf\") " pod="openshift-marketplace/community-operators-wfr8q" Mar 09 02:57:34 crc kubenswrapper[4901]: I0309 02:57:34.876698 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51a52c36-e39b-42f6-bb20-b2c8702cb4bf-utilities\") pod \"community-operators-wfr8q\" (UID: \"51a52c36-e39b-42f6-bb20-b2c8702cb4bf\") " pod="openshift-marketplace/community-operators-wfr8q" Mar 09 02:57:34 crc kubenswrapper[4901]: I0309 02:57:34.885879 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wfr8q"] Mar 09 02:57:34 crc kubenswrapper[4901]: I0309 02:57:34.978193 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51a52c36-e39b-42f6-bb20-b2c8702cb4bf-catalog-content\") pod \"community-operators-wfr8q\" (UID: \"51a52c36-e39b-42f6-bb20-b2c8702cb4bf\") " pod="openshift-marketplace/community-operators-wfr8q" Mar 09 02:57:34 crc kubenswrapper[4901]: I0309 02:57:34.978392 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdfrb\" (UniqueName: \"kubernetes.io/projected/51a52c36-e39b-42f6-bb20-b2c8702cb4bf-kube-api-access-sdfrb\") pod \"community-operators-wfr8q\" (UID: \"51a52c36-e39b-42f6-bb20-b2c8702cb4bf\") " pod="openshift-marketplace/community-operators-wfr8q" Mar 09 02:57:34 crc kubenswrapper[4901]: I0309 02:57:34.978497 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51a52c36-e39b-42f6-bb20-b2c8702cb4bf-utilities\") pod \"community-operators-wfr8q\" (UID: \"51a52c36-e39b-42f6-bb20-b2c8702cb4bf\") " pod="openshift-marketplace/community-operators-wfr8q" Mar 09 02:57:34 crc kubenswrapper[4901]: I0309 02:57:34.978975 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51a52c36-e39b-42f6-bb20-b2c8702cb4bf-catalog-content\") pod \"community-operators-wfr8q\" (UID: \"51a52c36-e39b-42f6-bb20-b2c8702cb4bf\") " pod="openshift-marketplace/community-operators-wfr8q" Mar 09 02:57:34 crc kubenswrapper[4901]: I0309 02:57:34.980015 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51a52c36-e39b-42f6-bb20-b2c8702cb4bf-utilities\") pod \"community-operators-wfr8q\" (UID: \"51a52c36-e39b-42f6-bb20-b2c8702cb4bf\") " pod="openshift-marketplace/community-operators-wfr8q" Mar 09 02:57:35 crc kubenswrapper[4901]: I0309 02:57:35.000085 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdfrb\" (UniqueName: \"kubernetes.io/projected/51a52c36-e39b-42f6-bb20-b2c8702cb4bf-kube-api-access-sdfrb\") pod \"community-operators-wfr8q\" (UID: \"51a52c36-e39b-42f6-bb20-b2c8702cb4bf\") " pod="openshift-marketplace/community-operators-wfr8q" Mar 09 02:57:35 crc kubenswrapper[4901]: I0309 02:57:35.023580 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mgwq2" Mar 09 02:57:35 crc kubenswrapper[4901]: I0309 02:57:35.203769 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wfr8q" Mar 09 02:57:35 crc kubenswrapper[4901]: I0309 02:57:35.679572 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wfr8q"] Mar 09 02:57:35 crc kubenswrapper[4901]: W0309 02:57:35.688546 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51a52c36_e39b_42f6_bb20_b2c8702cb4bf.slice/crio-c3189beb7946674fa2eac8f395ad7ebcb73d84bf4752703b3cf084ab398eb630 WatchSource:0}: Error finding container c3189beb7946674fa2eac8f395ad7ebcb73d84bf4752703b3cf084ab398eb630: Status 404 returned error can't find the container with id c3189beb7946674fa2eac8f395ad7ebcb73d84bf4752703b3cf084ab398eb630 Mar 09 02:57:35 crc kubenswrapper[4901]: I0309 02:57:35.975053 4901 generic.go:334] "Generic (PLEG): container finished" podID="51a52c36-e39b-42f6-bb20-b2c8702cb4bf" containerID="0697cf7f889e983a1e1f290abadd3ea00f4a927d0a4be52b71be6d0ddedb56f5" exitCode=0 Mar 09 02:57:35 crc kubenswrapper[4901]: I0309 02:57:35.975157 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wfr8q" event={"ID":"51a52c36-e39b-42f6-bb20-b2c8702cb4bf","Type":"ContainerDied","Data":"0697cf7f889e983a1e1f290abadd3ea00f4a927d0a4be52b71be6d0ddedb56f5"} Mar 09 02:57:35 crc kubenswrapper[4901]: I0309 02:57:35.975584 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wfr8q" event={"ID":"51a52c36-e39b-42f6-bb20-b2c8702cb4bf","Type":"ContainerStarted","Data":"c3189beb7946674fa2eac8f395ad7ebcb73d84bf4752703b3cf084ab398eb630"} Mar 09 02:57:36 crc kubenswrapper[4901]: I0309 02:57:36.981626 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wfr8q" event={"ID":"51a52c36-e39b-42f6-bb20-b2c8702cb4bf","Type":"ContainerStarted","Data":"7a0ae823a1cc12e7e0810cb523ddded2baa353cd4a638e14e6b643d5e8c04987"} Mar 09 02:57:37 crc kubenswrapper[4901]: I0309 02:57:37.350476 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mgwq2"] Mar 09 02:57:37 crc kubenswrapper[4901]: I0309 02:57:37.350973 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mgwq2" podUID="a01c202e-0da7-44f2-b1fa-42ef7d93ad7e" containerName="registry-server" containerID="cri-o://98e3856142f4088ee2a834c7a3ca1c4a96be53ef6e077ab66e8f17f522b99fd7" gracePeriod=2 Mar 09 02:57:38 crc kubenswrapper[4901]: I0309 02:57:37.998742 4901 generic.go:334] "Generic (PLEG): container finished" podID="51a52c36-e39b-42f6-bb20-b2c8702cb4bf" containerID="7a0ae823a1cc12e7e0810cb523ddded2baa353cd4a638e14e6b643d5e8c04987" exitCode=0 Mar 09 02:57:38 crc kubenswrapper[4901]: I0309 02:57:37.998859 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wfr8q" event={"ID":"51a52c36-e39b-42f6-bb20-b2c8702cb4bf","Type":"ContainerDied","Data":"7a0ae823a1cc12e7e0810cb523ddded2baa353cd4a638e14e6b643d5e8c04987"} Mar 09 02:57:38 crc kubenswrapper[4901]: I0309 02:57:38.004768 4901 generic.go:334] "Generic (PLEG): container finished" podID="a01c202e-0da7-44f2-b1fa-42ef7d93ad7e" containerID="98e3856142f4088ee2a834c7a3ca1c4a96be53ef6e077ab66e8f17f522b99fd7" exitCode=0 Mar 09 02:57:38 crc kubenswrapper[4901]: I0309 02:57:38.004800 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mgwq2" event={"ID":"a01c202e-0da7-44f2-b1fa-42ef7d93ad7e","Type":"ContainerDied","Data":"98e3856142f4088ee2a834c7a3ca1c4a96be53ef6e077ab66e8f17f522b99fd7"} Mar 09 02:57:38 crc kubenswrapper[4901]: I0309 02:57:38.004821 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mgwq2" event={"ID":"a01c202e-0da7-44f2-b1fa-42ef7d93ad7e","Type":"ContainerDied","Data":"15e4bdb28ea5d7dd4bb5511c58514464d43a4f69c44e96bce4982b941c9b4efe"} Mar 09 02:57:38 crc kubenswrapper[4901]: I0309 02:57:38.004832 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15e4bdb28ea5d7dd4bb5511c58514464d43a4f69c44e96bce4982b941c9b4efe" Mar 09 02:57:38 crc kubenswrapper[4901]: I0309 02:57:38.024782 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mgwq2" Mar 09 02:57:38 crc kubenswrapper[4901]: I0309 02:57:38.068956 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-llmsx" Mar 09 02:57:38 crc kubenswrapper[4901]: I0309 02:57:38.082079 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-z2jph" Mar 09 02:57:38 crc kubenswrapper[4901]: I0309 02:57:38.223281 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a01c202e-0da7-44f2-b1fa-42ef7d93ad7e-utilities\") pod \"a01c202e-0da7-44f2-b1fa-42ef7d93ad7e\" (UID: \"a01c202e-0da7-44f2-b1fa-42ef7d93ad7e\") " Mar 09 02:57:38 crc kubenswrapper[4901]: I0309 02:57:38.223405 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a01c202e-0da7-44f2-b1fa-42ef7d93ad7e-catalog-content\") pod \"a01c202e-0da7-44f2-b1fa-42ef7d93ad7e\" (UID: \"a01c202e-0da7-44f2-b1fa-42ef7d93ad7e\") " Mar 09 02:57:38 crc kubenswrapper[4901]: I0309 02:57:38.223448 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bm92j\" (UniqueName: \"kubernetes.io/projected/a01c202e-0da7-44f2-b1fa-42ef7d93ad7e-kube-api-access-bm92j\") pod \"a01c202e-0da7-44f2-b1fa-42ef7d93ad7e\" (UID: \"a01c202e-0da7-44f2-b1fa-42ef7d93ad7e\") " Mar 09 02:57:38 crc kubenswrapper[4901]: I0309 02:57:38.224167 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a01c202e-0da7-44f2-b1fa-42ef7d93ad7e-utilities" (OuterVolumeSpecName: "utilities") pod "a01c202e-0da7-44f2-b1fa-42ef7d93ad7e" (UID: "a01c202e-0da7-44f2-b1fa-42ef7d93ad7e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 02:57:38 crc kubenswrapper[4901]: I0309 02:57:38.229694 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a01c202e-0da7-44f2-b1fa-42ef7d93ad7e-kube-api-access-bm92j" (OuterVolumeSpecName: "kube-api-access-bm92j") pod "a01c202e-0da7-44f2-b1fa-42ef7d93ad7e" (UID: "a01c202e-0da7-44f2-b1fa-42ef7d93ad7e"). InnerVolumeSpecName "kube-api-access-bm92j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:57:38 crc kubenswrapper[4901]: I0309 02:57:38.325317 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bm92j\" (UniqueName: \"kubernetes.io/projected/a01c202e-0da7-44f2-b1fa-42ef7d93ad7e-kube-api-access-bm92j\") on node \"crc\" DevicePath \"\"" Mar 09 02:57:38 crc kubenswrapper[4901]: I0309 02:57:38.325378 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a01c202e-0da7-44f2-b1fa-42ef7d93ad7e-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 02:57:38 crc kubenswrapper[4901]: I0309 02:57:38.330272 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a01c202e-0da7-44f2-b1fa-42ef7d93ad7e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a01c202e-0da7-44f2-b1fa-42ef7d93ad7e" (UID: "a01c202e-0da7-44f2-b1fa-42ef7d93ad7e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 02:57:38 crc kubenswrapper[4901]: I0309 02:57:38.427410 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a01c202e-0da7-44f2-b1fa-42ef7d93ad7e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 02:57:38 crc kubenswrapper[4901]: I0309 02:57:38.760714 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-l7twq" Mar 09 02:57:39 crc kubenswrapper[4901]: I0309 02:57:39.016381 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mgwq2" Mar 09 02:57:39 crc kubenswrapper[4901]: I0309 02:57:39.016400 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wfr8q" event={"ID":"51a52c36-e39b-42f6-bb20-b2c8702cb4bf","Type":"ContainerStarted","Data":"a754020f2bc0fc42a1e72519e602618918cc78d928ef70c3c210c12ef7855dd4"} Mar 09 02:57:39 crc kubenswrapper[4901]: I0309 02:57:39.047880 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wfr8q" podStartSLOduration=2.583054518 podStartE2EDuration="5.047846673s" podCreationTimestamp="2026-03-09 02:57:34 +0000 UTC" firstStartedPulling="2026-03-09 02:57:35.977255686 +0000 UTC m=+980.566919418" lastFinishedPulling="2026-03-09 02:57:38.442047831 +0000 UTC m=+983.031711573" observedRunningTime="2026-03-09 02:57:39.041838592 +0000 UTC m=+983.631502334" watchObservedRunningTime="2026-03-09 02:57:39.047846673 +0000 UTC m=+983.637510415" Mar 09 02:57:39 crc kubenswrapper[4901]: I0309 02:57:39.065189 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mgwq2"] Mar 09 02:57:39 crc kubenswrapper[4901]: I0309 02:57:39.079949 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mgwq2"] Mar 09 02:57:40 crc kubenswrapper[4901]: I0309 02:57:40.124873 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a01c202e-0da7-44f2-b1fa-42ef7d93ad7e" path="/var/lib/kubelet/pods/a01c202e-0da7-44f2-b1fa-42ef7d93ad7e/volumes" Mar 09 02:57:40 crc kubenswrapper[4901]: I0309 02:57:40.595612 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bnx7v"] Mar 09 02:57:40 crc kubenswrapper[4901]: E0309 02:57:40.596022 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a01c202e-0da7-44f2-b1fa-42ef7d93ad7e" containerName="extract-utilities" Mar 09 02:57:40 crc kubenswrapper[4901]: I0309 02:57:40.596052 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="a01c202e-0da7-44f2-b1fa-42ef7d93ad7e" containerName="extract-utilities" Mar 09 02:57:40 crc kubenswrapper[4901]: E0309 02:57:40.596066 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a01c202e-0da7-44f2-b1fa-42ef7d93ad7e" containerName="registry-server" Mar 09 02:57:40 crc kubenswrapper[4901]: I0309 02:57:40.596073 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="a01c202e-0da7-44f2-b1fa-42ef7d93ad7e" containerName="registry-server" Mar 09 02:57:40 crc kubenswrapper[4901]: E0309 02:57:40.596092 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a01c202e-0da7-44f2-b1fa-42ef7d93ad7e" containerName="extract-content" Mar 09 02:57:40 crc kubenswrapper[4901]: I0309 02:57:40.596100 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="a01c202e-0da7-44f2-b1fa-42ef7d93ad7e" containerName="extract-content" Mar 09 02:57:40 crc kubenswrapper[4901]: I0309 02:57:40.596227 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="a01c202e-0da7-44f2-b1fa-42ef7d93ad7e" containerName="registry-server" Mar 09 02:57:40 crc kubenswrapper[4901]: I0309 02:57:40.596999 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bnx7v" Mar 09 02:57:40 crc kubenswrapper[4901]: I0309 02:57:40.599061 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 09 02:57:40 crc kubenswrapper[4901]: I0309 02:57:40.607831 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bnx7v"] Mar 09 02:57:40 crc kubenswrapper[4901]: I0309 02:57:40.758657 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a365c75a-4afc-41ca-8005-4674a8097d40-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bnx7v\" (UID: \"a365c75a-4afc-41ca-8005-4674a8097d40\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bnx7v" Mar 09 02:57:40 crc kubenswrapper[4901]: I0309 02:57:40.758734 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a365c75a-4afc-41ca-8005-4674a8097d40-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bnx7v\" (UID: \"a365c75a-4afc-41ca-8005-4674a8097d40\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bnx7v" Mar 09 02:57:40 crc kubenswrapper[4901]: I0309 02:57:40.758869 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnd47\" (UniqueName: \"kubernetes.io/projected/a365c75a-4afc-41ca-8005-4674a8097d40-kube-api-access-jnd47\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bnx7v\" (UID: \"a365c75a-4afc-41ca-8005-4674a8097d40\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bnx7v" Mar 09 02:57:40 crc kubenswrapper[4901]: I0309 02:57:40.860446 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnd47\" (UniqueName: \"kubernetes.io/projected/a365c75a-4afc-41ca-8005-4674a8097d40-kube-api-access-jnd47\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bnx7v\" (UID: \"a365c75a-4afc-41ca-8005-4674a8097d40\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bnx7v" Mar 09 02:57:40 crc kubenswrapper[4901]: I0309 02:57:40.860780 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a365c75a-4afc-41ca-8005-4674a8097d40-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bnx7v\" (UID: \"a365c75a-4afc-41ca-8005-4674a8097d40\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bnx7v" Mar 09 02:57:40 crc kubenswrapper[4901]: I0309 02:57:40.860919 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a365c75a-4afc-41ca-8005-4674a8097d40-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bnx7v\" (UID: \"a365c75a-4afc-41ca-8005-4674a8097d40\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bnx7v" Mar 09 02:57:40 crc kubenswrapper[4901]: I0309 02:57:40.861471 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a365c75a-4afc-41ca-8005-4674a8097d40-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bnx7v\" (UID: \"a365c75a-4afc-41ca-8005-4674a8097d40\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bnx7v" Mar 09 02:57:40 crc kubenswrapper[4901]: I0309 02:57:40.861717 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a365c75a-4afc-41ca-8005-4674a8097d40-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bnx7v\" (UID: \"a365c75a-4afc-41ca-8005-4674a8097d40\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bnx7v" Mar 09 02:57:40 crc kubenswrapper[4901]: I0309 02:57:40.885405 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnd47\" (UniqueName: \"kubernetes.io/projected/a365c75a-4afc-41ca-8005-4674a8097d40-kube-api-access-jnd47\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bnx7v\" (UID: \"a365c75a-4afc-41ca-8005-4674a8097d40\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bnx7v" Mar 09 02:57:40 crc kubenswrapper[4901]: I0309 02:57:40.909885 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bnx7v" Mar 09 02:57:41 crc kubenswrapper[4901]: I0309 02:57:41.330040 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bnx7v"] Mar 09 02:57:41 crc kubenswrapper[4901]: W0309 02:57:41.341264 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda365c75a_4afc_41ca_8005_4674a8097d40.slice/crio-f09e8a2f42648448676c327ab42e8437e0b97e3edb1c03c621ff7caa2ed1a089 WatchSource:0}: Error finding container f09e8a2f42648448676c327ab42e8437e0b97e3edb1c03c621ff7caa2ed1a089: Status 404 returned error can't find the container with id f09e8a2f42648448676c327ab42e8437e0b97e3edb1c03c621ff7caa2ed1a089 Mar 09 02:57:42 crc kubenswrapper[4901]: I0309 02:57:42.036342 4901 generic.go:334] "Generic (PLEG): container finished" podID="a365c75a-4afc-41ca-8005-4674a8097d40" containerID="25cb73c4a40554e881817d13fe765a61d0df1d8484b93dc576afcce006218772" exitCode=0 Mar 09 02:57:42 crc kubenswrapper[4901]: I0309 02:57:42.036450 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bnx7v" event={"ID":"a365c75a-4afc-41ca-8005-4674a8097d40","Type":"ContainerDied","Data":"25cb73c4a40554e881817d13fe765a61d0df1d8484b93dc576afcce006218772"} Mar 09 02:57:42 crc kubenswrapper[4901]: I0309 02:57:42.036641 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bnx7v" event={"ID":"a365c75a-4afc-41ca-8005-4674a8097d40","Type":"ContainerStarted","Data":"f09e8a2f42648448676c327ab42e8437e0b97e3edb1c03c621ff7caa2ed1a089"} Mar 09 02:57:45 crc kubenswrapper[4901]: I0309 02:57:45.204635 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wfr8q" Mar 09 02:57:45 crc kubenswrapper[4901]: I0309 02:57:45.205530 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wfr8q" Mar 09 02:57:45 crc kubenswrapper[4901]: I0309 02:57:45.272333 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wfr8q" Mar 09 02:57:46 crc kubenswrapper[4901]: I0309 02:57:46.064146 4901 generic.go:334] "Generic (PLEG): container finished" podID="a365c75a-4afc-41ca-8005-4674a8097d40" containerID="96c76b39e9f941de17b8a28726440af1548e04a0c56b64ea0aa17567feac305a" exitCode=0 Mar 09 02:57:46 crc kubenswrapper[4901]: I0309 02:57:46.064263 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bnx7v" event={"ID":"a365c75a-4afc-41ca-8005-4674a8097d40","Type":"ContainerDied","Data":"96c76b39e9f941de17b8a28726440af1548e04a0c56b64ea0aa17567feac305a"} Mar 09 02:57:46 crc kubenswrapper[4901]: I0309 02:57:46.118954 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wfr8q" Mar 09 02:57:47 crc kubenswrapper[4901]: I0309 02:57:47.074394 4901 generic.go:334] "Generic (PLEG): container finished" podID="a365c75a-4afc-41ca-8005-4674a8097d40" containerID="6ca70acfaebec1eb5b525486566b6d90510b48f78833658d32d456c9bf4dc1df" exitCode=0 Mar 09 02:57:47 crc kubenswrapper[4901]: I0309 02:57:47.074461 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bnx7v" event={"ID":"a365c75a-4afc-41ca-8005-4674a8097d40","Type":"ContainerDied","Data":"6ca70acfaebec1eb5b525486566b6d90510b48f78833658d32d456c9bf4dc1df"} Mar 09 02:57:47 crc kubenswrapper[4901]: I0309 02:57:47.952662 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wfr8q"] Mar 09 02:57:48 crc kubenswrapper[4901]: I0309 02:57:48.381922 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bnx7v" Mar 09 02:57:48 crc kubenswrapper[4901]: I0309 02:57:48.496101 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnd47\" (UniqueName: \"kubernetes.io/projected/a365c75a-4afc-41ca-8005-4674a8097d40-kube-api-access-jnd47\") pod \"a365c75a-4afc-41ca-8005-4674a8097d40\" (UID: \"a365c75a-4afc-41ca-8005-4674a8097d40\") " Mar 09 02:57:48 crc kubenswrapper[4901]: I0309 02:57:48.496161 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a365c75a-4afc-41ca-8005-4674a8097d40-bundle\") pod \"a365c75a-4afc-41ca-8005-4674a8097d40\" (UID: \"a365c75a-4afc-41ca-8005-4674a8097d40\") " Mar 09 02:57:48 crc kubenswrapper[4901]: I0309 02:57:48.496274 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a365c75a-4afc-41ca-8005-4674a8097d40-util\") pod \"a365c75a-4afc-41ca-8005-4674a8097d40\" (UID: \"a365c75a-4afc-41ca-8005-4674a8097d40\") " Mar 09 02:57:48 crc kubenswrapper[4901]: I0309 02:57:48.497754 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a365c75a-4afc-41ca-8005-4674a8097d40-bundle" (OuterVolumeSpecName: "bundle") pod "a365c75a-4afc-41ca-8005-4674a8097d40" (UID: "a365c75a-4afc-41ca-8005-4674a8097d40"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 02:57:48 crc kubenswrapper[4901]: I0309 02:57:48.504474 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a365c75a-4afc-41ca-8005-4674a8097d40-kube-api-access-jnd47" (OuterVolumeSpecName: "kube-api-access-jnd47") pod "a365c75a-4afc-41ca-8005-4674a8097d40" (UID: "a365c75a-4afc-41ca-8005-4674a8097d40"). InnerVolumeSpecName "kube-api-access-jnd47". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:57:48 crc kubenswrapper[4901]: I0309 02:57:48.512290 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a365c75a-4afc-41ca-8005-4674a8097d40-util" (OuterVolumeSpecName: "util") pod "a365c75a-4afc-41ca-8005-4674a8097d40" (UID: "a365c75a-4afc-41ca-8005-4674a8097d40"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 02:57:48 crc kubenswrapper[4901]: I0309 02:57:48.598870 4901 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a365c75a-4afc-41ca-8005-4674a8097d40-util\") on node \"crc\" DevicePath \"\"" Mar 09 02:57:48 crc kubenswrapper[4901]: I0309 02:57:48.599030 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnd47\" (UniqueName: \"kubernetes.io/projected/a365c75a-4afc-41ca-8005-4674a8097d40-kube-api-access-jnd47\") on node \"crc\" DevicePath \"\"" Mar 09 02:57:48 crc kubenswrapper[4901]: I0309 02:57:48.599062 4901 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a365c75a-4afc-41ca-8005-4674a8097d40-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 02:57:49 crc kubenswrapper[4901]: I0309 02:57:49.100189 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bnx7v" event={"ID":"a365c75a-4afc-41ca-8005-4674a8097d40","Type":"ContainerDied","Data":"f09e8a2f42648448676c327ab42e8437e0b97e3edb1c03c621ff7caa2ed1a089"} Mar 09 02:57:49 crc kubenswrapper[4901]: I0309 02:57:49.100559 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f09e8a2f42648448676c327ab42e8437e0b97e3edb1c03c621ff7caa2ed1a089" Mar 09 02:57:49 crc kubenswrapper[4901]: I0309 02:57:49.100289 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wfr8q" podUID="51a52c36-e39b-42f6-bb20-b2c8702cb4bf" containerName="registry-server" containerID="cri-o://a754020f2bc0fc42a1e72519e602618918cc78d928ef70c3c210c12ef7855dd4" gracePeriod=2 Mar 09 02:57:49 crc kubenswrapper[4901]: I0309 02:57:49.100958 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bnx7v" Mar 09 02:57:49 crc kubenswrapper[4901]: I0309 02:57:49.524949 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wfr8q" Mar 09 02:57:49 crc kubenswrapper[4901]: I0309 02:57:49.719262 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51a52c36-e39b-42f6-bb20-b2c8702cb4bf-utilities\") pod \"51a52c36-e39b-42f6-bb20-b2c8702cb4bf\" (UID: \"51a52c36-e39b-42f6-bb20-b2c8702cb4bf\") " Mar 09 02:57:49 crc kubenswrapper[4901]: I0309 02:57:49.719312 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdfrb\" (UniqueName: \"kubernetes.io/projected/51a52c36-e39b-42f6-bb20-b2c8702cb4bf-kube-api-access-sdfrb\") pod \"51a52c36-e39b-42f6-bb20-b2c8702cb4bf\" (UID: \"51a52c36-e39b-42f6-bb20-b2c8702cb4bf\") " Mar 09 02:57:49 crc kubenswrapper[4901]: I0309 02:57:49.719440 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51a52c36-e39b-42f6-bb20-b2c8702cb4bf-catalog-content\") pod \"51a52c36-e39b-42f6-bb20-b2c8702cb4bf\" (UID: \"51a52c36-e39b-42f6-bb20-b2c8702cb4bf\") " Mar 09 02:57:49 crc kubenswrapper[4901]: I0309 02:57:49.721113 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51a52c36-e39b-42f6-bb20-b2c8702cb4bf-utilities" (OuterVolumeSpecName: "utilities") pod "51a52c36-e39b-42f6-bb20-b2c8702cb4bf" (UID: "51a52c36-e39b-42f6-bb20-b2c8702cb4bf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 02:57:49 crc kubenswrapper[4901]: I0309 02:57:49.731899 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51a52c36-e39b-42f6-bb20-b2c8702cb4bf-kube-api-access-sdfrb" (OuterVolumeSpecName: "kube-api-access-sdfrb") pod "51a52c36-e39b-42f6-bb20-b2c8702cb4bf" (UID: "51a52c36-e39b-42f6-bb20-b2c8702cb4bf"). InnerVolumeSpecName "kube-api-access-sdfrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:57:49 crc kubenswrapper[4901]: I0309 02:57:49.792987 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51a52c36-e39b-42f6-bb20-b2c8702cb4bf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "51a52c36-e39b-42f6-bb20-b2c8702cb4bf" (UID: "51a52c36-e39b-42f6-bb20-b2c8702cb4bf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 02:57:49 crc kubenswrapper[4901]: I0309 02:57:49.821116 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51a52c36-e39b-42f6-bb20-b2c8702cb4bf-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 02:57:49 crc kubenswrapper[4901]: I0309 02:57:49.821152 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51a52c36-e39b-42f6-bb20-b2c8702cb4bf-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 02:57:49 crc kubenswrapper[4901]: I0309 02:57:49.821164 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdfrb\" (UniqueName: \"kubernetes.io/projected/51a52c36-e39b-42f6-bb20-b2c8702cb4bf-kube-api-access-sdfrb\") on node \"crc\" DevicePath \"\"" Mar 09 02:57:50 crc kubenswrapper[4901]: I0309 02:57:50.111483 4901 generic.go:334] "Generic (PLEG): container finished" podID="51a52c36-e39b-42f6-bb20-b2c8702cb4bf" containerID="a754020f2bc0fc42a1e72519e602618918cc78d928ef70c3c210c12ef7855dd4" exitCode=0 Mar 09 02:57:50 crc kubenswrapper[4901]: I0309 02:57:50.111649 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wfr8q" Mar 09 02:57:50 crc kubenswrapper[4901]: I0309 02:57:50.114865 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wfr8q" event={"ID":"51a52c36-e39b-42f6-bb20-b2c8702cb4bf","Type":"ContainerDied","Data":"a754020f2bc0fc42a1e72519e602618918cc78d928ef70c3c210c12ef7855dd4"} Mar 09 02:57:50 crc kubenswrapper[4901]: I0309 02:57:50.114899 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wfr8q" event={"ID":"51a52c36-e39b-42f6-bb20-b2c8702cb4bf","Type":"ContainerDied","Data":"c3189beb7946674fa2eac8f395ad7ebcb73d84bf4752703b3cf084ab398eb630"} Mar 09 02:57:50 crc kubenswrapper[4901]: I0309 02:57:50.114915 4901 scope.go:117] "RemoveContainer" containerID="a754020f2bc0fc42a1e72519e602618918cc78d928ef70c3c210c12ef7855dd4" Mar 09 02:57:50 crc kubenswrapper[4901]: I0309 02:57:50.145875 4901 scope.go:117] "RemoveContainer" containerID="7a0ae823a1cc12e7e0810cb523ddded2baa353cd4a638e14e6b643d5e8c04987" Mar 09 02:57:50 crc kubenswrapper[4901]: I0309 02:57:50.163750 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wfr8q"] Mar 09 02:57:50 crc kubenswrapper[4901]: I0309 02:57:50.174200 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wfr8q"] Mar 09 02:57:50 crc kubenswrapper[4901]: I0309 02:57:50.199116 4901 scope.go:117] "RemoveContainer" containerID="0697cf7f889e983a1e1f290abadd3ea00f4a927d0a4be52b71be6d0ddedb56f5" Mar 09 02:57:50 crc kubenswrapper[4901]: I0309 02:57:50.222000 4901 scope.go:117] "RemoveContainer" containerID="a754020f2bc0fc42a1e72519e602618918cc78d928ef70c3c210c12ef7855dd4" Mar 09 02:57:50 crc kubenswrapper[4901]: E0309 02:57:50.222871 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a754020f2bc0fc42a1e72519e602618918cc78d928ef70c3c210c12ef7855dd4\": container with ID starting with a754020f2bc0fc42a1e72519e602618918cc78d928ef70c3c210c12ef7855dd4 not found: ID does not exist" containerID="a754020f2bc0fc42a1e72519e602618918cc78d928ef70c3c210c12ef7855dd4" Mar 09 02:57:50 crc kubenswrapper[4901]: I0309 02:57:50.222987 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a754020f2bc0fc42a1e72519e602618918cc78d928ef70c3c210c12ef7855dd4"} err="failed to get container status \"a754020f2bc0fc42a1e72519e602618918cc78d928ef70c3c210c12ef7855dd4\": rpc error: code = NotFound desc = could not find container \"a754020f2bc0fc42a1e72519e602618918cc78d928ef70c3c210c12ef7855dd4\": container with ID starting with a754020f2bc0fc42a1e72519e602618918cc78d928ef70c3c210c12ef7855dd4 not found: ID does not exist" Mar 09 02:57:50 crc kubenswrapper[4901]: I0309 02:57:50.223288 4901 scope.go:117] "RemoveContainer" containerID="7a0ae823a1cc12e7e0810cb523ddded2baa353cd4a638e14e6b643d5e8c04987" Mar 09 02:57:50 crc kubenswrapper[4901]: E0309 02:57:50.224048 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a0ae823a1cc12e7e0810cb523ddded2baa353cd4a638e14e6b643d5e8c04987\": container with ID starting with 7a0ae823a1cc12e7e0810cb523ddded2baa353cd4a638e14e6b643d5e8c04987 not found: ID does not exist" containerID="7a0ae823a1cc12e7e0810cb523ddded2baa353cd4a638e14e6b643d5e8c04987" Mar 09 02:57:50 crc kubenswrapper[4901]: I0309 02:57:50.224133 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a0ae823a1cc12e7e0810cb523ddded2baa353cd4a638e14e6b643d5e8c04987"} err="failed to get container status \"7a0ae823a1cc12e7e0810cb523ddded2baa353cd4a638e14e6b643d5e8c04987\": rpc error: code = NotFound desc = could not find container \"7a0ae823a1cc12e7e0810cb523ddded2baa353cd4a638e14e6b643d5e8c04987\": container with ID starting with 7a0ae823a1cc12e7e0810cb523ddded2baa353cd4a638e14e6b643d5e8c04987 not found: ID does not exist" Mar 09 02:57:50 crc kubenswrapper[4901]: I0309 02:57:50.224200 4901 scope.go:117] "RemoveContainer" containerID="0697cf7f889e983a1e1f290abadd3ea00f4a927d0a4be52b71be6d0ddedb56f5" Mar 09 02:57:50 crc kubenswrapper[4901]: E0309 02:57:50.224847 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0697cf7f889e983a1e1f290abadd3ea00f4a927d0a4be52b71be6d0ddedb56f5\": container with ID starting with 0697cf7f889e983a1e1f290abadd3ea00f4a927d0a4be52b71be6d0ddedb56f5 not found: ID does not exist" containerID="0697cf7f889e983a1e1f290abadd3ea00f4a927d0a4be52b71be6d0ddedb56f5" Mar 09 02:57:50 crc kubenswrapper[4901]: I0309 02:57:50.224941 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0697cf7f889e983a1e1f290abadd3ea00f4a927d0a4be52b71be6d0ddedb56f5"} err="failed to get container status \"0697cf7f889e983a1e1f290abadd3ea00f4a927d0a4be52b71be6d0ddedb56f5\": rpc error: code = NotFound desc = could not find container \"0697cf7f889e983a1e1f290abadd3ea00f4a927d0a4be52b71be6d0ddedb56f5\": container with ID starting with 0697cf7f889e983a1e1f290abadd3ea00f4a927d0a4be52b71be6d0ddedb56f5 not found: ID does not exist" Mar 09 02:57:52 crc kubenswrapper[4901]: I0309 02:57:52.114354 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51a52c36-e39b-42f6-bb20-b2c8702cb4bf" path="/var/lib/kubelet/pods/51a52c36-e39b-42f6-bb20-b2c8702cb4bf/volumes" Mar 09 02:57:53 crc kubenswrapper[4901]: I0309 02:57:53.014444 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-6vcp9"] Mar 09 02:57:53 crc kubenswrapper[4901]: E0309 02:57:53.014647 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a365c75a-4afc-41ca-8005-4674a8097d40" containerName="util" Mar 09 02:57:53 crc kubenswrapper[4901]: I0309 02:57:53.014657 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="a365c75a-4afc-41ca-8005-4674a8097d40" containerName="util" Mar 09 02:57:53 crc kubenswrapper[4901]: E0309 02:57:53.014668 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a365c75a-4afc-41ca-8005-4674a8097d40" containerName="pull" Mar 09 02:57:53 crc kubenswrapper[4901]: I0309 02:57:53.014674 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="a365c75a-4afc-41ca-8005-4674a8097d40" containerName="pull" Mar 09 02:57:53 crc kubenswrapper[4901]: E0309 02:57:53.014688 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51a52c36-e39b-42f6-bb20-b2c8702cb4bf" containerName="extract-utilities" Mar 09 02:57:53 crc kubenswrapper[4901]: I0309 02:57:53.014693 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="51a52c36-e39b-42f6-bb20-b2c8702cb4bf" containerName="extract-utilities" Mar 09 02:57:53 crc kubenswrapper[4901]: E0309 02:57:53.014704 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51a52c36-e39b-42f6-bb20-b2c8702cb4bf" containerName="registry-server" Mar 09 02:57:53 crc kubenswrapper[4901]: I0309 02:57:53.014709 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="51a52c36-e39b-42f6-bb20-b2c8702cb4bf" containerName="registry-server" Mar 09 02:57:53 crc kubenswrapper[4901]: E0309 02:57:53.014721 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a365c75a-4afc-41ca-8005-4674a8097d40" containerName="extract" Mar 09 02:57:53 crc kubenswrapper[4901]: I0309 02:57:53.014728 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="a365c75a-4afc-41ca-8005-4674a8097d40" containerName="extract" Mar 09 02:57:53 crc kubenswrapper[4901]: E0309 02:57:53.014736 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51a52c36-e39b-42f6-bb20-b2c8702cb4bf" containerName="extract-content" Mar 09 02:57:53 crc kubenswrapper[4901]: I0309 02:57:53.014741 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="51a52c36-e39b-42f6-bb20-b2c8702cb4bf" containerName="extract-content" Mar 09 02:57:53 crc kubenswrapper[4901]: I0309 02:57:53.014834 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="a365c75a-4afc-41ca-8005-4674a8097d40" containerName="extract" Mar 09 02:57:53 crc kubenswrapper[4901]: I0309 02:57:53.014842 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="51a52c36-e39b-42f6-bb20-b2c8702cb4bf" containerName="registry-server" Mar 09 02:57:53 crc kubenswrapper[4901]: I0309 02:57:53.015278 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-6vcp9" Mar 09 02:57:53 crc kubenswrapper[4901]: I0309 02:57:53.018597 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Mar 09 02:57:53 crc kubenswrapper[4901]: I0309 02:57:53.018841 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Mar 09 02:57:53 crc kubenswrapper[4901]: I0309 02:57:53.018993 4901 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-tzzjj" Mar 09 02:57:53 crc kubenswrapper[4901]: I0309 02:57:53.035846 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-6vcp9"] Mar 09 02:57:53 crc kubenswrapper[4901]: I0309 02:57:53.067018 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfcrd\" (UniqueName: \"kubernetes.io/projected/f7714088-b2cd-40f7-a35b-6d59b555fa0a-kube-api-access-qfcrd\") pod \"cert-manager-operator-controller-manager-66c8bdd694-6vcp9\" (UID: \"f7714088-b2cd-40f7-a35b-6d59b555fa0a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-6vcp9" Mar 09 02:57:53 crc kubenswrapper[4901]: I0309 02:57:53.067144 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f7714088-b2cd-40f7-a35b-6d59b555fa0a-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-6vcp9\" (UID: \"f7714088-b2cd-40f7-a35b-6d59b555fa0a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-6vcp9" Mar 09 02:57:53 crc kubenswrapper[4901]: I0309 02:57:53.168728 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f7714088-b2cd-40f7-a35b-6d59b555fa0a-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-6vcp9\" (UID: \"f7714088-b2cd-40f7-a35b-6d59b555fa0a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-6vcp9" Mar 09 02:57:53 crc kubenswrapper[4901]: I0309 02:57:53.168908 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfcrd\" (UniqueName: \"kubernetes.io/projected/f7714088-b2cd-40f7-a35b-6d59b555fa0a-kube-api-access-qfcrd\") pod \"cert-manager-operator-controller-manager-66c8bdd694-6vcp9\" (UID: \"f7714088-b2cd-40f7-a35b-6d59b555fa0a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-6vcp9" Mar 09 02:57:53 crc kubenswrapper[4901]: I0309 02:57:53.169194 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f7714088-b2cd-40f7-a35b-6d59b555fa0a-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-6vcp9\" (UID: \"f7714088-b2cd-40f7-a35b-6d59b555fa0a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-6vcp9" Mar 09 02:57:53 crc kubenswrapper[4901]: I0309 02:57:53.188033 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfcrd\" (UniqueName: \"kubernetes.io/projected/f7714088-b2cd-40f7-a35b-6d59b555fa0a-kube-api-access-qfcrd\") pod \"cert-manager-operator-controller-manager-66c8bdd694-6vcp9\" (UID: \"f7714088-b2cd-40f7-a35b-6d59b555fa0a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-6vcp9" Mar 09 02:57:53 crc kubenswrapper[4901]: I0309 02:57:53.330447 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-6vcp9" Mar 09 02:57:53 crc kubenswrapper[4901]: I0309 02:57:53.781552 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-6vcp9"] Mar 09 02:57:53 crc kubenswrapper[4901]: W0309 02:57:53.787646 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7714088_b2cd_40f7_a35b_6d59b555fa0a.slice/crio-d5ae52429841f6c7e8f19c8832e89e90405bcbf9c7e46cdbe71b8f844ce613f0 WatchSource:0}: Error finding container d5ae52429841f6c7e8f19c8832e89e90405bcbf9c7e46cdbe71b8f844ce613f0: Status 404 returned error can't find the container with id d5ae52429841f6c7e8f19c8832e89e90405bcbf9c7e46cdbe71b8f844ce613f0 Mar 09 02:57:54 crc kubenswrapper[4901]: I0309 02:57:54.145138 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-6vcp9" event={"ID":"f7714088-b2cd-40f7-a35b-6d59b555fa0a","Type":"ContainerStarted","Data":"d5ae52429841f6c7e8f19c8832e89e90405bcbf9c7e46cdbe71b8f844ce613f0"} Mar 09 02:57:58 crc kubenswrapper[4901]: I0309 02:57:58.176098 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-6vcp9" event={"ID":"f7714088-b2cd-40f7-a35b-6d59b555fa0a","Type":"ContainerStarted","Data":"f91b6fbf4d24787b0bf6b2714b41362f2972d1d848535dab81ab532539455f26"} Mar 09 02:57:58 crc kubenswrapper[4901]: I0309 02:57:58.195884 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-6vcp9" podStartSLOduration=2.783414368 podStartE2EDuration="6.195862867s" podCreationTimestamp="2026-03-09 02:57:52 +0000 UTC" firstStartedPulling="2026-03-09 02:57:53.790309381 +0000 UTC m=+998.379973113" lastFinishedPulling="2026-03-09 02:57:57.20275788 +0000 UTC m=+1001.792421612" observedRunningTime="2026-03-09 02:57:58.195554579 +0000 UTC m=+1002.785218321" watchObservedRunningTime="2026-03-09 02:57:58.195862867 +0000 UTC m=+1002.785526589" Mar 09 02:58:00 crc kubenswrapper[4901]: I0309 02:58:00.133428 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550418-rmqtb"] Mar 09 02:58:00 crc kubenswrapper[4901]: I0309 02:58:00.135693 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550418-rmqtb" Mar 09 02:58:00 crc kubenswrapper[4901]: I0309 02:58:00.138750 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 02:58:00 crc kubenswrapper[4901]: I0309 02:58:00.139197 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lq6vs" Mar 09 02:58:00 crc kubenswrapper[4901]: I0309 02:58:00.139252 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 02:58:00 crc kubenswrapper[4901]: I0309 02:58:00.139507 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550418-rmqtb"] Mar 09 02:58:00 crc kubenswrapper[4901]: I0309 02:58:00.257918 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nw9f\" (UniqueName: \"kubernetes.io/projected/9eb148ab-7da6-4a35-9fab-16e7a98612a6-kube-api-access-9nw9f\") pod \"auto-csr-approver-29550418-rmqtb\" (UID: \"9eb148ab-7da6-4a35-9fab-16e7a98612a6\") " pod="openshift-infra/auto-csr-approver-29550418-rmqtb" Mar 09 02:58:00 crc kubenswrapper[4901]: I0309 02:58:00.359425 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nw9f\" (UniqueName: \"kubernetes.io/projected/9eb148ab-7da6-4a35-9fab-16e7a98612a6-kube-api-access-9nw9f\") pod \"auto-csr-approver-29550418-rmqtb\" (UID: \"9eb148ab-7da6-4a35-9fab-16e7a98612a6\") " pod="openshift-infra/auto-csr-approver-29550418-rmqtb" Mar 09 02:58:00 crc kubenswrapper[4901]: I0309 02:58:00.400756 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nw9f\" (UniqueName: \"kubernetes.io/projected/9eb148ab-7da6-4a35-9fab-16e7a98612a6-kube-api-access-9nw9f\") pod \"auto-csr-approver-29550418-rmqtb\" (UID: \"9eb148ab-7da6-4a35-9fab-16e7a98612a6\") " pod="openshift-infra/auto-csr-approver-29550418-rmqtb" Mar 09 02:58:00 crc kubenswrapper[4901]: I0309 02:58:00.435844 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-tskr9"] Mar 09 02:58:00 crc kubenswrapper[4901]: I0309 02:58:00.436640 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-tskr9" Mar 09 02:58:00 crc kubenswrapper[4901]: I0309 02:58:00.474599 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 09 02:58:00 crc kubenswrapper[4901]: I0309 02:58:00.474951 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 09 02:58:00 crc kubenswrapper[4901]: I0309 02:58:00.475179 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550418-rmqtb" Mar 09 02:58:00 crc kubenswrapper[4901]: I0309 02:58:00.542192 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-tskr9"] Mar 09 02:58:00 crc kubenswrapper[4901]: I0309 02:58:00.580092 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/54d6e767-b2a0-472c-ade7-a8f22284526b-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-tskr9\" (UID: \"54d6e767-b2a0-472c-ade7-a8f22284526b\") " pod="cert-manager/cert-manager-webhook-6888856db4-tskr9" Mar 09 02:58:00 crc kubenswrapper[4901]: I0309 02:58:00.580155 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpwkg\" (UniqueName: \"kubernetes.io/projected/54d6e767-b2a0-472c-ade7-a8f22284526b-kube-api-access-hpwkg\") pod \"cert-manager-webhook-6888856db4-tskr9\" (UID: \"54d6e767-b2a0-472c-ade7-a8f22284526b\") " pod="cert-manager/cert-manager-webhook-6888856db4-tskr9" Mar 09 02:58:00 crc kubenswrapper[4901]: I0309 02:58:00.681490 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/54d6e767-b2a0-472c-ade7-a8f22284526b-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-tskr9\" (UID: \"54d6e767-b2a0-472c-ade7-a8f22284526b\") " pod="cert-manager/cert-manager-webhook-6888856db4-tskr9" Mar 09 02:58:00 crc kubenswrapper[4901]: I0309 02:58:00.681825 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpwkg\" (UniqueName: \"kubernetes.io/projected/54d6e767-b2a0-472c-ade7-a8f22284526b-kube-api-access-hpwkg\") pod \"cert-manager-webhook-6888856db4-tskr9\" (UID: \"54d6e767-b2a0-472c-ade7-a8f22284526b\") " pod="cert-manager/cert-manager-webhook-6888856db4-tskr9" Mar 09 02:58:00 crc kubenswrapper[4901]: I0309 02:58:00.700041 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/54d6e767-b2a0-472c-ade7-a8f22284526b-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-tskr9\" (UID: \"54d6e767-b2a0-472c-ade7-a8f22284526b\") " pod="cert-manager/cert-manager-webhook-6888856db4-tskr9" Mar 09 02:58:00 crc kubenswrapper[4901]: I0309 02:58:00.703910 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpwkg\" (UniqueName: \"kubernetes.io/projected/54d6e767-b2a0-472c-ade7-a8f22284526b-kube-api-access-hpwkg\") pod \"cert-manager-webhook-6888856db4-tskr9\" (UID: \"54d6e767-b2a0-472c-ade7-a8f22284526b\") " pod="cert-manager/cert-manager-webhook-6888856db4-tskr9" Mar 09 02:58:00 crc kubenswrapper[4901]: I0309 02:58:00.846382 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-tskr9" Mar 09 02:58:00 crc kubenswrapper[4901]: I0309 02:58:00.936519 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550418-rmqtb"] Mar 09 02:58:01 crc kubenswrapper[4901]: I0309 02:58:01.192288 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550418-rmqtb" event={"ID":"9eb148ab-7da6-4a35-9fab-16e7a98612a6","Type":"ContainerStarted","Data":"58e3b89fb80d166af74fdab5bd816abc304cb9544f192ab530ca5c13f9631c98"} Mar 09 02:58:01 crc kubenswrapper[4901]: I0309 02:58:01.276729 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-tskr9"] Mar 09 02:58:02 crc kubenswrapper[4901]: I0309 02:58:02.200942 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-tskr9" event={"ID":"54d6e767-b2a0-472c-ade7-a8f22284526b","Type":"ContainerStarted","Data":"964214b9a4d3daf18b0f58e8e5f62f364ee56c7b6535f1580221b1f032095eff"} Mar 09 02:58:03 crc kubenswrapper[4901]: I0309 02:58:03.207992 4901 generic.go:334] "Generic (PLEG): container finished" podID="9eb148ab-7da6-4a35-9fab-16e7a98612a6" containerID="6e34be1e8a47029c733d66e7e516cb6e368f429de6d79071ffc355445d8f8446" exitCode=0 Mar 09 02:58:03 crc kubenswrapper[4901]: I0309 02:58:03.208052 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550418-rmqtb" event={"ID":"9eb148ab-7da6-4a35-9fab-16e7a98612a6","Type":"ContainerDied","Data":"6e34be1e8a47029c733d66e7e516cb6e368f429de6d79071ffc355445d8f8446"} Mar 09 02:58:04 crc kubenswrapper[4901]: I0309 02:58:04.471208 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550418-rmqtb" Mar 09 02:58:04 crc kubenswrapper[4901]: I0309 02:58:04.576314 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nw9f\" (UniqueName: \"kubernetes.io/projected/9eb148ab-7da6-4a35-9fab-16e7a98612a6-kube-api-access-9nw9f\") pod \"9eb148ab-7da6-4a35-9fab-16e7a98612a6\" (UID: \"9eb148ab-7da6-4a35-9fab-16e7a98612a6\") " Mar 09 02:58:04 crc kubenswrapper[4901]: I0309 02:58:04.593418 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9eb148ab-7da6-4a35-9fab-16e7a98612a6-kube-api-access-9nw9f" (OuterVolumeSpecName: "kube-api-access-9nw9f") pod "9eb148ab-7da6-4a35-9fab-16e7a98612a6" (UID: "9eb148ab-7da6-4a35-9fab-16e7a98612a6"). InnerVolumeSpecName "kube-api-access-9nw9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:58:04 crc kubenswrapper[4901]: I0309 02:58:04.677985 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nw9f\" (UniqueName: \"kubernetes.io/projected/9eb148ab-7da6-4a35-9fab-16e7a98612a6-kube-api-access-9nw9f\") on node \"crc\" DevicePath \"\"" Mar 09 02:58:05 crc kubenswrapper[4901]: I0309 02:58:05.221737 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550418-rmqtb" event={"ID":"9eb148ab-7da6-4a35-9fab-16e7a98612a6","Type":"ContainerDied","Data":"58e3b89fb80d166af74fdab5bd816abc304cb9544f192ab530ca5c13f9631c98"} Mar 09 02:58:05 crc kubenswrapper[4901]: I0309 02:58:05.221948 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58e3b89fb80d166af74fdab5bd816abc304cb9544f192ab530ca5c13f9631c98" Mar 09 02:58:05 crc kubenswrapper[4901]: I0309 02:58:05.221834 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550418-rmqtb" Mar 09 02:58:05 crc kubenswrapper[4901]: I0309 02:58:05.528151 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550412-rf8dh"] Mar 09 02:58:05 crc kubenswrapper[4901]: I0309 02:58:05.531533 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550412-rf8dh"] Mar 09 02:58:06 crc kubenswrapper[4901]: I0309 02:58:06.117004 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcb536b6-7907-4645-abe6-bcb1489c6739" path="/var/lib/kubelet/pods/fcb536b6-7907-4645-abe6-bcb1489c6739/volumes" Mar 09 02:58:07 crc kubenswrapper[4901]: I0309 02:58:07.233033 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-tskr9" event={"ID":"54d6e767-b2a0-472c-ade7-a8f22284526b","Type":"ContainerStarted","Data":"ec4566ec44a55747dcefa1958d65a570c14d0c0b5b2212c7e12f26d3a4583149"} Mar 09 02:58:07 crc kubenswrapper[4901]: I0309 02:58:07.233359 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-tskr9" Mar 09 02:58:07 crc kubenswrapper[4901]: I0309 02:58:07.249650 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-tskr9" podStartSLOduration=1.899980894 podStartE2EDuration="7.249635223s" podCreationTimestamp="2026-03-09 02:58:00 +0000 UTC" firstStartedPulling="2026-03-09 02:58:01.283030851 +0000 UTC m=+1005.872694623" lastFinishedPulling="2026-03-09 02:58:06.63268521 +0000 UTC m=+1011.222348952" observedRunningTime="2026-03-09 02:58:07.245309385 +0000 UTC m=+1011.834973137" watchObservedRunningTime="2026-03-09 02:58:07.249635223 +0000 UTC m=+1011.839298955" Mar 09 02:58:07 crc kubenswrapper[4901]: I0309 02:58:07.394413 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-bncdj"] Mar 09 02:58:07 crc kubenswrapper[4901]: E0309 02:58:07.394700 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9eb148ab-7da6-4a35-9fab-16e7a98612a6" containerName="oc" Mar 09 02:58:07 crc kubenswrapper[4901]: I0309 02:58:07.394719 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="9eb148ab-7da6-4a35-9fab-16e7a98612a6" containerName="oc" Mar 09 02:58:07 crc kubenswrapper[4901]: I0309 02:58:07.394865 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="9eb148ab-7da6-4a35-9fab-16e7a98612a6" containerName="oc" Mar 09 02:58:07 crc kubenswrapper[4901]: I0309 02:58:07.395328 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-bncdj" Mar 09 02:58:07 crc kubenswrapper[4901]: I0309 02:58:07.397496 4901 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-lsb8m" Mar 09 02:58:07 crc kubenswrapper[4901]: I0309 02:58:07.400434 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-bncdj"] Mar 09 02:58:07 crc kubenswrapper[4901]: I0309 02:58:07.422674 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kf7f\" (UniqueName: \"kubernetes.io/projected/f07a88fa-51e7-43c8-b640-fdf34d1d2957-kube-api-access-4kf7f\") pod \"cert-manager-cainjector-5545bd876-bncdj\" (UID: \"f07a88fa-51e7-43c8-b640-fdf34d1d2957\") " pod="cert-manager/cert-manager-cainjector-5545bd876-bncdj" Mar 09 02:58:07 crc kubenswrapper[4901]: I0309 02:58:07.422777 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f07a88fa-51e7-43c8-b640-fdf34d1d2957-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-bncdj\" (UID: \"f07a88fa-51e7-43c8-b640-fdf34d1d2957\") " pod="cert-manager/cert-manager-cainjector-5545bd876-bncdj" Mar 09 02:58:07 crc kubenswrapper[4901]: I0309 02:58:07.523730 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kf7f\" (UniqueName: \"kubernetes.io/projected/f07a88fa-51e7-43c8-b640-fdf34d1d2957-kube-api-access-4kf7f\") pod \"cert-manager-cainjector-5545bd876-bncdj\" (UID: \"f07a88fa-51e7-43c8-b640-fdf34d1d2957\") " pod="cert-manager/cert-manager-cainjector-5545bd876-bncdj" Mar 09 02:58:07 crc kubenswrapper[4901]: I0309 02:58:07.523788 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f07a88fa-51e7-43c8-b640-fdf34d1d2957-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-bncdj\" (UID: \"f07a88fa-51e7-43c8-b640-fdf34d1d2957\") " pod="cert-manager/cert-manager-cainjector-5545bd876-bncdj" Mar 09 02:58:07 crc kubenswrapper[4901]: I0309 02:58:07.545672 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kf7f\" (UniqueName: \"kubernetes.io/projected/f07a88fa-51e7-43c8-b640-fdf34d1d2957-kube-api-access-4kf7f\") pod \"cert-manager-cainjector-5545bd876-bncdj\" (UID: \"f07a88fa-51e7-43c8-b640-fdf34d1d2957\") " pod="cert-manager/cert-manager-cainjector-5545bd876-bncdj" Mar 09 02:58:07 crc kubenswrapper[4901]: I0309 02:58:07.549325 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f07a88fa-51e7-43c8-b640-fdf34d1d2957-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-bncdj\" (UID: \"f07a88fa-51e7-43c8-b640-fdf34d1d2957\") " pod="cert-manager/cert-manager-cainjector-5545bd876-bncdj" Mar 09 02:58:07 crc kubenswrapper[4901]: I0309 02:58:07.714089 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-bncdj" Mar 09 02:58:08 crc kubenswrapper[4901]: W0309 02:58:08.235247 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf07a88fa_51e7_43c8_b640_fdf34d1d2957.slice/crio-92114d38682eed73a6b801242186cdfcdbfcfe63972297606783eb56da7b1993 WatchSource:0}: Error finding container 92114d38682eed73a6b801242186cdfcdbfcfe63972297606783eb56da7b1993: Status 404 returned error can't find the container with id 92114d38682eed73a6b801242186cdfcdbfcfe63972297606783eb56da7b1993 Mar 09 02:58:08 crc kubenswrapper[4901]: I0309 02:58:08.256824 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-bncdj"] Mar 09 02:58:09 crc kubenswrapper[4901]: I0309 02:58:09.248966 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-bncdj" event={"ID":"f07a88fa-51e7-43c8-b640-fdf34d1d2957","Type":"ContainerStarted","Data":"7e6c8ccd3d38553909b0b6018ad154aa01fb5bd80aa711ee1b82f25db261d395"} Mar 09 02:58:09 crc kubenswrapper[4901]: I0309 02:58:09.249384 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-bncdj" event={"ID":"f07a88fa-51e7-43c8-b640-fdf34d1d2957","Type":"ContainerStarted","Data":"92114d38682eed73a6b801242186cdfcdbfcfe63972297606783eb56da7b1993"} Mar 09 02:58:09 crc kubenswrapper[4901]: I0309 02:58:09.288704 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-bncdj" podStartSLOduration=2.288679877 podStartE2EDuration="2.288679877s" podCreationTimestamp="2026-03-09 02:58:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:58:09.2836208 +0000 UTC m=+1013.873284542" watchObservedRunningTime="2026-03-09 02:58:09.288679877 +0000 UTC m=+1013.878343599" Mar 09 02:58:09 crc kubenswrapper[4901]: I0309 02:58:09.699164 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-pvmjq"] Mar 09 02:58:09 crc kubenswrapper[4901]: I0309 02:58:09.700077 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-pvmjq" Mar 09 02:58:09 crc kubenswrapper[4901]: I0309 02:58:09.702017 4901 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-rwwcv" Mar 09 02:58:09 crc kubenswrapper[4901]: I0309 02:58:09.708548 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-pvmjq"] Mar 09 02:58:09 crc kubenswrapper[4901]: I0309 02:58:09.856672 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpfnb\" (UniqueName: \"kubernetes.io/projected/cdf41bf1-67bd-4b7e-b2cd-d629a7b0ab15-kube-api-access-dpfnb\") pod \"cert-manager-545d4d4674-pvmjq\" (UID: \"cdf41bf1-67bd-4b7e-b2cd-d629a7b0ab15\") " pod="cert-manager/cert-manager-545d4d4674-pvmjq" Mar 09 02:58:09 crc kubenswrapper[4901]: I0309 02:58:09.857010 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cdf41bf1-67bd-4b7e-b2cd-d629a7b0ab15-bound-sa-token\") pod \"cert-manager-545d4d4674-pvmjq\" (UID: \"cdf41bf1-67bd-4b7e-b2cd-d629a7b0ab15\") " pod="cert-manager/cert-manager-545d4d4674-pvmjq" Mar 09 02:58:09 crc kubenswrapper[4901]: I0309 02:58:09.958042 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpfnb\" (UniqueName: \"kubernetes.io/projected/cdf41bf1-67bd-4b7e-b2cd-d629a7b0ab15-kube-api-access-dpfnb\") pod \"cert-manager-545d4d4674-pvmjq\" (UID: \"cdf41bf1-67bd-4b7e-b2cd-d629a7b0ab15\") " pod="cert-manager/cert-manager-545d4d4674-pvmjq" Mar 09 02:58:09 crc kubenswrapper[4901]: I0309 02:58:09.958152 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cdf41bf1-67bd-4b7e-b2cd-d629a7b0ab15-bound-sa-token\") pod \"cert-manager-545d4d4674-pvmjq\" (UID: \"cdf41bf1-67bd-4b7e-b2cd-d629a7b0ab15\") " pod="cert-manager/cert-manager-545d4d4674-pvmjq" Mar 09 02:58:09 crc kubenswrapper[4901]: I0309 02:58:09.978795 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cdf41bf1-67bd-4b7e-b2cd-d629a7b0ab15-bound-sa-token\") pod \"cert-manager-545d4d4674-pvmjq\" (UID: \"cdf41bf1-67bd-4b7e-b2cd-d629a7b0ab15\") " pod="cert-manager/cert-manager-545d4d4674-pvmjq" Mar 09 02:58:09 crc kubenswrapper[4901]: I0309 02:58:09.990479 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpfnb\" (UniqueName: \"kubernetes.io/projected/cdf41bf1-67bd-4b7e-b2cd-d629a7b0ab15-kube-api-access-dpfnb\") pod \"cert-manager-545d4d4674-pvmjq\" (UID: \"cdf41bf1-67bd-4b7e-b2cd-d629a7b0ab15\") " pod="cert-manager/cert-manager-545d4d4674-pvmjq" Mar 09 02:58:10 crc kubenswrapper[4901]: I0309 02:58:10.029915 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-pvmjq" Mar 09 02:58:10 crc kubenswrapper[4901]: I0309 02:58:10.258929 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-pvmjq"] Mar 09 02:58:11 crc kubenswrapper[4901]: I0309 02:58:11.262306 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-pvmjq" event={"ID":"cdf41bf1-67bd-4b7e-b2cd-d629a7b0ab15","Type":"ContainerStarted","Data":"4559e3eae1e7dd082f495c7c0688d984cef7ef957a1bf9c82a3dc7b162d5f012"} Mar 09 02:58:11 crc kubenswrapper[4901]: I0309 02:58:11.262358 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-pvmjq" event={"ID":"cdf41bf1-67bd-4b7e-b2cd-d629a7b0ab15","Type":"ContainerStarted","Data":"1a0625f979524b6a55c33145909654dda977bfae0a44a943abda06e01cb7e89f"} Mar 09 02:58:11 crc kubenswrapper[4901]: I0309 02:58:11.283591 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-pvmjq" podStartSLOduration=2.283560039 podStartE2EDuration="2.283560039s" podCreationTimestamp="2026-03-09 02:58:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:58:11.280503122 +0000 UTC m=+1015.870166864" watchObservedRunningTime="2026-03-09 02:58:11.283560039 +0000 UTC m=+1015.873223821" Mar 09 02:58:15 crc kubenswrapper[4901]: I0309 02:58:15.850710 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-tskr9" Mar 09 02:58:17 crc kubenswrapper[4901]: I0309 02:58:17.502057 4901 scope.go:117] "RemoveContainer" containerID="117f9dca07a548933ba5e15f807d01f05a62a9c6954c1054cff3e197c4a386e7" Mar 09 02:58:22 crc kubenswrapper[4901]: I0309 02:58:22.362108 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-87jmb"] Mar 09 02:58:22 crc kubenswrapper[4901]: I0309 02:58:22.364111 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-87jmb" Mar 09 02:58:22 crc kubenswrapper[4901]: I0309 02:58:22.373169 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 09 02:58:22 crc kubenswrapper[4901]: I0309 02:58:22.373359 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-mbmd5" Mar 09 02:58:22 crc kubenswrapper[4901]: I0309 02:58:22.373291 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 09 02:58:22 crc kubenswrapper[4901]: I0309 02:58:22.387278 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-87jmb"] Mar 09 02:58:22 crc kubenswrapper[4901]: I0309 02:58:22.432089 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttrzg\" (UniqueName: \"kubernetes.io/projected/3cd16b8c-edb0-460a-988b-539bafafb5a0-kube-api-access-ttrzg\") pod \"openstack-operator-index-87jmb\" (UID: \"3cd16b8c-edb0-460a-988b-539bafafb5a0\") " pod="openstack-operators/openstack-operator-index-87jmb" Mar 09 02:58:22 crc kubenswrapper[4901]: I0309 02:58:22.534128 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttrzg\" (UniqueName: \"kubernetes.io/projected/3cd16b8c-edb0-460a-988b-539bafafb5a0-kube-api-access-ttrzg\") pod \"openstack-operator-index-87jmb\" (UID: \"3cd16b8c-edb0-460a-988b-539bafafb5a0\") " pod="openstack-operators/openstack-operator-index-87jmb" Mar 09 02:58:22 crc kubenswrapper[4901]: I0309 02:58:22.556637 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttrzg\" (UniqueName: \"kubernetes.io/projected/3cd16b8c-edb0-460a-988b-539bafafb5a0-kube-api-access-ttrzg\") pod \"openstack-operator-index-87jmb\" (UID: \"3cd16b8c-edb0-460a-988b-539bafafb5a0\") " pod="openstack-operators/openstack-operator-index-87jmb" Mar 09 02:58:22 crc kubenswrapper[4901]: I0309 02:58:22.729685 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-87jmb" Mar 09 02:58:23 crc kubenswrapper[4901]: I0309 02:58:23.004753 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-87jmb"] Mar 09 02:58:23 crc kubenswrapper[4901]: I0309 02:58:23.349986 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-87jmb" event={"ID":"3cd16b8c-edb0-460a-988b-539bafafb5a0","Type":"ContainerStarted","Data":"73d4ad0bed8f9cbcfe5e56d47ddaf8f44f405fd9fce9156ca74871dd94e605de"} Mar 09 02:58:24 crc kubenswrapper[4901]: I0309 02:58:24.361305 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-87jmb" event={"ID":"3cd16b8c-edb0-460a-988b-539bafafb5a0","Type":"ContainerStarted","Data":"fedbe57e99509f49bab3e063fcb2a7475aad727fe114a91407162f998d4ef302"} Mar 09 02:58:24 crc kubenswrapper[4901]: I0309 02:58:24.384755 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-87jmb" podStartSLOduration=1.476024266 podStartE2EDuration="2.38472936s" podCreationTimestamp="2026-03-09 02:58:22 +0000 UTC" firstStartedPulling="2026-03-09 02:58:23.009111239 +0000 UTC m=+1027.598774971" lastFinishedPulling="2026-03-09 02:58:23.917816293 +0000 UTC m=+1028.507480065" observedRunningTime="2026-03-09 02:58:24.381469578 +0000 UTC m=+1028.971133350" watchObservedRunningTime="2026-03-09 02:58:24.38472936 +0000 UTC m=+1028.974393122" Mar 09 02:58:32 crc kubenswrapper[4901]: I0309 02:58:32.730687 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-87jmb" Mar 09 02:58:32 crc kubenswrapper[4901]: I0309 02:58:32.731298 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-87jmb" Mar 09 02:58:32 crc kubenswrapper[4901]: I0309 02:58:32.768873 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-87jmb" Mar 09 02:58:33 crc kubenswrapper[4901]: I0309 02:58:33.480334 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-87jmb" Mar 09 02:58:34 crc kubenswrapper[4901]: I0309 02:58:34.611311 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f995bc55"] Mar 09 02:58:34 crc kubenswrapper[4901]: I0309 02:58:34.614747 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f995bc55" Mar 09 02:58:34 crc kubenswrapper[4901]: I0309 02:58:34.619619 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-ztsvq" Mar 09 02:58:34 crc kubenswrapper[4901]: I0309 02:58:34.627785 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f995bc55"] Mar 09 02:58:34 crc kubenswrapper[4901]: I0309 02:58:34.730213 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa-bundle\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f995bc55\" (UID: \"1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f995bc55" Mar 09 02:58:34 crc kubenswrapper[4901]: I0309 02:58:34.730347 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6mnr\" (UniqueName: \"kubernetes.io/projected/1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa-kube-api-access-w6mnr\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f995bc55\" (UID: \"1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f995bc55" Mar 09 02:58:34 crc kubenswrapper[4901]: I0309 02:58:34.730441 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa-util\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f995bc55\" (UID: \"1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f995bc55" Mar 09 02:58:34 crc kubenswrapper[4901]: I0309 02:58:34.832486 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa-bundle\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f995bc55\" (UID: \"1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f995bc55" Mar 09 02:58:34 crc kubenswrapper[4901]: I0309 02:58:34.832556 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6mnr\" (UniqueName: \"kubernetes.io/projected/1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa-kube-api-access-w6mnr\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f995bc55\" (UID: \"1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f995bc55" Mar 09 02:58:34 crc kubenswrapper[4901]: I0309 02:58:34.832599 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa-util\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f995bc55\" (UID: \"1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f995bc55" Mar 09 02:58:34 crc kubenswrapper[4901]: I0309 02:58:34.833353 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa-bundle\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f995bc55\" (UID: \"1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f995bc55" Mar 09 02:58:34 crc kubenswrapper[4901]: I0309 02:58:34.833440 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa-util\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f995bc55\" (UID: \"1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f995bc55" Mar 09 02:58:34 crc kubenswrapper[4901]: I0309 02:58:34.867008 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6mnr\" (UniqueName: \"kubernetes.io/projected/1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa-kube-api-access-w6mnr\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f995bc55\" (UID: \"1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f995bc55" Mar 09 02:58:34 crc kubenswrapper[4901]: I0309 02:58:34.944957 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f995bc55" Mar 09 02:58:35 crc kubenswrapper[4901]: I0309 02:58:35.243887 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f995bc55"] Mar 09 02:58:35 crc kubenswrapper[4901]: I0309 02:58:35.456987 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f995bc55" event={"ID":"1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa","Type":"ContainerStarted","Data":"e681c5ad7b3ebe82597f25ec7b227d5e66689c4becf97ac5a5e26fa1aaa3f275"} Mar 09 02:58:35 crc kubenswrapper[4901]: I0309 02:58:35.457352 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f995bc55" event={"ID":"1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa","Type":"ContainerStarted","Data":"564bea5daf2f2b240f264c49af363b8695e4ce77593c9ed5c30dcc025a4d38ca"} Mar 09 02:58:36 crc kubenswrapper[4901]: I0309 02:58:36.468606 4901 generic.go:334] "Generic (PLEG): container finished" podID="1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa" containerID="e681c5ad7b3ebe82597f25ec7b227d5e66689c4becf97ac5a5e26fa1aaa3f275" exitCode=0 Mar 09 02:58:36 crc kubenswrapper[4901]: I0309 02:58:36.468666 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f995bc55" event={"ID":"1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa","Type":"ContainerDied","Data":"e681c5ad7b3ebe82597f25ec7b227d5e66689c4becf97ac5a5e26fa1aaa3f275"} Mar 09 02:58:37 crc kubenswrapper[4901]: I0309 02:58:37.476956 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f995bc55" event={"ID":"1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa","Type":"ContainerStarted","Data":"357a03ac714a4c3b1ebccf248b6e253a0c722e7dc88c9258658fd2c1f6bfb320"} Mar 09 02:58:38 crc kubenswrapper[4901]: I0309 02:58:38.488064 4901 generic.go:334] "Generic (PLEG): container finished" podID="1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa" containerID="357a03ac714a4c3b1ebccf248b6e253a0c722e7dc88c9258658fd2c1f6bfb320" exitCode=0 Mar 09 02:58:38 crc kubenswrapper[4901]: I0309 02:58:38.488260 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f995bc55" event={"ID":"1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa","Type":"ContainerDied","Data":"357a03ac714a4c3b1ebccf248b6e253a0c722e7dc88c9258658fd2c1f6bfb320"} Mar 09 02:58:39 crc kubenswrapper[4901]: I0309 02:58:39.502967 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f995bc55" event={"ID":"1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa","Type":"ContainerStarted","Data":"6564c218116d9abf361f3a9ff941c5d8a75bef080fa9b6bb9b668317a9ae4fdc"} Mar 09 02:58:40 crc kubenswrapper[4901]: I0309 02:58:40.513425 4901 generic.go:334] "Generic (PLEG): container finished" podID="1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa" containerID="6564c218116d9abf361f3a9ff941c5d8a75bef080fa9b6bb9b668317a9ae4fdc" exitCode=0 Mar 09 02:58:40 crc kubenswrapper[4901]: I0309 02:58:40.513524 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f995bc55" event={"ID":"1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa","Type":"ContainerDied","Data":"6564c218116d9abf361f3a9ff941c5d8a75bef080fa9b6bb9b668317a9ae4fdc"} Mar 09 02:58:41 crc kubenswrapper[4901]: I0309 02:58:41.848860 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f995bc55" Mar 09 02:58:41 crc kubenswrapper[4901]: I0309 02:58:41.952974 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6mnr\" (UniqueName: \"kubernetes.io/projected/1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa-kube-api-access-w6mnr\") pod \"1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa\" (UID: \"1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa\") " Mar 09 02:58:41 crc kubenswrapper[4901]: I0309 02:58:41.953037 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa-util\") pod \"1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa\" (UID: \"1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa\") " Mar 09 02:58:41 crc kubenswrapper[4901]: I0309 02:58:41.953137 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa-bundle\") pod \"1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa\" (UID: \"1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa\") " Mar 09 02:58:41 crc kubenswrapper[4901]: I0309 02:58:41.956615 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa-bundle" (OuterVolumeSpecName: "bundle") pod "1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa" (UID: "1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 02:58:41 crc kubenswrapper[4901]: I0309 02:58:41.968191 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa-util" (OuterVolumeSpecName: "util") pod "1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa" (UID: "1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 02:58:41 crc kubenswrapper[4901]: I0309 02:58:41.972486 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa-kube-api-access-w6mnr" (OuterVolumeSpecName: "kube-api-access-w6mnr") pod "1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa" (UID: "1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa"). InnerVolumeSpecName "kube-api-access-w6mnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 02:58:42 crc kubenswrapper[4901]: I0309 02:58:42.054391 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6mnr\" (UniqueName: \"kubernetes.io/projected/1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa-kube-api-access-w6mnr\") on node \"crc\" DevicePath \"\"" Mar 09 02:58:42 crc kubenswrapper[4901]: I0309 02:58:42.054421 4901 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa-util\") on node \"crc\" DevicePath \"\"" Mar 09 02:58:42 crc kubenswrapper[4901]: I0309 02:58:42.054430 4901 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 02:58:42 crc kubenswrapper[4901]: I0309 02:58:42.534620 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f995bc55" event={"ID":"1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa","Type":"ContainerDied","Data":"564bea5daf2f2b240f264c49af363b8695e4ce77593c9ed5c30dcc025a4d38ca"} Mar 09 02:58:42 crc kubenswrapper[4901]: I0309 02:58:42.534701 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="564bea5daf2f2b240f264c49af363b8695e4ce77593c9ed5c30dcc025a4d38ca" Mar 09 02:58:42 crc kubenswrapper[4901]: I0309 02:58:42.534725 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f995bc55" Mar 09 02:58:44 crc kubenswrapper[4901]: I0309 02:58:44.584914 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-6f44f7b99f-btdwg"] Mar 09 02:58:44 crc kubenswrapper[4901]: E0309 02:58:44.585515 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa" containerName="extract" Mar 09 02:58:44 crc kubenswrapper[4901]: I0309 02:58:44.585531 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa" containerName="extract" Mar 09 02:58:44 crc kubenswrapper[4901]: E0309 02:58:44.585544 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa" containerName="pull" Mar 09 02:58:44 crc kubenswrapper[4901]: I0309 02:58:44.585552 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa" containerName="pull" Mar 09 02:58:44 crc kubenswrapper[4901]: E0309 02:58:44.585562 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa" containerName="util" Mar 09 02:58:44 crc kubenswrapper[4901]: I0309 02:58:44.585571 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa" containerName="util" Mar 09 02:58:44 crc kubenswrapper[4901]: I0309 02:58:44.585722 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa" containerName="extract" Mar 09 02:58:44 crc kubenswrapper[4901]: I0309 02:58:44.586301 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-btdwg" Mar 09 02:58:44 crc kubenswrapper[4901]: I0309 02:58:44.588846 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-5l6jq" Mar 09 02:58:44 crc kubenswrapper[4901]: I0309 02:58:44.626329 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6f44f7b99f-btdwg"] Mar 09 02:58:44 crc kubenswrapper[4901]: I0309 02:58:44.688184 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsr2h\" (UniqueName: \"kubernetes.io/projected/e881f11e-7b7b-4a3e-9c63-fbd2d8cd61dd-kube-api-access-tsr2h\") pod \"openstack-operator-controller-init-6f44f7b99f-btdwg\" (UID: \"e881f11e-7b7b-4a3e-9c63-fbd2d8cd61dd\") " pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-btdwg" Mar 09 02:58:44 crc kubenswrapper[4901]: I0309 02:58:44.789677 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsr2h\" (UniqueName: \"kubernetes.io/projected/e881f11e-7b7b-4a3e-9c63-fbd2d8cd61dd-kube-api-access-tsr2h\") pod \"openstack-operator-controller-init-6f44f7b99f-btdwg\" (UID: \"e881f11e-7b7b-4a3e-9c63-fbd2d8cd61dd\") " pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-btdwg" Mar 09 02:58:44 crc kubenswrapper[4901]: I0309 02:58:44.809323 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsr2h\" (UniqueName: \"kubernetes.io/projected/e881f11e-7b7b-4a3e-9c63-fbd2d8cd61dd-kube-api-access-tsr2h\") pod \"openstack-operator-controller-init-6f44f7b99f-btdwg\" (UID: \"e881f11e-7b7b-4a3e-9c63-fbd2d8cd61dd\") " pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-btdwg" Mar 09 02:58:44 crc kubenswrapper[4901]: I0309 02:58:44.906753 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-btdwg" Mar 09 02:58:45 crc kubenswrapper[4901]: I0309 02:58:45.360523 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6f44f7b99f-btdwg"] Mar 09 02:58:45 crc kubenswrapper[4901]: W0309 02:58:45.369725 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode881f11e_7b7b_4a3e_9c63_fbd2d8cd61dd.slice/crio-23674d9450182a85296451265d219506a48f4154eff9c8a293a7bd92621ba839 WatchSource:0}: Error finding container 23674d9450182a85296451265d219506a48f4154eff9c8a293a7bd92621ba839: Status 404 returned error can't find the container with id 23674d9450182a85296451265d219506a48f4154eff9c8a293a7bd92621ba839 Mar 09 02:58:45 crc kubenswrapper[4901]: I0309 02:58:45.556657 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-btdwg" event={"ID":"e881f11e-7b7b-4a3e-9c63-fbd2d8cd61dd","Type":"ContainerStarted","Data":"23674d9450182a85296451265d219506a48f4154eff9c8a293a7bd92621ba839"} Mar 09 02:58:50 crc kubenswrapper[4901]: I0309 02:58:50.590339 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-btdwg" event={"ID":"e881f11e-7b7b-4a3e-9c63-fbd2d8cd61dd","Type":"ContainerStarted","Data":"21402fa016a144f2feb92267f5237295d820b8805a4ba52ecee7350e964a3121"} Mar 09 02:58:50 crc kubenswrapper[4901]: I0309 02:58:50.592344 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-btdwg" Mar 09 02:58:50 crc kubenswrapper[4901]: I0309 02:58:50.644266 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-btdwg" podStartSLOduration=2.218990536 podStartE2EDuration="6.644247855s" podCreationTimestamp="2026-03-09 02:58:44 +0000 UTC" firstStartedPulling="2026-03-09 02:58:45.373715896 +0000 UTC m=+1049.963379628" lastFinishedPulling="2026-03-09 02:58:49.798973215 +0000 UTC m=+1054.388636947" observedRunningTime="2026-03-09 02:58:50.639365682 +0000 UTC m=+1055.229029434" watchObservedRunningTime="2026-03-09 02:58:50.644247855 +0000 UTC m=+1055.233911587" Mar 09 02:58:54 crc kubenswrapper[4901]: I0309 02:58:54.911090 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-btdwg" Mar 09 02:59:13 crc kubenswrapper[4901]: I0309 02:59:13.853559 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6db6876945-l7qrj"] Mar 09 02:59:13 crc kubenswrapper[4901]: I0309 02:59:13.854757 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-l7qrj" Mar 09 02:59:13 crc kubenswrapper[4901]: I0309 02:59:13.857066 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-bkktb" Mar 09 02:59:13 crc kubenswrapper[4901]: I0309 02:59:13.859165 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-7sctc"] Mar 09 02:59:13 crc kubenswrapper[4901]: I0309 02:59:13.859798 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-7sctc" Mar 09 02:59:13 crc kubenswrapper[4901]: I0309 02:59:13.861685 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-fwqzp" Mar 09 02:59:13 crc kubenswrapper[4901]: I0309 02:59:13.867514 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6db6876945-l7qrj"] Mar 09 02:59:13 crc kubenswrapper[4901]: I0309 02:59:13.875418 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-7sctc"] Mar 09 02:59:13 crc kubenswrapper[4901]: I0309 02:59:13.913595 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-5d87c9d997-bvjjz"] Mar 09 02:59:13 crc kubenswrapper[4901]: I0309 02:59:13.914456 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-bvjjz" Mar 09 02:59:13 crc kubenswrapper[4901]: I0309 02:59:13.919967 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-w7fmg" Mar 09 02:59:13 crc kubenswrapper[4901]: I0309 02:59:13.943912 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-64db6967f8-gclrf"] Mar 09 02:59:13 crc kubenswrapper[4901]: I0309 02:59:13.944612 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-gclrf" Mar 09 02:59:13 crc kubenswrapper[4901]: I0309 02:59:13.951445 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-kpgvk" Mar 09 02:59:13 crc kubenswrapper[4901]: I0309 02:59:13.973513 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-5d87c9d997-bvjjz"] Mar 09 02:59:13 crc kubenswrapper[4901]: I0309 02:59:13.983325 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-cf99c678f-xrb6p"] Mar 09 02:59:13 crc kubenswrapper[4901]: I0309 02:59:13.984318 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-xrb6p" Mar 09 02:59:13 crc kubenswrapper[4901]: I0309 02:59:13.985937 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w87jw\" (UniqueName: \"kubernetes.io/projected/e690f8b6-f8ab-42fb-8e9c-be5dcbc52de4-kube-api-access-w87jw\") pod \"designate-operator-controller-manager-5d87c9d997-bvjjz\" (UID: \"e690f8b6-f8ab-42fb-8e9c-be5dcbc52de4\") " pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-bvjjz" Mar 09 02:59:13 crc kubenswrapper[4901]: I0309 02:59:13.985990 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlkrl\" (UniqueName: \"kubernetes.io/projected/150f46cd-b329-412c-b0a9-acd69b79a434-kube-api-access-hlkrl\") pod \"barbican-operator-controller-manager-6db6876945-l7qrj\" (UID: \"150f46cd-b329-412c-b0a9-acd69b79a434\") " pod="openstack-operators/barbican-operator-controller-manager-6db6876945-l7qrj" Mar 09 02:59:13 crc kubenswrapper[4901]: I0309 02:59:13.986040 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gckn8\" (UniqueName: \"kubernetes.io/projected/e7f9248a-6f54-4cbd-9225-a601e2dd4e93-kube-api-access-gckn8\") pod \"cinder-operator-controller-manager-55d77d7b5c-7sctc\" (UID: \"e7f9248a-6f54-4cbd-9225-a601e2dd4e93\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-7sctc" Mar 09 02:59:13 crc kubenswrapper[4901]: I0309 02:59:13.986538 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-zb5jt" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:13.999296 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-64db6967f8-gclrf"] Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.004178 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-lfc7j"] Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.004912 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-lfc7j" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.012694 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-l8xw2" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.024646 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-cf99c678f-xrb6p"] Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.031982 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-f7fcc58b9-24mvj"] Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.032698 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-24mvj" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.038167 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-bndbd" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.038418 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.062132 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-lfc7j"] Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.070321 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-f7fcc58b9-24mvj"] Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.086325 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-545456dc4-5qmp2"] Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.086803 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-244dc\" (UniqueName: \"kubernetes.io/projected/b99c3916-afa1-4f6c-a25e-ca7a7a30d5c6-kube-api-access-244dc\") pod \"horizon-operator-controller-manager-78bc7f9bd9-lfc7j\" (UID: \"b99c3916-afa1-4f6c-a25e-ca7a7a30d5c6\") " pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-lfc7j" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.086868 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gckn8\" (UniqueName: \"kubernetes.io/projected/e7f9248a-6f54-4cbd-9225-a601e2dd4e93-kube-api-access-gckn8\") pod \"cinder-operator-controller-manager-55d77d7b5c-7sctc\" (UID: \"e7f9248a-6f54-4cbd-9225-a601e2dd4e93\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-7sctc" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.086894 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dktvb\" (UniqueName: \"kubernetes.io/projected/21d847c9-9877-4e8d-b414-7f8035ebfc32-kube-api-access-dktvb\") pod \"infra-operator-controller-manager-f7fcc58b9-24mvj\" (UID: \"21d847c9-9877-4e8d-b414-7f8035ebfc32\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-24mvj" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.086934 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w87jw\" (UniqueName: \"kubernetes.io/projected/e690f8b6-f8ab-42fb-8e9c-be5dcbc52de4-kube-api-access-w87jw\") pod \"designate-operator-controller-manager-5d87c9d997-bvjjz\" (UID: \"e690f8b6-f8ab-42fb-8e9c-be5dcbc52de4\") " pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-bvjjz" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.086958 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/21d847c9-9877-4e8d-b414-7f8035ebfc32-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-24mvj\" (UID: \"21d847c9-9877-4e8d-b414-7f8035ebfc32\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-24mvj" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.086981 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlkrl\" (UniqueName: \"kubernetes.io/projected/150f46cd-b329-412c-b0a9-acd69b79a434-kube-api-access-hlkrl\") pod \"barbican-operator-controller-manager-6db6876945-l7qrj\" (UID: \"150f46cd-b329-412c-b0a9-acd69b79a434\") " pod="openstack-operators/barbican-operator-controller-manager-6db6876945-l7qrj" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.087008 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vrdz\" (UniqueName: \"kubernetes.io/projected/5c11cb5a-ab01-422d-a70b-33bb9dd06f8b-kube-api-access-9vrdz\") pod \"heat-operator-controller-manager-cf99c678f-xrb6p\" (UID: \"5c11cb5a-ab01-422d-a70b-33bb9dd06f8b\") " pod="openstack-operators/heat-operator-controller-manager-cf99c678f-xrb6p" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.087028 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrwrt\" (UniqueName: \"kubernetes.io/projected/68bf831f-f763-4c43-b57f-13d244b3a21e-kube-api-access-mrwrt\") pod \"glance-operator-controller-manager-64db6967f8-gclrf\" (UID: \"68bf831f-f763-4c43-b57f-13d244b3a21e\") " pod="openstack-operators/glance-operator-controller-manager-64db6967f8-gclrf" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.087294 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-5qmp2" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.093005 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-rhjf4" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.104599 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-545456dc4-5qmp2"] Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.124310 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlkrl\" (UniqueName: \"kubernetes.io/projected/150f46cd-b329-412c-b0a9-acd69b79a434-kube-api-access-hlkrl\") pod \"barbican-operator-controller-manager-6db6876945-l7qrj\" (UID: \"150f46cd-b329-412c-b0a9-acd69b79a434\") " pod="openstack-operators/barbican-operator-controller-manager-6db6876945-l7qrj" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.128450 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w87jw\" (UniqueName: \"kubernetes.io/projected/e690f8b6-f8ab-42fb-8e9c-be5dcbc52de4-kube-api-access-w87jw\") pod \"designate-operator-controller-manager-5d87c9d997-bvjjz\" (UID: \"e690f8b6-f8ab-42fb-8e9c-be5dcbc52de4\") " pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-bvjjz" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.148194 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c789f89c6-tkpsk"] Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.149116 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gckn8\" (UniqueName: \"kubernetes.io/projected/e7f9248a-6f54-4cbd-9225-a601e2dd4e93-kube-api-access-gckn8\") pod \"cinder-operator-controller-manager-55d77d7b5c-7sctc\" (UID: \"e7f9248a-6f54-4cbd-9225-a601e2dd4e93\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-7sctc" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.175170 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-tkpsk" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.176741 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-l7qrj" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.176967 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-n5px9"] Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.179740 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-w52qb" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.186157 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-7sctc" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.189565 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/21d847c9-9877-4e8d-b414-7f8035ebfc32-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-24mvj\" (UID: \"21d847c9-9877-4e8d-b414-7f8035ebfc32\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-24mvj" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.189622 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vrdz\" (UniqueName: \"kubernetes.io/projected/5c11cb5a-ab01-422d-a70b-33bb9dd06f8b-kube-api-access-9vrdz\") pod \"heat-operator-controller-manager-cf99c678f-xrb6p\" (UID: \"5c11cb5a-ab01-422d-a70b-33bb9dd06f8b\") " pod="openstack-operators/heat-operator-controller-manager-cf99c678f-xrb6p" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.189685 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrwrt\" (UniqueName: \"kubernetes.io/projected/68bf831f-f763-4c43-b57f-13d244b3a21e-kube-api-access-mrwrt\") pod \"glance-operator-controller-manager-64db6967f8-gclrf\" (UID: \"68bf831f-f763-4c43-b57f-13d244b3a21e\") " pod="openstack-operators/glance-operator-controller-manager-64db6967f8-gclrf" Mar 09 02:59:14 crc kubenswrapper[4901]: E0309 02:59:14.189697 4901 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.189715 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5d774\" (UniqueName: \"kubernetes.io/projected/3644aa78-798b-40f2-9041-700fc89959e0-kube-api-access-5d774\") pod \"ironic-operator-controller-manager-545456dc4-5qmp2\" (UID: \"3644aa78-798b-40f2-9041-700fc89959e0\") " pod="openstack-operators/ironic-operator-controller-manager-545456dc4-5qmp2" Mar 09 02:59:14 crc kubenswrapper[4901]: E0309 02:59:14.189752 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21d847c9-9877-4e8d-b414-7f8035ebfc32-cert podName:21d847c9-9877-4e8d-b414-7f8035ebfc32 nodeName:}" failed. No retries permitted until 2026-03-09 02:59:14.689735293 +0000 UTC m=+1079.279399025 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/21d847c9-9877-4e8d-b414-7f8035ebfc32-cert") pod "infra-operator-controller-manager-f7fcc58b9-24mvj" (UID: "21d847c9-9877-4e8d-b414-7f8035ebfc32") : secret "infra-operator-webhook-server-cert" not found Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.189768 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-244dc\" (UniqueName: \"kubernetes.io/projected/b99c3916-afa1-4f6c-a25e-ca7a7a30d5c6-kube-api-access-244dc\") pod \"horizon-operator-controller-manager-78bc7f9bd9-lfc7j\" (UID: \"b99c3916-afa1-4f6c-a25e-ca7a7a30d5c6\") " pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-lfc7j" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.189808 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dktvb\" (UniqueName: \"kubernetes.io/projected/21d847c9-9877-4e8d-b414-7f8035ebfc32-kube-api-access-dktvb\") pod \"infra-operator-controller-manager-f7fcc58b9-24mvj\" (UID: \"21d847c9-9877-4e8d-b414-7f8035ebfc32\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-24mvj" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.198908 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c789f89c6-tkpsk"] Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.198993 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-67d996989d-n5px9" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.201200 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-h7b62" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.206505 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-n5px9"] Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.210085 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-244dc\" (UniqueName: \"kubernetes.io/projected/b99c3916-afa1-4f6c-a25e-ca7a7a30d5c6-kube-api-access-244dc\") pod \"horizon-operator-controller-manager-78bc7f9bd9-lfc7j\" (UID: \"b99c3916-afa1-4f6c-a25e-ca7a7a30d5c6\") " pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-lfc7j" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.213752 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrwrt\" (UniqueName: \"kubernetes.io/projected/68bf831f-f763-4c43-b57f-13d244b3a21e-kube-api-access-mrwrt\") pod \"glance-operator-controller-manager-64db6967f8-gclrf\" (UID: \"68bf831f-f763-4c43-b57f-13d244b3a21e\") " pod="openstack-operators/glance-operator-controller-manager-64db6967f8-gclrf" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.219696 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vrdz\" (UniqueName: \"kubernetes.io/projected/5c11cb5a-ab01-422d-a70b-33bb9dd06f8b-kube-api-access-9vrdz\") pod \"heat-operator-controller-manager-cf99c678f-xrb6p\" (UID: \"5c11cb5a-ab01-422d-a70b-33bb9dd06f8b\") " pod="openstack-operators/heat-operator-controller-manager-cf99c678f-xrb6p" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.220534 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dktvb\" (UniqueName: \"kubernetes.io/projected/21d847c9-9877-4e8d-b414-7f8035ebfc32-kube-api-access-dktvb\") pod \"infra-operator-controller-manager-f7fcc58b9-24mvj\" (UID: \"21d847c9-9877-4e8d-b414-7f8035ebfc32\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-24mvj" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.228873 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-h2bk7"] Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.229770 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-h2bk7" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.232556 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-ffvx2" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.237371 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54688575f-bvb9v"] Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.238426 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-54688575f-bvb9v" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.240603 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-fhrwh" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.245001 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-bvjjz" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.247557 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54688575f-bvb9v"] Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.258287 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-h2bk7"] Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.264899 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-gclrf" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.268140 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-74b6b5dc96-6wntt"] Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.269633 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-6wntt" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.274279 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-q5wl8" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.275347 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-74b6b5dc96-6wntt"] Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.284823 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-mksnm"] Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.285773 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-mksnm" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.288144 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-npnzr" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.290059 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-mksnm"] Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.291612 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kbvd\" (UniqueName: \"kubernetes.io/projected/a6308427-401a-4c01-afdb-e385f8efc20d-kube-api-access-4kbvd\") pod \"neutron-operator-controller-manager-54688575f-bvb9v\" (UID: \"a6308427-401a-4c01-afdb-e385f8efc20d\") " pod="openstack-operators/neutron-operator-controller-manager-54688575f-bvb9v" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.291655 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mb5jr\" (UniqueName: \"kubernetes.io/projected/49531808-52f2-497a-98bc-61883926e221-kube-api-access-mb5jr\") pod \"mariadb-operator-controller-manager-7b6bfb6475-h2bk7\" (UID: \"49531808-52f2-497a-98bc-61883926e221\") " pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-h2bk7" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.291704 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mvpw\" (UniqueName: \"kubernetes.io/projected/6ee7156f-f994-4e79-875d-744fed479fcf-kube-api-access-5mvpw\") pod \"keystone-operator-controller-manager-7c789f89c6-tkpsk\" (UID: \"6ee7156f-f994-4e79-875d-744fed479fcf\") " pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-tkpsk" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.291723 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5d774\" (UniqueName: \"kubernetes.io/projected/3644aa78-798b-40f2-9041-700fc89959e0-kube-api-access-5d774\") pod \"ironic-operator-controller-manager-545456dc4-5qmp2\" (UID: \"3644aa78-798b-40f2-9041-700fc89959e0\") " pod="openstack-operators/ironic-operator-controller-manager-545456dc4-5qmp2" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.291742 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8h4x\" (UniqueName: \"kubernetes.io/projected/1b18c23b-f21d-4935-898a-2864b473119c-kube-api-access-w8h4x\") pod \"manila-operator-controller-manager-67d996989d-n5px9\" (UID: \"1b18c23b-f21d-4935-898a-2864b473119c\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-n5px9" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.301879 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-x7pjx"] Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.304461 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-x7pjx" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.308485 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.308496 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-76rl6" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.313291 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-75684d597f-x98gb"] Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.314918 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-xrb6p" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.317049 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5d774\" (UniqueName: \"kubernetes.io/projected/3644aa78-798b-40f2-9041-700fc89959e0-kube-api-access-5d774\") pod \"ironic-operator-controller-manager-545456dc4-5qmp2\" (UID: \"3644aa78-798b-40f2-9041-700fc89959e0\") " pod="openstack-operators/ironic-operator-controller-manager-545456dc4-5qmp2" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.318448 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-x98gb" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.321959 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-x7pjx"] Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.322521 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-w7qnh" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.334577 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-lfc7j" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.336718 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-75684d597f-x98gb"] Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.366703 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-648564c9fc-5z5dd"] Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.368952 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-5z5dd" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.374893 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-j76xn" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.376958 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-9b9ff9f4d-2qbw7"] Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.377790 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-2qbw7" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.379086 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-d6xkf" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.382755 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-648564c9fc-5z5dd"] Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.393420 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmnw9\" (UniqueName: \"kubernetes.io/projected/00255202-1625-4595-a5d2-90aadb87fcfc-kube-api-access-tmnw9\") pod \"octavia-operator-controller-manager-5d86c7ddb7-mksnm\" (UID: \"00255202-1625-4595-a5d2-90aadb87fcfc\") " pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-mksnm" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.393466 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbx5n\" (UniqueName: \"kubernetes.io/projected/15663ad3-38d8-4a71-88fd-28f74b590e6e-kube-api-access-zbx5n\") pod \"ovn-operator-controller-manager-75684d597f-x98gb\" (UID: \"15663ad3-38d8-4a71-88fd-28f74b590e6e\") " pod="openstack-operators/ovn-operator-controller-manager-75684d597f-x98gb" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.393518 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kbvd\" (UniqueName: \"kubernetes.io/projected/a6308427-401a-4c01-afdb-e385f8efc20d-kube-api-access-4kbvd\") pod \"neutron-operator-controller-manager-54688575f-bvb9v\" (UID: \"a6308427-401a-4c01-afdb-e385f8efc20d\") " pod="openstack-operators/neutron-operator-controller-manager-54688575f-bvb9v" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.393541 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4x9p6\" (UniqueName: \"kubernetes.io/projected/dbe9fe4b-dcc0-4c01-9de1-8f68a30ca974-kube-api-access-4x9p6\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-x7pjx\" (UID: \"dbe9fe4b-dcc0-4c01-9de1-8f68a30ca974\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-x7pjx" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.393563 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2tbx\" (UniqueName: \"kubernetes.io/projected/e6489b4e-ac8f-4853-90a2-6f7ec0af3367-kube-api-access-z2tbx\") pod \"nova-operator-controller-manager-74b6b5dc96-6wntt\" (UID: \"e6489b4e-ac8f-4853-90a2-6f7ec0af3367\") " pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-6wntt" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.393587 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mb5jr\" (UniqueName: \"kubernetes.io/projected/49531808-52f2-497a-98bc-61883926e221-kube-api-access-mb5jr\") pod \"mariadb-operator-controller-manager-7b6bfb6475-h2bk7\" (UID: \"49531808-52f2-497a-98bc-61883926e221\") " pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-h2bk7" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.393631 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dbe9fe4b-dcc0-4c01-9de1-8f68a30ca974-cert\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-x7pjx\" (UID: \"dbe9fe4b-dcc0-4c01-9de1-8f68a30ca974\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-x7pjx" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.393653 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mvpw\" (UniqueName: \"kubernetes.io/projected/6ee7156f-f994-4e79-875d-744fed479fcf-kube-api-access-5mvpw\") pod \"keystone-operator-controller-manager-7c789f89c6-tkpsk\" (UID: \"6ee7156f-f994-4e79-875d-744fed479fcf\") " pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-tkpsk" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.393671 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8h4x\" (UniqueName: \"kubernetes.io/projected/1b18c23b-f21d-4935-898a-2864b473119c-kube-api-access-w8h4x\") pod \"manila-operator-controller-manager-67d996989d-n5px9\" (UID: \"1b18c23b-f21d-4935-898a-2864b473119c\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-n5px9" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.407202 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9b9ff9f4d-2qbw7"] Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.407278 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5fdb694969-8d9wj"] Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.408052 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-8d9wj" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.412384 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8h4x\" (UniqueName: \"kubernetes.io/projected/1b18c23b-f21d-4935-898a-2864b473119c-kube-api-access-w8h4x\") pod \"manila-operator-controller-manager-67d996989d-n5px9\" (UID: \"1b18c23b-f21d-4935-898a-2864b473119c\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-n5px9" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.415510 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mvpw\" (UniqueName: \"kubernetes.io/projected/6ee7156f-f994-4e79-875d-744fed479fcf-kube-api-access-5mvpw\") pod \"keystone-operator-controller-manager-7c789f89c6-tkpsk\" (UID: \"6ee7156f-f994-4e79-875d-744fed479fcf\") " pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-tkpsk" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.427403 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5fdb694969-8d9wj"] Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.427897 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-zggm9" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.445810 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mb5jr\" (UniqueName: \"kubernetes.io/projected/49531808-52f2-497a-98bc-61883926e221-kube-api-access-mb5jr\") pod \"mariadb-operator-controller-manager-7b6bfb6475-h2bk7\" (UID: \"49531808-52f2-497a-98bc-61883926e221\") " pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-h2bk7" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.446474 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kbvd\" (UniqueName: \"kubernetes.io/projected/a6308427-401a-4c01-afdb-e385f8efc20d-kube-api-access-4kbvd\") pod \"neutron-operator-controller-manager-54688575f-bvb9v\" (UID: \"a6308427-401a-4c01-afdb-e385f8efc20d\") " pod="openstack-operators/neutron-operator-controller-manager-54688575f-bvb9v" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.487109 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-5qmp2" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.495752 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dbe9fe4b-dcc0-4c01-9de1-8f68a30ca974-cert\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-x7pjx\" (UID: \"dbe9fe4b-dcc0-4c01-9de1-8f68a30ca974\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-x7pjx" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.495802 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgfv4\" (UniqueName: \"kubernetes.io/projected/4b68ef50-4ff4-420c-a455-ec9dd86db4cc-kube-api-access-zgfv4\") pod \"swift-operator-controller-manager-9b9ff9f4d-2qbw7\" (UID: \"4b68ef50-4ff4-420c-a455-ec9dd86db4cc\") " pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-2qbw7" Mar 09 02:59:14 crc kubenswrapper[4901]: E0309 02:59:14.497026 4901 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 02:59:14 crc kubenswrapper[4901]: E0309 02:59:14.497086 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dbe9fe4b-dcc0-4c01-9de1-8f68a30ca974-cert podName:dbe9fe4b-dcc0-4c01-9de1-8f68a30ca974 nodeName:}" failed. No retries permitted until 2026-03-09 02:59:14.997066667 +0000 UTC m=+1079.586730399 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dbe9fe4b-dcc0-4c01-9de1-8f68a30ca974-cert") pod "openstack-baremetal-operator-controller-manager-dc6dbbbd-x7pjx" (UID: "dbe9fe4b-dcc0-4c01-9de1-8f68a30ca974") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.495827 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmnw9\" (UniqueName: \"kubernetes.io/projected/00255202-1625-4595-a5d2-90aadb87fcfc-kube-api-access-tmnw9\") pod \"octavia-operator-controller-manager-5d86c7ddb7-mksnm\" (UID: \"00255202-1625-4595-a5d2-90aadb87fcfc\") " pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-mksnm" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.498397 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbx5n\" (UniqueName: \"kubernetes.io/projected/15663ad3-38d8-4a71-88fd-28f74b590e6e-kube-api-access-zbx5n\") pod \"ovn-operator-controller-manager-75684d597f-x98gb\" (UID: \"15663ad3-38d8-4a71-88fd-28f74b590e6e\") " pod="openstack-operators/ovn-operator-controller-manager-75684d597f-x98gb" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.498426 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8fmk\" (UniqueName: \"kubernetes.io/projected/37afdbe5-071a-4161-8390-1de33fefd993-kube-api-access-l8fmk\") pod \"telemetry-operator-controller-manager-5fdb694969-8d9wj\" (UID: \"37afdbe5-071a-4161-8390-1de33fefd993\") " pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-8d9wj" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.498509 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4x9p6\" (UniqueName: \"kubernetes.io/projected/dbe9fe4b-dcc0-4c01-9de1-8f68a30ca974-kube-api-access-4x9p6\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-x7pjx\" (UID: \"dbe9fe4b-dcc0-4c01-9de1-8f68a30ca974\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-x7pjx" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.498536 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2tbx\" (UniqueName: \"kubernetes.io/projected/e6489b4e-ac8f-4853-90a2-6f7ec0af3367-kube-api-access-z2tbx\") pod \"nova-operator-controller-manager-74b6b5dc96-6wntt\" (UID: \"e6489b4e-ac8f-4853-90a2-6f7ec0af3367\") " pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-6wntt" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.498563 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjxc2\" (UniqueName: \"kubernetes.io/projected/17fc5096-7a83-4918-ba7f-213f188a1ce3-kube-api-access-fjxc2\") pod \"placement-operator-controller-manager-648564c9fc-5z5dd\" (UID: \"17fc5096-7a83-4918-ba7f-213f188a1ce3\") " pod="openstack-operators/placement-operator-controller-manager-648564c9fc-5z5dd" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.524975 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbx5n\" (UniqueName: \"kubernetes.io/projected/15663ad3-38d8-4a71-88fd-28f74b590e6e-kube-api-access-zbx5n\") pod \"ovn-operator-controller-manager-75684d597f-x98gb\" (UID: \"15663ad3-38d8-4a71-88fd-28f74b590e6e\") " pod="openstack-operators/ovn-operator-controller-manager-75684d597f-x98gb" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.529519 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2tbx\" (UniqueName: \"kubernetes.io/projected/e6489b4e-ac8f-4853-90a2-6f7ec0af3367-kube-api-access-z2tbx\") pod \"nova-operator-controller-manager-74b6b5dc96-6wntt\" (UID: \"e6489b4e-ac8f-4853-90a2-6f7ec0af3367\") " pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-6wntt" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.531061 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4x9p6\" (UniqueName: \"kubernetes.io/projected/dbe9fe4b-dcc0-4c01-9de1-8f68a30ca974-kube-api-access-4x9p6\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-x7pjx\" (UID: \"dbe9fe4b-dcc0-4c01-9de1-8f68a30ca974\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-x7pjx" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.535779 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-x98gb" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.539751 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmnw9\" (UniqueName: \"kubernetes.io/projected/00255202-1625-4595-a5d2-90aadb87fcfc-kube-api-access-tmnw9\") pod \"octavia-operator-controller-manager-5d86c7ddb7-mksnm\" (UID: \"00255202-1625-4595-a5d2-90aadb87fcfc\") " pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-mksnm" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.542717 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-55b5ff4dbb-6qs6f"] Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.543491 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-6qs6f" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.547129 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-gzn59" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.572339 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-55b5ff4dbb-6qs6f"] Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.605362 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgfv4\" (UniqueName: \"kubernetes.io/projected/4b68ef50-4ff4-420c-a455-ec9dd86db4cc-kube-api-access-zgfv4\") pod \"swift-operator-controller-manager-9b9ff9f4d-2qbw7\" (UID: \"4b68ef50-4ff4-420c-a455-ec9dd86db4cc\") " pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-2qbw7" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.605429 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8fmk\" (UniqueName: \"kubernetes.io/projected/37afdbe5-071a-4161-8390-1de33fefd993-kube-api-access-l8fmk\") pod \"telemetry-operator-controller-manager-5fdb694969-8d9wj\" (UID: \"37afdbe5-071a-4161-8390-1de33fefd993\") " pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-8d9wj" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.605500 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjxc2\" (UniqueName: \"kubernetes.io/projected/17fc5096-7a83-4918-ba7f-213f188a1ce3-kube-api-access-fjxc2\") pod \"placement-operator-controller-manager-648564c9fc-5z5dd\" (UID: \"17fc5096-7a83-4918-ba7f-213f188a1ce3\") " pod="openstack-operators/placement-operator-controller-manager-648564c9fc-5z5dd" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.605546 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jxdv\" (UniqueName: \"kubernetes.io/projected/40444baa-6195-4d1a-9704-21874564d865-kube-api-access-5jxdv\") pod \"test-operator-controller-manager-55b5ff4dbb-6qs6f\" (UID: \"40444baa-6195-4d1a-9704-21874564d865\") " pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-6qs6f" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.607286 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-tkpsk" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.619061 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-67d996989d-n5px9" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.636378 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8fmk\" (UniqueName: \"kubernetes.io/projected/37afdbe5-071a-4161-8390-1de33fefd993-kube-api-access-l8fmk\") pod \"telemetry-operator-controller-manager-5fdb694969-8d9wj\" (UID: \"37afdbe5-071a-4161-8390-1de33fefd993\") " pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-8d9wj" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.636734 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgfv4\" (UniqueName: \"kubernetes.io/projected/4b68ef50-4ff4-420c-a455-ec9dd86db4cc-kube-api-access-zgfv4\") pod \"swift-operator-controller-manager-9b9ff9f4d-2qbw7\" (UID: \"4b68ef50-4ff4-420c-a455-ec9dd86db4cc\") " pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-2qbw7" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.640935 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjxc2\" (UniqueName: \"kubernetes.io/projected/17fc5096-7a83-4918-ba7f-213f188a1ce3-kube-api-access-fjxc2\") pod \"placement-operator-controller-manager-648564c9fc-5z5dd\" (UID: \"17fc5096-7a83-4918-ba7f-213f188a1ce3\") " pod="openstack-operators/placement-operator-controller-manager-648564c9fc-5z5dd" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.655633 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-h2bk7" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.659463 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-54688575f-bvb9v" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.667087 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-6wntt" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.677854 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-r2rzc"] Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.678767 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-r2rzc" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.681637 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-zwr5t" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.700449 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-r2rzc"] Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.705043 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-jmbhm"] Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.705839 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-jmbhm" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.707071 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/21d847c9-9877-4e8d-b414-7f8035ebfc32-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-24mvj\" (UID: \"21d847c9-9877-4e8d-b414-7f8035ebfc32\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-24mvj" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.707102 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jxdv\" (UniqueName: \"kubernetes.io/projected/40444baa-6195-4d1a-9704-21874564d865-kube-api-access-5jxdv\") pod \"test-operator-controller-manager-55b5ff4dbb-6qs6f\" (UID: \"40444baa-6195-4d1a-9704-21874564d865\") " pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-6qs6f" Mar 09 02:59:14 crc kubenswrapper[4901]: E0309 02:59:14.707809 4901 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 09 02:59:14 crc kubenswrapper[4901]: E0309 02:59:14.707859 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21d847c9-9877-4e8d-b414-7f8035ebfc32-cert podName:21d847c9-9877-4e8d-b414-7f8035ebfc32 nodeName:}" failed. No retries permitted until 2026-03-09 02:59:15.707844118 +0000 UTC m=+1080.297507850 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/21d847c9-9877-4e8d-b414-7f8035ebfc32-cert") pod "infra-operator-controller-manager-f7fcc58b9-24mvj" (UID: "21d847c9-9877-4e8d-b414-7f8035ebfc32") : secret "infra-operator-webhook-server-cert" not found Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.709652 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.709804 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.712387 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-s5d64" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.724492 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-jmbhm"] Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.725594 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jxdv\" (UniqueName: \"kubernetes.io/projected/40444baa-6195-4d1a-9704-21874564d865-kube-api-access-5jxdv\") pod \"test-operator-controller-manager-55b5ff4dbb-6qs6f\" (UID: \"40444baa-6195-4d1a-9704-21874564d865\") " pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-6qs6f" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.728341 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wvsdc"] Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.731866 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wvsdc" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.733271 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wvsdc"] Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.738651 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-x87cc" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.749432 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-mksnm" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.805164 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-7sctc"] Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.807963 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt2tc\" (UniqueName: \"kubernetes.io/projected/38e4f81e-db3d-4d6d-824c-cdcf6e42ab1f-kube-api-access-lt2tc\") pod \"rabbitmq-cluster-operator-manager-668c99d594-wvsdc\" (UID: \"38e4f81e-db3d-4d6d-824c-cdcf6e42ab1f\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wvsdc" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.808079 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/801bcadf-a5a1-4b3c-9564-e1e21ff68f7f-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-jmbhm\" (UID: \"801bcadf-a5a1-4b3c-9564-e1e21ff68f7f\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-jmbhm" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.808374 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcf7h\" (UniqueName: \"kubernetes.io/projected/801bcadf-a5a1-4b3c-9564-e1e21ff68f7f-kube-api-access-kcf7h\") pod \"openstack-operator-controller-manager-7dfcb4d64f-jmbhm\" (UID: \"801bcadf-a5a1-4b3c-9564-e1e21ff68f7f\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-jmbhm" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.808423 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/801bcadf-a5a1-4b3c-9564-e1e21ff68f7f-metrics-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-jmbhm\" (UID: \"801bcadf-a5a1-4b3c-9564-e1e21ff68f7f\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-jmbhm" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.808539 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq6t4\" (UniqueName: \"kubernetes.io/projected/f86fd171-86dc-4bad-8b53-24cde4942e76-kube-api-access-lq6t4\") pod \"watcher-operator-controller-manager-bccc79885-r2rzc\" (UID: \"f86fd171-86dc-4bad-8b53-24cde4942e76\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-r2rzc" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.818655 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6db6876945-l7qrj"] Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.872116 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-5z5dd" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.899860 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-2qbw7" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.913365 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/801bcadf-a5a1-4b3c-9564-e1e21ff68f7f-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-jmbhm\" (UID: \"801bcadf-a5a1-4b3c-9564-e1e21ff68f7f\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-jmbhm" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.913523 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcf7h\" (UniqueName: \"kubernetes.io/projected/801bcadf-a5a1-4b3c-9564-e1e21ff68f7f-kube-api-access-kcf7h\") pod \"openstack-operator-controller-manager-7dfcb4d64f-jmbhm\" (UID: \"801bcadf-a5a1-4b3c-9564-e1e21ff68f7f\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-jmbhm" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.913564 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/801bcadf-a5a1-4b3c-9564-e1e21ff68f7f-metrics-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-jmbhm\" (UID: \"801bcadf-a5a1-4b3c-9564-e1e21ff68f7f\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-jmbhm" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.913604 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq6t4\" (UniqueName: \"kubernetes.io/projected/f86fd171-86dc-4bad-8b53-24cde4942e76-kube-api-access-lq6t4\") pod \"watcher-operator-controller-manager-bccc79885-r2rzc\" (UID: \"f86fd171-86dc-4bad-8b53-24cde4942e76\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-r2rzc" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.913668 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lt2tc\" (UniqueName: \"kubernetes.io/projected/38e4f81e-db3d-4d6d-824c-cdcf6e42ab1f-kube-api-access-lt2tc\") pod \"rabbitmq-cluster-operator-manager-668c99d594-wvsdc\" (UID: \"38e4f81e-db3d-4d6d-824c-cdcf6e42ab1f\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wvsdc" Mar 09 02:59:14 crc kubenswrapper[4901]: E0309 02:59:14.914098 4901 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 09 02:59:14 crc kubenswrapper[4901]: E0309 02:59:14.914141 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/801bcadf-a5a1-4b3c-9564-e1e21ff68f7f-webhook-certs podName:801bcadf-a5a1-4b3c-9564-e1e21ff68f7f nodeName:}" failed. No retries permitted until 2026-03-09 02:59:15.414125237 +0000 UTC m=+1080.003788969 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/801bcadf-a5a1-4b3c-9564-e1e21ff68f7f-webhook-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-jmbhm" (UID: "801bcadf-a5a1-4b3c-9564-e1e21ff68f7f") : secret "webhook-server-cert" not found Mar 09 02:59:14 crc kubenswrapper[4901]: E0309 02:59:14.914315 4901 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 09 02:59:14 crc kubenswrapper[4901]: E0309 02:59:14.914341 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/801bcadf-a5a1-4b3c-9564-e1e21ff68f7f-metrics-certs podName:801bcadf-a5a1-4b3c-9564-e1e21ff68f7f nodeName:}" failed. No retries permitted until 2026-03-09 02:59:15.414332702 +0000 UTC m=+1080.003996434 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/801bcadf-a5a1-4b3c-9564-e1e21ff68f7f-metrics-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-jmbhm" (UID: "801bcadf-a5a1-4b3c-9564-e1e21ff68f7f") : secret "metrics-server-cert" not found Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.927671 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-8d9wj" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.934128 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcf7h\" (UniqueName: \"kubernetes.io/projected/801bcadf-a5a1-4b3c-9564-e1e21ff68f7f-kube-api-access-kcf7h\") pod \"openstack-operator-controller-manager-7dfcb4d64f-jmbhm\" (UID: \"801bcadf-a5a1-4b3c-9564-e1e21ff68f7f\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-jmbhm" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.934820 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt2tc\" (UniqueName: \"kubernetes.io/projected/38e4f81e-db3d-4d6d-824c-cdcf6e42ab1f-kube-api-access-lt2tc\") pod \"rabbitmq-cluster-operator-manager-668c99d594-wvsdc\" (UID: \"38e4f81e-db3d-4d6d-824c-cdcf6e42ab1f\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wvsdc" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.935395 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq6t4\" (UniqueName: \"kubernetes.io/projected/f86fd171-86dc-4bad-8b53-24cde4942e76-kube-api-access-lq6t4\") pod \"watcher-operator-controller-manager-bccc79885-r2rzc\" (UID: \"f86fd171-86dc-4bad-8b53-24cde4942e76\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-r2rzc" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.954267 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-6qs6f" Mar 09 02:59:14 crc kubenswrapper[4901]: I0309 02:59:14.974258 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-64db6967f8-gclrf"] Mar 09 02:59:15 crc kubenswrapper[4901]: I0309 02:59:15.010095 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-r2rzc" Mar 09 02:59:15 crc kubenswrapper[4901]: I0309 02:59:15.015587 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dbe9fe4b-dcc0-4c01-9de1-8f68a30ca974-cert\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-x7pjx\" (UID: \"dbe9fe4b-dcc0-4c01-9de1-8f68a30ca974\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-x7pjx" Mar 09 02:59:15 crc kubenswrapper[4901]: E0309 02:59:15.015744 4901 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 02:59:15 crc kubenswrapper[4901]: E0309 02:59:15.015975 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dbe9fe4b-dcc0-4c01-9de1-8f68a30ca974-cert podName:dbe9fe4b-dcc0-4c01-9de1-8f68a30ca974 nodeName:}" failed. No retries permitted until 2026-03-09 02:59:16.015790849 +0000 UTC m=+1080.605454581 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dbe9fe4b-dcc0-4c01-9de1-8f68a30ca974-cert") pod "openstack-baremetal-operator-controller-manager-dc6dbbbd-x7pjx" (UID: "dbe9fe4b-dcc0-4c01-9de1-8f68a30ca974") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 02:59:15 crc kubenswrapper[4901]: W0309 02:59:15.016845 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68bf831f_f763_4c43_b57f_13d244b3a21e.slice/crio-a00ac6568842644208d1ae35a933c4b1ebefc1ff6b7738d6324340b972c77837 WatchSource:0}: Error finding container a00ac6568842644208d1ae35a933c4b1ebefc1ff6b7738d6324340b972c77837: Status 404 returned error can't find the container with id a00ac6568842644208d1ae35a933c4b1ebefc1ff6b7738d6324340b972c77837 Mar 09 02:59:15 crc kubenswrapper[4901]: I0309 02:59:15.069826 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-5d87c9d997-bvjjz"] Mar 09 02:59:15 crc kubenswrapper[4901]: I0309 02:59:15.070509 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wvsdc" Mar 09 02:59:15 crc kubenswrapper[4901]: I0309 02:59:15.087770 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-lfc7j"] Mar 09 02:59:15 crc kubenswrapper[4901]: W0309 02:59:15.090997 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode690f8b6_f8ab_42fb_8e9c_be5dcbc52de4.slice/crio-21e4ddb336eeed548c4e23df639d7e87305e48eb85d35fc7ef820fd988e10bb6 WatchSource:0}: Error finding container 21e4ddb336eeed548c4e23df639d7e87305e48eb85d35fc7ef820fd988e10bb6: Status 404 returned error can't find the container with id 21e4ddb336eeed548c4e23df639d7e87305e48eb85d35fc7ef820fd988e10bb6 Mar 09 02:59:15 crc kubenswrapper[4901]: I0309 02:59:15.093025 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-cf99c678f-xrb6p"] Mar 09 02:59:15 crc kubenswrapper[4901]: W0309 02:59:15.104036 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb99c3916_afa1_4f6c_a25e_ca7a7a30d5c6.slice/crio-a62ebe2d6fc159f955acb68742fa85d7a1c61cd4105b7e4d7db3576530a8d5e9 WatchSource:0}: Error finding container a62ebe2d6fc159f955acb68742fa85d7a1c61cd4105b7e4d7db3576530a8d5e9: Status 404 returned error can't find the container with id a62ebe2d6fc159f955acb68742fa85d7a1c61cd4105b7e4d7db3576530a8d5e9 Mar 09 02:59:15 crc kubenswrapper[4901]: I0309 02:59:15.241437 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-545456dc4-5qmp2"] Mar 09 02:59:15 crc kubenswrapper[4901]: I0309 02:59:15.246428 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-75684d597f-x98gb"] Mar 09 02:59:15 crc kubenswrapper[4901]: I0309 02:59:15.321764 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-n5px9"] Mar 09 02:59:15 crc kubenswrapper[4901]: I0309 02:59:15.348561 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c789f89c6-tkpsk"] Mar 09 02:59:15 crc kubenswrapper[4901]: I0309 02:59:15.353391 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-74b6b5dc96-6wntt"] Mar 09 02:59:15 crc kubenswrapper[4901]: I0309 02:59:15.361937 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-h2bk7"] Mar 09 02:59:15 crc kubenswrapper[4901]: I0309 02:59:15.368346 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-mksnm"] Mar 09 02:59:15 crc kubenswrapper[4901]: E0309 02:59:15.372675 4901 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:2d59045b8d8e6f9c5483c4fdda7c5057218d553200dc4bcf26789980ac1d9abd,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tmnw9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-5d86c7ddb7-mksnm_openstack-operators(00255202-1625-4595-a5d2-90aadb87fcfc): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 09 02:59:15 crc kubenswrapper[4901]: I0309 02:59:15.372781 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54688575f-bvb9v"] Mar 09 02:59:15 crc kubenswrapper[4901]: E0309 02:59:15.373853 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-mksnm" podUID="00255202-1625-4595-a5d2-90aadb87fcfc" Mar 09 02:59:15 crc kubenswrapper[4901]: W0309 02:59:15.374559 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6308427_401a_4c01_afdb_e385f8efc20d.slice/crio-70c60def46964f5c214f6f718dbe3f5ef803f69627e3eb240b2739706ae31785 WatchSource:0}: Error finding container 70c60def46964f5c214f6f718dbe3f5ef803f69627e3eb240b2739706ae31785: Status 404 returned error can't find the container with id 70c60def46964f5c214f6f718dbe3f5ef803f69627e3eb240b2739706ae31785 Mar 09 02:59:15 crc kubenswrapper[4901]: E0309 02:59:15.376786 4901 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:b242403a27609ac87a0ed3a7dd788aceaf8f3da3620981cf5e000d56862d77a4,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4kbvd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-54688575f-bvb9v_openstack-operators(a6308427-401a-4c01-afdb-e385f8efc20d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 09 02:59:15 crc kubenswrapper[4901]: E0309 02:59:15.378499 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-54688575f-bvb9v" podUID="a6308427-401a-4c01-afdb-e385f8efc20d" Mar 09 02:59:15 crc kubenswrapper[4901]: I0309 02:59:15.423197 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/801bcadf-a5a1-4b3c-9564-e1e21ff68f7f-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-jmbhm\" (UID: \"801bcadf-a5a1-4b3c-9564-e1e21ff68f7f\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-jmbhm" Mar 09 02:59:15 crc kubenswrapper[4901]: I0309 02:59:15.423305 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/801bcadf-a5a1-4b3c-9564-e1e21ff68f7f-metrics-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-jmbhm\" (UID: \"801bcadf-a5a1-4b3c-9564-e1e21ff68f7f\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-jmbhm" Mar 09 02:59:15 crc kubenswrapper[4901]: E0309 02:59:15.423330 4901 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 09 02:59:15 crc kubenswrapper[4901]: E0309 02:59:15.423381 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/801bcadf-a5a1-4b3c-9564-e1e21ff68f7f-webhook-certs podName:801bcadf-a5a1-4b3c-9564-e1e21ff68f7f nodeName:}" failed. No retries permitted until 2026-03-09 02:59:16.423367141 +0000 UTC m=+1081.013030873 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/801bcadf-a5a1-4b3c-9564-e1e21ff68f7f-webhook-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-jmbhm" (UID: "801bcadf-a5a1-4b3c-9564-e1e21ff68f7f") : secret "webhook-server-cert" not found Mar 09 02:59:15 crc kubenswrapper[4901]: E0309 02:59:15.423433 4901 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 09 02:59:15 crc kubenswrapper[4901]: E0309 02:59:15.423498 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/801bcadf-a5a1-4b3c-9564-e1e21ff68f7f-metrics-certs podName:801bcadf-a5a1-4b3c-9564-e1e21ff68f7f nodeName:}" failed. No retries permitted until 2026-03-09 02:59:16.423481314 +0000 UTC m=+1081.013145046 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/801bcadf-a5a1-4b3c-9564-e1e21ff68f7f-metrics-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-jmbhm" (UID: "801bcadf-a5a1-4b3c-9564-e1e21ff68f7f") : secret "metrics-server-cert" not found Mar 09 02:59:15 crc kubenswrapper[4901]: I0309 02:59:15.529489 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-648564c9fc-5z5dd"] Mar 09 02:59:15 crc kubenswrapper[4901]: I0309 02:59:15.536193 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9b9ff9f4d-2qbw7"] Mar 09 02:59:15 crc kubenswrapper[4901]: I0309 02:59:15.539833 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5fdb694969-8d9wj"] Mar 09 02:59:15 crc kubenswrapper[4901]: I0309 02:59:15.543732 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-r2rzc"] Mar 09 02:59:15 crc kubenswrapper[4901]: W0309 02:59:15.546282 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17fc5096_7a83_4918_ba7f_213f188a1ce3.slice/crio-f75aba29660b11df7250bd55da69a3ed8d94abe4298bdc7883d13ade5ebe81b8 WatchSource:0}: Error finding container f75aba29660b11df7250bd55da69a3ed8d94abe4298bdc7883d13ade5ebe81b8: Status 404 returned error can't find the container with id f75aba29660b11df7250bd55da69a3ed8d94abe4298bdc7883d13ade5ebe81b8 Mar 09 02:59:15 crc kubenswrapper[4901]: W0309 02:59:15.550469 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b68ef50_4ff4_420c_a455_ec9dd86db4cc.slice/crio-477aa0f2e255f5aede690f8a435a0e516bc86b529f0ee97ef0c77c5118c3576a WatchSource:0}: Error finding container 477aa0f2e255f5aede690f8a435a0e516bc86b529f0ee97ef0c77c5118c3576a: Status 404 returned error can't find the container with id 477aa0f2e255f5aede690f8a435a0e516bc86b529f0ee97ef0c77c5118c3576a Mar 09 02:59:15 crc kubenswrapper[4901]: E0309 02:59:15.552831 4901 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:f309cdea8084a4b1e8cbcd732d6e250fd93c55cfd1b48ba9026907c8591faab7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zgfv4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-9b9ff9f4d-2qbw7_openstack-operators(4b68ef50-4ff4-420c-a455-ec9dd86db4cc): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 09 02:59:15 crc kubenswrapper[4901]: E0309 02:59:15.554057 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-2qbw7" podUID="4b68ef50-4ff4-420c-a455-ec9dd86db4cc" Mar 09 02:59:15 crc kubenswrapper[4901]: W0309 02:59:15.561089 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf86fd171_86dc_4bad_8b53_24cde4942e76.slice/crio-4bdb846519f748bbda22c5d08fb9da1d7d97006a516f58136fae8c388204d9cc WatchSource:0}: Error finding container 4bdb846519f748bbda22c5d08fb9da1d7d97006a516f58136fae8c388204d9cc: Status 404 returned error can't find the container with id 4bdb846519f748bbda22c5d08fb9da1d7d97006a516f58136fae8c388204d9cc Mar 09 02:59:15 crc kubenswrapper[4901]: I0309 02:59:15.563528 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-55b5ff4dbb-6qs6f"] Mar 09 02:59:15 crc kubenswrapper[4901]: E0309 02:59:15.565822 4901 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lq6t4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-bccc79885-r2rzc_openstack-operators(f86fd171-86dc-4bad-8b53-24cde4942e76): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 09 02:59:15 crc kubenswrapper[4901]: E0309 02:59:15.567052 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-r2rzc" podUID="f86fd171-86dc-4bad-8b53-24cde4942e76" Mar 09 02:59:15 crc kubenswrapper[4901]: W0309 02:59:15.567839 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37afdbe5_071a_4161_8390_1de33fefd993.slice/crio-a6441ef62261697c7328940a1dddcf71b4ff8b43682701d11a7b27b40ce4bc7c WatchSource:0}: Error finding container a6441ef62261697c7328940a1dddcf71b4ff8b43682701d11a7b27b40ce4bc7c: Status 404 returned error can't find the container with id a6441ef62261697c7328940a1dddcf71b4ff8b43682701d11a7b27b40ce4bc7c Mar 09 02:59:15 crc kubenswrapper[4901]: W0309 02:59:15.569773 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40444baa_6195_4d1a_9704_21874564d865.slice/crio-4eb9345be1d5c3779406ed6c0f16be6da0322a76af506e5336cf2869cb906c5b WatchSource:0}: Error finding container 4eb9345be1d5c3779406ed6c0f16be6da0322a76af506e5336cf2869cb906c5b: Status 404 returned error can't find the container with id 4eb9345be1d5c3779406ed6c0f16be6da0322a76af506e5336cf2869cb906c5b Mar 09 02:59:15 crc kubenswrapper[4901]: E0309 02:59:15.571116 4901 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:1b9074a4ce16396d8bd2d30a475fc8c2f004f75a023e3eef8950661e89c0bcc6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-l8fmk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-5fdb694969-8d9wj_openstack-operators(37afdbe5-071a-4161-8390-1de33fefd993): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 09 02:59:15 crc kubenswrapper[4901]: E0309 02:59:15.572265 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-8d9wj" podUID="37afdbe5-071a-4161-8390-1de33fefd993" Mar 09 02:59:15 crc kubenswrapper[4901]: E0309 02:59:15.572405 4901 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:9d03f03aa9a460f1fcac8875064808c03e4ecd0388873bbfb9c7dc58331f3968,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5jxdv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-55b5ff4dbb-6qs6f_openstack-operators(40444baa-6195-4d1a-9704-21874564d865): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 09 02:59:15 crc kubenswrapper[4901]: E0309 02:59:15.573619 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-6qs6f" podUID="40444baa-6195-4d1a-9704-21874564d865" Mar 09 02:59:15 crc kubenswrapper[4901]: I0309 02:59:15.668871 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wvsdc"] Mar 09 02:59:15 crc kubenswrapper[4901]: I0309 02:59:15.726854 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/21d847c9-9877-4e8d-b414-7f8035ebfc32-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-24mvj\" (UID: \"21d847c9-9877-4e8d-b414-7f8035ebfc32\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-24mvj" Mar 09 02:59:15 crc kubenswrapper[4901]: E0309 02:59:15.727080 4901 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 09 02:59:15 crc kubenswrapper[4901]: E0309 02:59:15.727185 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21d847c9-9877-4e8d-b414-7f8035ebfc32-cert podName:21d847c9-9877-4e8d-b414-7f8035ebfc32 nodeName:}" failed. No retries permitted until 2026-03-09 02:59:17.727163077 +0000 UTC m=+1082.316826809 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/21d847c9-9877-4e8d-b414-7f8035ebfc32-cert") pod "infra-operator-controller-manager-f7fcc58b9-24mvj" (UID: "21d847c9-9877-4e8d-b414-7f8035ebfc32") : secret "infra-operator-webhook-server-cert" not found Mar 09 02:59:15 crc kubenswrapper[4901]: I0309 02:59:15.820465 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-l7qrj" event={"ID":"150f46cd-b329-412c-b0a9-acd69b79a434","Type":"ContainerStarted","Data":"3977491b1e9b134be6f2c0b3b6175360da9934ba06b65f20ebc617d6f41a191c"} Mar 09 02:59:15 crc kubenswrapper[4901]: I0309 02:59:15.821799 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-5z5dd" event={"ID":"17fc5096-7a83-4918-ba7f-213f188a1ce3","Type":"ContainerStarted","Data":"f75aba29660b11df7250bd55da69a3ed8d94abe4298bdc7883d13ade5ebe81b8"} Mar 09 02:59:15 crc kubenswrapper[4901]: I0309 02:59:15.822912 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-x98gb" event={"ID":"15663ad3-38d8-4a71-88fd-28f74b590e6e","Type":"ContainerStarted","Data":"e4aed5bbe9832ae31e2ba86d275ec6d4cb60780fb106f1855a111c763cbb6847"} Mar 09 02:59:15 crc kubenswrapper[4901]: I0309 02:59:15.825197 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-6wntt" event={"ID":"e6489b4e-ac8f-4853-90a2-6f7ec0af3367","Type":"ContainerStarted","Data":"a6a8585081d826c67610a5586601ecf61c9b41caff0d24dd8567ad118ebff773"} Mar 09 02:59:15 crc kubenswrapper[4901]: I0309 02:59:15.828012 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-tkpsk" event={"ID":"6ee7156f-f994-4e79-875d-744fed479fcf","Type":"ContainerStarted","Data":"e9063931909d95c4243cec851aa85048443b242d97aba8fa5a50cba2629c4f9d"} Mar 09 02:59:15 crc kubenswrapper[4901]: I0309 02:59:15.829743 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-2qbw7" event={"ID":"4b68ef50-4ff4-420c-a455-ec9dd86db4cc","Type":"ContainerStarted","Data":"477aa0f2e255f5aede690f8a435a0e516bc86b529f0ee97ef0c77c5118c3576a"} Mar 09 02:59:15 crc kubenswrapper[4901]: E0309 02:59:15.831005 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:f309cdea8084a4b1e8cbcd732d6e250fd93c55cfd1b48ba9026907c8591faab7\\\"\"" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-2qbw7" podUID="4b68ef50-4ff4-420c-a455-ec9dd86db4cc" Mar 09 02:59:15 crc kubenswrapper[4901]: I0309 02:59:15.831153 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-xrb6p" event={"ID":"5c11cb5a-ab01-422d-a70b-33bb9dd06f8b","Type":"ContainerStarted","Data":"023307f13c4eadb5caf8365420985c6c239d98abb341298b81fd5b4e0fba888e"} Mar 09 02:59:15 crc kubenswrapper[4901]: I0309 02:59:15.833790 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-67d996989d-n5px9" event={"ID":"1b18c23b-f21d-4935-898a-2864b473119c","Type":"ContainerStarted","Data":"3375e364c47ebf21321105cc8b79709e669a4126dbb930737a6b38c809a81853"} Mar 09 02:59:15 crc kubenswrapper[4901]: I0309 02:59:15.837147 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54688575f-bvb9v" event={"ID":"a6308427-401a-4c01-afdb-e385f8efc20d","Type":"ContainerStarted","Data":"70c60def46964f5c214f6f718dbe3f5ef803f69627e3eb240b2739706ae31785"} Mar 09 02:59:15 crc kubenswrapper[4901]: E0309 02:59:15.838487 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:b242403a27609ac87a0ed3a7dd788aceaf8f3da3620981cf5e000d56862d77a4\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-54688575f-bvb9v" podUID="a6308427-401a-4c01-afdb-e385f8efc20d" Mar 09 02:59:15 crc kubenswrapper[4901]: I0309 02:59:15.839112 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-r2rzc" event={"ID":"f86fd171-86dc-4bad-8b53-24cde4942e76","Type":"ContainerStarted","Data":"4bdb846519f748bbda22c5d08fb9da1d7d97006a516f58136fae8c388204d9cc"} Mar 09 02:59:15 crc kubenswrapper[4901]: I0309 02:59:15.841922 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-lfc7j" event={"ID":"b99c3916-afa1-4f6c-a25e-ca7a7a30d5c6","Type":"ContainerStarted","Data":"a62ebe2d6fc159f955acb68742fa85d7a1c61cd4105b7e4d7db3576530a8d5e9"} Mar 09 02:59:15 crc kubenswrapper[4901]: E0309 02:59:15.849103 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-r2rzc" podUID="f86fd171-86dc-4bad-8b53-24cde4942e76" Mar 09 02:59:15 crc kubenswrapper[4901]: I0309 02:59:15.856581 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-7sctc" event={"ID":"e7f9248a-6f54-4cbd-9225-a601e2dd4e93","Type":"ContainerStarted","Data":"95778c659ae089493f51bf5a66601d01708820eeeaa8fda9017522c7d749159e"} Mar 09 02:59:15 crc kubenswrapper[4901]: I0309 02:59:15.861035 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-6qs6f" event={"ID":"40444baa-6195-4d1a-9704-21874564d865","Type":"ContainerStarted","Data":"4eb9345be1d5c3779406ed6c0f16be6da0322a76af506e5336cf2869cb906c5b"} Mar 09 02:59:15 crc kubenswrapper[4901]: I0309 02:59:15.864211 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-8d9wj" event={"ID":"37afdbe5-071a-4161-8390-1de33fefd993","Type":"ContainerStarted","Data":"a6441ef62261697c7328940a1dddcf71b4ff8b43682701d11a7b27b40ce4bc7c"} Mar 09 02:59:15 crc kubenswrapper[4901]: E0309 02:59:15.864233 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:9d03f03aa9a460f1fcac8875064808c03e4ecd0388873bbfb9c7dc58331f3968\\\"\"" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-6qs6f" podUID="40444baa-6195-4d1a-9704-21874564d865" Mar 09 02:59:15 crc kubenswrapper[4901]: E0309 02:59:15.865539 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:1b9074a4ce16396d8bd2d30a475fc8c2f004f75a023e3eef8950661e89c0bcc6\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-8d9wj" podUID="37afdbe5-071a-4161-8390-1de33fefd993" Mar 09 02:59:15 crc kubenswrapper[4901]: I0309 02:59:15.866407 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-gclrf" event={"ID":"68bf831f-f763-4c43-b57f-13d244b3a21e","Type":"ContainerStarted","Data":"a00ac6568842644208d1ae35a933c4b1ebefc1ff6b7738d6324340b972c77837"} Mar 09 02:59:15 crc kubenswrapper[4901]: I0309 02:59:15.872577 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-bvjjz" event={"ID":"e690f8b6-f8ab-42fb-8e9c-be5dcbc52de4","Type":"ContainerStarted","Data":"21e4ddb336eeed548c4e23df639d7e87305e48eb85d35fc7ef820fd988e10bb6"} Mar 09 02:59:15 crc kubenswrapper[4901]: I0309 02:59:15.874157 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wvsdc" event={"ID":"38e4f81e-db3d-4d6d-824c-cdcf6e42ab1f","Type":"ContainerStarted","Data":"80e1c4f0e7fce986186a42656c446a62f9fa4855b29c5ce450d9e955794534bb"} Mar 09 02:59:15 crc kubenswrapper[4901]: I0309 02:59:15.876802 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-5qmp2" event={"ID":"3644aa78-798b-40f2-9041-700fc89959e0","Type":"ContainerStarted","Data":"516060fd1179969ba1f615a86c8be44bb7400a9cfaf82107c853e9d11831863d"} Mar 09 02:59:15 crc kubenswrapper[4901]: I0309 02:59:15.879119 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-mksnm" event={"ID":"00255202-1625-4595-a5d2-90aadb87fcfc","Type":"ContainerStarted","Data":"20bb486a67c7b4384452f6e29337da5f8d1b4e65c5007c3dbf077aa06ab057ac"} Mar 09 02:59:15 crc kubenswrapper[4901]: E0309 02:59:15.880550 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:2d59045b8d8e6f9c5483c4fdda7c5057218d553200dc4bcf26789980ac1d9abd\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-mksnm" podUID="00255202-1625-4595-a5d2-90aadb87fcfc" Mar 09 02:59:15 crc kubenswrapper[4901]: I0309 02:59:15.880750 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-h2bk7" event={"ID":"49531808-52f2-497a-98bc-61883926e221","Type":"ContainerStarted","Data":"75b9744ccb8b227b00359361db888a2f2731abee98435d21a9b0a3992c96db65"} Mar 09 02:59:16 crc kubenswrapper[4901]: I0309 02:59:16.038141 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dbe9fe4b-dcc0-4c01-9de1-8f68a30ca974-cert\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-x7pjx\" (UID: \"dbe9fe4b-dcc0-4c01-9de1-8f68a30ca974\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-x7pjx" Mar 09 02:59:16 crc kubenswrapper[4901]: E0309 02:59:16.038376 4901 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 02:59:16 crc kubenswrapper[4901]: E0309 02:59:16.038476 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dbe9fe4b-dcc0-4c01-9de1-8f68a30ca974-cert podName:dbe9fe4b-dcc0-4c01-9de1-8f68a30ca974 nodeName:}" failed. No retries permitted until 2026-03-09 02:59:18.038449712 +0000 UTC m=+1082.628113444 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dbe9fe4b-dcc0-4c01-9de1-8f68a30ca974-cert") pod "openstack-baremetal-operator-controller-manager-dc6dbbbd-x7pjx" (UID: "dbe9fe4b-dcc0-4c01-9de1-8f68a30ca974") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 02:59:16 crc kubenswrapper[4901]: I0309 02:59:16.445029 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/801bcadf-a5a1-4b3c-9564-e1e21ff68f7f-metrics-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-jmbhm\" (UID: \"801bcadf-a5a1-4b3c-9564-e1e21ff68f7f\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-jmbhm" Mar 09 02:59:16 crc kubenswrapper[4901]: I0309 02:59:16.445432 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/801bcadf-a5a1-4b3c-9564-e1e21ff68f7f-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-jmbhm\" (UID: \"801bcadf-a5a1-4b3c-9564-e1e21ff68f7f\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-jmbhm" Mar 09 02:59:16 crc kubenswrapper[4901]: E0309 02:59:16.445653 4901 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 09 02:59:16 crc kubenswrapper[4901]: E0309 02:59:16.447360 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/801bcadf-a5a1-4b3c-9564-e1e21ff68f7f-webhook-certs podName:801bcadf-a5a1-4b3c-9564-e1e21ff68f7f nodeName:}" failed. No retries permitted until 2026-03-09 02:59:18.447334956 +0000 UTC m=+1083.036998688 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/801bcadf-a5a1-4b3c-9564-e1e21ff68f7f-webhook-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-jmbhm" (UID: "801bcadf-a5a1-4b3c-9564-e1e21ff68f7f") : secret "webhook-server-cert" not found Mar 09 02:59:16 crc kubenswrapper[4901]: E0309 02:59:16.450040 4901 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 09 02:59:16 crc kubenswrapper[4901]: E0309 02:59:16.450106 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/801bcadf-a5a1-4b3c-9564-e1e21ff68f7f-metrics-certs podName:801bcadf-a5a1-4b3c-9564-e1e21ff68f7f nodeName:}" failed. No retries permitted until 2026-03-09 02:59:18.450086826 +0000 UTC m=+1083.039750628 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/801bcadf-a5a1-4b3c-9564-e1e21ff68f7f-metrics-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-jmbhm" (UID: "801bcadf-a5a1-4b3c-9564-e1e21ff68f7f") : secret "metrics-server-cert" not found Mar 09 02:59:16 crc kubenswrapper[4901]: E0309 02:59:16.907125 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:1b9074a4ce16396d8bd2d30a475fc8c2f004f75a023e3eef8950661e89c0bcc6\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-8d9wj" podUID="37afdbe5-071a-4161-8390-1de33fefd993" Mar 09 02:59:16 crc kubenswrapper[4901]: E0309 02:59:16.907435 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:f309cdea8084a4b1e8cbcd732d6e250fd93c55cfd1b48ba9026907c8591faab7\\\"\"" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-2qbw7" podUID="4b68ef50-4ff4-420c-a455-ec9dd86db4cc" Mar 09 02:59:16 crc kubenswrapper[4901]: E0309 02:59:16.907585 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:b242403a27609ac87a0ed3a7dd788aceaf8f3da3620981cf5e000d56862d77a4\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-54688575f-bvb9v" podUID="a6308427-401a-4c01-afdb-e385f8efc20d" Mar 09 02:59:16 crc kubenswrapper[4901]: E0309 02:59:16.907963 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:2d59045b8d8e6f9c5483c4fdda7c5057218d553200dc4bcf26789980ac1d9abd\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-mksnm" podUID="00255202-1625-4595-a5d2-90aadb87fcfc" Mar 09 02:59:16 crc kubenswrapper[4901]: E0309 02:59:16.908445 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:9d03f03aa9a460f1fcac8875064808c03e4ecd0388873bbfb9c7dc58331f3968\\\"\"" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-6qs6f" podUID="40444baa-6195-4d1a-9704-21874564d865" Mar 09 02:59:16 crc kubenswrapper[4901]: E0309 02:59:16.908485 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-r2rzc" podUID="f86fd171-86dc-4bad-8b53-24cde4942e76" Mar 09 02:59:17 crc kubenswrapper[4901]: I0309 02:59:17.766211 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/21d847c9-9877-4e8d-b414-7f8035ebfc32-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-24mvj\" (UID: \"21d847c9-9877-4e8d-b414-7f8035ebfc32\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-24mvj" Mar 09 02:59:17 crc kubenswrapper[4901]: E0309 02:59:17.766424 4901 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 09 02:59:17 crc kubenswrapper[4901]: E0309 02:59:17.766674 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21d847c9-9877-4e8d-b414-7f8035ebfc32-cert podName:21d847c9-9877-4e8d-b414-7f8035ebfc32 nodeName:}" failed. No retries permitted until 2026-03-09 02:59:21.766633416 +0000 UTC m=+1086.356297148 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/21d847c9-9877-4e8d-b414-7f8035ebfc32-cert") pod "infra-operator-controller-manager-f7fcc58b9-24mvj" (UID: "21d847c9-9877-4e8d-b414-7f8035ebfc32") : secret "infra-operator-webhook-server-cert" not found Mar 09 02:59:18 crc kubenswrapper[4901]: I0309 02:59:18.070365 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dbe9fe4b-dcc0-4c01-9de1-8f68a30ca974-cert\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-x7pjx\" (UID: \"dbe9fe4b-dcc0-4c01-9de1-8f68a30ca974\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-x7pjx" Mar 09 02:59:18 crc kubenswrapper[4901]: E0309 02:59:18.070544 4901 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 02:59:18 crc kubenswrapper[4901]: E0309 02:59:18.070589 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dbe9fe4b-dcc0-4c01-9de1-8f68a30ca974-cert podName:dbe9fe4b-dcc0-4c01-9de1-8f68a30ca974 nodeName:}" failed. No retries permitted until 2026-03-09 02:59:22.070576225 +0000 UTC m=+1086.660239957 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dbe9fe4b-dcc0-4c01-9de1-8f68a30ca974-cert") pod "openstack-baremetal-operator-controller-manager-dc6dbbbd-x7pjx" (UID: "dbe9fe4b-dcc0-4c01-9de1-8f68a30ca974") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 02:59:18 crc kubenswrapper[4901]: I0309 02:59:18.475873 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/801bcadf-a5a1-4b3c-9564-e1e21ff68f7f-metrics-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-jmbhm\" (UID: \"801bcadf-a5a1-4b3c-9564-e1e21ff68f7f\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-jmbhm" Mar 09 02:59:18 crc kubenswrapper[4901]: I0309 02:59:18.475996 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/801bcadf-a5a1-4b3c-9564-e1e21ff68f7f-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-jmbhm\" (UID: \"801bcadf-a5a1-4b3c-9564-e1e21ff68f7f\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-jmbhm" Mar 09 02:59:18 crc kubenswrapper[4901]: E0309 02:59:18.476062 4901 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 09 02:59:18 crc kubenswrapper[4901]: E0309 02:59:18.476144 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/801bcadf-a5a1-4b3c-9564-e1e21ff68f7f-metrics-certs podName:801bcadf-a5a1-4b3c-9564-e1e21ff68f7f nodeName:}" failed. No retries permitted until 2026-03-09 02:59:22.476126436 +0000 UTC m=+1087.065790168 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/801bcadf-a5a1-4b3c-9564-e1e21ff68f7f-metrics-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-jmbhm" (UID: "801bcadf-a5a1-4b3c-9564-e1e21ff68f7f") : secret "metrics-server-cert" not found Mar 09 02:59:18 crc kubenswrapper[4901]: E0309 02:59:18.476195 4901 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 09 02:59:18 crc kubenswrapper[4901]: E0309 02:59:18.476304 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/801bcadf-a5a1-4b3c-9564-e1e21ff68f7f-webhook-certs podName:801bcadf-a5a1-4b3c-9564-e1e21ff68f7f nodeName:}" failed. No retries permitted until 2026-03-09 02:59:22.47628527 +0000 UTC m=+1087.065949002 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/801bcadf-a5a1-4b3c-9564-e1e21ff68f7f-webhook-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-jmbhm" (UID: "801bcadf-a5a1-4b3c-9564-e1e21ff68f7f") : secret "webhook-server-cert" not found Mar 09 02:59:21 crc kubenswrapper[4901]: I0309 02:59:21.836622 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/21d847c9-9877-4e8d-b414-7f8035ebfc32-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-24mvj\" (UID: \"21d847c9-9877-4e8d-b414-7f8035ebfc32\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-24mvj" Mar 09 02:59:21 crc kubenswrapper[4901]: E0309 02:59:21.836835 4901 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 09 02:59:21 crc kubenswrapper[4901]: E0309 02:59:21.837282 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21d847c9-9877-4e8d-b414-7f8035ebfc32-cert podName:21d847c9-9877-4e8d-b414-7f8035ebfc32 nodeName:}" failed. No retries permitted until 2026-03-09 02:59:29.837260203 +0000 UTC m=+1094.426923935 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/21d847c9-9877-4e8d-b414-7f8035ebfc32-cert") pod "infra-operator-controller-manager-f7fcc58b9-24mvj" (UID: "21d847c9-9877-4e8d-b414-7f8035ebfc32") : secret "infra-operator-webhook-server-cert" not found Mar 09 02:59:22 crc kubenswrapper[4901]: I0309 02:59:22.149724 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dbe9fe4b-dcc0-4c01-9de1-8f68a30ca974-cert\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-x7pjx\" (UID: \"dbe9fe4b-dcc0-4c01-9de1-8f68a30ca974\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-x7pjx" Mar 09 02:59:22 crc kubenswrapper[4901]: E0309 02:59:22.149932 4901 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 02:59:22 crc kubenswrapper[4901]: E0309 02:59:22.150012 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dbe9fe4b-dcc0-4c01-9de1-8f68a30ca974-cert podName:dbe9fe4b-dcc0-4c01-9de1-8f68a30ca974 nodeName:}" failed. No retries permitted until 2026-03-09 02:59:30.149993854 +0000 UTC m=+1094.739657586 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dbe9fe4b-dcc0-4c01-9de1-8f68a30ca974-cert") pod "openstack-baremetal-operator-controller-manager-dc6dbbbd-x7pjx" (UID: "dbe9fe4b-dcc0-4c01-9de1-8f68a30ca974") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 02:59:22 crc kubenswrapper[4901]: I0309 02:59:22.554861 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/801bcadf-a5a1-4b3c-9564-e1e21ff68f7f-metrics-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-jmbhm\" (UID: \"801bcadf-a5a1-4b3c-9564-e1e21ff68f7f\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-jmbhm" Mar 09 02:59:22 crc kubenswrapper[4901]: E0309 02:59:22.555120 4901 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 09 02:59:22 crc kubenswrapper[4901]: I0309 02:59:22.555359 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/801bcadf-a5a1-4b3c-9564-e1e21ff68f7f-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-jmbhm\" (UID: \"801bcadf-a5a1-4b3c-9564-e1e21ff68f7f\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-jmbhm" Mar 09 02:59:22 crc kubenswrapper[4901]: E0309 02:59:22.555589 4901 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 09 02:59:22 crc kubenswrapper[4901]: E0309 02:59:22.555665 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/801bcadf-a5a1-4b3c-9564-e1e21ff68f7f-webhook-certs podName:801bcadf-a5a1-4b3c-9564-e1e21ff68f7f nodeName:}" failed. No retries permitted until 2026-03-09 02:59:30.555647677 +0000 UTC m=+1095.145311409 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/801bcadf-a5a1-4b3c-9564-e1e21ff68f7f-webhook-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-jmbhm" (UID: "801bcadf-a5a1-4b3c-9564-e1e21ff68f7f") : secret "webhook-server-cert" not found Mar 09 02:59:22 crc kubenswrapper[4901]: E0309 02:59:22.556534 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/801bcadf-a5a1-4b3c-9564-e1e21ff68f7f-metrics-certs podName:801bcadf-a5a1-4b3c-9564-e1e21ff68f7f nodeName:}" failed. No retries permitted until 2026-03-09 02:59:30.556516819 +0000 UTC m=+1095.146180631 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/801bcadf-a5a1-4b3c-9564-e1e21ff68f7f-metrics-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-jmbhm" (UID: "801bcadf-a5a1-4b3c-9564-e1e21ff68f7f") : secret "metrics-server-cert" not found Mar 09 02:59:28 crc kubenswrapper[4901]: E0309 02:59:28.828902 4901 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:9d723ab33964ee44704eed3223b64e828349d45dee04695434a6fcf4b6807d4c" Mar 09 02:59:28 crc kubenswrapper[4901]: E0309 02:59:28.829505 4901 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:9d723ab33964ee44704eed3223b64e828349d45dee04695434a6fcf4b6807d4c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5mvpw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7c789f89c6-tkpsk_openstack-operators(6ee7156f-f994-4e79-875d-744fed479fcf): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 02:59:28 crc kubenswrapper[4901]: E0309 02:59:28.830650 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-tkpsk" podUID="6ee7156f-f994-4e79-875d-744fed479fcf" Mar 09 02:59:29 crc kubenswrapper[4901]: E0309 02:59:29.086127 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:9d723ab33964ee44704eed3223b64e828349d45dee04695434a6fcf4b6807d4c\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-tkpsk" podUID="6ee7156f-f994-4e79-875d-744fed479fcf" Mar 09 02:59:29 crc kubenswrapper[4901]: I0309 02:59:29.917847 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/21d847c9-9877-4e8d-b414-7f8035ebfc32-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-24mvj\" (UID: \"21d847c9-9877-4e8d-b414-7f8035ebfc32\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-24mvj" Mar 09 02:59:29 crc kubenswrapper[4901]: I0309 02:59:29.924878 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/21d847c9-9877-4e8d-b414-7f8035ebfc32-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-24mvj\" (UID: \"21d847c9-9877-4e8d-b414-7f8035ebfc32\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-24mvj" Mar 09 02:59:29 crc kubenswrapper[4901]: I0309 02:59:29.960696 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-bndbd" Mar 09 02:59:29 crc kubenswrapper[4901]: I0309 02:59:29.967605 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-24mvj" Mar 09 02:59:30 crc kubenswrapper[4901]: I0309 02:59:30.094659 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-x98gb" event={"ID":"15663ad3-38d8-4a71-88fd-28f74b590e6e","Type":"ContainerStarted","Data":"b448e152d20db33d75d618ecb552a283e8f3d9dc79d0677214f974c377a5f80f"} Mar 09 02:59:30 crc kubenswrapper[4901]: I0309 02:59:30.094931 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-x98gb" Mar 09 02:59:30 crc kubenswrapper[4901]: I0309 02:59:30.100450 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-6wntt" event={"ID":"e6489b4e-ac8f-4853-90a2-6f7ec0af3367","Type":"ContainerStarted","Data":"5adaa4ff4851ebff713c349309e196d07d48f184b1f9b330380af8f0fe2d110d"} Mar 09 02:59:30 crc kubenswrapper[4901]: I0309 02:59:30.100586 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-6wntt" Mar 09 02:59:30 crc kubenswrapper[4901]: I0309 02:59:30.137621 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-x98gb" podStartSLOduration=3.038169149 podStartE2EDuration="16.137603432s" podCreationTimestamp="2026-03-09 02:59:14 +0000 UTC" firstStartedPulling="2026-03-09 02:59:15.269750295 +0000 UTC m=+1079.859414027" lastFinishedPulling="2026-03-09 02:59:28.369184568 +0000 UTC m=+1092.958848310" observedRunningTime="2026-03-09 02:59:30.117401025 +0000 UTC m=+1094.707064757" watchObservedRunningTime="2026-03-09 02:59:30.137603432 +0000 UTC m=+1094.727267164" Mar 09 02:59:30 crc kubenswrapper[4901]: I0309 02:59:30.144978 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-6wntt" podStartSLOduration=2.028680427 podStartE2EDuration="16.144964977s" podCreationTimestamp="2026-03-09 02:59:14 +0000 UTC" firstStartedPulling="2026-03-09 02:59:15.365662822 +0000 UTC m=+1079.955326554" lastFinishedPulling="2026-03-09 02:59:29.481947372 +0000 UTC m=+1094.071611104" observedRunningTime="2026-03-09 02:59:30.143651574 +0000 UTC m=+1094.733315306" watchObservedRunningTime="2026-03-09 02:59:30.144964977 +0000 UTC m=+1094.734628709" Mar 09 02:59:30 crc kubenswrapper[4901]: I0309 02:59:30.145340 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-6qs6f" event={"ID":"40444baa-6195-4d1a-9704-21874564d865","Type":"ContainerStarted","Data":"ba84bb31a13c733ae4eefa400e5e75b3ba8a14b3fb32936f6c62c9cb51e30175"} Mar 09 02:59:30 crc kubenswrapper[4901]: I0309 02:59:30.145378 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-bvjjz" Mar 09 02:59:30 crc kubenswrapper[4901]: I0309 02:59:30.145390 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-bvjjz" event={"ID":"e690f8b6-f8ab-42fb-8e9c-be5dcbc52de4","Type":"ContainerStarted","Data":"55d493676fe615b73f9120a2d25a8fda7744a0ecc06aa4d6a05b3425c2616b3f"} Mar 09 02:59:30 crc kubenswrapper[4901]: I0309 02:59:30.145907 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-6qs6f" Mar 09 02:59:30 crc kubenswrapper[4901]: I0309 02:59:30.166496 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-gclrf" event={"ID":"68bf831f-f763-4c43-b57f-13d244b3a21e","Type":"ContainerStarted","Data":"e32bd20a82466128b7731c9f01af0bd3d1f771a043dffb27be5d66e21df09f4b"} Mar 09 02:59:30 crc kubenswrapper[4901]: I0309 02:59:30.167312 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-gclrf" Mar 09 02:59:30 crc kubenswrapper[4901]: I0309 02:59:30.201508 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-lfc7j" event={"ID":"b99c3916-afa1-4f6c-a25e-ca7a7a30d5c6","Type":"ContainerStarted","Data":"61904b8c4f7661ad26869e1d1314b4cc19b24ada759839521ab0289d36ea96b0"} Mar 09 02:59:30 crc kubenswrapper[4901]: I0309 02:59:30.202121 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-lfc7j" Mar 09 02:59:30 crc kubenswrapper[4901]: I0309 02:59:30.211617 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-bvjjz" podStartSLOduration=2.863662564 podStartE2EDuration="17.211598739s" podCreationTimestamp="2026-03-09 02:59:13 +0000 UTC" firstStartedPulling="2026-03-09 02:59:15.102986268 +0000 UTC m=+1079.692650000" lastFinishedPulling="2026-03-09 02:59:29.450922443 +0000 UTC m=+1094.040586175" observedRunningTime="2026-03-09 02:59:30.179875983 +0000 UTC m=+1094.769539715" watchObservedRunningTime="2026-03-09 02:59:30.211598739 +0000 UTC m=+1094.801262471" Mar 09 02:59:30 crc kubenswrapper[4901]: I0309 02:59:30.213855 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-6qs6f" podStartSLOduration=2.325123789 podStartE2EDuration="16.213848876s" podCreationTimestamp="2026-03-09 02:59:14 +0000 UTC" firstStartedPulling="2026-03-09 02:59:15.572257069 +0000 UTC m=+1080.161920801" lastFinishedPulling="2026-03-09 02:59:29.460982156 +0000 UTC m=+1094.050645888" observedRunningTime="2026-03-09 02:59:30.202415809 +0000 UTC m=+1094.792079541" watchObservedRunningTime="2026-03-09 02:59:30.213848876 +0000 UTC m=+1094.803512608" Mar 09 02:59:30 crc kubenswrapper[4901]: I0309 02:59:30.226925 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dbe9fe4b-dcc0-4c01-9de1-8f68a30ca974-cert\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-x7pjx\" (UID: \"dbe9fe4b-dcc0-4c01-9de1-8f68a30ca974\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-x7pjx" Mar 09 02:59:30 crc kubenswrapper[4901]: E0309 02:59:30.227107 4901 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 02:59:30 crc kubenswrapper[4901]: E0309 02:59:30.227155 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dbe9fe4b-dcc0-4c01-9de1-8f68a30ca974-cert podName:dbe9fe4b-dcc0-4c01-9de1-8f68a30ca974 nodeName:}" failed. No retries permitted until 2026-03-09 02:59:46.227141219 +0000 UTC m=+1110.816804951 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dbe9fe4b-dcc0-4c01-9de1-8f68a30ca974-cert") pod "openstack-baremetal-operator-controller-manager-dc6dbbbd-x7pjx" (UID: "dbe9fe4b-dcc0-4c01-9de1-8f68a30ca974") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 02:59:30 crc kubenswrapper[4901]: I0309 02:59:30.241616 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-7sctc" event={"ID":"e7f9248a-6f54-4cbd-9225-a601e2dd4e93","Type":"ContainerStarted","Data":"d849ebb5643408ae87034fcb9b13613d49c663b6f74a8819cb9c8cf66ef9f762"} Mar 09 02:59:30 crc kubenswrapper[4901]: I0309 02:59:30.241828 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-7sctc" Mar 09 02:59:30 crc kubenswrapper[4901]: I0309 02:59:30.248356 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-gclrf" podStartSLOduration=2.867442848 podStartE2EDuration="17.248335021s" podCreationTimestamp="2026-03-09 02:59:13 +0000 UTC" firstStartedPulling="2026-03-09 02:59:15.034748925 +0000 UTC m=+1079.624412657" lastFinishedPulling="2026-03-09 02:59:29.415641098 +0000 UTC m=+1094.005304830" observedRunningTime="2026-03-09 02:59:30.232654078 +0000 UTC m=+1094.822317810" watchObservedRunningTime="2026-03-09 02:59:30.248335021 +0000 UTC m=+1094.837998753" Mar 09 02:59:30 crc kubenswrapper[4901]: I0309 02:59:30.260164 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-5z5dd" event={"ID":"17fc5096-7a83-4918-ba7f-213f188a1ce3","Type":"ContainerStarted","Data":"d6db14f1864e62fc139a2e0833eeecadddf713c20068d5c7dbd67887712c4188"} Mar 09 02:59:30 crc kubenswrapper[4901]: I0309 02:59:30.260765 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-5z5dd" Mar 09 02:59:30 crc kubenswrapper[4901]: I0309 02:59:30.275769 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-lfc7j" podStartSLOduration=4.0685349 podStartE2EDuration="17.27575478s" podCreationTimestamp="2026-03-09 02:59:13 +0000 UTC" firstStartedPulling="2026-03-09 02:59:15.108327192 +0000 UTC m=+1079.697990924" lastFinishedPulling="2026-03-09 02:59:28.315547052 +0000 UTC m=+1092.905210804" observedRunningTime="2026-03-09 02:59:30.274774705 +0000 UTC m=+1094.864438437" watchObservedRunningTime="2026-03-09 02:59:30.27575478 +0000 UTC m=+1094.865418512" Mar 09 02:59:30 crc kubenswrapper[4901]: I0309 02:59:30.275901 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-l7qrj" event={"ID":"150f46cd-b329-412c-b0a9-acd69b79a434","Type":"ContainerStarted","Data":"ecbda44efe81188a83b8d3f1275c0f0824fe1f07e50e1bec978de64dc86ddefe"} Mar 09 02:59:30 crc kubenswrapper[4901]: I0309 02:59:30.276482 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-l7qrj" Mar 09 02:59:30 crc kubenswrapper[4901]: I0309 02:59:30.288810 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-67d996989d-n5px9" event={"ID":"1b18c23b-f21d-4935-898a-2864b473119c","Type":"ContainerStarted","Data":"90e30791006f1613dd5ea59dca8a47c12bd916094c96224713f4b91615f58956"} Mar 09 02:59:30 crc kubenswrapper[4901]: I0309 02:59:30.289507 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-67d996989d-n5px9" Mar 09 02:59:30 crc kubenswrapper[4901]: I0309 02:59:30.309116 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-5z5dd" podStartSLOduration=3.05493784 podStartE2EDuration="16.309101937s" podCreationTimestamp="2026-03-09 02:59:14 +0000 UTC" firstStartedPulling="2026-03-09 02:59:15.551822196 +0000 UTC m=+1080.141485928" lastFinishedPulling="2026-03-09 02:59:28.805986253 +0000 UTC m=+1093.395650025" observedRunningTime="2026-03-09 02:59:30.305613239 +0000 UTC m=+1094.895276971" watchObservedRunningTime="2026-03-09 02:59:30.309101937 +0000 UTC m=+1094.898765669" Mar 09 02:59:30 crc kubenswrapper[4901]: I0309 02:59:30.316930 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wvsdc" event={"ID":"38e4f81e-db3d-4d6d-824c-cdcf6e42ab1f","Type":"ContainerStarted","Data":"2423f3daa9e9819e87c94f69c5b8e3118244fa7c494cab6f2f78d635f93cc2e7"} Mar 09 02:59:30 crc kubenswrapper[4901]: I0309 02:59:30.331868 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-7sctc" podStartSLOduration=3.868354985 podStartE2EDuration="17.331854338s" podCreationTimestamp="2026-03-09 02:59:13 +0000 UTC" firstStartedPulling="2026-03-09 02:59:14.851988217 +0000 UTC m=+1079.441651949" lastFinishedPulling="2026-03-09 02:59:28.31548757 +0000 UTC m=+1092.905151302" observedRunningTime="2026-03-09 02:59:30.327768886 +0000 UTC m=+1094.917432618" watchObservedRunningTime="2026-03-09 02:59:30.331854338 +0000 UTC m=+1094.921518070" Mar 09 02:59:30 crc kubenswrapper[4901]: I0309 02:59:30.332450 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-5qmp2" event={"ID":"3644aa78-798b-40f2-9041-700fc89959e0","Type":"ContainerStarted","Data":"c2bd543c32dca17307edf136bac9f212bbcc9658bbf96d33dddf4f43938e23c7"} Mar 09 02:59:30 crc kubenswrapper[4901]: I0309 02:59:30.332908 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-5qmp2" Mar 09 02:59:30 crc kubenswrapper[4901]: I0309 02:59:30.353165 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-h2bk7" event={"ID":"49531808-52f2-497a-98bc-61883926e221","Type":"ContainerStarted","Data":"3bb757c90db388016ca7364c9541bb46c111ccd5492b1db15a6727ad326215a7"} Mar 09 02:59:30 crc kubenswrapper[4901]: I0309 02:59:30.353788 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-h2bk7" Mar 09 02:59:30 crc kubenswrapper[4901]: I0309 02:59:30.382801 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-xrb6p" event={"ID":"5c11cb5a-ab01-422d-a70b-33bb9dd06f8b","Type":"ContainerStarted","Data":"ec9515f48931bee7dc974af5ed73f270296fcb7db668367d9bb6b7a394b1202b"} Mar 09 02:59:30 crc kubenswrapper[4901]: I0309 02:59:30.383363 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-xrb6p" Mar 09 02:59:30 crc kubenswrapper[4901]: I0309 02:59:30.424023 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-67d996989d-n5px9" podStartSLOduration=2.355002149 podStartE2EDuration="16.424006182s" podCreationTimestamp="2026-03-09 02:59:14 +0000 UTC" firstStartedPulling="2026-03-09 02:59:15.332178472 +0000 UTC m=+1079.921842204" lastFinishedPulling="2026-03-09 02:59:29.401182505 +0000 UTC m=+1093.990846237" observedRunningTime="2026-03-09 02:59:30.398557973 +0000 UTC m=+1094.988221695" watchObservedRunningTime="2026-03-09 02:59:30.424006182 +0000 UTC m=+1095.013669914" Mar 09 02:59:30 crc kubenswrapper[4901]: I0309 02:59:30.467532 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wvsdc" podStartSLOduration=2.678409908 podStartE2EDuration="16.467514384s" podCreationTimestamp="2026-03-09 02:59:14 +0000 UTC" firstStartedPulling="2026-03-09 02:59:15.671383237 +0000 UTC m=+1080.261046969" lastFinishedPulling="2026-03-09 02:59:29.460487693 +0000 UTC m=+1094.050151445" observedRunningTime="2026-03-09 02:59:30.428929795 +0000 UTC m=+1095.018593527" watchObservedRunningTime="2026-03-09 02:59:30.467514384 +0000 UTC m=+1095.057178116" Mar 09 02:59:30 crc kubenswrapper[4901]: I0309 02:59:30.474370 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-f7fcc58b9-24mvj"] Mar 09 02:59:30 crc kubenswrapper[4901]: I0309 02:59:30.478650 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-l7qrj" podStartSLOduration=4.011512138 podStartE2EDuration="17.478630233s" podCreationTimestamp="2026-03-09 02:59:13 +0000 UTC" firstStartedPulling="2026-03-09 02:59:14.848500819 +0000 UTC m=+1079.438164551" lastFinishedPulling="2026-03-09 02:59:28.315618914 +0000 UTC m=+1092.905282646" observedRunningTime="2026-03-09 02:59:30.460295363 +0000 UTC m=+1095.049959095" watchObservedRunningTime="2026-03-09 02:59:30.478630233 +0000 UTC m=+1095.068293965" Mar 09 02:59:30 crc kubenswrapper[4901]: W0309 02:59:30.489341 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21d847c9_9877_4e8d_b414_7f8035ebfc32.slice/crio-7228cc0f3e10629d1d16f6a185a4e6d90766ce23f1c28bffef7524122e2e90a8 WatchSource:0}: Error finding container 7228cc0f3e10629d1d16f6a185a4e6d90766ce23f1c28bffef7524122e2e90a8: Status 404 returned error can't find the container with id 7228cc0f3e10629d1d16f6a185a4e6d90766ce23f1c28bffef7524122e2e90a8 Mar 09 02:59:30 crc kubenswrapper[4901]: I0309 02:59:30.494513 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-h2bk7" podStartSLOduration=3.060767396 podStartE2EDuration="16.494494031s" podCreationTimestamp="2026-03-09 02:59:14 +0000 UTC" firstStartedPulling="2026-03-09 02:59:15.372214867 +0000 UTC m=+1079.961878599" lastFinishedPulling="2026-03-09 02:59:28.805941502 +0000 UTC m=+1093.395605234" observedRunningTime="2026-03-09 02:59:30.490019439 +0000 UTC m=+1095.079683171" watchObservedRunningTime="2026-03-09 02:59:30.494494031 +0000 UTC m=+1095.084157763" Mar 09 02:59:30 crc kubenswrapper[4901]: I0309 02:59:30.510441 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-xrb6p" podStartSLOduration=3.242440603 podStartE2EDuration="17.510424561s" podCreationTimestamp="2026-03-09 02:59:13 +0000 UTC" firstStartedPulling="2026-03-09 02:59:15.146989393 +0000 UTC m=+1079.736653115" lastFinishedPulling="2026-03-09 02:59:29.414973341 +0000 UTC m=+1094.004637073" observedRunningTime="2026-03-09 02:59:30.509215031 +0000 UTC m=+1095.098878753" watchObservedRunningTime="2026-03-09 02:59:30.510424561 +0000 UTC m=+1095.100088283" Mar 09 02:59:30 crc kubenswrapper[4901]: I0309 02:59:30.537143 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-5qmp2" podStartSLOduration=3.385299539 podStartE2EDuration="17.537129651s" podCreationTimestamp="2026-03-09 02:59:13 +0000 UTC" firstStartedPulling="2026-03-09 02:59:15.250839 +0000 UTC m=+1079.840502732" lastFinishedPulling="2026-03-09 02:59:29.402669102 +0000 UTC m=+1093.992332844" observedRunningTime="2026-03-09 02:59:30.536209358 +0000 UTC m=+1095.125873090" watchObservedRunningTime="2026-03-09 02:59:30.537129651 +0000 UTC m=+1095.126793383" Mar 09 02:59:30 crc kubenswrapper[4901]: I0309 02:59:30.635881 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/801bcadf-a5a1-4b3c-9564-e1e21ff68f7f-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-jmbhm\" (UID: \"801bcadf-a5a1-4b3c-9564-e1e21ff68f7f\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-jmbhm" Mar 09 02:59:30 crc kubenswrapper[4901]: I0309 02:59:30.635979 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/801bcadf-a5a1-4b3c-9564-e1e21ff68f7f-metrics-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-jmbhm\" (UID: \"801bcadf-a5a1-4b3c-9564-e1e21ff68f7f\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-jmbhm" Mar 09 02:59:30 crc kubenswrapper[4901]: E0309 02:59:30.636110 4901 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 09 02:59:30 crc kubenswrapper[4901]: E0309 02:59:30.636153 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/801bcadf-a5a1-4b3c-9564-e1e21ff68f7f-metrics-certs podName:801bcadf-a5a1-4b3c-9564-e1e21ff68f7f nodeName:}" failed. No retries permitted until 2026-03-09 02:59:46.636140687 +0000 UTC m=+1111.225804419 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/801bcadf-a5a1-4b3c-9564-e1e21ff68f7f-metrics-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-jmbhm" (UID: "801bcadf-a5a1-4b3c-9564-e1e21ff68f7f") : secret "metrics-server-cert" not found Mar 09 02:59:30 crc kubenswrapper[4901]: E0309 02:59:30.636304 4901 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 09 02:59:30 crc kubenswrapper[4901]: E0309 02:59:30.636340 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/801bcadf-a5a1-4b3c-9564-e1e21ff68f7f-webhook-certs podName:801bcadf-a5a1-4b3c-9564-e1e21ff68f7f nodeName:}" failed. No retries permitted until 2026-03-09 02:59:46.636330692 +0000 UTC m=+1111.225994424 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/801bcadf-a5a1-4b3c-9564-e1e21ff68f7f-webhook-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-jmbhm" (UID: "801bcadf-a5a1-4b3c-9564-e1e21ff68f7f") : secret "webhook-server-cert" not found Mar 09 02:59:30 crc kubenswrapper[4901]: I0309 02:59:30.864676 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 02:59:30 crc kubenswrapper[4901]: I0309 02:59:30.864727 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 02:59:31 crc kubenswrapper[4901]: I0309 02:59:31.393134 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-24mvj" event={"ID":"21d847c9-9877-4e8d-b414-7f8035ebfc32","Type":"ContainerStarted","Data":"7228cc0f3e10629d1d16f6a185a4e6d90766ce23f1c28bffef7524122e2e90a8"} Mar 09 02:59:34 crc kubenswrapper[4901]: I0309 02:59:34.180343 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-l7qrj" Mar 09 02:59:34 crc kubenswrapper[4901]: I0309 02:59:34.188988 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-7sctc" Mar 09 02:59:34 crc kubenswrapper[4901]: I0309 02:59:34.247712 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-bvjjz" Mar 09 02:59:34 crc kubenswrapper[4901]: I0309 02:59:34.268107 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-gclrf" Mar 09 02:59:34 crc kubenswrapper[4901]: I0309 02:59:34.323566 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-xrb6p" Mar 09 02:59:34 crc kubenswrapper[4901]: I0309 02:59:34.344528 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-lfc7j" Mar 09 02:59:34 crc kubenswrapper[4901]: I0309 02:59:34.492124 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-5qmp2" Mar 09 02:59:34 crc kubenswrapper[4901]: I0309 02:59:34.540450 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-x98gb" Mar 09 02:59:34 crc kubenswrapper[4901]: I0309 02:59:34.622242 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-67d996989d-n5px9" Mar 09 02:59:34 crc kubenswrapper[4901]: I0309 02:59:34.661653 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-h2bk7" Mar 09 02:59:34 crc kubenswrapper[4901]: I0309 02:59:34.671170 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-6wntt" Mar 09 02:59:34 crc kubenswrapper[4901]: I0309 02:59:34.875294 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-5z5dd" Mar 09 02:59:34 crc kubenswrapper[4901]: I0309 02:59:34.956557 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-6qs6f" Mar 09 02:59:36 crc kubenswrapper[4901]: I0309 02:59:36.436078 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-8d9wj" event={"ID":"37afdbe5-071a-4161-8390-1de33fefd993","Type":"ContainerStarted","Data":"e2a966384b303f0c0099876032f1429b7006c8799a86e1d14fffe91fa31cc8bc"} Mar 09 02:59:36 crc kubenswrapper[4901]: I0309 02:59:36.436821 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-8d9wj" Mar 09 02:59:36 crc kubenswrapper[4901]: I0309 02:59:36.437181 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-2qbw7" event={"ID":"4b68ef50-4ff4-420c-a455-ec9dd86db4cc","Type":"ContainerStarted","Data":"4565ec2c7f246e8c198f2185d98681b49c1c7741e8587c86c543b31760e02aee"} Mar 09 02:59:36 crc kubenswrapper[4901]: I0309 02:59:36.437563 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-2qbw7" Mar 09 02:59:36 crc kubenswrapper[4901]: I0309 02:59:36.438804 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-24mvj" event={"ID":"21d847c9-9877-4e8d-b414-7f8035ebfc32","Type":"ContainerStarted","Data":"7195c371abe5f45bd7ce32e2a057f77f6c62a2d77b82e7d2fd72b6eb7efb33cb"} Mar 09 02:59:36 crc kubenswrapper[4901]: I0309 02:59:36.438933 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-24mvj" Mar 09 02:59:36 crc kubenswrapper[4901]: I0309 02:59:36.439773 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54688575f-bvb9v" event={"ID":"a6308427-401a-4c01-afdb-e385f8efc20d","Type":"ContainerStarted","Data":"02dc12203d0a139992f404bd106746896140e914699a00d8082f9d9b85feade4"} Mar 09 02:59:36 crc kubenswrapper[4901]: I0309 02:59:36.439916 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-54688575f-bvb9v" Mar 09 02:59:36 crc kubenswrapper[4901]: I0309 02:59:36.441581 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-r2rzc" event={"ID":"f86fd171-86dc-4bad-8b53-24cde4942e76","Type":"ContainerStarted","Data":"471770ed2a99d6a1dc9bdf5097149ef72ea28bef424b0a7ec2992f793f0f49a0"} Mar 09 02:59:36 crc kubenswrapper[4901]: I0309 02:59:36.441786 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-r2rzc" Mar 09 02:59:36 crc kubenswrapper[4901]: I0309 02:59:36.443955 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-mksnm" event={"ID":"00255202-1625-4595-a5d2-90aadb87fcfc","Type":"ContainerStarted","Data":"58dfbd2f5ebc29c973d870b2c85126cfb7d2f12e4cadbb58d3574173d035b58f"} Mar 09 02:59:36 crc kubenswrapper[4901]: I0309 02:59:36.444143 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-mksnm" Mar 09 02:59:36 crc kubenswrapper[4901]: I0309 02:59:36.457337 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-8d9wj" podStartSLOduration=2.427582939 podStartE2EDuration="22.457318609s" podCreationTimestamp="2026-03-09 02:59:14 +0000 UTC" firstStartedPulling="2026-03-09 02:59:15.571002257 +0000 UTC m=+1080.160665989" lastFinishedPulling="2026-03-09 02:59:35.600737907 +0000 UTC m=+1100.190401659" observedRunningTime="2026-03-09 02:59:36.452620321 +0000 UTC m=+1101.042284063" watchObservedRunningTime="2026-03-09 02:59:36.457318609 +0000 UTC m=+1101.046982341" Mar 09 02:59:36 crc kubenswrapper[4901]: I0309 02:59:36.467522 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-r2rzc" podStartSLOduration=2.334422941 podStartE2EDuration="22.467507415s" podCreationTimestamp="2026-03-09 02:59:14 +0000 UTC" firstStartedPulling="2026-03-09 02:59:15.565671393 +0000 UTC m=+1080.155335125" lastFinishedPulling="2026-03-09 02:59:35.698755867 +0000 UTC m=+1100.288419599" observedRunningTime="2026-03-09 02:59:36.465062174 +0000 UTC m=+1101.054725926" watchObservedRunningTime="2026-03-09 02:59:36.467507415 +0000 UTC m=+1101.057171147" Mar 09 02:59:36 crc kubenswrapper[4901]: I0309 02:59:36.478025 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-24mvj" podStartSLOduration=18.381042206 podStartE2EDuration="23.478009029s" podCreationTimestamp="2026-03-09 02:59:13 +0000 UTC" firstStartedPulling="2026-03-09 02:59:30.506290317 +0000 UTC m=+1095.095954049" lastFinishedPulling="2026-03-09 02:59:35.60325713 +0000 UTC m=+1100.192920872" observedRunningTime="2026-03-09 02:59:36.477476225 +0000 UTC m=+1101.067139947" watchObservedRunningTime="2026-03-09 02:59:36.478009029 +0000 UTC m=+1101.067672761" Mar 09 02:59:36 crc kubenswrapper[4901]: I0309 02:59:36.491478 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-54688575f-bvb9v" podStartSLOduration=2.26470565 podStartE2EDuration="22.491462216s" podCreationTimestamp="2026-03-09 02:59:14 +0000 UTC" firstStartedPulling="2026-03-09 02:59:15.376653498 +0000 UTC m=+1079.966317230" lastFinishedPulling="2026-03-09 02:59:35.603410044 +0000 UTC m=+1100.193073796" observedRunningTime="2026-03-09 02:59:36.488384159 +0000 UTC m=+1101.078047891" watchObservedRunningTime="2026-03-09 02:59:36.491462216 +0000 UTC m=+1101.081125938" Mar 09 02:59:36 crc kubenswrapper[4901]: I0309 02:59:36.502130 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-mksnm" podStartSLOduration=2.275512032 podStartE2EDuration="22.502110394s" podCreationTimestamp="2026-03-09 02:59:14 +0000 UTC" firstStartedPulling="2026-03-09 02:59:15.372557305 +0000 UTC m=+1079.962221037" lastFinishedPulling="2026-03-09 02:59:35.599155657 +0000 UTC m=+1100.188819399" observedRunningTime="2026-03-09 02:59:36.500696578 +0000 UTC m=+1101.090360320" watchObservedRunningTime="2026-03-09 02:59:36.502110394 +0000 UTC m=+1101.091774146" Mar 09 02:59:36 crc kubenswrapper[4901]: I0309 02:59:36.519355 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-2qbw7" podStartSLOduration=2.46781625 podStartE2EDuration="22.519331096s" podCreationTimestamp="2026-03-09 02:59:14 +0000 UTC" firstStartedPulling="2026-03-09 02:59:15.552657927 +0000 UTC m=+1080.142321659" lastFinishedPulling="2026-03-09 02:59:35.604172763 +0000 UTC m=+1100.193836505" observedRunningTime="2026-03-09 02:59:36.516201397 +0000 UTC m=+1101.105865139" watchObservedRunningTime="2026-03-09 02:59:36.519331096 +0000 UTC m=+1101.108994848" Mar 09 02:59:44 crc kubenswrapper[4901]: I0309 02:59:44.662518 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-54688575f-bvb9v" Mar 09 02:59:44 crc kubenswrapper[4901]: I0309 02:59:44.754212 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-mksnm" Mar 09 02:59:44 crc kubenswrapper[4901]: I0309 02:59:44.903441 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-2qbw7" Mar 09 02:59:44 crc kubenswrapper[4901]: I0309 02:59:44.930946 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-8d9wj" Mar 09 02:59:45 crc kubenswrapper[4901]: I0309 02:59:45.014204 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-r2rzc" Mar 09 02:59:46 crc kubenswrapper[4901]: I0309 02:59:46.290601 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dbe9fe4b-dcc0-4c01-9de1-8f68a30ca974-cert\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-x7pjx\" (UID: \"dbe9fe4b-dcc0-4c01-9de1-8f68a30ca974\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-x7pjx" Mar 09 02:59:46 crc kubenswrapper[4901]: I0309 02:59:46.300554 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dbe9fe4b-dcc0-4c01-9de1-8f68a30ca974-cert\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-x7pjx\" (UID: \"dbe9fe4b-dcc0-4c01-9de1-8f68a30ca974\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-x7pjx" Mar 09 02:59:46 crc kubenswrapper[4901]: I0309 02:59:46.331318 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-76rl6" Mar 09 02:59:46 crc kubenswrapper[4901]: I0309 02:59:46.340394 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-x7pjx" Mar 09 02:59:46 crc kubenswrapper[4901]: I0309 02:59:46.696609 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/801bcadf-a5a1-4b3c-9564-e1e21ff68f7f-metrics-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-jmbhm\" (UID: \"801bcadf-a5a1-4b3c-9564-e1e21ff68f7f\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-jmbhm" Mar 09 02:59:46 crc kubenswrapper[4901]: I0309 02:59:46.696903 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/801bcadf-a5a1-4b3c-9564-e1e21ff68f7f-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-jmbhm\" (UID: \"801bcadf-a5a1-4b3c-9564-e1e21ff68f7f\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-jmbhm" Mar 09 02:59:46 crc kubenswrapper[4901]: I0309 02:59:46.703731 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/801bcadf-a5a1-4b3c-9564-e1e21ff68f7f-metrics-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-jmbhm\" (UID: \"801bcadf-a5a1-4b3c-9564-e1e21ff68f7f\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-jmbhm" Mar 09 02:59:46 crc kubenswrapper[4901]: I0309 02:59:46.705169 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/801bcadf-a5a1-4b3c-9564-e1e21ff68f7f-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-jmbhm\" (UID: \"801bcadf-a5a1-4b3c-9564-e1e21ff68f7f\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-jmbhm" Mar 09 02:59:46 crc kubenswrapper[4901]: I0309 02:59:46.830311 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-x7pjx"] Mar 09 02:59:46 crc kubenswrapper[4901]: I0309 02:59:46.855521 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-s5d64" Mar 09 02:59:46 crc kubenswrapper[4901]: I0309 02:59:46.863865 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-jmbhm" Mar 09 02:59:47 crc kubenswrapper[4901]: W0309 02:59:47.117292 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod801bcadf_a5a1_4b3c_9564_e1e21ff68f7f.slice/crio-6e4afd6bdc8d389c8525e1799a51b56e5d87ef89102a09421967f6bce118b3fb WatchSource:0}: Error finding container 6e4afd6bdc8d389c8525e1799a51b56e5d87ef89102a09421967f6bce118b3fb: Status 404 returned error can't find the container with id 6e4afd6bdc8d389c8525e1799a51b56e5d87ef89102a09421967f6bce118b3fb Mar 09 02:59:47 crc kubenswrapper[4901]: I0309 02:59:47.117587 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-jmbhm"] Mar 09 02:59:47 crc kubenswrapper[4901]: I0309 02:59:47.528319 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-x7pjx" event={"ID":"dbe9fe4b-dcc0-4c01-9de1-8f68a30ca974","Type":"ContainerStarted","Data":"0ba480c228437683135abceba5488dad294ce916d99f5fa449d1b7008794f30c"} Mar 09 02:59:47 crc kubenswrapper[4901]: I0309 02:59:47.530746 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-jmbhm" event={"ID":"801bcadf-a5a1-4b3c-9564-e1e21ff68f7f","Type":"ContainerStarted","Data":"6e4afd6bdc8d389c8525e1799a51b56e5d87ef89102a09421967f6bce118b3fb"} Mar 09 02:59:48 crc kubenswrapper[4901]: I0309 02:59:48.542090 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-jmbhm" event={"ID":"801bcadf-a5a1-4b3c-9564-e1e21ff68f7f","Type":"ContainerStarted","Data":"36c0d3c2153f8d95906a4fb4651330cd1ae31b2870e0c9c2ed7970742f5127c0"} Mar 09 02:59:48 crc kubenswrapper[4901]: I0309 02:59:48.542294 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-jmbhm" Mar 09 02:59:48 crc kubenswrapper[4901]: I0309 02:59:48.606210 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-jmbhm" podStartSLOduration=34.60614449 podStartE2EDuration="34.60614449s" podCreationTimestamp="2026-03-09 02:59:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 02:59:48.589380249 +0000 UTC m=+1113.179044021" watchObservedRunningTime="2026-03-09 02:59:48.60614449 +0000 UTC m=+1113.195808262" Mar 09 02:59:49 crc kubenswrapper[4901]: I0309 02:59:49.975050 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-24mvj" Mar 09 02:59:50 crc kubenswrapper[4901]: I0309 02:59:50.561916 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-tkpsk" event={"ID":"6ee7156f-f994-4e79-875d-744fed479fcf","Type":"ContainerStarted","Data":"3c332053698de4c2325d4351f4b3504779abdc12a5708bf8cb397f90d43923ee"} Mar 09 02:59:50 crc kubenswrapper[4901]: I0309 02:59:50.563017 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-tkpsk" Mar 09 02:59:50 crc kubenswrapper[4901]: I0309 02:59:50.590560 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-tkpsk" podStartSLOduration=3.536849882 podStartE2EDuration="37.590532224s" podCreationTimestamp="2026-03-09 02:59:13 +0000 UTC" firstStartedPulling="2026-03-09 02:59:15.350800349 +0000 UTC m=+1079.940464081" lastFinishedPulling="2026-03-09 02:59:49.404482691 +0000 UTC m=+1113.994146423" observedRunningTime="2026-03-09 02:59:50.584885822 +0000 UTC m=+1115.174549564" watchObservedRunningTime="2026-03-09 02:59:50.590532224 +0000 UTC m=+1115.180195996" Mar 09 02:59:51 crc kubenswrapper[4901]: I0309 02:59:51.572793 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-x7pjx" event={"ID":"dbe9fe4b-dcc0-4c01-9de1-8f68a30ca974","Type":"ContainerStarted","Data":"564cce6d5906a647cf23a3d3ea0591b0527c3ffca50c1bb22092926db07cd826"} Mar 09 02:59:51 crc kubenswrapper[4901]: I0309 02:59:51.573108 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-x7pjx" Mar 09 02:59:51 crc kubenswrapper[4901]: I0309 02:59:51.599703 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-x7pjx" podStartSLOduration=33.980172615 podStartE2EDuration="37.599680637s" podCreationTimestamp="2026-03-09 02:59:14 +0000 UTC" firstStartedPulling="2026-03-09 02:59:46.84269611 +0000 UTC m=+1111.432359872" lastFinishedPulling="2026-03-09 02:59:50.462204162 +0000 UTC m=+1115.051867894" observedRunningTime="2026-03-09 02:59:51.594379714 +0000 UTC m=+1116.184043446" watchObservedRunningTime="2026-03-09 02:59:51.599680637 +0000 UTC m=+1116.189344369" Mar 09 02:59:54 crc kubenswrapper[4901]: I0309 02:59:54.610393 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-tkpsk" Mar 09 02:59:56 crc kubenswrapper[4901]: I0309 02:59:56.350579 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-x7pjx" Mar 09 02:59:56 crc kubenswrapper[4901]: I0309 02:59:56.872546 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-jmbhm" Mar 09 03:00:00 crc kubenswrapper[4901]: I0309 03:00:00.160063 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550420-9l54t"] Mar 09 03:00:00 crc kubenswrapper[4901]: I0309 03:00:00.161795 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550420-9l54t" Mar 09 03:00:00 crc kubenswrapper[4901]: I0309 03:00:00.166134 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 03:00:00 crc kubenswrapper[4901]: I0309 03:00:00.166207 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 03:00:00 crc kubenswrapper[4901]: I0309 03:00:00.167520 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lq6vs" Mar 09 03:00:00 crc kubenswrapper[4901]: I0309 03:00:00.181066 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550420-9l54t"] Mar 09 03:00:00 crc kubenswrapper[4901]: I0309 03:00:00.181120 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550420-pqfpq"] Mar 09 03:00:00 crc kubenswrapper[4901]: I0309 03:00:00.181909 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550420-pqfpq" Mar 09 03:00:00 crc kubenswrapper[4901]: I0309 03:00:00.195721 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 09 03:00:00 crc kubenswrapper[4901]: I0309 03:00:00.196115 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 09 03:00:00 crc kubenswrapper[4901]: I0309 03:00:00.200900 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550420-pqfpq"] Mar 09 03:00:00 crc kubenswrapper[4901]: I0309 03:00:00.312498 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d19c465-0987-41b6-b948-6a61d368bac4-secret-volume\") pod \"collect-profiles-29550420-pqfpq\" (UID: \"7d19c465-0987-41b6-b948-6a61d368bac4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550420-pqfpq" Mar 09 03:00:00 crc kubenswrapper[4901]: I0309 03:00:00.312658 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w94vv\" (UniqueName: \"kubernetes.io/projected/0ddd9ca0-2302-4a94-9d07-fbf8db553b73-kube-api-access-w94vv\") pod \"auto-csr-approver-29550420-9l54t\" (UID: \"0ddd9ca0-2302-4a94-9d07-fbf8db553b73\") " pod="openshift-infra/auto-csr-approver-29550420-9l54t" Mar 09 03:00:00 crc kubenswrapper[4901]: I0309 03:00:00.312967 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjd76\" (UniqueName: \"kubernetes.io/projected/7d19c465-0987-41b6-b948-6a61d368bac4-kube-api-access-gjd76\") pod \"collect-profiles-29550420-pqfpq\" (UID: \"7d19c465-0987-41b6-b948-6a61d368bac4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550420-pqfpq" Mar 09 03:00:00 crc kubenswrapper[4901]: I0309 03:00:00.313116 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d19c465-0987-41b6-b948-6a61d368bac4-config-volume\") pod \"collect-profiles-29550420-pqfpq\" (UID: \"7d19c465-0987-41b6-b948-6a61d368bac4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550420-pqfpq" Mar 09 03:00:00 crc kubenswrapper[4901]: I0309 03:00:00.414928 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjd76\" (UniqueName: \"kubernetes.io/projected/7d19c465-0987-41b6-b948-6a61d368bac4-kube-api-access-gjd76\") pod \"collect-profiles-29550420-pqfpq\" (UID: \"7d19c465-0987-41b6-b948-6a61d368bac4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550420-pqfpq" Mar 09 03:00:00 crc kubenswrapper[4901]: I0309 03:00:00.415018 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d19c465-0987-41b6-b948-6a61d368bac4-config-volume\") pod \"collect-profiles-29550420-pqfpq\" (UID: \"7d19c465-0987-41b6-b948-6a61d368bac4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550420-pqfpq" Mar 09 03:00:00 crc kubenswrapper[4901]: I0309 03:00:00.415068 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d19c465-0987-41b6-b948-6a61d368bac4-secret-volume\") pod \"collect-profiles-29550420-pqfpq\" (UID: \"7d19c465-0987-41b6-b948-6a61d368bac4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550420-pqfpq" Mar 09 03:00:00 crc kubenswrapper[4901]: I0309 03:00:00.415141 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w94vv\" (UniqueName: \"kubernetes.io/projected/0ddd9ca0-2302-4a94-9d07-fbf8db553b73-kube-api-access-w94vv\") pod \"auto-csr-approver-29550420-9l54t\" (UID: \"0ddd9ca0-2302-4a94-9d07-fbf8db553b73\") " pod="openshift-infra/auto-csr-approver-29550420-9l54t" Mar 09 03:00:00 crc kubenswrapper[4901]: I0309 03:00:00.416945 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d19c465-0987-41b6-b948-6a61d368bac4-config-volume\") pod \"collect-profiles-29550420-pqfpq\" (UID: \"7d19c465-0987-41b6-b948-6a61d368bac4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550420-pqfpq" Mar 09 03:00:00 crc kubenswrapper[4901]: I0309 03:00:00.425476 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d19c465-0987-41b6-b948-6a61d368bac4-secret-volume\") pod \"collect-profiles-29550420-pqfpq\" (UID: \"7d19c465-0987-41b6-b948-6a61d368bac4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550420-pqfpq" Mar 09 03:00:00 crc kubenswrapper[4901]: I0309 03:00:00.444158 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjd76\" (UniqueName: \"kubernetes.io/projected/7d19c465-0987-41b6-b948-6a61d368bac4-kube-api-access-gjd76\") pod \"collect-profiles-29550420-pqfpq\" (UID: \"7d19c465-0987-41b6-b948-6a61d368bac4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550420-pqfpq" Mar 09 03:00:00 crc kubenswrapper[4901]: I0309 03:00:00.445505 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w94vv\" (UniqueName: \"kubernetes.io/projected/0ddd9ca0-2302-4a94-9d07-fbf8db553b73-kube-api-access-w94vv\") pod \"auto-csr-approver-29550420-9l54t\" (UID: \"0ddd9ca0-2302-4a94-9d07-fbf8db553b73\") " pod="openshift-infra/auto-csr-approver-29550420-9l54t" Mar 09 03:00:00 crc kubenswrapper[4901]: I0309 03:00:00.528787 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550420-9l54t" Mar 09 03:00:00 crc kubenswrapper[4901]: I0309 03:00:00.543572 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550420-pqfpq" Mar 09 03:00:00 crc kubenswrapper[4901]: I0309 03:00:00.867499 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 03:00:00 crc kubenswrapper[4901]: I0309 03:00:00.867752 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 03:00:00 crc kubenswrapper[4901]: W0309 03:00:00.875422 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ddd9ca0_2302_4a94_9d07_fbf8db553b73.slice/crio-7e3564a06575895f3bab8cf2824571105cdcfe4a267774b7e41c23c4c5d3d7fc WatchSource:0}: Error finding container 7e3564a06575895f3bab8cf2824571105cdcfe4a267774b7e41c23c4c5d3d7fc: Status 404 returned error can't find the container with id 7e3564a06575895f3bab8cf2824571105cdcfe4a267774b7e41c23c4c5d3d7fc Mar 09 03:00:00 crc kubenswrapper[4901]: I0309 03:00:00.878498 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550420-9l54t"] Mar 09 03:00:01 crc kubenswrapper[4901]: I0309 03:00:01.051625 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550420-pqfpq"] Mar 09 03:00:01 crc kubenswrapper[4901]: W0309 03:00:01.054448 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d19c465_0987_41b6_b948_6a61d368bac4.slice/crio-69ab2897d93e35494420b220e277e4b5630235d0cdea66bbd14b91a08592338a WatchSource:0}: Error finding container 69ab2897d93e35494420b220e277e4b5630235d0cdea66bbd14b91a08592338a: Status 404 returned error can't find the container with id 69ab2897d93e35494420b220e277e4b5630235d0cdea66bbd14b91a08592338a Mar 09 03:00:01 crc kubenswrapper[4901]: E0309 03:00:01.454251 4901 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d19c465_0987_41b6_b948_6a61d368bac4.slice/crio-ed9c4c3e79f1be3c311a803930c595ab18d827a0f272a4923403ead5508b1d9d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d19c465_0987_41b6_b948_6a61d368bac4.slice/crio-conmon-ed9c4c3e79f1be3c311a803930c595ab18d827a0f272a4923403ead5508b1d9d.scope\": RecentStats: unable to find data in memory cache]" Mar 09 03:00:01 crc kubenswrapper[4901]: I0309 03:00:01.660974 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550420-9l54t" event={"ID":"0ddd9ca0-2302-4a94-9d07-fbf8db553b73","Type":"ContainerStarted","Data":"7e3564a06575895f3bab8cf2824571105cdcfe4a267774b7e41c23c4c5d3d7fc"} Mar 09 03:00:01 crc kubenswrapper[4901]: I0309 03:00:01.663215 4901 generic.go:334] "Generic (PLEG): container finished" podID="7d19c465-0987-41b6-b948-6a61d368bac4" containerID="ed9c4c3e79f1be3c311a803930c595ab18d827a0f272a4923403ead5508b1d9d" exitCode=0 Mar 09 03:00:01 crc kubenswrapper[4901]: I0309 03:00:01.663270 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550420-pqfpq" event={"ID":"7d19c465-0987-41b6-b948-6a61d368bac4","Type":"ContainerDied","Data":"ed9c4c3e79f1be3c311a803930c595ab18d827a0f272a4923403ead5508b1d9d"} Mar 09 03:00:01 crc kubenswrapper[4901]: I0309 03:00:01.663304 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550420-pqfpq" event={"ID":"7d19c465-0987-41b6-b948-6a61d368bac4","Type":"ContainerStarted","Data":"69ab2897d93e35494420b220e277e4b5630235d0cdea66bbd14b91a08592338a"} Mar 09 03:00:03 crc kubenswrapper[4901]: I0309 03:00:03.049878 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550420-pqfpq" Mar 09 03:00:03 crc kubenswrapper[4901]: I0309 03:00:03.158267 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjd76\" (UniqueName: \"kubernetes.io/projected/7d19c465-0987-41b6-b948-6a61d368bac4-kube-api-access-gjd76\") pod \"7d19c465-0987-41b6-b948-6a61d368bac4\" (UID: \"7d19c465-0987-41b6-b948-6a61d368bac4\") " Mar 09 03:00:03 crc kubenswrapper[4901]: I0309 03:00:03.158565 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d19c465-0987-41b6-b948-6a61d368bac4-config-volume\") pod \"7d19c465-0987-41b6-b948-6a61d368bac4\" (UID: \"7d19c465-0987-41b6-b948-6a61d368bac4\") " Mar 09 03:00:03 crc kubenswrapper[4901]: I0309 03:00:03.158758 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d19c465-0987-41b6-b948-6a61d368bac4-secret-volume\") pod \"7d19c465-0987-41b6-b948-6a61d368bac4\" (UID: \"7d19c465-0987-41b6-b948-6a61d368bac4\") " Mar 09 03:00:03 crc kubenswrapper[4901]: I0309 03:00:03.160123 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d19c465-0987-41b6-b948-6a61d368bac4-config-volume" (OuterVolumeSpecName: "config-volume") pod "7d19c465-0987-41b6-b948-6a61d368bac4" (UID: "7d19c465-0987-41b6-b948-6a61d368bac4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:00:03 crc kubenswrapper[4901]: I0309 03:00:03.167621 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d19c465-0987-41b6-b948-6a61d368bac4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7d19c465-0987-41b6-b948-6a61d368bac4" (UID: "7d19c465-0987-41b6-b948-6a61d368bac4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:00:03 crc kubenswrapper[4901]: I0309 03:00:03.167782 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d19c465-0987-41b6-b948-6a61d368bac4-kube-api-access-gjd76" (OuterVolumeSpecName: "kube-api-access-gjd76") pod "7d19c465-0987-41b6-b948-6a61d368bac4" (UID: "7d19c465-0987-41b6-b948-6a61d368bac4"). InnerVolumeSpecName "kube-api-access-gjd76". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:00:03 crc kubenswrapper[4901]: I0309 03:00:03.261073 4901 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d19c465-0987-41b6-b948-6a61d368bac4-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 09 03:00:03 crc kubenswrapper[4901]: I0309 03:00:03.261119 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjd76\" (UniqueName: \"kubernetes.io/projected/7d19c465-0987-41b6-b948-6a61d368bac4-kube-api-access-gjd76\") on node \"crc\" DevicePath \"\"" Mar 09 03:00:03 crc kubenswrapper[4901]: I0309 03:00:03.261132 4901 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d19c465-0987-41b6-b948-6a61d368bac4-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 03:00:03 crc kubenswrapper[4901]: I0309 03:00:03.683287 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550420-pqfpq" event={"ID":"7d19c465-0987-41b6-b948-6a61d368bac4","Type":"ContainerDied","Data":"69ab2897d93e35494420b220e277e4b5630235d0cdea66bbd14b91a08592338a"} Mar 09 03:00:03 crc kubenswrapper[4901]: I0309 03:00:03.683538 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69ab2897d93e35494420b220e277e4b5630235d0cdea66bbd14b91a08592338a" Mar 09 03:00:03 crc kubenswrapper[4901]: I0309 03:00:03.683399 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550420-pqfpq" Mar 09 03:00:05 crc kubenswrapper[4901]: I0309 03:00:05.701368 4901 generic.go:334] "Generic (PLEG): container finished" podID="0ddd9ca0-2302-4a94-9d07-fbf8db553b73" containerID="2e9420df9d722ebd4bebb2b0d2034e677b415c23cb883ab6d8be4a18aba0d18c" exitCode=0 Mar 09 03:00:05 crc kubenswrapper[4901]: I0309 03:00:05.701697 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550420-9l54t" event={"ID":"0ddd9ca0-2302-4a94-9d07-fbf8db553b73","Type":"ContainerDied","Data":"2e9420df9d722ebd4bebb2b0d2034e677b415c23cb883ab6d8be4a18aba0d18c"} Mar 09 03:00:07 crc kubenswrapper[4901]: I0309 03:00:07.010478 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550420-9l54t" Mar 09 03:00:07 crc kubenswrapper[4901]: I0309 03:00:07.119264 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w94vv\" (UniqueName: \"kubernetes.io/projected/0ddd9ca0-2302-4a94-9d07-fbf8db553b73-kube-api-access-w94vv\") pod \"0ddd9ca0-2302-4a94-9d07-fbf8db553b73\" (UID: \"0ddd9ca0-2302-4a94-9d07-fbf8db553b73\") " Mar 09 03:00:07 crc kubenswrapper[4901]: I0309 03:00:07.126560 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ddd9ca0-2302-4a94-9d07-fbf8db553b73-kube-api-access-w94vv" (OuterVolumeSpecName: "kube-api-access-w94vv") pod "0ddd9ca0-2302-4a94-9d07-fbf8db553b73" (UID: "0ddd9ca0-2302-4a94-9d07-fbf8db553b73"). InnerVolumeSpecName "kube-api-access-w94vv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:00:07 crc kubenswrapper[4901]: I0309 03:00:07.221291 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w94vv\" (UniqueName: \"kubernetes.io/projected/0ddd9ca0-2302-4a94-9d07-fbf8db553b73-kube-api-access-w94vv\") on node \"crc\" DevicePath \"\"" Mar 09 03:00:07 crc kubenswrapper[4901]: I0309 03:00:07.728426 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550420-9l54t" event={"ID":"0ddd9ca0-2302-4a94-9d07-fbf8db553b73","Type":"ContainerDied","Data":"7e3564a06575895f3bab8cf2824571105cdcfe4a267774b7e41c23c4c5d3d7fc"} Mar 09 03:00:07 crc kubenswrapper[4901]: I0309 03:00:07.728474 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e3564a06575895f3bab8cf2824571105cdcfe4a267774b7e41c23c4c5d3d7fc" Mar 09 03:00:07 crc kubenswrapper[4901]: I0309 03:00:07.728488 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550420-9l54t" Mar 09 03:00:08 crc kubenswrapper[4901]: I0309 03:00:08.084252 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550414-zl74c"] Mar 09 03:00:08 crc kubenswrapper[4901]: I0309 03:00:08.096858 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550414-zl74c"] Mar 09 03:00:08 crc kubenswrapper[4901]: I0309 03:00:08.115811 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="439c6578-bec4-4371-9f36-bbe54de578bf" path="/var/lib/kubelet/pods/439c6578-bec4-4371-9f36-bbe54de578bf/volumes" Mar 09 03:00:12 crc kubenswrapper[4901]: I0309 03:00:12.939523 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-vcd9z"] Mar 09 03:00:12 crc kubenswrapper[4901]: E0309 03:00:12.941324 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d19c465-0987-41b6-b948-6a61d368bac4" containerName="collect-profiles" Mar 09 03:00:12 crc kubenswrapper[4901]: I0309 03:00:12.941437 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d19c465-0987-41b6-b948-6a61d368bac4" containerName="collect-profiles" Mar 09 03:00:12 crc kubenswrapper[4901]: E0309 03:00:12.941523 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ddd9ca0-2302-4a94-9d07-fbf8db553b73" containerName="oc" Mar 09 03:00:12 crc kubenswrapper[4901]: I0309 03:00:12.941651 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ddd9ca0-2302-4a94-9d07-fbf8db553b73" containerName="oc" Mar 09 03:00:12 crc kubenswrapper[4901]: I0309 03:00:12.941900 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d19c465-0987-41b6-b948-6a61d368bac4" containerName="collect-profiles" Mar 09 03:00:12 crc kubenswrapper[4901]: I0309 03:00:12.941997 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ddd9ca0-2302-4a94-9d07-fbf8db553b73" containerName="oc" Mar 09 03:00:12 crc kubenswrapper[4901]: I0309 03:00:12.942914 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-vcd9z" Mar 09 03:00:12 crc kubenswrapper[4901]: I0309 03:00:12.946145 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 09 03:00:12 crc kubenswrapper[4901]: I0309 03:00:12.946667 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 09 03:00:12 crc kubenswrapper[4901]: I0309 03:00:12.947143 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-fzv77" Mar 09 03:00:12 crc kubenswrapper[4901]: I0309 03:00:12.948486 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 09 03:00:12 crc kubenswrapper[4901]: I0309 03:00:12.954276 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-vcd9z"] Mar 09 03:00:13 crc kubenswrapper[4901]: I0309 03:00:13.006549 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bdf9323-3b7a-4dea-aa2e-63dd95f0cb5a-config\") pod \"dnsmasq-dns-589db6c89c-vcd9z\" (UID: \"0bdf9323-3b7a-4dea-aa2e-63dd95f0cb5a\") " pod="openstack/dnsmasq-dns-589db6c89c-vcd9z" Mar 09 03:00:13 crc kubenswrapper[4901]: I0309 03:00:13.006944 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5t6fs\" (UniqueName: \"kubernetes.io/projected/0bdf9323-3b7a-4dea-aa2e-63dd95f0cb5a-kube-api-access-5t6fs\") pod \"dnsmasq-dns-589db6c89c-vcd9z\" (UID: \"0bdf9323-3b7a-4dea-aa2e-63dd95f0cb5a\") " pod="openstack/dnsmasq-dns-589db6c89c-vcd9z" Mar 09 03:00:13 crc kubenswrapper[4901]: I0309 03:00:13.064419 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-4qr9p"] Mar 09 03:00:13 crc kubenswrapper[4901]: I0309 03:00:13.065552 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-4qr9p" Mar 09 03:00:13 crc kubenswrapper[4901]: I0309 03:00:13.067963 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 09 03:00:13 crc kubenswrapper[4901]: I0309 03:00:13.079868 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-4qr9p"] Mar 09 03:00:13 crc kubenswrapper[4901]: I0309 03:00:13.107930 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5t6fs\" (UniqueName: \"kubernetes.io/projected/0bdf9323-3b7a-4dea-aa2e-63dd95f0cb5a-kube-api-access-5t6fs\") pod \"dnsmasq-dns-589db6c89c-vcd9z\" (UID: \"0bdf9323-3b7a-4dea-aa2e-63dd95f0cb5a\") " pod="openstack/dnsmasq-dns-589db6c89c-vcd9z" Mar 09 03:00:13 crc kubenswrapper[4901]: I0309 03:00:13.107989 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bdf9323-3b7a-4dea-aa2e-63dd95f0cb5a-config\") pod \"dnsmasq-dns-589db6c89c-vcd9z\" (UID: \"0bdf9323-3b7a-4dea-aa2e-63dd95f0cb5a\") " pod="openstack/dnsmasq-dns-589db6c89c-vcd9z" Mar 09 03:00:13 crc kubenswrapper[4901]: I0309 03:00:13.110175 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bdf9323-3b7a-4dea-aa2e-63dd95f0cb5a-config\") pod \"dnsmasq-dns-589db6c89c-vcd9z\" (UID: \"0bdf9323-3b7a-4dea-aa2e-63dd95f0cb5a\") " pod="openstack/dnsmasq-dns-589db6c89c-vcd9z" Mar 09 03:00:13 crc kubenswrapper[4901]: I0309 03:00:13.129249 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5t6fs\" (UniqueName: \"kubernetes.io/projected/0bdf9323-3b7a-4dea-aa2e-63dd95f0cb5a-kube-api-access-5t6fs\") pod \"dnsmasq-dns-589db6c89c-vcd9z\" (UID: \"0bdf9323-3b7a-4dea-aa2e-63dd95f0cb5a\") " pod="openstack/dnsmasq-dns-589db6c89c-vcd9z" Mar 09 03:00:13 crc kubenswrapper[4901]: I0309 03:00:13.209244 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcd8w\" (UniqueName: \"kubernetes.io/projected/7a9998e9-fba6-4227-9fae-a916d6143d7d-kube-api-access-zcd8w\") pod \"dnsmasq-dns-86bbd886cf-4qr9p\" (UID: \"7a9998e9-fba6-4227-9fae-a916d6143d7d\") " pod="openstack/dnsmasq-dns-86bbd886cf-4qr9p" Mar 09 03:00:13 crc kubenswrapper[4901]: I0309 03:00:13.209332 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a9998e9-fba6-4227-9fae-a916d6143d7d-dns-svc\") pod \"dnsmasq-dns-86bbd886cf-4qr9p\" (UID: \"7a9998e9-fba6-4227-9fae-a916d6143d7d\") " pod="openstack/dnsmasq-dns-86bbd886cf-4qr9p" Mar 09 03:00:13 crc kubenswrapper[4901]: I0309 03:00:13.209425 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a9998e9-fba6-4227-9fae-a916d6143d7d-config\") pod \"dnsmasq-dns-86bbd886cf-4qr9p\" (UID: \"7a9998e9-fba6-4227-9fae-a916d6143d7d\") " pod="openstack/dnsmasq-dns-86bbd886cf-4qr9p" Mar 09 03:00:13 crc kubenswrapper[4901]: I0309 03:00:13.264836 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-vcd9z" Mar 09 03:00:13 crc kubenswrapper[4901]: I0309 03:00:13.310506 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a9998e9-fba6-4227-9fae-a916d6143d7d-config\") pod \"dnsmasq-dns-86bbd886cf-4qr9p\" (UID: \"7a9998e9-fba6-4227-9fae-a916d6143d7d\") " pod="openstack/dnsmasq-dns-86bbd886cf-4qr9p" Mar 09 03:00:13 crc kubenswrapper[4901]: I0309 03:00:13.310600 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcd8w\" (UniqueName: \"kubernetes.io/projected/7a9998e9-fba6-4227-9fae-a916d6143d7d-kube-api-access-zcd8w\") pod \"dnsmasq-dns-86bbd886cf-4qr9p\" (UID: \"7a9998e9-fba6-4227-9fae-a916d6143d7d\") " pod="openstack/dnsmasq-dns-86bbd886cf-4qr9p" Mar 09 03:00:13 crc kubenswrapper[4901]: I0309 03:00:13.310645 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a9998e9-fba6-4227-9fae-a916d6143d7d-dns-svc\") pod \"dnsmasq-dns-86bbd886cf-4qr9p\" (UID: \"7a9998e9-fba6-4227-9fae-a916d6143d7d\") " pod="openstack/dnsmasq-dns-86bbd886cf-4qr9p" Mar 09 03:00:13 crc kubenswrapper[4901]: I0309 03:00:13.311514 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a9998e9-fba6-4227-9fae-a916d6143d7d-dns-svc\") pod \"dnsmasq-dns-86bbd886cf-4qr9p\" (UID: \"7a9998e9-fba6-4227-9fae-a916d6143d7d\") " pod="openstack/dnsmasq-dns-86bbd886cf-4qr9p" Mar 09 03:00:13 crc kubenswrapper[4901]: I0309 03:00:13.312379 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a9998e9-fba6-4227-9fae-a916d6143d7d-config\") pod \"dnsmasq-dns-86bbd886cf-4qr9p\" (UID: \"7a9998e9-fba6-4227-9fae-a916d6143d7d\") " pod="openstack/dnsmasq-dns-86bbd886cf-4qr9p" Mar 09 03:00:13 crc kubenswrapper[4901]: I0309 03:00:13.340111 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcd8w\" (UniqueName: \"kubernetes.io/projected/7a9998e9-fba6-4227-9fae-a916d6143d7d-kube-api-access-zcd8w\") pod \"dnsmasq-dns-86bbd886cf-4qr9p\" (UID: \"7a9998e9-fba6-4227-9fae-a916d6143d7d\") " pod="openstack/dnsmasq-dns-86bbd886cf-4qr9p" Mar 09 03:00:13 crc kubenswrapper[4901]: I0309 03:00:13.382197 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-4qr9p" Mar 09 03:00:13 crc kubenswrapper[4901]: I0309 03:00:13.760487 4901 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 03:00:13 crc kubenswrapper[4901]: I0309 03:00:13.763543 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-vcd9z"] Mar 09 03:00:13 crc kubenswrapper[4901]: I0309 03:00:13.775038 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589db6c89c-vcd9z" event={"ID":"0bdf9323-3b7a-4dea-aa2e-63dd95f0cb5a","Type":"ContainerStarted","Data":"c0faff36f2b82479038d2ec0b7a707c9b803ccf7555cb143101c8a3e962c88c3"} Mar 09 03:00:13 crc kubenswrapper[4901]: I0309 03:00:13.827659 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-4qr9p"] Mar 09 03:00:13 crc kubenswrapper[4901]: W0309 03:00:13.841504 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a9998e9_fba6_4227_9fae_a916d6143d7d.slice/crio-4e93826f3d55602b22ae95f5a29e8514a3571c0218702419830155718f040adc WatchSource:0}: Error finding container 4e93826f3d55602b22ae95f5a29e8514a3571c0218702419830155718f040adc: Status 404 returned error can't find the container with id 4e93826f3d55602b22ae95f5a29e8514a3571c0218702419830155718f040adc Mar 09 03:00:14 crc kubenswrapper[4901]: I0309 03:00:14.229993 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-vcd9z"] Mar 09 03:00:14 crc kubenswrapper[4901]: I0309 03:00:14.259955 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79f9fc56ff-fc488"] Mar 09 03:00:14 crc kubenswrapper[4901]: I0309 03:00:14.264512 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79f9fc56ff-fc488" Mar 09 03:00:14 crc kubenswrapper[4901]: I0309 03:00:14.279981 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79f9fc56ff-fc488"] Mar 09 03:00:14 crc kubenswrapper[4901]: I0309 03:00:14.427529 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/375fef20-b677-465b-bc26-cb609f6babae-config\") pod \"dnsmasq-dns-79f9fc56ff-fc488\" (UID: \"375fef20-b677-465b-bc26-cb609f6babae\") " pod="openstack/dnsmasq-dns-79f9fc56ff-fc488" Mar 09 03:00:14 crc kubenswrapper[4901]: I0309 03:00:14.427683 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/375fef20-b677-465b-bc26-cb609f6babae-dns-svc\") pod \"dnsmasq-dns-79f9fc56ff-fc488\" (UID: \"375fef20-b677-465b-bc26-cb609f6babae\") " pod="openstack/dnsmasq-dns-79f9fc56ff-fc488" Mar 09 03:00:14 crc kubenswrapper[4901]: I0309 03:00:14.427706 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8xdz\" (UniqueName: \"kubernetes.io/projected/375fef20-b677-465b-bc26-cb609f6babae-kube-api-access-g8xdz\") pod \"dnsmasq-dns-79f9fc56ff-fc488\" (UID: \"375fef20-b677-465b-bc26-cb609f6babae\") " pod="openstack/dnsmasq-dns-79f9fc56ff-fc488" Mar 09 03:00:14 crc kubenswrapper[4901]: I0309 03:00:14.529456 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/375fef20-b677-465b-bc26-cb609f6babae-dns-svc\") pod \"dnsmasq-dns-79f9fc56ff-fc488\" (UID: \"375fef20-b677-465b-bc26-cb609f6babae\") " pod="openstack/dnsmasq-dns-79f9fc56ff-fc488" Mar 09 03:00:14 crc kubenswrapper[4901]: I0309 03:00:14.529513 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8xdz\" (UniqueName: \"kubernetes.io/projected/375fef20-b677-465b-bc26-cb609f6babae-kube-api-access-g8xdz\") pod \"dnsmasq-dns-79f9fc56ff-fc488\" (UID: \"375fef20-b677-465b-bc26-cb609f6babae\") " pod="openstack/dnsmasq-dns-79f9fc56ff-fc488" Mar 09 03:00:14 crc kubenswrapper[4901]: I0309 03:00:14.529589 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/375fef20-b677-465b-bc26-cb609f6babae-config\") pod \"dnsmasq-dns-79f9fc56ff-fc488\" (UID: \"375fef20-b677-465b-bc26-cb609f6babae\") " pod="openstack/dnsmasq-dns-79f9fc56ff-fc488" Mar 09 03:00:14 crc kubenswrapper[4901]: I0309 03:00:14.531072 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/375fef20-b677-465b-bc26-cb609f6babae-dns-svc\") pod \"dnsmasq-dns-79f9fc56ff-fc488\" (UID: \"375fef20-b677-465b-bc26-cb609f6babae\") " pod="openstack/dnsmasq-dns-79f9fc56ff-fc488" Mar 09 03:00:14 crc kubenswrapper[4901]: I0309 03:00:14.531401 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/375fef20-b677-465b-bc26-cb609f6babae-config\") pod \"dnsmasq-dns-79f9fc56ff-fc488\" (UID: \"375fef20-b677-465b-bc26-cb609f6babae\") " pod="openstack/dnsmasq-dns-79f9fc56ff-fc488" Mar 09 03:00:14 crc kubenswrapper[4901]: I0309 03:00:14.546256 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8xdz\" (UniqueName: \"kubernetes.io/projected/375fef20-b677-465b-bc26-cb609f6babae-kube-api-access-g8xdz\") pod \"dnsmasq-dns-79f9fc56ff-fc488\" (UID: \"375fef20-b677-465b-bc26-cb609f6babae\") " pod="openstack/dnsmasq-dns-79f9fc56ff-fc488" Mar 09 03:00:14 crc kubenswrapper[4901]: I0309 03:00:14.587097 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79f9fc56ff-fc488" Mar 09 03:00:14 crc kubenswrapper[4901]: I0309 03:00:14.784658 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86bbd886cf-4qr9p" event={"ID":"7a9998e9-fba6-4227-9fae-a916d6143d7d","Type":"ContainerStarted","Data":"4e93826f3d55602b22ae95f5a29e8514a3571c0218702419830155718f040adc"} Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.102589 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-4qr9p"] Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.116547 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79f9fc56ff-fc488"] Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.146161 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-j2tdt"] Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.147349 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-j2tdt" Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.154561 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-j2tdt"] Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.241887 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c09579bf-bfa6-4f4a-b7c3-4be8a914270e-dns-svc\") pod \"dnsmasq-dns-7c47bcb9f9-j2tdt\" (UID: \"c09579bf-bfa6-4f4a-b7c3-4be8a914270e\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-j2tdt" Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.241971 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c09579bf-bfa6-4f4a-b7c3-4be8a914270e-config\") pod \"dnsmasq-dns-7c47bcb9f9-j2tdt\" (UID: \"c09579bf-bfa6-4f4a-b7c3-4be8a914270e\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-j2tdt" Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.241996 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgnnz\" (UniqueName: \"kubernetes.io/projected/c09579bf-bfa6-4f4a-b7c3-4be8a914270e-kube-api-access-bgnnz\") pod \"dnsmasq-dns-7c47bcb9f9-j2tdt\" (UID: \"c09579bf-bfa6-4f4a-b7c3-4be8a914270e\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-j2tdt" Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.343896 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c09579bf-bfa6-4f4a-b7c3-4be8a914270e-dns-svc\") pod \"dnsmasq-dns-7c47bcb9f9-j2tdt\" (UID: \"c09579bf-bfa6-4f4a-b7c3-4be8a914270e\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-j2tdt" Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.345384 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c09579bf-bfa6-4f4a-b7c3-4be8a914270e-config\") pod \"dnsmasq-dns-7c47bcb9f9-j2tdt\" (UID: \"c09579bf-bfa6-4f4a-b7c3-4be8a914270e\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-j2tdt" Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.345419 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgnnz\" (UniqueName: \"kubernetes.io/projected/c09579bf-bfa6-4f4a-b7c3-4be8a914270e-kube-api-access-bgnnz\") pod \"dnsmasq-dns-7c47bcb9f9-j2tdt\" (UID: \"c09579bf-bfa6-4f4a-b7c3-4be8a914270e\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-j2tdt" Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.345305 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c09579bf-bfa6-4f4a-b7c3-4be8a914270e-dns-svc\") pod \"dnsmasq-dns-7c47bcb9f9-j2tdt\" (UID: \"c09579bf-bfa6-4f4a-b7c3-4be8a914270e\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-j2tdt" Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.346396 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c09579bf-bfa6-4f4a-b7c3-4be8a914270e-config\") pod \"dnsmasq-dns-7c47bcb9f9-j2tdt\" (UID: \"c09579bf-bfa6-4f4a-b7c3-4be8a914270e\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-j2tdt" Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.359844 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgnnz\" (UniqueName: \"kubernetes.io/projected/c09579bf-bfa6-4f4a-b7c3-4be8a914270e-kube-api-access-bgnnz\") pod \"dnsmasq-dns-7c47bcb9f9-j2tdt\" (UID: \"c09579bf-bfa6-4f4a-b7c3-4be8a914270e\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-j2tdt" Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.388288 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.390958 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.393621 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-v4svx" Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.394921 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.395421 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.395608 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.395846 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.396418 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.398004 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.403865 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.474009 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-j2tdt" Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.548868 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/46c7df0b-fc0a-4fd9-b097-72da03442510-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"46c7df0b-fc0a-4fd9-b097-72da03442510\") " pod="openstack/rabbitmq-server-0" Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.548919 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95n4v\" (UniqueName: \"kubernetes.io/projected/46c7df0b-fc0a-4fd9-b097-72da03442510-kube-api-access-95n4v\") pod \"rabbitmq-server-0\" (UID: \"46c7df0b-fc0a-4fd9-b097-72da03442510\") " pod="openstack/rabbitmq-server-0" Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.548953 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/46c7df0b-fc0a-4fd9-b097-72da03442510-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"46c7df0b-fc0a-4fd9-b097-72da03442510\") " pod="openstack/rabbitmq-server-0" Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.549018 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/46c7df0b-fc0a-4fd9-b097-72da03442510-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"46c7df0b-fc0a-4fd9-b097-72da03442510\") " pod="openstack/rabbitmq-server-0" Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.549146 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/46c7df0b-fc0a-4fd9-b097-72da03442510-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"46c7df0b-fc0a-4fd9-b097-72da03442510\") " pod="openstack/rabbitmq-server-0" Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.549295 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"46c7df0b-fc0a-4fd9-b097-72da03442510\") " pod="openstack/rabbitmq-server-0" Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.549371 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/46c7df0b-fc0a-4fd9-b097-72da03442510-pod-info\") pod \"rabbitmq-server-0\" (UID: \"46c7df0b-fc0a-4fd9-b097-72da03442510\") " pod="openstack/rabbitmq-server-0" Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.549385 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/46c7df0b-fc0a-4fd9-b097-72da03442510-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"46c7df0b-fc0a-4fd9-b097-72da03442510\") " pod="openstack/rabbitmq-server-0" Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.549417 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/46c7df0b-fc0a-4fd9-b097-72da03442510-server-conf\") pod \"rabbitmq-server-0\" (UID: \"46c7df0b-fc0a-4fd9-b097-72da03442510\") " pod="openstack/rabbitmq-server-0" Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.549452 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/46c7df0b-fc0a-4fd9-b097-72da03442510-config-data\") pod \"rabbitmq-server-0\" (UID: \"46c7df0b-fc0a-4fd9-b097-72da03442510\") " pod="openstack/rabbitmq-server-0" Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.549524 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/46c7df0b-fc0a-4fd9-b097-72da03442510-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"46c7df0b-fc0a-4fd9-b097-72da03442510\") " pod="openstack/rabbitmq-server-0" Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.651339 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/46c7df0b-fc0a-4fd9-b097-72da03442510-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"46c7df0b-fc0a-4fd9-b097-72da03442510\") " pod="openstack/rabbitmq-server-0" Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.651561 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/46c7df0b-fc0a-4fd9-b097-72da03442510-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"46c7df0b-fc0a-4fd9-b097-72da03442510\") " pod="openstack/rabbitmq-server-0" Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.651608 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95n4v\" (UniqueName: \"kubernetes.io/projected/46c7df0b-fc0a-4fd9-b097-72da03442510-kube-api-access-95n4v\") pod \"rabbitmq-server-0\" (UID: \"46c7df0b-fc0a-4fd9-b097-72da03442510\") " pod="openstack/rabbitmq-server-0" Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.651629 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/46c7df0b-fc0a-4fd9-b097-72da03442510-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"46c7df0b-fc0a-4fd9-b097-72da03442510\") " pod="openstack/rabbitmq-server-0" Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.651656 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/46c7df0b-fc0a-4fd9-b097-72da03442510-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"46c7df0b-fc0a-4fd9-b097-72da03442510\") " pod="openstack/rabbitmq-server-0" Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.651692 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/46c7df0b-fc0a-4fd9-b097-72da03442510-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"46c7df0b-fc0a-4fd9-b097-72da03442510\") " pod="openstack/rabbitmq-server-0" Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.651727 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"46c7df0b-fc0a-4fd9-b097-72da03442510\") " pod="openstack/rabbitmq-server-0" Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.651775 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/46c7df0b-fc0a-4fd9-b097-72da03442510-pod-info\") pod \"rabbitmq-server-0\" (UID: \"46c7df0b-fc0a-4fd9-b097-72da03442510\") " pod="openstack/rabbitmq-server-0" Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.651790 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/46c7df0b-fc0a-4fd9-b097-72da03442510-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"46c7df0b-fc0a-4fd9-b097-72da03442510\") " pod="openstack/rabbitmq-server-0" Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.651809 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/46c7df0b-fc0a-4fd9-b097-72da03442510-server-conf\") pod \"rabbitmq-server-0\" (UID: \"46c7df0b-fc0a-4fd9-b097-72da03442510\") " pod="openstack/rabbitmq-server-0" Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.651842 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/46c7df0b-fc0a-4fd9-b097-72da03442510-config-data\") pod \"rabbitmq-server-0\" (UID: \"46c7df0b-fc0a-4fd9-b097-72da03442510\") " pod="openstack/rabbitmq-server-0" Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.652245 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/46c7df0b-fc0a-4fd9-b097-72da03442510-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"46c7df0b-fc0a-4fd9-b097-72da03442510\") " pod="openstack/rabbitmq-server-0" Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.652476 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/46c7df0b-fc0a-4fd9-b097-72da03442510-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"46c7df0b-fc0a-4fd9-b097-72da03442510\") " pod="openstack/rabbitmq-server-0" Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.652747 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/46c7df0b-fc0a-4fd9-b097-72da03442510-config-data\") pod \"rabbitmq-server-0\" (UID: \"46c7df0b-fc0a-4fd9-b097-72da03442510\") " pod="openstack/rabbitmq-server-0" Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.652983 4901 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"46c7df0b-fc0a-4fd9-b097-72da03442510\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.653355 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/46c7df0b-fc0a-4fd9-b097-72da03442510-server-conf\") pod \"rabbitmq-server-0\" (UID: \"46c7df0b-fc0a-4fd9-b097-72da03442510\") " pod="openstack/rabbitmq-server-0" Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.653592 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/46c7df0b-fc0a-4fd9-b097-72da03442510-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"46c7df0b-fc0a-4fd9-b097-72da03442510\") " pod="openstack/rabbitmq-server-0" Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.657750 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/46c7df0b-fc0a-4fd9-b097-72da03442510-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"46c7df0b-fc0a-4fd9-b097-72da03442510\") " pod="openstack/rabbitmq-server-0" Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.658486 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/46c7df0b-fc0a-4fd9-b097-72da03442510-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"46c7df0b-fc0a-4fd9-b097-72da03442510\") " pod="openstack/rabbitmq-server-0" Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.660267 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/46c7df0b-fc0a-4fd9-b097-72da03442510-pod-info\") pod \"rabbitmq-server-0\" (UID: \"46c7df0b-fc0a-4fd9-b097-72da03442510\") " pod="openstack/rabbitmq-server-0" Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.674662 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/46c7df0b-fc0a-4fd9-b097-72da03442510-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"46c7df0b-fc0a-4fd9-b097-72da03442510\") " pod="openstack/rabbitmq-server-0" Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.678891 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95n4v\" (UniqueName: \"kubernetes.io/projected/46c7df0b-fc0a-4fd9-b097-72da03442510-kube-api-access-95n4v\") pod \"rabbitmq-server-0\" (UID: \"46c7df0b-fc0a-4fd9-b097-72da03442510\") " pod="openstack/rabbitmq-server-0" Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.692190 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"46c7df0b-fc0a-4fd9-b097-72da03442510\") " pod="openstack/rabbitmq-server-0" Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.711637 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.797617 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79f9fc56ff-fc488" event={"ID":"375fef20-b677-465b-bc26-cb609f6babae","Type":"ContainerStarted","Data":"95b56f47a0b9ac70edf505f6646db379f6a4fb3662601fa0b96e0ace3ab139c1"} Mar 09 03:00:15 crc kubenswrapper[4901]: I0309 03:00:15.913826 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-j2tdt"] Mar 09 03:00:15 crc kubenswrapper[4901]: W0309 03:00:15.923307 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc09579bf_bfa6_4f4a_b7c3_4be8a914270e.slice/crio-0c0bcb4e430b06af5a8ad6ac3f00c53d885d8630c6fbfebe108a30883eb120d8 WatchSource:0}: Error finding container 0c0bcb4e430b06af5a8ad6ac3f00c53d885d8630c6fbfebe108a30883eb120d8: Status 404 returned error can't find the container with id 0c0bcb4e430b06af5a8ad6ac3f00c53d885d8630c6fbfebe108a30883eb120d8 Mar 09 03:00:16 crc kubenswrapper[4901]: I0309 03:00:16.154839 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 03:00:16 crc kubenswrapper[4901]: W0309 03:00:16.165334 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46c7df0b_fc0a_4fd9_b097_72da03442510.slice/crio-d09b0ebec0684a62c7563c9ab80e719b64e7aaa08a6829e9b18c6e07a9220e17 WatchSource:0}: Error finding container d09b0ebec0684a62c7563c9ab80e719b64e7aaa08a6829e9b18c6e07a9220e17: Status 404 returned error can't find the container with id d09b0ebec0684a62c7563c9ab80e719b64e7aaa08a6829e9b18c6e07a9220e17 Mar 09 03:00:16 crc kubenswrapper[4901]: I0309 03:00:16.279862 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 03:00:16 crc kubenswrapper[4901]: I0309 03:00:16.281000 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 09 03:00:16 crc kubenswrapper[4901]: I0309 03:00:16.282934 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 09 03:00:16 crc kubenswrapper[4901]: I0309 03:00:16.283008 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 09 03:00:16 crc kubenswrapper[4901]: I0309 03:00:16.283107 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 09 03:00:16 crc kubenswrapper[4901]: I0309 03:00:16.283411 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 09 03:00:16 crc kubenswrapper[4901]: I0309 03:00:16.283479 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-hs8w6" Mar 09 03:00:16 crc kubenswrapper[4901]: I0309 03:00:16.283571 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 09 03:00:16 crc kubenswrapper[4901]: I0309 03:00:16.285315 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 09 03:00:16 crc kubenswrapper[4901]: I0309 03:00:16.290180 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 03:00:16 crc kubenswrapper[4901]: I0309 03:00:16.363984 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"98538e55-cb87-49e2-9fd5-fff06d7edfdd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 03:00:16 crc kubenswrapper[4901]: I0309 03:00:16.364033 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/98538e55-cb87-49e2-9fd5-fff06d7edfdd-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"98538e55-cb87-49e2-9fd5-fff06d7edfdd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 03:00:16 crc kubenswrapper[4901]: I0309 03:00:16.364060 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/98538e55-cb87-49e2-9fd5-fff06d7edfdd-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"98538e55-cb87-49e2-9fd5-fff06d7edfdd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 03:00:16 crc kubenswrapper[4901]: I0309 03:00:16.364086 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/98538e55-cb87-49e2-9fd5-fff06d7edfdd-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"98538e55-cb87-49e2-9fd5-fff06d7edfdd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 03:00:16 crc kubenswrapper[4901]: I0309 03:00:16.364111 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/98538e55-cb87-49e2-9fd5-fff06d7edfdd-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"98538e55-cb87-49e2-9fd5-fff06d7edfdd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 03:00:16 crc kubenswrapper[4901]: I0309 03:00:16.364157 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/98538e55-cb87-49e2-9fd5-fff06d7edfdd-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"98538e55-cb87-49e2-9fd5-fff06d7edfdd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 03:00:16 crc kubenswrapper[4901]: I0309 03:00:16.364184 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/98538e55-cb87-49e2-9fd5-fff06d7edfdd-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"98538e55-cb87-49e2-9fd5-fff06d7edfdd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 03:00:16 crc kubenswrapper[4901]: I0309 03:00:16.364202 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/98538e55-cb87-49e2-9fd5-fff06d7edfdd-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"98538e55-cb87-49e2-9fd5-fff06d7edfdd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 03:00:16 crc kubenswrapper[4901]: I0309 03:00:16.364234 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/98538e55-cb87-49e2-9fd5-fff06d7edfdd-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"98538e55-cb87-49e2-9fd5-fff06d7edfdd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 03:00:16 crc kubenswrapper[4901]: I0309 03:00:16.364260 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/98538e55-cb87-49e2-9fd5-fff06d7edfdd-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"98538e55-cb87-49e2-9fd5-fff06d7edfdd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 03:00:16 crc kubenswrapper[4901]: I0309 03:00:16.366593 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrztr\" (UniqueName: \"kubernetes.io/projected/98538e55-cb87-49e2-9fd5-fff06d7edfdd-kube-api-access-wrztr\") pod \"rabbitmq-cell1-server-0\" (UID: \"98538e55-cb87-49e2-9fd5-fff06d7edfdd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 03:00:16 crc kubenswrapper[4901]: I0309 03:00:16.476369 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/98538e55-cb87-49e2-9fd5-fff06d7edfdd-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"98538e55-cb87-49e2-9fd5-fff06d7edfdd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 03:00:16 crc kubenswrapper[4901]: I0309 03:00:16.476420 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/98538e55-cb87-49e2-9fd5-fff06d7edfdd-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"98538e55-cb87-49e2-9fd5-fff06d7edfdd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 03:00:16 crc kubenswrapper[4901]: I0309 03:00:16.476446 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/98538e55-cb87-49e2-9fd5-fff06d7edfdd-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"98538e55-cb87-49e2-9fd5-fff06d7edfdd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 03:00:16 crc kubenswrapper[4901]: I0309 03:00:16.476466 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/98538e55-cb87-49e2-9fd5-fff06d7edfdd-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"98538e55-cb87-49e2-9fd5-fff06d7edfdd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 03:00:16 crc kubenswrapper[4901]: I0309 03:00:16.476493 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrztr\" (UniqueName: \"kubernetes.io/projected/98538e55-cb87-49e2-9fd5-fff06d7edfdd-kube-api-access-wrztr\") pod \"rabbitmq-cell1-server-0\" (UID: \"98538e55-cb87-49e2-9fd5-fff06d7edfdd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 03:00:16 crc kubenswrapper[4901]: I0309 03:00:16.476529 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"98538e55-cb87-49e2-9fd5-fff06d7edfdd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 03:00:16 crc kubenswrapper[4901]: I0309 03:00:16.476545 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/98538e55-cb87-49e2-9fd5-fff06d7edfdd-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"98538e55-cb87-49e2-9fd5-fff06d7edfdd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 03:00:16 crc kubenswrapper[4901]: I0309 03:00:16.476563 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/98538e55-cb87-49e2-9fd5-fff06d7edfdd-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"98538e55-cb87-49e2-9fd5-fff06d7edfdd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 03:00:16 crc kubenswrapper[4901]: I0309 03:00:16.476587 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/98538e55-cb87-49e2-9fd5-fff06d7edfdd-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"98538e55-cb87-49e2-9fd5-fff06d7edfdd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 03:00:16 crc kubenswrapper[4901]: I0309 03:00:16.476612 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/98538e55-cb87-49e2-9fd5-fff06d7edfdd-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"98538e55-cb87-49e2-9fd5-fff06d7edfdd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 03:00:16 crc kubenswrapper[4901]: I0309 03:00:16.476654 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/98538e55-cb87-49e2-9fd5-fff06d7edfdd-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"98538e55-cb87-49e2-9fd5-fff06d7edfdd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 03:00:16 crc kubenswrapper[4901]: I0309 03:00:16.477435 4901 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"98538e55-cb87-49e2-9fd5-fff06d7edfdd\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-cell1-server-0" Mar 09 03:00:16 crc kubenswrapper[4901]: I0309 03:00:16.479421 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/98538e55-cb87-49e2-9fd5-fff06d7edfdd-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"98538e55-cb87-49e2-9fd5-fff06d7edfdd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 03:00:16 crc kubenswrapper[4901]: I0309 03:00:16.479847 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/98538e55-cb87-49e2-9fd5-fff06d7edfdd-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"98538e55-cb87-49e2-9fd5-fff06d7edfdd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 03:00:16 crc kubenswrapper[4901]: I0309 03:00:16.480653 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/98538e55-cb87-49e2-9fd5-fff06d7edfdd-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"98538e55-cb87-49e2-9fd5-fff06d7edfdd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 03:00:16 crc kubenswrapper[4901]: I0309 03:00:16.481968 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/98538e55-cb87-49e2-9fd5-fff06d7edfdd-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"98538e55-cb87-49e2-9fd5-fff06d7edfdd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 03:00:16 crc kubenswrapper[4901]: I0309 03:00:16.482187 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/98538e55-cb87-49e2-9fd5-fff06d7edfdd-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"98538e55-cb87-49e2-9fd5-fff06d7edfdd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 03:00:16 crc kubenswrapper[4901]: I0309 03:00:16.485603 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/98538e55-cb87-49e2-9fd5-fff06d7edfdd-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"98538e55-cb87-49e2-9fd5-fff06d7edfdd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 03:00:16 crc kubenswrapper[4901]: I0309 03:00:16.487709 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/98538e55-cb87-49e2-9fd5-fff06d7edfdd-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"98538e55-cb87-49e2-9fd5-fff06d7edfdd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 03:00:16 crc kubenswrapper[4901]: I0309 03:00:16.506963 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrztr\" (UniqueName: \"kubernetes.io/projected/98538e55-cb87-49e2-9fd5-fff06d7edfdd-kube-api-access-wrztr\") pod \"rabbitmq-cell1-server-0\" (UID: \"98538e55-cb87-49e2-9fd5-fff06d7edfdd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 03:00:16 crc kubenswrapper[4901]: I0309 03:00:16.515979 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/98538e55-cb87-49e2-9fd5-fff06d7edfdd-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"98538e55-cb87-49e2-9fd5-fff06d7edfdd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 03:00:16 crc kubenswrapper[4901]: I0309 03:00:16.536544 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/98538e55-cb87-49e2-9fd5-fff06d7edfdd-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"98538e55-cb87-49e2-9fd5-fff06d7edfdd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 03:00:16 crc kubenswrapper[4901]: I0309 03:00:16.545241 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"98538e55-cb87-49e2-9fd5-fff06d7edfdd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 03:00:16 crc kubenswrapper[4901]: I0309 03:00:16.606024 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 09 03:00:16 crc kubenswrapper[4901]: I0309 03:00:16.813609 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-j2tdt" event={"ID":"c09579bf-bfa6-4f4a-b7c3-4be8a914270e","Type":"ContainerStarted","Data":"0c0bcb4e430b06af5a8ad6ac3f00c53d885d8630c6fbfebe108a30883eb120d8"} Mar 09 03:00:16 crc kubenswrapper[4901]: I0309 03:00:16.815156 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"46c7df0b-fc0a-4fd9-b097-72da03442510","Type":"ContainerStarted","Data":"d09b0ebec0684a62c7563c9ab80e719b64e7aaa08a6829e9b18c6e07a9220e17"} Mar 09 03:00:17 crc kubenswrapper[4901]: I0309 03:00:17.145940 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 03:00:17 crc kubenswrapper[4901]: W0309 03:00:17.167638 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98538e55_cb87_49e2_9fd5_fff06d7edfdd.slice/crio-d7fc083e19dda162052a9d32b6cce2981ad403f0496af4440f9b9fc162c5bfd3 WatchSource:0}: Error finding container d7fc083e19dda162052a9d32b6cce2981ad403f0496af4440f9b9fc162c5bfd3: Status 404 returned error can't find the container with id d7fc083e19dda162052a9d32b6cce2981ad403f0496af4440f9b9fc162c5bfd3 Mar 09 03:00:17 crc kubenswrapper[4901]: I0309 03:00:17.525160 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 09 03:00:17 crc kubenswrapper[4901]: I0309 03:00:17.526701 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 09 03:00:17 crc kubenswrapper[4901]: I0309 03:00:17.529901 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-ls4vr" Mar 09 03:00:17 crc kubenswrapper[4901]: I0309 03:00:17.530108 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 09 03:00:17 crc kubenswrapper[4901]: I0309 03:00:17.530262 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 09 03:00:17 crc kubenswrapper[4901]: I0309 03:00:17.531927 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 09 03:00:17 crc kubenswrapper[4901]: I0309 03:00:17.539105 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 09 03:00:17 crc kubenswrapper[4901]: I0309 03:00:17.539630 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 09 03:00:17 crc kubenswrapper[4901]: I0309 03:00:17.591087 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9df0684a-2816-4af7-97cf-00e31c542eef-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"9df0684a-2816-4af7-97cf-00e31c542eef\") " pod="openstack/openstack-galera-0" Mar 09 03:00:17 crc kubenswrapper[4901]: I0309 03:00:17.591143 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9df0684a-2816-4af7-97cf-00e31c542eef-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"9df0684a-2816-4af7-97cf-00e31c542eef\") " pod="openstack/openstack-galera-0" Mar 09 03:00:17 crc kubenswrapper[4901]: I0309 03:00:17.591167 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9df0684a-2816-4af7-97cf-00e31c542eef-operator-scripts\") pod \"openstack-galera-0\" (UID: \"9df0684a-2816-4af7-97cf-00e31c542eef\") " pod="openstack/openstack-galera-0" Mar 09 03:00:17 crc kubenswrapper[4901]: I0309 03:00:17.591217 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9df0684a-2816-4af7-97cf-00e31c542eef-kolla-config\") pod \"openstack-galera-0\" (UID: \"9df0684a-2816-4af7-97cf-00e31c542eef\") " pod="openstack/openstack-galera-0" Mar 09 03:00:17 crc kubenswrapper[4901]: I0309 03:00:17.591317 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9df0684a-2816-4af7-97cf-00e31c542eef-config-data-default\") pod \"openstack-galera-0\" (UID: \"9df0684a-2816-4af7-97cf-00e31c542eef\") " pod="openstack/openstack-galera-0" Mar 09 03:00:17 crc kubenswrapper[4901]: I0309 03:00:17.591373 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czcks\" (UniqueName: \"kubernetes.io/projected/9df0684a-2816-4af7-97cf-00e31c542eef-kube-api-access-czcks\") pod \"openstack-galera-0\" (UID: \"9df0684a-2816-4af7-97cf-00e31c542eef\") " pod="openstack/openstack-galera-0" Mar 09 03:00:17 crc kubenswrapper[4901]: I0309 03:00:17.591403 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9df0684a-2816-4af7-97cf-00e31c542eef-config-data-generated\") pod \"openstack-galera-0\" (UID: \"9df0684a-2816-4af7-97cf-00e31c542eef\") " pod="openstack/openstack-galera-0" Mar 09 03:00:17 crc kubenswrapper[4901]: I0309 03:00:17.591447 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"9df0684a-2816-4af7-97cf-00e31c542eef\") " pod="openstack/openstack-galera-0" Mar 09 03:00:17 crc kubenswrapper[4901]: I0309 03:00:17.616806 4901 scope.go:117] "RemoveContainer" containerID="da4a75ba9b15303e5fc81245b4255efd1bebcc96b23483870a8d5b6ef37602ff" Mar 09 03:00:17 crc kubenswrapper[4901]: I0309 03:00:17.693134 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9df0684a-2816-4af7-97cf-00e31c542eef-kolla-config\") pod \"openstack-galera-0\" (UID: \"9df0684a-2816-4af7-97cf-00e31c542eef\") " pod="openstack/openstack-galera-0" Mar 09 03:00:17 crc kubenswrapper[4901]: I0309 03:00:17.693189 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9df0684a-2816-4af7-97cf-00e31c542eef-config-data-default\") pod \"openstack-galera-0\" (UID: \"9df0684a-2816-4af7-97cf-00e31c542eef\") " pod="openstack/openstack-galera-0" Mar 09 03:00:17 crc kubenswrapper[4901]: I0309 03:00:17.693214 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czcks\" (UniqueName: \"kubernetes.io/projected/9df0684a-2816-4af7-97cf-00e31c542eef-kube-api-access-czcks\") pod \"openstack-galera-0\" (UID: \"9df0684a-2816-4af7-97cf-00e31c542eef\") " pod="openstack/openstack-galera-0" Mar 09 03:00:17 crc kubenswrapper[4901]: I0309 03:00:17.693245 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9df0684a-2816-4af7-97cf-00e31c542eef-config-data-generated\") pod \"openstack-galera-0\" (UID: \"9df0684a-2816-4af7-97cf-00e31c542eef\") " pod="openstack/openstack-galera-0" Mar 09 03:00:17 crc kubenswrapper[4901]: I0309 03:00:17.693270 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"9df0684a-2816-4af7-97cf-00e31c542eef\") " pod="openstack/openstack-galera-0" Mar 09 03:00:17 crc kubenswrapper[4901]: I0309 03:00:17.693325 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9df0684a-2816-4af7-97cf-00e31c542eef-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"9df0684a-2816-4af7-97cf-00e31c542eef\") " pod="openstack/openstack-galera-0" Mar 09 03:00:17 crc kubenswrapper[4901]: I0309 03:00:17.693354 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9df0684a-2816-4af7-97cf-00e31c542eef-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"9df0684a-2816-4af7-97cf-00e31c542eef\") " pod="openstack/openstack-galera-0" Mar 09 03:00:17 crc kubenswrapper[4901]: I0309 03:00:17.693373 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9df0684a-2816-4af7-97cf-00e31c542eef-operator-scripts\") pod \"openstack-galera-0\" (UID: \"9df0684a-2816-4af7-97cf-00e31c542eef\") " pod="openstack/openstack-galera-0" Mar 09 03:00:17 crc kubenswrapper[4901]: I0309 03:00:17.694319 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9df0684a-2816-4af7-97cf-00e31c542eef-config-data-default\") pod \"openstack-galera-0\" (UID: \"9df0684a-2816-4af7-97cf-00e31c542eef\") " pod="openstack/openstack-galera-0" Mar 09 03:00:17 crc kubenswrapper[4901]: I0309 03:00:17.694705 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9df0684a-2816-4af7-97cf-00e31c542eef-config-data-generated\") pod \"openstack-galera-0\" (UID: \"9df0684a-2816-4af7-97cf-00e31c542eef\") " pod="openstack/openstack-galera-0" Mar 09 03:00:17 crc kubenswrapper[4901]: I0309 03:00:17.694819 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9df0684a-2816-4af7-97cf-00e31c542eef-kolla-config\") pod \"openstack-galera-0\" (UID: \"9df0684a-2816-4af7-97cf-00e31c542eef\") " pod="openstack/openstack-galera-0" Mar 09 03:00:17 crc kubenswrapper[4901]: I0309 03:00:17.694990 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9df0684a-2816-4af7-97cf-00e31c542eef-operator-scripts\") pod \"openstack-galera-0\" (UID: \"9df0684a-2816-4af7-97cf-00e31c542eef\") " pod="openstack/openstack-galera-0" Mar 09 03:00:17 crc kubenswrapper[4901]: I0309 03:00:17.695324 4901 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"9df0684a-2816-4af7-97cf-00e31c542eef\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/openstack-galera-0" Mar 09 03:00:17 crc kubenswrapper[4901]: I0309 03:00:17.708428 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9df0684a-2816-4af7-97cf-00e31c542eef-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"9df0684a-2816-4af7-97cf-00e31c542eef\") " pod="openstack/openstack-galera-0" Mar 09 03:00:17 crc kubenswrapper[4901]: I0309 03:00:17.708525 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9df0684a-2816-4af7-97cf-00e31c542eef-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"9df0684a-2816-4af7-97cf-00e31c542eef\") " pod="openstack/openstack-galera-0" Mar 09 03:00:17 crc kubenswrapper[4901]: I0309 03:00:17.715412 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czcks\" (UniqueName: \"kubernetes.io/projected/9df0684a-2816-4af7-97cf-00e31c542eef-kube-api-access-czcks\") pod \"openstack-galera-0\" (UID: \"9df0684a-2816-4af7-97cf-00e31c542eef\") " pod="openstack/openstack-galera-0" Mar 09 03:00:17 crc kubenswrapper[4901]: I0309 03:00:17.748385 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"9df0684a-2816-4af7-97cf-00e31c542eef\") " pod="openstack/openstack-galera-0" Mar 09 03:00:17 crc kubenswrapper[4901]: I0309 03:00:17.854475 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"98538e55-cb87-49e2-9fd5-fff06d7edfdd","Type":"ContainerStarted","Data":"d7fc083e19dda162052a9d32b6cce2981ad403f0496af4440f9b9fc162c5bfd3"} Mar 09 03:00:17 crc kubenswrapper[4901]: I0309 03:00:17.865610 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 09 03:00:18 crc kubenswrapper[4901]: I0309 03:00:18.901343 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 09 03:00:18 crc kubenswrapper[4901]: I0309 03:00:18.905354 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 09 03:00:18 crc kubenswrapper[4901]: I0309 03:00:18.907704 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-mhqqf" Mar 09 03:00:18 crc kubenswrapper[4901]: I0309 03:00:18.908463 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 09 03:00:18 crc kubenswrapper[4901]: I0309 03:00:18.909316 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 09 03:00:18 crc kubenswrapper[4901]: I0309 03:00:18.910755 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 09 03:00:18 crc kubenswrapper[4901]: I0309 03:00:18.921363 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 09 03:00:19 crc kubenswrapper[4901]: I0309 03:00:19.012670 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f0098aa8-4248-48ec-a254-368c395308b1\") " pod="openstack/openstack-cell1-galera-0" Mar 09 03:00:19 crc kubenswrapper[4901]: I0309 03:00:19.012710 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f0098aa8-4248-48ec-a254-368c395308b1-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f0098aa8-4248-48ec-a254-368c395308b1\") " pod="openstack/openstack-cell1-galera-0" Mar 09 03:00:19 crc kubenswrapper[4901]: I0309 03:00:19.012727 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f0098aa8-4248-48ec-a254-368c395308b1-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f0098aa8-4248-48ec-a254-368c395308b1\") " pod="openstack/openstack-cell1-galera-0" Mar 09 03:00:19 crc kubenswrapper[4901]: I0309 03:00:19.012749 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f0098aa8-4248-48ec-a254-368c395308b1-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f0098aa8-4248-48ec-a254-368c395308b1\") " pod="openstack/openstack-cell1-galera-0" Mar 09 03:00:19 crc kubenswrapper[4901]: I0309 03:00:19.012776 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0098aa8-4248-48ec-a254-368c395308b1-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f0098aa8-4248-48ec-a254-368c395308b1\") " pod="openstack/openstack-cell1-galera-0" Mar 09 03:00:19 crc kubenswrapper[4901]: I0309 03:00:19.012791 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9fhm\" (UniqueName: \"kubernetes.io/projected/f0098aa8-4248-48ec-a254-368c395308b1-kube-api-access-q9fhm\") pod \"openstack-cell1-galera-0\" (UID: \"f0098aa8-4248-48ec-a254-368c395308b1\") " pod="openstack/openstack-cell1-galera-0" Mar 09 03:00:19 crc kubenswrapper[4901]: I0309 03:00:19.012889 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0098aa8-4248-48ec-a254-368c395308b1-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f0098aa8-4248-48ec-a254-368c395308b1\") " pod="openstack/openstack-cell1-galera-0" Mar 09 03:00:19 crc kubenswrapper[4901]: I0309 03:00:19.012948 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0098aa8-4248-48ec-a254-368c395308b1-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f0098aa8-4248-48ec-a254-368c395308b1\") " pod="openstack/openstack-cell1-galera-0" Mar 09 03:00:19 crc kubenswrapper[4901]: I0309 03:00:19.096756 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 09 03:00:19 crc kubenswrapper[4901]: I0309 03:00:19.097757 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 09 03:00:19 crc kubenswrapper[4901]: I0309 03:00:19.102246 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 09 03:00:19 crc kubenswrapper[4901]: I0309 03:00:19.102456 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 09 03:00:19 crc kubenswrapper[4901]: I0309 03:00:19.102517 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-qrtn2" Mar 09 03:00:19 crc kubenswrapper[4901]: I0309 03:00:19.107278 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 09 03:00:19 crc kubenswrapper[4901]: I0309 03:00:19.124051 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0098aa8-4248-48ec-a254-368c395308b1-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f0098aa8-4248-48ec-a254-368c395308b1\") " pod="openstack/openstack-cell1-galera-0" Mar 09 03:00:19 crc kubenswrapper[4901]: I0309 03:00:19.124087 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9fhm\" (UniqueName: \"kubernetes.io/projected/f0098aa8-4248-48ec-a254-368c395308b1-kube-api-access-q9fhm\") pod \"openstack-cell1-galera-0\" (UID: \"f0098aa8-4248-48ec-a254-368c395308b1\") " pod="openstack/openstack-cell1-galera-0" Mar 09 03:00:19 crc kubenswrapper[4901]: I0309 03:00:19.124108 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0098aa8-4248-48ec-a254-368c395308b1-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f0098aa8-4248-48ec-a254-368c395308b1\") " pod="openstack/openstack-cell1-galera-0" Mar 09 03:00:19 crc kubenswrapper[4901]: I0309 03:00:19.124130 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0098aa8-4248-48ec-a254-368c395308b1-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f0098aa8-4248-48ec-a254-368c395308b1\") " pod="openstack/openstack-cell1-galera-0" Mar 09 03:00:19 crc kubenswrapper[4901]: I0309 03:00:19.124260 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f0098aa8-4248-48ec-a254-368c395308b1\") " pod="openstack/openstack-cell1-galera-0" Mar 09 03:00:19 crc kubenswrapper[4901]: I0309 03:00:19.124282 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f0098aa8-4248-48ec-a254-368c395308b1-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f0098aa8-4248-48ec-a254-368c395308b1\") " pod="openstack/openstack-cell1-galera-0" Mar 09 03:00:19 crc kubenswrapper[4901]: I0309 03:00:19.124304 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f0098aa8-4248-48ec-a254-368c395308b1-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f0098aa8-4248-48ec-a254-368c395308b1\") " pod="openstack/openstack-cell1-galera-0" Mar 09 03:00:19 crc kubenswrapper[4901]: I0309 03:00:19.124334 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f0098aa8-4248-48ec-a254-368c395308b1-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f0098aa8-4248-48ec-a254-368c395308b1\") " pod="openstack/openstack-cell1-galera-0" Mar 09 03:00:19 crc kubenswrapper[4901]: I0309 03:00:19.125014 4901 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f0098aa8-4248-48ec-a254-368c395308b1\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/openstack-cell1-galera-0" Mar 09 03:00:19 crc kubenswrapper[4901]: I0309 03:00:19.125516 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f0098aa8-4248-48ec-a254-368c395308b1-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f0098aa8-4248-48ec-a254-368c395308b1\") " pod="openstack/openstack-cell1-galera-0" Mar 09 03:00:19 crc kubenswrapper[4901]: I0309 03:00:19.125640 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f0098aa8-4248-48ec-a254-368c395308b1-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f0098aa8-4248-48ec-a254-368c395308b1\") " pod="openstack/openstack-cell1-galera-0" Mar 09 03:00:19 crc kubenswrapper[4901]: I0309 03:00:19.126073 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0098aa8-4248-48ec-a254-368c395308b1-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f0098aa8-4248-48ec-a254-368c395308b1\") " pod="openstack/openstack-cell1-galera-0" Mar 09 03:00:19 crc kubenswrapper[4901]: I0309 03:00:19.126745 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f0098aa8-4248-48ec-a254-368c395308b1-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f0098aa8-4248-48ec-a254-368c395308b1\") " pod="openstack/openstack-cell1-galera-0" Mar 09 03:00:19 crc kubenswrapper[4901]: I0309 03:00:19.129849 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0098aa8-4248-48ec-a254-368c395308b1-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f0098aa8-4248-48ec-a254-368c395308b1\") " pod="openstack/openstack-cell1-galera-0" Mar 09 03:00:19 crc kubenswrapper[4901]: I0309 03:00:19.139845 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9fhm\" (UniqueName: \"kubernetes.io/projected/f0098aa8-4248-48ec-a254-368c395308b1-kube-api-access-q9fhm\") pod \"openstack-cell1-galera-0\" (UID: \"f0098aa8-4248-48ec-a254-368c395308b1\") " pod="openstack/openstack-cell1-galera-0" Mar 09 03:00:19 crc kubenswrapper[4901]: I0309 03:00:19.154709 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0098aa8-4248-48ec-a254-368c395308b1-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f0098aa8-4248-48ec-a254-368c395308b1\") " pod="openstack/openstack-cell1-galera-0" Mar 09 03:00:19 crc kubenswrapper[4901]: I0309 03:00:19.172674 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f0098aa8-4248-48ec-a254-368c395308b1\") " pod="openstack/openstack-cell1-galera-0" Mar 09 03:00:19 crc kubenswrapper[4901]: I0309 03:00:19.225924 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b3e03cd-75ae-46dc-aee4-b778929cf535-memcached-tls-certs\") pod \"memcached-0\" (UID: \"2b3e03cd-75ae-46dc-aee4-b778929cf535\") " pod="openstack/memcached-0" Mar 09 03:00:19 crc kubenswrapper[4901]: I0309 03:00:19.225981 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2b3e03cd-75ae-46dc-aee4-b778929cf535-kolla-config\") pod \"memcached-0\" (UID: \"2b3e03cd-75ae-46dc-aee4-b778929cf535\") " pod="openstack/memcached-0" Mar 09 03:00:19 crc kubenswrapper[4901]: I0309 03:00:19.226020 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b3e03cd-75ae-46dc-aee4-b778929cf535-combined-ca-bundle\") pod \"memcached-0\" (UID: \"2b3e03cd-75ae-46dc-aee4-b778929cf535\") " pod="openstack/memcached-0" Mar 09 03:00:19 crc kubenswrapper[4901]: I0309 03:00:19.226052 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2b3e03cd-75ae-46dc-aee4-b778929cf535-config-data\") pod \"memcached-0\" (UID: \"2b3e03cd-75ae-46dc-aee4-b778929cf535\") " pod="openstack/memcached-0" Mar 09 03:00:19 crc kubenswrapper[4901]: I0309 03:00:19.226174 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5zzp\" (UniqueName: \"kubernetes.io/projected/2b3e03cd-75ae-46dc-aee4-b778929cf535-kube-api-access-n5zzp\") pod \"memcached-0\" (UID: \"2b3e03cd-75ae-46dc-aee4-b778929cf535\") " pod="openstack/memcached-0" Mar 09 03:00:19 crc kubenswrapper[4901]: I0309 03:00:19.231461 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 09 03:00:19 crc kubenswrapper[4901]: I0309 03:00:19.327638 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b3e03cd-75ae-46dc-aee4-b778929cf535-memcached-tls-certs\") pod \"memcached-0\" (UID: \"2b3e03cd-75ae-46dc-aee4-b778929cf535\") " pod="openstack/memcached-0" Mar 09 03:00:19 crc kubenswrapper[4901]: I0309 03:00:19.327683 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2b3e03cd-75ae-46dc-aee4-b778929cf535-kolla-config\") pod \"memcached-0\" (UID: \"2b3e03cd-75ae-46dc-aee4-b778929cf535\") " pod="openstack/memcached-0" Mar 09 03:00:19 crc kubenswrapper[4901]: I0309 03:00:19.327716 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b3e03cd-75ae-46dc-aee4-b778929cf535-combined-ca-bundle\") pod \"memcached-0\" (UID: \"2b3e03cd-75ae-46dc-aee4-b778929cf535\") " pod="openstack/memcached-0" Mar 09 03:00:19 crc kubenswrapper[4901]: I0309 03:00:19.327734 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2b3e03cd-75ae-46dc-aee4-b778929cf535-config-data\") pod \"memcached-0\" (UID: \"2b3e03cd-75ae-46dc-aee4-b778929cf535\") " pod="openstack/memcached-0" Mar 09 03:00:19 crc kubenswrapper[4901]: I0309 03:00:19.327750 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5zzp\" (UniqueName: \"kubernetes.io/projected/2b3e03cd-75ae-46dc-aee4-b778929cf535-kube-api-access-n5zzp\") pod \"memcached-0\" (UID: \"2b3e03cd-75ae-46dc-aee4-b778929cf535\") " pod="openstack/memcached-0" Mar 09 03:00:19 crc kubenswrapper[4901]: I0309 03:00:19.328534 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2b3e03cd-75ae-46dc-aee4-b778929cf535-kolla-config\") pod \"memcached-0\" (UID: \"2b3e03cd-75ae-46dc-aee4-b778929cf535\") " pod="openstack/memcached-0" Mar 09 03:00:19 crc kubenswrapper[4901]: I0309 03:00:19.328590 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2b3e03cd-75ae-46dc-aee4-b778929cf535-config-data\") pod \"memcached-0\" (UID: \"2b3e03cd-75ae-46dc-aee4-b778929cf535\") " pod="openstack/memcached-0" Mar 09 03:00:19 crc kubenswrapper[4901]: I0309 03:00:19.330875 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b3e03cd-75ae-46dc-aee4-b778929cf535-combined-ca-bundle\") pod \"memcached-0\" (UID: \"2b3e03cd-75ae-46dc-aee4-b778929cf535\") " pod="openstack/memcached-0" Mar 09 03:00:19 crc kubenswrapper[4901]: I0309 03:00:19.331575 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b3e03cd-75ae-46dc-aee4-b778929cf535-memcached-tls-certs\") pod \"memcached-0\" (UID: \"2b3e03cd-75ae-46dc-aee4-b778929cf535\") " pod="openstack/memcached-0" Mar 09 03:00:19 crc kubenswrapper[4901]: I0309 03:00:19.350742 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5zzp\" (UniqueName: \"kubernetes.io/projected/2b3e03cd-75ae-46dc-aee4-b778929cf535-kube-api-access-n5zzp\") pod \"memcached-0\" (UID: \"2b3e03cd-75ae-46dc-aee4-b778929cf535\") " pod="openstack/memcached-0" Mar 09 03:00:19 crc kubenswrapper[4901]: I0309 03:00:19.505661 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 09 03:00:21 crc kubenswrapper[4901]: I0309 03:00:21.415647 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 03:00:21 crc kubenswrapper[4901]: I0309 03:00:21.418519 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 09 03:00:21 crc kubenswrapper[4901]: I0309 03:00:21.423655 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-xml24" Mar 09 03:00:21 crc kubenswrapper[4901]: I0309 03:00:21.430450 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 03:00:21 crc kubenswrapper[4901]: I0309 03:00:21.461526 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhzl8\" (UniqueName: \"kubernetes.io/projected/993fdfea-9981-48c0-9b5b-c78eab5106a0-kube-api-access-rhzl8\") pod \"kube-state-metrics-0\" (UID: \"993fdfea-9981-48c0-9b5b-c78eab5106a0\") " pod="openstack/kube-state-metrics-0" Mar 09 03:00:21 crc kubenswrapper[4901]: I0309 03:00:21.563317 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhzl8\" (UniqueName: \"kubernetes.io/projected/993fdfea-9981-48c0-9b5b-c78eab5106a0-kube-api-access-rhzl8\") pod \"kube-state-metrics-0\" (UID: \"993fdfea-9981-48c0-9b5b-c78eab5106a0\") " pod="openstack/kube-state-metrics-0" Mar 09 03:00:21 crc kubenswrapper[4901]: I0309 03:00:21.594835 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhzl8\" (UniqueName: \"kubernetes.io/projected/993fdfea-9981-48c0-9b5b-c78eab5106a0-kube-api-access-rhzl8\") pod \"kube-state-metrics-0\" (UID: \"993fdfea-9981-48c0-9b5b-c78eab5106a0\") " pod="openstack/kube-state-metrics-0" Mar 09 03:00:21 crc kubenswrapper[4901]: I0309 03:00:21.741259 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 09 03:00:23 crc kubenswrapper[4901]: I0309 03:00:23.701531 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-hltph"] Mar 09 03:00:23 crc kubenswrapper[4901]: I0309 03:00:23.705606 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-hltph" Mar 09 03:00:23 crc kubenswrapper[4901]: I0309 03:00:23.708141 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-bnn6v" Mar 09 03:00:23 crc kubenswrapper[4901]: I0309 03:00:23.711108 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 09 03:00:23 crc kubenswrapper[4901]: I0309 03:00:23.716475 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-5rg5k"] Mar 09 03:00:23 crc kubenswrapper[4901]: I0309 03:00:23.717702 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5rg5k" Mar 09 03:00:23 crc kubenswrapper[4901]: I0309 03:00:23.719862 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 09 03:00:23 crc kubenswrapper[4901]: I0309 03:00:23.732104 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-hltph"] Mar 09 03:00:23 crc kubenswrapper[4901]: I0309 03:00:23.794963 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89-etc-ovs\") pod \"ovn-controller-ovs-hltph\" (UID: \"4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89\") " pod="openstack/ovn-controller-ovs-hltph" Mar 09 03:00:23 crc kubenswrapper[4901]: I0309 03:00:23.795027 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcl27\" (UniqueName: \"kubernetes.io/projected/4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89-kube-api-access-fcl27\") pod \"ovn-controller-ovs-hltph\" (UID: \"4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89\") " pod="openstack/ovn-controller-ovs-hltph" Mar 09 03:00:23 crc kubenswrapper[4901]: I0309 03:00:23.795051 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89-var-log\") pod \"ovn-controller-ovs-hltph\" (UID: \"4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89\") " pod="openstack/ovn-controller-ovs-hltph" Mar 09 03:00:23 crc kubenswrapper[4901]: I0309 03:00:23.795082 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc54c941-19d2-42c1-b9f0-a3a58999bda5-combined-ca-bundle\") pod \"ovn-controller-5rg5k\" (UID: \"dc54c941-19d2-42c1-b9f0-a3a58999bda5\") " pod="openstack/ovn-controller-5rg5k" Mar 09 03:00:23 crc kubenswrapper[4901]: I0309 03:00:23.795103 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89-scripts\") pod \"ovn-controller-ovs-hltph\" (UID: \"4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89\") " pod="openstack/ovn-controller-ovs-hltph" Mar 09 03:00:23 crc kubenswrapper[4901]: I0309 03:00:23.795125 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc54c941-19d2-42c1-b9f0-a3a58999bda5-scripts\") pod \"ovn-controller-5rg5k\" (UID: \"dc54c941-19d2-42c1-b9f0-a3a58999bda5\") " pod="openstack/ovn-controller-5rg5k" Mar 09 03:00:23 crc kubenswrapper[4901]: I0309 03:00:23.795151 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/dc54c941-19d2-42c1-b9f0-a3a58999bda5-var-log-ovn\") pod \"ovn-controller-5rg5k\" (UID: \"dc54c941-19d2-42c1-b9f0-a3a58999bda5\") " pod="openstack/ovn-controller-5rg5k" Mar 09 03:00:23 crc kubenswrapper[4901]: I0309 03:00:23.795170 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/dc54c941-19d2-42c1-b9f0-a3a58999bda5-var-run-ovn\") pod \"ovn-controller-5rg5k\" (UID: \"dc54c941-19d2-42c1-b9f0-a3a58999bda5\") " pod="openstack/ovn-controller-5rg5k" Mar 09 03:00:23 crc kubenswrapper[4901]: I0309 03:00:23.795203 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/dc54c941-19d2-42c1-b9f0-a3a58999bda5-var-run\") pod \"ovn-controller-5rg5k\" (UID: \"dc54c941-19d2-42c1-b9f0-a3a58999bda5\") " pod="openstack/ovn-controller-5rg5k" Mar 09 03:00:23 crc kubenswrapper[4901]: I0309 03:00:23.795238 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89-var-lib\") pod \"ovn-controller-ovs-hltph\" (UID: \"4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89\") " pod="openstack/ovn-controller-ovs-hltph" Mar 09 03:00:23 crc kubenswrapper[4901]: I0309 03:00:23.795272 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntvll\" (UniqueName: \"kubernetes.io/projected/dc54c941-19d2-42c1-b9f0-a3a58999bda5-kube-api-access-ntvll\") pod \"ovn-controller-5rg5k\" (UID: \"dc54c941-19d2-42c1-b9f0-a3a58999bda5\") " pod="openstack/ovn-controller-5rg5k" Mar 09 03:00:23 crc kubenswrapper[4901]: I0309 03:00:23.795290 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89-var-run\") pod \"ovn-controller-ovs-hltph\" (UID: \"4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89\") " pod="openstack/ovn-controller-ovs-hltph" Mar 09 03:00:23 crc kubenswrapper[4901]: I0309 03:00:23.795304 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc54c941-19d2-42c1-b9f0-a3a58999bda5-ovn-controller-tls-certs\") pod \"ovn-controller-5rg5k\" (UID: \"dc54c941-19d2-42c1-b9f0-a3a58999bda5\") " pod="openstack/ovn-controller-5rg5k" Mar 09 03:00:23 crc kubenswrapper[4901]: I0309 03:00:23.811354 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-5rg5k"] Mar 09 03:00:23 crc kubenswrapper[4901]: I0309 03:00:23.896099 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89-var-lib\") pod \"ovn-controller-ovs-hltph\" (UID: \"4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89\") " pod="openstack/ovn-controller-ovs-hltph" Mar 09 03:00:23 crc kubenswrapper[4901]: I0309 03:00:23.896137 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89-var-run\") pod \"ovn-controller-ovs-hltph\" (UID: \"4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89\") " pod="openstack/ovn-controller-ovs-hltph" Mar 09 03:00:23 crc kubenswrapper[4901]: I0309 03:00:23.896156 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc54c941-19d2-42c1-b9f0-a3a58999bda5-ovn-controller-tls-certs\") pod \"ovn-controller-5rg5k\" (UID: \"dc54c941-19d2-42c1-b9f0-a3a58999bda5\") " pod="openstack/ovn-controller-5rg5k" Mar 09 03:00:23 crc kubenswrapper[4901]: I0309 03:00:23.896172 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntvll\" (UniqueName: \"kubernetes.io/projected/dc54c941-19d2-42c1-b9f0-a3a58999bda5-kube-api-access-ntvll\") pod \"ovn-controller-5rg5k\" (UID: \"dc54c941-19d2-42c1-b9f0-a3a58999bda5\") " pod="openstack/ovn-controller-5rg5k" Mar 09 03:00:23 crc kubenswrapper[4901]: I0309 03:00:23.896188 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89-etc-ovs\") pod \"ovn-controller-ovs-hltph\" (UID: \"4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89\") " pod="openstack/ovn-controller-ovs-hltph" Mar 09 03:00:23 crc kubenswrapper[4901]: I0309 03:00:23.896214 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcl27\" (UniqueName: \"kubernetes.io/projected/4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89-kube-api-access-fcl27\") pod \"ovn-controller-ovs-hltph\" (UID: \"4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89\") " pod="openstack/ovn-controller-ovs-hltph" Mar 09 03:00:23 crc kubenswrapper[4901]: I0309 03:00:23.896236 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89-var-log\") pod \"ovn-controller-ovs-hltph\" (UID: \"4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89\") " pod="openstack/ovn-controller-ovs-hltph" Mar 09 03:00:23 crc kubenswrapper[4901]: I0309 03:00:23.896287 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc54c941-19d2-42c1-b9f0-a3a58999bda5-combined-ca-bundle\") pod \"ovn-controller-5rg5k\" (UID: \"dc54c941-19d2-42c1-b9f0-a3a58999bda5\") " pod="openstack/ovn-controller-5rg5k" Mar 09 03:00:23 crc kubenswrapper[4901]: I0309 03:00:23.896316 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89-scripts\") pod \"ovn-controller-ovs-hltph\" (UID: \"4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89\") " pod="openstack/ovn-controller-ovs-hltph" Mar 09 03:00:23 crc kubenswrapper[4901]: I0309 03:00:23.896341 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc54c941-19d2-42c1-b9f0-a3a58999bda5-scripts\") pod \"ovn-controller-5rg5k\" (UID: \"dc54c941-19d2-42c1-b9f0-a3a58999bda5\") " pod="openstack/ovn-controller-5rg5k" Mar 09 03:00:23 crc kubenswrapper[4901]: I0309 03:00:23.896371 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/dc54c941-19d2-42c1-b9f0-a3a58999bda5-var-log-ovn\") pod \"ovn-controller-5rg5k\" (UID: \"dc54c941-19d2-42c1-b9f0-a3a58999bda5\") " pod="openstack/ovn-controller-5rg5k" Mar 09 03:00:23 crc kubenswrapper[4901]: I0309 03:00:23.896392 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/dc54c941-19d2-42c1-b9f0-a3a58999bda5-var-run-ovn\") pod \"ovn-controller-5rg5k\" (UID: \"dc54c941-19d2-42c1-b9f0-a3a58999bda5\") " pod="openstack/ovn-controller-5rg5k" Mar 09 03:00:23 crc kubenswrapper[4901]: I0309 03:00:23.896432 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/dc54c941-19d2-42c1-b9f0-a3a58999bda5-var-run\") pod \"ovn-controller-5rg5k\" (UID: \"dc54c941-19d2-42c1-b9f0-a3a58999bda5\") " pod="openstack/ovn-controller-5rg5k" Mar 09 03:00:23 crc kubenswrapper[4901]: I0309 03:00:23.896973 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/dc54c941-19d2-42c1-b9f0-a3a58999bda5-var-run\") pod \"ovn-controller-5rg5k\" (UID: \"dc54c941-19d2-42c1-b9f0-a3a58999bda5\") " pod="openstack/ovn-controller-5rg5k" Mar 09 03:00:23 crc kubenswrapper[4901]: I0309 03:00:23.897112 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89-var-lib\") pod \"ovn-controller-ovs-hltph\" (UID: \"4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89\") " pod="openstack/ovn-controller-ovs-hltph" Mar 09 03:00:23 crc kubenswrapper[4901]: I0309 03:00:23.897304 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89-var-run\") pod \"ovn-controller-ovs-hltph\" (UID: \"4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89\") " pod="openstack/ovn-controller-ovs-hltph" Mar 09 03:00:23 crc kubenswrapper[4901]: I0309 03:00:23.897447 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/dc54c941-19d2-42c1-b9f0-a3a58999bda5-var-run-ovn\") pod \"ovn-controller-5rg5k\" (UID: \"dc54c941-19d2-42c1-b9f0-a3a58999bda5\") " pod="openstack/ovn-controller-5rg5k" Mar 09 03:00:23 crc kubenswrapper[4901]: I0309 03:00:23.897488 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89-etc-ovs\") pod \"ovn-controller-ovs-hltph\" (UID: \"4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89\") " pod="openstack/ovn-controller-ovs-hltph" Mar 09 03:00:23 crc kubenswrapper[4901]: I0309 03:00:23.897662 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/dc54c941-19d2-42c1-b9f0-a3a58999bda5-var-log-ovn\") pod \"ovn-controller-5rg5k\" (UID: \"dc54c941-19d2-42c1-b9f0-a3a58999bda5\") " pod="openstack/ovn-controller-5rg5k" Mar 09 03:00:23 crc kubenswrapper[4901]: I0309 03:00:23.897822 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89-var-log\") pod \"ovn-controller-ovs-hltph\" (UID: \"4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89\") " pod="openstack/ovn-controller-ovs-hltph" Mar 09 03:00:23 crc kubenswrapper[4901]: I0309 03:00:23.899230 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc54c941-19d2-42c1-b9f0-a3a58999bda5-scripts\") pod \"ovn-controller-5rg5k\" (UID: \"dc54c941-19d2-42c1-b9f0-a3a58999bda5\") " pod="openstack/ovn-controller-5rg5k" Mar 09 03:00:23 crc kubenswrapper[4901]: I0309 03:00:23.899244 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89-scripts\") pod \"ovn-controller-ovs-hltph\" (UID: \"4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89\") " pod="openstack/ovn-controller-ovs-hltph" Mar 09 03:00:23 crc kubenswrapper[4901]: I0309 03:00:23.902572 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc54c941-19d2-42c1-b9f0-a3a58999bda5-combined-ca-bundle\") pod \"ovn-controller-5rg5k\" (UID: \"dc54c941-19d2-42c1-b9f0-a3a58999bda5\") " pod="openstack/ovn-controller-5rg5k" Mar 09 03:00:23 crc kubenswrapper[4901]: I0309 03:00:23.912650 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc54c941-19d2-42c1-b9f0-a3a58999bda5-ovn-controller-tls-certs\") pod \"ovn-controller-5rg5k\" (UID: \"dc54c941-19d2-42c1-b9f0-a3a58999bda5\") " pod="openstack/ovn-controller-5rg5k" Mar 09 03:00:23 crc kubenswrapper[4901]: I0309 03:00:23.929650 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntvll\" (UniqueName: \"kubernetes.io/projected/dc54c941-19d2-42c1-b9f0-a3a58999bda5-kube-api-access-ntvll\") pod \"ovn-controller-5rg5k\" (UID: \"dc54c941-19d2-42c1-b9f0-a3a58999bda5\") " pod="openstack/ovn-controller-5rg5k" Mar 09 03:00:23 crc kubenswrapper[4901]: I0309 03:00:23.932687 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcl27\" (UniqueName: \"kubernetes.io/projected/4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89-kube-api-access-fcl27\") pod \"ovn-controller-ovs-hltph\" (UID: \"4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89\") " pod="openstack/ovn-controller-ovs-hltph" Mar 09 03:00:24 crc kubenswrapper[4901]: I0309 03:00:24.033712 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-hltph" Mar 09 03:00:24 crc kubenswrapper[4901]: I0309 03:00:24.043964 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5rg5k" Mar 09 03:00:26 crc kubenswrapper[4901]: I0309 03:00:26.619907 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 09 03:00:26 crc kubenswrapper[4901]: I0309 03:00:26.621694 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 09 03:00:26 crc kubenswrapper[4901]: I0309 03:00:26.626293 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-8ptvl" Mar 09 03:00:26 crc kubenswrapper[4901]: I0309 03:00:26.626296 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 09 03:00:26 crc kubenswrapper[4901]: I0309 03:00:26.626364 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 09 03:00:26 crc kubenswrapper[4901]: I0309 03:00:26.626320 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 09 03:00:26 crc kubenswrapper[4901]: I0309 03:00:26.626610 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 09 03:00:26 crc kubenswrapper[4901]: I0309 03:00:26.635706 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 09 03:00:26 crc kubenswrapper[4901]: I0309 03:00:26.648862 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c86c22f2-896c-4c29-95c7-024aea61dcd2-config\") pod \"ovsdbserver-nb-0\" (UID: \"c86c22f2-896c-4c29-95c7-024aea61dcd2\") " pod="openstack/ovsdbserver-nb-0" Mar 09 03:00:26 crc kubenswrapper[4901]: I0309 03:00:26.648925 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c86c22f2-896c-4c29-95c7-024aea61dcd2-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c86c22f2-896c-4c29-95c7-024aea61dcd2\") " pod="openstack/ovsdbserver-nb-0" Mar 09 03:00:26 crc kubenswrapper[4901]: I0309 03:00:26.649019 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c86c22f2-896c-4c29-95c7-024aea61dcd2-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"c86c22f2-896c-4c29-95c7-024aea61dcd2\") " pod="openstack/ovsdbserver-nb-0" Mar 09 03:00:26 crc kubenswrapper[4901]: I0309 03:00:26.649082 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c86c22f2-896c-4c29-95c7-024aea61dcd2-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c86c22f2-896c-4c29-95c7-024aea61dcd2\") " pod="openstack/ovsdbserver-nb-0" Mar 09 03:00:26 crc kubenswrapper[4901]: I0309 03:00:26.649120 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpl8z\" (UniqueName: \"kubernetes.io/projected/c86c22f2-896c-4c29-95c7-024aea61dcd2-kube-api-access-tpl8z\") pod \"ovsdbserver-nb-0\" (UID: \"c86c22f2-896c-4c29-95c7-024aea61dcd2\") " pod="openstack/ovsdbserver-nb-0" Mar 09 03:00:26 crc kubenswrapper[4901]: I0309 03:00:26.649159 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"c86c22f2-896c-4c29-95c7-024aea61dcd2\") " pod="openstack/ovsdbserver-nb-0" Mar 09 03:00:26 crc kubenswrapper[4901]: I0309 03:00:26.649191 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c86c22f2-896c-4c29-95c7-024aea61dcd2-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"c86c22f2-896c-4c29-95c7-024aea61dcd2\") " pod="openstack/ovsdbserver-nb-0" Mar 09 03:00:26 crc kubenswrapper[4901]: I0309 03:00:26.649253 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c86c22f2-896c-4c29-95c7-024aea61dcd2-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"c86c22f2-896c-4c29-95c7-024aea61dcd2\") " pod="openstack/ovsdbserver-nb-0" Mar 09 03:00:26 crc kubenswrapper[4901]: I0309 03:00:26.751132 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c86c22f2-896c-4c29-95c7-024aea61dcd2-config\") pod \"ovsdbserver-nb-0\" (UID: \"c86c22f2-896c-4c29-95c7-024aea61dcd2\") " pod="openstack/ovsdbserver-nb-0" Mar 09 03:00:26 crc kubenswrapper[4901]: I0309 03:00:26.751186 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c86c22f2-896c-4c29-95c7-024aea61dcd2-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c86c22f2-896c-4c29-95c7-024aea61dcd2\") " pod="openstack/ovsdbserver-nb-0" Mar 09 03:00:26 crc kubenswrapper[4901]: I0309 03:00:26.751268 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c86c22f2-896c-4c29-95c7-024aea61dcd2-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"c86c22f2-896c-4c29-95c7-024aea61dcd2\") " pod="openstack/ovsdbserver-nb-0" Mar 09 03:00:26 crc kubenswrapper[4901]: I0309 03:00:26.751304 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c86c22f2-896c-4c29-95c7-024aea61dcd2-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c86c22f2-896c-4c29-95c7-024aea61dcd2\") " pod="openstack/ovsdbserver-nb-0" Mar 09 03:00:26 crc kubenswrapper[4901]: I0309 03:00:26.751342 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpl8z\" (UniqueName: \"kubernetes.io/projected/c86c22f2-896c-4c29-95c7-024aea61dcd2-kube-api-access-tpl8z\") pod \"ovsdbserver-nb-0\" (UID: \"c86c22f2-896c-4c29-95c7-024aea61dcd2\") " pod="openstack/ovsdbserver-nb-0" Mar 09 03:00:26 crc kubenswrapper[4901]: I0309 03:00:26.751394 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"c86c22f2-896c-4c29-95c7-024aea61dcd2\") " pod="openstack/ovsdbserver-nb-0" Mar 09 03:00:26 crc kubenswrapper[4901]: I0309 03:00:26.751435 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c86c22f2-896c-4c29-95c7-024aea61dcd2-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"c86c22f2-896c-4c29-95c7-024aea61dcd2\") " pod="openstack/ovsdbserver-nb-0" Mar 09 03:00:26 crc kubenswrapper[4901]: I0309 03:00:26.751455 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c86c22f2-896c-4c29-95c7-024aea61dcd2-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"c86c22f2-896c-4c29-95c7-024aea61dcd2\") " pod="openstack/ovsdbserver-nb-0" Mar 09 03:00:26 crc kubenswrapper[4901]: I0309 03:00:26.751760 4901 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"c86c22f2-896c-4c29-95c7-024aea61dcd2\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/ovsdbserver-nb-0" Mar 09 03:00:26 crc kubenswrapper[4901]: I0309 03:00:26.752274 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c86c22f2-896c-4c29-95c7-024aea61dcd2-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"c86c22f2-896c-4c29-95c7-024aea61dcd2\") " pod="openstack/ovsdbserver-nb-0" Mar 09 03:00:26 crc kubenswrapper[4901]: I0309 03:00:26.752596 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c86c22f2-896c-4c29-95c7-024aea61dcd2-config\") pod \"ovsdbserver-nb-0\" (UID: \"c86c22f2-896c-4c29-95c7-024aea61dcd2\") " pod="openstack/ovsdbserver-nb-0" Mar 09 03:00:26 crc kubenswrapper[4901]: I0309 03:00:26.752890 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c86c22f2-896c-4c29-95c7-024aea61dcd2-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"c86c22f2-896c-4c29-95c7-024aea61dcd2\") " pod="openstack/ovsdbserver-nb-0" Mar 09 03:00:26 crc kubenswrapper[4901]: I0309 03:00:26.756444 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c86c22f2-896c-4c29-95c7-024aea61dcd2-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c86c22f2-896c-4c29-95c7-024aea61dcd2\") " pod="openstack/ovsdbserver-nb-0" Mar 09 03:00:26 crc kubenswrapper[4901]: I0309 03:00:26.756891 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c86c22f2-896c-4c29-95c7-024aea61dcd2-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c86c22f2-896c-4c29-95c7-024aea61dcd2\") " pod="openstack/ovsdbserver-nb-0" Mar 09 03:00:26 crc kubenswrapper[4901]: I0309 03:00:26.761484 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c86c22f2-896c-4c29-95c7-024aea61dcd2-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"c86c22f2-896c-4c29-95c7-024aea61dcd2\") " pod="openstack/ovsdbserver-nb-0" Mar 09 03:00:26 crc kubenswrapper[4901]: I0309 03:00:26.768518 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpl8z\" (UniqueName: \"kubernetes.io/projected/c86c22f2-896c-4c29-95c7-024aea61dcd2-kube-api-access-tpl8z\") pod \"ovsdbserver-nb-0\" (UID: \"c86c22f2-896c-4c29-95c7-024aea61dcd2\") " pod="openstack/ovsdbserver-nb-0" Mar 09 03:00:26 crc kubenswrapper[4901]: I0309 03:00:26.776495 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"c86c22f2-896c-4c29-95c7-024aea61dcd2\") " pod="openstack/ovsdbserver-nb-0" Mar 09 03:00:26 crc kubenswrapper[4901]: I0309 03:00:26.941985 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 09 03:00:28 crc kubenswrapper[4901]: I0309 03:00:28.606906 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 09 03:00:28 crc kubenswrapper[4901]: I0309 03:00:28.608984 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 09 03:00:28 crc kubenswrapper[4901]: I0309 03:00:28.613852 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-j5q8w" Mar 09 03:00:28 crc kubenswrapper[4901]: I0309 03:00:28.621856 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 09 03:00:28 crc kubenswrapper[4901]: I0309 03:00:28.625026 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 09 03:00:28 crc kubenswrapper[4901]: I0309 03:00:28.625336 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 09 03:00:28 crc kubenswrapper[4901]: I0309 03:00:28.638373 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 09 03:00:28 crc kubenswrapper[4901]: I0309 03:00:28.686750 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/167ad9cc-678d-499b-9be0-2e74112f84c9-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"167ad9cc-678d-499b-9be0-2e74112f84c9\") " pod="openstack/ovsdbserver-sb-0" Mar 09 03:00:28 crc kubenswrapper[4901]: I0309 03:00:28.686906 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/167ad9cc-678d-499b-9be0-2e74112f84c9-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"167ad9cc-678d-499b-9be0-2e74112f84c9\") " pod="openstack/ovsdbserver-sb-0" Mar 09 03:00:28 crc kubenswrapper[4901]: I0309 03:00:28.686988 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxs9l\" (UniqueName: \"kubernetes.io/projected/167ad9cc-678d-499b-9be0-2e74112f84c9-kube-api-access-xxs9l\") pod \"ovsdbserver-sb-0\" (UID: \"167ad9cc-678d-499b-9be0-2e74112f84c9\") " pod="openstack/ovsdbserver-sb-0" Mar 09 03:00:28 crc kubenswrapper[4901]: I0309 03:00:28.687052 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/167ad9cc-678d-499b-9be0-2e74112f84c9-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"167ad9cc-678d-499b-9be0-2e74112f84c9\") " pod="openstack/ovsdbserver-sb-0" Mar 09 03:00:28 crc kubenswrapper[4901]: I0309 03:00:28.687110 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"167ad9cc-678d-499b-9be0-2e74112f84c9\") " pod="openstack/ovsdbserver-sb-0" Mar 09 03:00:28 crc kubenswrapper[4901]: I0309 03:00:28.687166 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/167ad9cc-678d-499b-9be0-2e74112f84c9-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"167ad9cc-678d-499b-9be0-2e74112f84c9\") " pod="openstack/ovsdbserver-sb-0" Mar 09 03:00:28 crc kubenswrapper[4901]: I0309 03:00:28.687209 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/167ad9cc-678d-499b-9be0-2e74112f84c9-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"167ad9cc-678d-499b-9be0-2e74112f84c9\") " pod="openstack/ovsdbserver-sb-0" Mar 09 03:00:28 crc kubenswrapper[4901]: I0309 03:00:28.687303 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/167ad9cc-678d-499b-9be0-2e74112f84c9-config\") pod \"ovsdbserver-sb-0\" (UID: \"167ad9cc-678d-499b-9be0-2e74112f84c9\") " pod="openstack/ovsdbserver-sb-0" Mar 09 03:00:28 crc kubenswrapper[4901]: I0309 03:00:28.788150 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/167ad9cc-678d-499b-9be0-2e74112f84c9-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"167ad9cc-678d-499b-9be0-2e74112f84c9\") " pod="openstack/ovsdbserver-sb-0" Mar 09 03:00:28 crc kubenswrapper[4901]: I0309 03:00:28.788694 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/167ad9cc-678d-499b-9be0-2e74112f84c9-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"167ad9cc-678d-499b-9be0-2e74112f84c9\") " pod="openstack/ovsdbserver-sb-0" Mar 09 03:00:28 crc kubenswrapper[4901]: I0309 03:00:28.788943 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/167ad9cc-678d-499b-9be0-2e74112f84c9-config\") pod \"ovsdbserver-sb-0\" (UID: \"167ad9cc-678d-499b-9be0-2e74112f84c9\") " pod="openstack/ovsdbserver-sb-0" Mar 09 03:00:28 crc kubenswrapper[4901]: I0309 03:00:28.789120 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/167ad9cc-678d-499b-9be0-2e74112f84c9-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"167ad9cc-678d-499b-9be0-2e74112f84c9\") " pod="openstack/ovsdbserver-sb-0" Mar 09 03:00:28 crc kubenswrapper[4901]: I0309 03:00:28.789188 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/167ad9cc-678d-499b-9be0-2e74112f84c9-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"167ad9cc-678d-499b-9be0-2e74112f84c9\") " pod="openstack/ovsdbserver-sb-0" Mar 09 03:00:28 crc kubenswrapper[4901]: I0309 03:00:28.789345 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/167ad9cc-678d-499b-9be0-2e74112f84c9-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"167ad9cc-678d-499b-9be0-2e74112f84c9\") " pod="openstack/ovsdbserver-sb-0" Mar 09 03:00:28 crc kubenswrapper[4901]: I0309 03:00:28.789463 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxs9l\" (UniqueName: \"kubernetes.io/projected/167ad9cc-678d-499b-9be0-2e74112f84c9-kube-api-access-xxs9l\") pod \"ovsdbserver-sb-0\" (UID: \"167ad9cc-678d-499b-9be0-2e74112f84c9\") " pod="openstack/ovsdbserver-sb-0" Mar 09 03:00:28 crc kubenswrapper[4901]: I0309 03:00:28.789630 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/167ad9cc-678d-499b-9be0-2e74112f84c9-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"167ad9cc-678d-499b-9be0-2e74112f84c9\") " pod="openstack/ovsdbserver-sb-0" Mar 09 03:00:28 crc kubenswrapper[4901]: I0309 03:00:28.789755 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/167ad9cc-678d-499b-9be0-2e74112f84c9-config\") pod \"ovsdbserver-sb-0\" (UID: \"167ad9cc-678d-499b-9be0-2e74112f84c9\") " pod="openstack/ovsdbserver-sb-0" Mar 09 03:00:28 crc kubenswrapper[4901]: I0309 03:00:28.789873 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"167ad9cc-678d-499b-9be0-2e74112f84c9\") " pod="openstack/ovsdbserver-sb-0" Mar 09 03:00:28 crc kubenswrapper[4901]: I0309 03:00:28.790040 4901 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"167ad9cc-678d-499b-9be0-2e74112f84c9\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/ovsdbserver-sb-0" Mar 09 03:00:28 crc kubenswrapper[4901]: I0309 03:00:28.790650 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/167ad9cc-678d-499b-9be0-2e74112f84c9-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"167ad9cc-678d-499b-9be0-2e74112f84c9\") " pod="openstack/ovsdbserver-sb-0" Mar 09 03:00:28 crc kubenswrapper[4901]: I0309 03:00:28.795773 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/167ad9cc-678d-499b-9be0-2e74112f84c9-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"167ad9cc-678d-499b-9be0-2e74112f84c9\") " pod="openstack/ovsdbserver-sb-0" Mar 09 03:00:28 crc kubenswrapper[4901]: I0309 03:00:28.795833 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/167ad9cc-678d-499b-9be0-2e74112f84c9-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"167ad9cc-678d-499b-9be0-2e74112f84c9\") " pod="openstack/ovsdbserver-sb-0" Mar 09 03:00:28 crc kubenswrapper[4901]: I0309 03:00:28.804531 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/167ad9cc-678d-499b-9be0-2e74112f84c9-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"167ad9cc-678d-499b-9be0-2e74112f84c9\") " pod="openstack/ovsdbserver-sb-0" Mar 09 03:00:28 crc kubenswrapper[4901]: I0309 03:00:28.809479 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxs9l\" (UniqueName: \"kubernetes.io/projected/167ad9cc-678d-499b-9be0-2e74112f84c9-kube-api-access-xxs9l\") pod \"ovsdbserver-sb-0\" (UID: \"167ad9cc-678d-499b-9be0-2e74112f84c9\") " pod="openstack/ovsdbserver-sb-0" Mar 09 03:00:28 crc kubenswrapper[4901]: I0309 03:00:28.813519 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"167ad9cc-678d-499b-9be0-2e74112f84c9\") " pod="openstack/ovsdbserver-sb-0" Mar 09 03:00:28 crc kubenswrapper[4901]: I0309 03:00:28.952032 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 09 03:00:30 crc kubenswrapper[4901]: I0309 03:00:30.863202 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 03:00:30 crc kubenswrapper[4901]: I0309 03:00:30.863574 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 03:00:30 crc kubenswrapper[4901]: I0309 03:00:30.863618 4901 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5c998" Mar 09 03:00:30 crc kubenswrapper[4901]: I0309 03:00:30.864251 4901 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"de1ddcd67e5e6d7dcbea8ef2824b5106d2b931526ee8f8e98968dbc1152811b6"} pod="openshift-machine-config-operator/machine-config-daemon-5c998" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 03:00:30 crc kubenswrapper[4901]: I0309 03:00:30.864303 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" containerID="cri-o://de1ddcd67e5e6d7dcbea8ef2824b5106d2b931526ee8f8e98968dbc1152811b6" gracePeriod=600 Mar 09 03:00:31 crc kubenswrapper[4901]: E0309 03:00:31.361475 4901 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:9e7397d61095b02a8c1deb24bca874bc0032aa18019f12d53e0eda8998b85447" Mar 09 03:00:31 crc kubenswrapper[4901]: E0309 03:00:31.361688 4901 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:9e7397d61095b02a8c1deb24bca874bc0032aa18019f12d53e0eda8998b85447,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-95n4v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(46c7df0b-fc0a-4fd9-b097-72da03442510): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 03:00:31 crc kubenswrapper[4901]: E0309 03:00:31.362954 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="46c7df0b-fc0a-4fd9-b097-72da03442510" Mar 09 03:00:31 crc kubenswrapper[4901]: I0309 03:00:31.971500 4901 generic.go:334] "Generic (PLEG): container finished" podID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerID="de1ddcd67e5e6d7dcbea8ef2824b5106d2b931526ee8f8e98968dbc1152811b6" exitCode=0 Mar 09 03:00:31 crc kubenswrapper[4901]: I0309 03:00:31.972673 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" event={"ID":"65e722e8-52c4-4bb6-9927-f378b2f7296a","Type":"ContainerDied","Data":"de1ddcd67e5e6d7dcbea8ef2824b5106d2b931526ee8f8e98968dbc1152811b6"} Mar 09 03:00:31 crc kubenswrapper[4901]: I0309 03:00:31.972708 4901 scope.go:117] "RemoveContainer" containerID="37ec3e94088a17553e2b069ce6fa01c84825c1f38b75b23f862711155501cfa6" Mar 09 03:00:31 crc kubenswrapper[4901]: E0309 03:00:31.977632 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:9e7397d61095b02a8c1deb24bca874bc0032aa18019f12d53e0eda8998b85447\\\"\"" pod="openstack/rabbitmq-server-0" podUID="46c7df0b-fc0a-4fd9-b097-72da03442510" Mar 09 03:00:35 crc kubenswrapper[4901]: I0309 03:00:35.786517 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 09 03:00:36 crc kubenswrapper[4901]: I0309 03:00:36.075712 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 09 03:00:36 crc kubenswrapper[4901]: I0309 03:00:36.083266 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 09 03:00:36 crc kubenswrapper[4901]: W0309 03:00:36.497353 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0098aa8_4248_48ec_a254_368c395308b1.slice/crio-1de970bcceff311fa19a500f73a8d13616b2bad13039b206aea33903c1d1417a WatchSource:0}: Error finding container 1de970bcceff311fa19a500f73a8d13616b2bad13039b206aea33903c1d1417a: Status 404 returned error can't find the container with id 1de970bcceff311fa19a500f73a8d13616b2bad13039b206aea33903c1d1417a Mar 09 03:00:36 crc kubenswrapper[4901]: E0309 03:00:36.526069 4901 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514" Mar 09 03:00:36 crc kubenswrapper[4901]: E0309 03:00:36.526293 4901 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bgnnz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-7c47bcb9f9-j2tdt_openstack(c09579bf-bfa6-4f4a-b7c3-4be8a914270e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 03:00:36 crc kubenswrapper[4901]: E0309 03:00:36.528259 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-7c47bcb9f9-j2tdt" podUID="c09579bf-bfa6-4f4a-b7c3-4be8a914270e" Mar 09 03:00:36 crc kubenswrapper[4901]: E0309 03:00:36.545502 4901 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514" Mar 09 03:00:36 crc kubenswrapper[4901]: E0309 03:00:36.545657 4901 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5t6fs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-589db6c89c-vcd9z_openstack(0bdf9323-3b7a-4dea-aa2e-63dd95f0cb5a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 03:00:36 crc kubenswrapper[4901]: E0309 03:00:36.546949 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-589db6c89c-vcd9z" podUID="0bdf9323-3b7a-4dea-aa2e-63dd95f0cb5a" Mar 09 03:00:36 crc kubenswrapper[4901]: E0309 03:00:36.604405 4901 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514" Mar 09 03:00:36 crc kubenswrapper[4901]: E0309 03:00:36.604580 4901 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g8xdz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-79f9fc56ff-fc488_openstack(375fef20-b677-465b-bc26-cb609f6babae): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 03:00:36 crc kubenswrapper[4901]: E0309 03:00:36.605989 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-79f9fc56ff-fc488" podUID="375fef20-b677-465b-bc26-cb609f6babae" Mar 09 03:00:36 crc kubenswrapper[4901]: E0309 03:00:36.657404 4901 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514" Mar 09 03:00:36 crc kubenswrapper[4901]: E0309 03:00:36.657900 4901 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zcd8w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-86bbd886cf-4qr9p_openstack(7a9998e9-fba6-4227-9fae-a916d6143d7d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 03:00:36 crc kubenswrapper[4901]: E0309 03:00:36.659637 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-86bbd886cf-4qr9p" podUID="7a9998e9-fba6-4227-9fae-a916d6143d7d" Mar 09 03:00:36 crc kubenswrapper[4901]: I0309 03:00:36.740945 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 03:00:36 crc kubenswrapper[4901]: W0309 03:00:36.812708 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc54c941_19d2_42c1_b9f0_a3a58999bda5.slice/crio-1fa4549ee30fd478c1cc8063cbb815d38c1347d5e0c4dc36125f708c0a58f4ee WatchSource:0}: Error finding container 1fa4549ee30fd478c1cc8063cbb815d38c1347d5e0c4dc36125f708c0a58f4ee: Status 404 returned error can't find the container with id 1fa4549ee30fd478c1cc8063cbb815d38c1347d5e0c4dc36125f708c0a58f4ee Mar 09 03:00:36 crc kubenswrapper[4901]: I0309 03:00:36.815930 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-5rg5k"] Mar 09 03:00:36 crc kubenswrapper[4901]: I0309 03:00:36.962659 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-hltph"] Mar 09 03:00:36 crc kubenswrapper[4901]: W0309 03:00:36.965201 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b3c3ee5_e413_4c0f_bf6b_aba2084f4e89.slice/crio-b5c023718819a580beade659ae310802783879f3abc6a9a4e91c499e588bafe3 WatchSource:0}: Error finding container b5c023718819a580beade659ae310802783879f3abc6a9a4e91c499e588bafe3: Status 404 returned error can't find the container with id b5c023718819a580beade659ae310802783879f3abc6a9a4e91c499e588bafe3 Mar 09 03:00:37 crc kubenswrapper[4901]: I0309 03:00:37.018973 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"993fdfea-9981-48c0-9b5b-c78eab5106a0","Type":"ContainerStarted","Data":"8cdb357b3a0e049df907d97e8d5470b2840561b835a914a24933289af5f80c99"} Mar 09 03:00:37 crc kubenswrapper[4901]: I0309 03:00:37.020275 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"9df0684a-2816-4af7-97cf-00e31c542eef","Type":"ContainerStarted","Data":"45858a7d0c4eba77159193d5e3d5965a27c862de73a9ec0764595d4ab5a7ccbc"} Mar 09 03:00:37 crc kubenswrapper[4901]: I0309 03:00:37.022834 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" event={"ID":"65e722e8-52c4-4bb6-9927-f378b2f7296a","Type":"ContainerStarted","Data":"9277963b36f2cb3e2457299d51b88e2cddec56d32cd2a3c3337a07a6a046785b"} Mar 09 03:00:37 crc kubenswrapper[4901]: I0309 03:00:37.027161 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hltph" event={"ID":"4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89","Type":"ContainerStarted","Data":"b5c023718819a580beade659ae310802783879f3abc6a9a4e91c499e588bafe3"} Mar 09 03:00:37 crc kubenswrapper[4901]: I0309 03:00:37.028545 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f0098aa8-4248-48ec-a254-368c395308b1","Type":"ContainerStarted","Data":"1de970bcceff311fa19a500f73a8d13616b2bad13039b206aea33903c1d1417a"} Mar 09 03:00:37 crc kubenswrapper[4901]: I0309 03:00:37.030731 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5rg5k" event={"ID":"dc54c941-19d2-42c1-b9f0-a3a58999bda5","Type":"ContainerStarted","Data":"1fa4549ee30fd478c1cc8063cbb815d38c1347d5e0c4dc36125f708c0a58f4ee"} Mar 09 03:00:37 crc kubenswrapper[4901]: I0309 03:00:37.032569 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"2b3e03cd-75ae-46dc-aee4-b778929cf535","Type":"ContainerStarted","Data":"c848cdd78c97d85450b65da73f826ef1b05faf1160a2df5372998fbcfd87cff9"} Mar 09 03:00:37 crc kubenswrapper[4901]: E0309 03:00:37.037378 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514\\\"\"" pod="openstack/dnsmasq-dns-7c47bcb9f9-j2tdt" podUID="c09579bf-bfa6-4f4a-b7c3-4be8a914270e" Mar 09 03:00:37 crc kubenswrapper[4901]: E0309 03:00:37.037583 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514\\\"\"" pod="openstack/dnsmasq-dns-79f9fc56ff-fc488" podUID="375fef20-b677-465b-bc26-cb609f6babae" Mar 09 03:00:37 crc kubenswrapper[4901]: I0309 03:00:37.246776 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 09 03:00:37 crc kubenswrapper[4901]: W0309 03:00:37.315948 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod167ad9cc_678d_499b_9be0_2e74112f84c9.slice/crio-46f0b2c02c4740aa4f7c827024b1083c604e073028efe8be643c2bdaa5a311aa WatchSource:0}: Error finding container 46f0b2c02c4740aa4f7c827024b1083c604e073028efe8be643c2bdaa5a311aa: Status 404 returned error can't find the container with id 46f0b2c02c4740aa4f7c827024b1083c604e073028efe8be643c2bdaa5a311aa Mar 09 03:00:37 crc kubenswrapper[4901]: I0309 03:00:37.368927 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-vcd9z" Mar 09 03:00:37 crc kubenswrapper[4901]: I0309 03:00:37.564591 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bdf9323-3b7a-4dea-aa2e-63dd95f0cb5a-config\") pod \"0bdf9323-3b7a-4dea-aa2e-63dd95f0cb5a\" (UID: \"0bdf9323-3b7a-4dea-aa2e-63dd95f0cb5a\") " Mar 09 03:00:37 crc kubenswrapper[4901]: I0309 03:00:37.565245 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bdf9323-3b7a-4dea-aa2e-63dd95f0cb5a-config" (OuterVolumeSpecName: "config") pod "0bdf9323-3b7a-4dea-aa2e-63dd95f0cb5a" (UID: "0bdf9323-3b7a-4dea-aa2e-63dd95f0cb5a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:00:37 crc kubenswrapper[4901]: I0309 03:00:37.566018 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5t6fs\" (UniqueName: \"kubernetes.io/projected/0bdf9323-3b7a-4dea-aa2e-63dd95f0cb5a-kube-api-access-5t6fs\") pod \"0bdf9323-3b7a-4dea-aa2e-63dd95f0cb5a\" (UID: \"0bdf9323-3b7a-4dea-aa2e-63dd95f0cb5a\") " Mar 09 03:00:37 crc kubenswrapper[4901]: I0309 03:00:37.566973 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bdf9323-3b7a-4dea-aa2e-63dd95f0cb5a-config\") on node \"crc\" DevicePath \"\"" Mar 09 03:00:37 crc kubenswrapper[4901]: I0309 03:00:37.607753 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 09 03:00:37 crc kubenswrapper[4901]: I0309 03:00:37.610405 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bdf9323-3b7a-4dea-aa2e-63dd95f0cb5a-kube-api-access-5t6fs" (OuterVolumeSpecName: "kube-api-access-5t6fs") pod "0bdf9323-3b7a-4dea-aa2e-63dd95f0cb5a" (UID: "0bdf9323-3b7a-4dea-aa2e-63dd95f0cb5a"). InnerVolumeSpecName "kube-api-access-5t6fs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:00:37 crc kubenswrapper[4901]: I0309 03:00:37.667665 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5t6fs\" (UniqueName: \"kubernetes.io/projected/0bdf9323-3b7a-4dea-aa2e-63dd95f0cb5a-kube-api-access-5t6fs\") on node \"crc\" DevicePath \"\"" Mar 09 03:00:37 crc kubenswrapper[4901]: W0309 03:00:37.704662 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc86c22f2_896c_4c29_95c7_024aea61dcd2.slice/crio-66faf20c023bbaffbd283fb5360df2c2927680324af54043b31e59e6a5b95696 WatchSource:0}: Error finding container 66faf20c023bbaffbd283fb5360df2c2927680324af54043b31e59e6a5b95696: Status 404 returned error can't find the container with id 66faf20c023bbaffbd283fb5360df2c2927680324af54043b31e59e6a5b95696 Mar 09 03:00:37 crc kubenswrapper[4901]: I0309 03:00:37.753831 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-4qr9p" Mar 09 03:00:37 crc kubenswrapper[4901]: I0309 03:00:37.869754 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcd8w\" (UniqueName: \"kubernetes.io/projected/7a9998e9-fba6-4227-9fae-a916d6143d7d-kube-api-access-zcd8w\") pod \"7a9998e9-fba6-4227-9fae-a916d6143d7d\" (UID: \"7a9998e9-fba6-4227-9fae-a916d6143d7d\") " Mar 09 03:00:37 crc kubenswrapper[4901]: I0309 03:00:37.869900 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a9998e9-fba6-4227-9fae-a916d6143d7d-dns-svc\") pod \"7a9998e9-fba6-4227-9fae-a916d6143d7d\" (UID: \"7a9998e9-fba6-4227-9fae-a916d6143d7d\") " Mar 09 03:00:37 crc kubenswrapper[4901]: I0309 03:00:37.869927 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a9998e9-fba6-4227-9fae-a916d6143d7d-config\") pod \"7a9998e9-fba6-4227-9fae-a916d6143d7d\" (UID: \"7a9998e9-fba6-4227-9fae-a916d6143d7d\") " Mar 09 03:00:37 crc kubenswrapper[4901]: I0309 03:00:37.870476 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a9998e9-fba6-4227-9fae-a916d6143d7d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7a9998e9-fba6-4227-9fae-a916d6143d7d" (UID: "7a9998e9-fba6-4227-9fae-a916d6143d7d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:00:37 crc kubenswrapper[4901]: I0309 03:00:37.870653 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a9998e9-fba6-4227-9fae-a916d6143d7d-config" (OuterVolumeSpecName: "config") pod "7a9998e9-fba6-4227-9fae-a916d6143d7d" (UID: "7a9998e9-fba6-4227-9fae-a916d6143d7d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:00:37 crc kubenswrapper[4901]: I0309 03:00:37.872991 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a9998e9-fba6-4227-9fae-a916d6143d7d-kube-api-access-zcd8w" (OuterVolumeSpecName: "kube-api-access-zcd8w") pod "7a9998e9-fba6-4227-9fae-a916d6143d7d" (UID: "7a9998e9-fba6-4227-9fae-a916d6143d7d"). InnerVolumeSpecName "kube-api-access-zcd8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:00:37 crc kubenswrapper[4901]: I0309 03:00:37.972004 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcd8w\" (UniqueName: \"kubernetes.io/projected/7a9998e9-fba6-4227-9fae-a916d6143d7d-kube-api-access-zcd8w\") on node \"crc\" DevicePath \"\"" Mar 09 03:00:37 crc kubenswrapper[4901]: I0309 03:00:37.972034 4901 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a9998e9-fba6-4227-9fae-a916d6143d7d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 03:00:37 crc kubenswrapper[4901]: I0309 03:00:37.972044 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a9998e9-fba6-4227-9fae-a916d6143d7d-config\") on node \"crc\" DevicePath \"\"" Mar 09 03:00:38 crc kubenswrapper[4901]: I0309 03:00:38.042447 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589db6c89c-vcd9z" event={"ID":"0bdf9323-3b7a-4dea-aa2e-63dd95f0cb5a","Type":"ContainerDied","Data":"c0faff36f2b82479038d2ec0b7a707c9b803ccf7555cb143101c8a3e962c88c3"} Mar 09 03:00:38 crc kubenswrapper[4901]: I0309 03:00:38.042438 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-vcd9z" Mar 09 03:00:38 crc kubenswrapper[4901]: I0309 03:00:38.044734 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"c86c22f2-896c-4c29-95c7-024aea61dcd2","Type":"ContainerStarted","Data":"66faf20c023bbaffbd283fb5360df2c2927680324af54043b31e59e6a5b95696"} Mar 09 03:00:38 crc kubenswrapper[4901]: I0309 03:00:38.047674 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"98538e55-cb87-49e2-9fd5-fff06d7edfdd","Type":"ContainerStarted","Data":"d573b837ddf089152e6738d97df2ec1aa5c6f25f6f2ae8c229ee9079ec71fbad"} Mar 09 03:00:38 crc kubenswrapper[4901]: I0309 03:00:38.049204 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-4qr9p" Mar 09 03:00:38 crc kubenswrapper[4901]: I0309 03:00:38.049248 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86bbd886cf-4qr9p" event={"ID":"7a9998e9-fba6-4227-9fae-a916d6143d7d","Type":"ContainerDied","Data":"4e93826f3d55602b22ae95f5a29e8514a3571c0218702419830155718f040adc"} Mar 09 03:00:38 crc kubenswrapper[4901]: I0309 03:00:38.050255 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"167ad9cc-678d-499b-9be0-2e74112f84c9","Type":"ContainerStarted","Data":"46f0b2c02c4740aa4f7c827024b1083c604e073028efe8be643c2bdaa5a311aa"} Mar 09 03:00:38 crc kubenswrapper[4901]: I0309 03:00:38.127363 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-vcd9z"] Mar 09 03:00:38 crc kubenswrapper[4901]: I0309 03:00:38.127400 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-vcd9z"] Mar 09 03:00:38 crc kubenswrapper[4901]: I0309 03:00:38.149060 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-4qr9p"] Mar 09 03:00:38 crc kubenswrapper[4901]: I0309 03:00:38.159626 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-4qr9p"] Mar 09 03:00:40 crc kubenswrapper[4901]: I0309 03:00:40.116088 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bdf9323-3b7a-4dea-aa2e-63dd95f0cb5a" path="/var/lib/kubelet/pods/0bdf9323-3b7a-4dea-aa2e-63dd95f0cb5a/volumes" Mar 09 03:00:40 crc kubenswrapper[4901]: I0309 03:00:40.117333 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a9998e9-fba6-4227-9fae-a916d6143d7d" path="/var/lib/kubelet/pods/7a9998e9-fba6-4227-9fae-a916d6143d7d/volumes" Mar 09 03:00:45 crc kubenswrapper[4901]: I0309 03:00:45.132078 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"9df0684a-2816-4af7-97cf-00e31c542eef","Type":"ContainerStarted","Data":"0c2c9885ccda1bae73c95e9243ec29fe06b6d01586a48419ab131559dc2b48fa"} Mar 09 03:00:45 crc kubenswrapper[4901]: I0309 03:00:45.133638 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"c86c22f2-896c-4c29-95c7-024aea61dcd2","Type":"ContainerStarted","Data":"f60874d330498787b4f53dbd548f5bbb7d7609369bb61c9f558d83dea563dbb7"} Mar 09 03:00:45 crc kubenswrapper[4901]: I0309 03:00:45.136359 4901 generic.go:334] "Generic (PLEG): container finished" podID="4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89" containerID="858a5a877443194b8865cd485f5efb4064ab5bc8f500d6a40ba7b9d488f969ad" exitCode=0 Mar 09 03:00:45 crc kubenswrapper[4901]: I0309 03:00:45.136431 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hltph" event={"ID":"4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89","Type":"ContainerDied","Data":"858a5a877443194b8865cd485f5efb4064ab5bc8f500d6a40ba7b9d488f969ad"} Mar 09 03:00:45 crc kubenswrapper[4901]: I0309 03:00:45.140889 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f0098aa8-4248-48ec-a254-368c395308b1","Type":"ContainerStarted","Data":"3a05919e373ebb6f88b2fb0ed9c30b2b394efeac74eb31a4d3a3029fe54bc70d"} Mar 09 03:00:45 crc kubenswrapper[4901]: I0309 03:00:45.143383 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"167ad9cc-678d-499b-9be0-2e74112f84c9","Type":"ContainerStarted","Data":"58f28fe8133335254744ffb487e2889617fa95aef6fad8082e0e5543fe0012a2"} Mar 09 03:00:45 crc kubenswrapper[4901]: I0309 03:00:45.145578 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5rg5k" event={"ID":"dc54c941-19d2-42c1-b9f0-a3a58999bda5","Type":"ContainerStarted","Data":"9006efa47acc80f02568c7e41f3501e04cd4ba5afcd137f8e6891cbea2267262"} Mar 09 03:00:45 crc kubenswrapper[4901]: I0309 03:00:45.145917 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-5rg5k" Mar 09 03:00:45 crc kubenswrapper[4901]: I0309 03:00:45.147726 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"2b3e03cd-75ae-46dc-aee4-b778929cf535","Type":"ContainerStarted","Data":"854d0438c9e18acb0221644ecbe21cbe51324ea8c9135307368198911d10bbd5"} Mar 09 03:00:45 crc kubenswrapper[4901]: I0309 03:00:45.148157 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 09 03:00:45 crc kubenswrapper[4901]: I0309 03:00:45.160777 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"993fdfea-9981-48c0-9b5b-c78eab5106a0","Type":"ContainerStarted","Data":"bbda781da1c908b27a4862b6c710a520d1d856aa2df5f0b8d9a4ae8fa51858c2"} Mar 09 03:00:45 crc kubenswrapper[4901]: I0309 03:00:45.160985 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 09 03:00:45 crc kubenswrapper[4901]: I0309 03:00:45.228138 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=22.417474735 podStartE2EDuration="26.228119805s" podCreationTimestamp="2026-03-09 03:00:19 +0000 UTC" firstStartedPulling="2026-03-09 03:00:36.555862761 +0000 UTC m=+1161.145526513" lastFinishedPulling="2026-03-09 03:00:40.366507851 +0000 UTC m=+1164.956171583" observedRunningTime="2026-03-09 03:00:45.222723929 +0000 UTC m=+1169.812387661" watchObservedRunningTime="2026-03-09 03:00:45.228119805 +0000 UTC m=+1169.817783537" Mar 09 03:00:45 crc kubenswrapper[4901]: I0309 03:00:45.247965 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-5rg5k" podStartSLOduration=14.846501567 podStartE2EDuration="22.247944106s" podCreationTimestamp="2026-03-09 03:00:23 +0000 UTC" firstStartedPulling="2026-03-09 03:00:36.815707753 +0000 UTC m=+1161.405371485" lastFinishedPulling="2026-03-09 03:00:44.217150282 +0000 UTC m=+1168.806814024" observedRunningTime="2026-03-09 03:00:45.238854876 +0000 UTC m=+1169.828518608" watchObservedRunningTime="2026-03-09 03:00:45.247944106 +0000 UTC m=+1169.837607838" Mar 09 03:00:45 crc kubenswrapper[4901]: I0309 03:00:45.264322 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=16.698630623 podStartE2EDuration="24.264299088s" podCreationTimestamp="2026-03-09 03:00:21 +0000 UTC" firstStartedPulling="2026-03-09 03:00:36.752546047 +0000 UTC m=+1161.342209779" lastFinishedPulling="2026-03-09 03:00:44.318214492 +0000 UTC m=+1168.907878244" observedRunningTime="2026-03-09 03:00:45.262692108 +0000 UTC m=+1169.852355840" watchObservedRunningTime="2026-03-09 03:00:45.264299088 +0000 UTC m=+1169.853962820" Mar 09 03:00:46 crc kubenswrapper[4901]: I0309 03:00:46.177095 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hltph" event={"ID":"4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89","Type":"ContainerStarted","Data":"b269dfeb8bccf0d2d404b3336072d680e723d68372ed379a3aaa3d30efaa387d"} Mar 09 03:00:46 crc kubenswrapper[4901]: I0309 03:00:46.177736 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hltph" event={"ID":"4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89","Type":"ContainerStarted","Data":"0ebf319a3bbc109bfb53682936200a7e2d64ef909307f9919c3a44fc2eece41f"} Mar 09 03:00:46 crc kubenswrapper[4901]: I0309 03:00:46.205870 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-hltph" podStartSLOduration=16.726790127 podStartE2EDuration="23.2058563s" podCreationTimestamp="2026-03-09 03:00:23 +0000 UTC" firstStartedPulling="2026-03-09 03:00:36.967845032 +0000 UTC m=+1161.557508764" lastFinishedPulling="2026-03-09 03:00:43.446911205 +0000 UTC m=+1168.036574937" observedRunningTime="2026-03-09 03:00:46.205028049 +0000 UTC m=+1170.794691791" watchObservedRunningTime="2026-03-09 03:00:46.2058563 +0000 UTC m=+1170.795520032" Mar 09 03:00:46 crc kubenswrapper[4901]: I0309 03:00:46.952059 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-nmx84"] Mar 09 03:00:46 crc kubenswrapper[4901]: I0309 03:00:46.953027 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-nmx84" Mar 09 03:00:46 crc kubenswrapper[4901]: I0309 03:00:46.958245 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 09 03:00:46 crc kubenswrapper[4901]: I0309 03:00:46.984073 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-nmx84"] Mar 09 03:00:47 crc kubenswrapper[4901]: I0309 03:00:47.048091 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/065dfe75-7489-4b15-8a4d-4adf13393aea-combined-ca-bundle\") pod \"ovn-controller-metrics-nmx84\" (UID: \"065dfe75-7489-4b15-8a4d-4adf13393aea\") " pod="openstack/ovn-controller-metrics-nmx84" Mar 09 03:00:47 crc kubenswrapper[4901]: I0309 03:00:47.048184 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/065dfe75-7489-4b15-8a4d-4adf13393aea-ovn-rundir\") pod \"ovn-controller-metrics-nmx84\" (UID: \"065dfe75-7489-4b15-8a4d-4adf13393aea\") " pod="openstack/ovn-controller-metrics-nmx84" Mar 09 03:00:47 crc kubenswrapper[4901]: I0309 03:00:47.048208 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/065dfe75-7489-4b15-8a4d-4adf13393aea-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-nmx84\" (UID: \"065dfe75-7489-4b15-8a4d-4adf13393aea\") " pod="openstack/ovn-controller-metrics-nmx84" Mar 09 03:00:47 crc kubenswrapper[4901]: I0309 03:00:47.048241 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/065dfe75-7489-4b15-8a4d-4adf13393aea-ovs-rundir\") pod \"ovn-controller-metrics-nmx84\" (UID: \"065dfe75-7489-4b15-8a4d-4adf13393aea\") " pod="openstack/ovn-controller-metrics-nmx84" Mar 09 03:00:47 crc kubenswrapper[4901]: I0309 03:00:47.052701 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/065dfe75-7489-4b15-8a4d-4adf13393aea-config\") pod \"ovn-controller-metrics-nmx84\" (UID: \"065dfe75-7489-4b15-8a4d-4adf13393aea\") " pod="openstack/ovn-controller-metrics-nmx84" Mar 09 03:00:47 crc kubenswrapper[4901]: I0309 03:00:47.052823 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jh9p6\" (UniqueName: \"kubernetes.io/projected/065dfe75-7489-4b15-8a4d-4adf13393aea-kube-api-access-jh9p6\") pod \"ovn-controller-metrics-nmx84\" (UID: \"065dfe75-7489-4b15-8a4d-4adf13393aea\") " pod="openstack/ovn-controller-metrics-nmx84" Mar 09 03:00:47 crc kubenswrapper[4901]: I0309 03:00:47.157557 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/065dfe75-7489-4b15-8a4d-4adf13393aea-ovn-rundir\") pod \"ovn-controller-metrics-nmx84\" (UID: \"065dfe75-7489-4b15-8a4d-4adf13393aea\") " pod="openstack/ovn-controller-metrics-nmx84" Mar 09 03:00:47 crc kubenswrapper[4901]: I0309 03:00:47.157606 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/065dfe75-7489-4b15-8a4d-4adf13393aea-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-nmx84\" (UID: \"065dfe75-7489-4b15-8a4d-4adf13393aea\") " pod="openstack/ovn-controller-metrics-nmx84" Mar 09 03:00:47 crc kubenswrapper[4901]: I0309 03:00:47.157630 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/065dfe75-7489-4b15-8a4d-4adf13393aea-ovs-rundir\") pod \"ovn-controller-metrics-nmx84\" (UID: \"065dfe75-7489-4b15-8a4d-4adf13393aea\") " pod="openstack/ovn-controller-metrics-nmx84" Mar 09 03:00:47 crc kubenswrapper[4901]: I0309 03:00:47.157649 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/065dfe75-7489-4b15-8a4d-4adf13393aea-config\") pod \"ovn-controller-metrics-nmx84\" (UID: \"065dfe75-7489-4b15-8a4d-4adf13393aea\") " pod="openstack/ovn-controller-metrics-nmx84" Mar 09 03:00:47 crc kubenswrapper[4901]: I0309 03:00:47.157676 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jh9p6\" (UniqueName: \"kubernetes.io/projected/065dfe75-7489-4b15-8a4d-4adf13393aea-kube-api-access-jh9p6\") pod \"ovn-controller-metrics-nmx84\" (UID: \"065dfe75-7489-4b15-8a4d-4adf13393aea\") " pod="openstack/ovn-controller-metrics-nmx84" Mar 09 03:00:47 crc kubenswrapper[4901]: I0309 03:00:47.157730 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/065dfe75-7489-4b15-8a4d-4adf13393aea-combined-ca-bundle\") pod \"ovn-controller-metrics-nmx84\" (UID: \"065dfe75-7489-4b15-8a4d-4adf13393aea\") " pod="openstack/ovn-controller-metrics-nmx84" Mar 09 03:00:47 crc kubenswrapper[4901]: I0309 03:00:47.160464 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/065dfe75-7489-4b15-8a4d-4adf13393aea-ovs-rundir\") pod \"ovn-controller-metrics-nmx84\" (UID: \"065dfe75-7489-4b15-8a4d-4adf13393aea\") " pod="openstack/ovn-controller-metrics-nmx84" Mar 09 03:00:47 crc kubenswrapper[4901]: I0309 03:00:47.160555 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/065dfe75-7489-4b15-8a4d-4adf13393aea-ovn-rundir\") pod \"ovn-controller-metrics-nmx84\" (UID: \"065dfe75-7489-4b15-8a4d-4adf13393aea\") " pod="openstack/ovn-controller-metrics-nmx84" Mar 09 03:00:47 crc kubenswrapper[4901]: I0309 03:00:47.162292 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/065dfe75-7489-4b15-8a4d-4adf13393aea-config\") pod \"ovn-controller-metrics-nmx84\" (UID: \"065dfe75-7489-4b15-8a4d-4adf13393aea\") " pod="openstack/ovn-controller-metrics-nmx84" Mar 09 03:00:47 crc kubenswrapper[4901]: I0309 03:00:47.170839 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/065dfe75-7489-4b15-8a4d-4adf13393aea-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-nmx84\" (UID: \"065dfe75-7489-4b15-8a4d-4adf13393aea\") " pod="openstack/ovn-controller-metrics-nmx84" Mar 09 03:00:47 crc kubenswrapper[4901]: I0309 03:00:47.173100 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/065dfe75-7489-4b15-8a4d-4adf13393aea-combined-ca-bundle\") pod \"ovn-controller-metrics-nmx84\" (UID: \"065dfe75-7489-4b15-8a4d-4adf13393aea\") " pod="openstack/ovn-controller-metrics-nmx84" Mar 09 03:00:47 crc kubenswrapper[4901]: I0309 03:00:47.176481 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jh9p6\" (UniqueName: \"kubernetes.io/projected/065dfe75-7489-4b15-8a4d-4adf13393aea-kube-api-access-jh9p6\") pod \"ovn-controller-metrics-nmx84\" (UID: \"065dfe75-7489-4b15-8a4d-4adf13393aea\") " pod="openstack/ovn-controller-metrics-nmx84" Mar 09 03:00:47 crc kubenswrapper[4901]: I0309 03:00:47.179164 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79f9fc56ff-fc488"] Mar 09 03:00:47 crc kubenswrapper[4901]: I0309 03:00:47.190377 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-hltph" Mar 09 03:00:47 crc kubenswrapper[4901]: I0309 03:00:47.190521 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-hltph" Mar 09 03:00:47 crc kubenswrapper[4901]: I0309 03:00:47.204658 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b47ddbdf5-rtwnv"] Mar 09 03:00:47 crc kubenswrapper[4901]: I0309 03:00:47.206283 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b47ddbdf5-rtwnv" Mar 09 03:00:47 crc kubenswrapper[4901]: I0309 03:00:47.210129 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 09 03:00:47 crc kubenswrapper[4901]: I0309 03:00:47.223635 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b47ddbdf5-rtwnv"] Mar 09 03:00:47 crc kubenswrapper[4901]: I0309 03:00:47.273631 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-nmx84" Mar 09 03:00:47 crc kubenswrapper[4901]: I0309 03:00:47.351915 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-j2tdt"] Mar 09 03:00:47 crc kubenswrapper[4901]: I0309 03:00:47.361034 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c431ae87-ef51-42f6-8ace-cd59ac685e50-dns-svc\") pod \"dnsmasq-dns-b47ddbdf5-rtwnv\" (UID: \"c431ae87-ef51-42f6-8ace-cd59ac685e50\") " pod="openstack/dnsmasq-dns-b47ddbdf5-rtwnv" Mar 09 03:00:47 crc kubenswrapper[4901]: I0309 03:00:47.361086 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dtkd\" (UniqueName: \"kubernetes.io/projected/c431ae87-ef51-42f6-8ace-cd59ac685e50-kube-api-access-7dtkd\") pod \"dnsmasq-dns-b47ddbdf5-rtwnv\" (UID: \"c431ae87-ef51-42f6-8ace-cd59ac685e50\") " pod="openstack/dnsmasq-dns-b47ddbdf5-rtwnv" Mar 09 03:00:47 crc kubenswrapper[4901]: I0309 03:00:47.361198 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c431ae87-ef51-42f6-8ace-cd59ac685e50-config\") pod \"dnsmasq-dns-b47ddbdf5-rtwnv\" (UID: \"c431ae87-ef51-42f6-8ace-cd59ac685e50\") " pod="openstack/dnsmasq-dns-b47ddbdf5-rtwnv" Mar 09 03:00:47 crc kubenswrapper[4901]: I0309 03:00:47.361390 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c431ae87-ef51-42f6-8ace-cd59ac685e50-ovsdbserver-nb\") pod \"dnsmasq-dns-b47ddbdf5-rtwnv\" (UID: \"c431ae87-ef51-42f6-8ace-cd59ac685e50\") " pod="openstack/dnsmasq-dns-b47ddbdf5-rtwnv" Mar 09 03:00:47 crc kubenswrapper[4901]: I0309 03:00:47.390198 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-659ddb758c-h5cnt"] Mar 09 03:00:47 crc kubenswrapper[4901]: I0309 03:00:47.391615 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-659ddb758c-h5cnt" Mar 09 03:00:47 crc kubenswrapper[4901]: I0309 03:00:47.408588 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 09 03:00:47 crc kubenswrapper[4901]: I0309 03:00:47.431044 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-659ddb758c-h5cnt"] Mar 09 03:00:47 crc kubenswrapper[4901]: I0309 03:00:47.465982 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c431ae87-ef51-42f6-8ace-cd59ac685e50-ovsdbserver-nb\") pod \"dnsmasq-dns-b47ddbdf5-rtwnv\" (UID: \"c431ae87-ef51-42f6-8ace-cd59ac685e50\") " pod="openstack/dnsmasq-dns-b47ddbdf5-rtwnv" Mar 09 03:00:47 crc kubenswrapper[4901]: I0309 03:00:47.466042 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/171527dd-3e70-4f4e-8b52-9302b6fd6238-ovsdbserver-nb\") pod \"dnsmasq-dns-659ddb758c-h5cnt\" (UID: \"171527dd-3e70-4f4e-8b52-9302b6fd6238\") " pod="openstack/dnsmasq-dns-659ddb758c-h5cnt" Mar 09 03:00:47 crc kubenswrapper[4901]: I0309 03:00:47.466076 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c431ae87-ef51-42f6-8ace-cd59ac685e50-dns-svc\") pod \"dnsmasq-dns-b47ddbdf5-rtwnv\" (UID: \"c431ae87-ef51-42f6-8ace-cd59ac685e50\") " pod="openstack/dnsmasq-dns-b47ddbdf5-rtwnv" Mar 09 03:00:47 crc kubenswrapper[4901]: I0309 03:00:47.466113 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dtkd\" (UniqueName: \"kubernetes.io/projected/c431ae87-ef51-42f6-8ace-cd59ac685e50-kube-api-access-7dtkd\") pod \"dnsmasq-dns-b47ddbdf5-rtwnv\" (UID: \"c431ae87-ef51-42f6-8ace-cd59ac685e50\") " pod="openstack/dnsmasq-dns-b47ddbdf5-rtwnv" Mar 09 03:00:47 crc kubenswrapper[4901]: I0309 03:00:47.466216 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/171527dd-3e70-4f4e-8b52-9302b6fd6238-ovsdbserver-sb\") pod \"dnsmasq-dns-659ddb758c-h5cnt\" (UID: \"171527dd-3e70-4f4e-8b52-9302b6fd6238\") " pod="openstack/dnsmasq-dns-659ddb758c-h5cnt" Mar 09 03:00:47 crc kubenswrapper[4901]: I0309 03:00:47.466264 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz5n2\" (UniqueName: \"kubernetes.io/projected/171527dd-3e70-4f4e-8b52-9302b6fd6238-kube-api-access-xz5n2\") pod \"dnsmasq-dns-659ddb758c-h5cnt\" (UID: \"171527dd-3e70-4f4e-8b52-9302b6fd6238\") " pod="openstack/dnsmasq-dns-659ddb758c-h5cnt" Mar 09 03:00:47 crc kubenswrapper[4901]: I0309 03:00:47.466288 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/171527dd-3e70-4f4e-8b52-9302b6fd6238-config\") pod \"dnsmasq-dns-659ddb758c-h5cnt\" (UID: \"171527dd-3e70-4f4e-8b52-9302b6fd6238\") " pod="openstack/dnsmasq-dns-659ddb758c-h5cnt" Mar 09 03:00:47 crc kubenswrapper[4901]: I0309 03:00:47.466320 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/171527dd-3e70-4f4e-8b52-9302b6fd6238-dns-svc\") pod \"dnsmasq-dns-659ddb758c-h5cnt\" (UID: \"171527dd-3e70-4f4e-8b52-9302b6fd6238\") " pod="openstack/dnsmasq-dns-659ddb758c-h5cnt" Mar 09 03:00:47 crc kubenswrapper[4901]: I0309 03:00:47.466362 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c431ae87-ef51-42f6-8ace-cd59ac685e50-config\") pod \"dnsmasq-dns-b47ddbdf5-rtwnv\" (UID: \"c431ae87-ef51-42f6-8ace-cd59ac685e50\") " pod="openstack/dnsmasq-dns-b47ddbdf5-rtwnv" Mar 09 03:00:47 crc kubenswrapper[4901]: I0309 03:00:47.467173 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c431ae87-ef51-42f6-8ace-cd59ac685e50-config\") pod \"dnsmasq-dns-b47ddbdf5-rtwnv\" (UID: \"c431ae87-ef51-42f6-8ace-cd59ac685e50\") " pod="openstack/dnsmasq-dns-b47ddbdf5-rtwnv" Mar 09 03:00:47 crc kubenswrapper[4901]: I0309 03:00:47.467249 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c431ae87-ef51-42f6-8ace-cd59ac685e50-ovsdbserver-nb\") pod \"dnsmasq-dns-b47ddbdf5-rtwnv\" (UID: \"c431ae87-ef51-42f6-8ace-cd59ac685e50\") " pod="openstack/dnsmasq-dns-b47ddbdf5-rtwnv" Mar 09 03:00:47 crc kubenswrapper[4901]: I0309 03:00:47.467817 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c431ae87-ef51-42f6-8ace-cd59ac685e50-dns-svc\") pod \"dnsmasq-dns-b47ddbdf5-rtwnv\" (UID: \"c431ae87-ef51-42f6-8ace-cd59ac685e50\") " pod="openstack/dnsmasq-dns-b47ddbdf5-rtwnv" Mar 09 03:00:47 crc kubenswrapper[4901]: I0309 03:00:47.500402 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dtkd\" (UniqueName: \"kubernetes.io/projected/c431ae87-ef51-42f6-8ace-cd59ac685e50-kube-api-access-7dtkd\") pod \"dnsmasq-dns-b47ddbdf5-rtwnv\" (UID: \"c431ae87-ef51-42f6-8ace-cd59ac685e50\") " pod="openstack/dnsmasq-dns-b47ddbdf5-rtwnv" Mar 09 03:00:47 crc kubenswrapper[4901]: I0309 03:00:47.551148 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b47ddbdf5-rtwnv" Mar 09 03:00:47 crc kubenswrapper[4901]: I0309 03:00:47.567580 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/171527dd-3e70-4f4e-8b52-9302b6fd6238-ovsdbserver-nb\") pod \"dnsmasq-dns-659ddb758c-h5cnt\" (UID: \"171527dd-3e70-4f4e-8b52-9302b6fd6238\") " pod="openstack/dnsmasq-dns-659ddb758c-h5cnt" Mar 09 03:00:47 crc kubenswrapper[4901]: I0309 03:00:47.567676 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/171527dd-3e70-4f4e-8b52-9302b6fd6238-ovsdbserver-sb\") pod \"dnsmasq-dns-659ddb758c-h5cnt\" (UID: \"171527dd-3e70-4f4e-8b52-9302b6fd6238\") " pod="openstack/dnsmasq-dns-659ddb758c-h5cnt" Mar 09 03:00:47 crc kubenswrapper[4901]: I0309 03:00:47.567700 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xz5n2\" (UniqueName: \"kubernetes.io/projected/171527dd-3e70-4f4e-8b52-9302b6fd6238-kube-api-access-xz5n2\") pod \"dnsmasq-dns-659ddb758c-h5cnt\" (UID: \"171527dd-3e70-4f4e-8b52-9302b6fd6238\") " pod="openstack/dnsmasq-dns-659ddb758c-h5cnt" Mar 09 03:00:47 crc kubenswrapper[4901]: I0309 03:00:47.567720 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/171527dd-3e70-4f4e-8b52-9302b6fd6238-config\") pod \"dnsmasq-dns-659ddb758c-h5cnt\" (UID: \"171527dd-3e70-4f4e-8b52-9302b6fd6238\") " pod="openstack/dnsmasq-dns-659ddb758c-h5cnt" Mar 09 03:00:47 crc kubenswrapper[4901]: I0309 03:00:47.567745 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/171527dd-3e70-4f4e-8b52-9302b6fd6238-dns-svc\") pod \"dnsmasq-dns-659ddb758c-h5cnt\" (UID: \"171527dd-3e70-4f4e-8b52-9302b6fd6238\") " pod="openstack/dnsmasq-dns-659ddb758c-h5cnt" Mar 09 03:00:47 crc kubenswrapper[4901]: I0309 03:00:47.568504 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/171527dd-3e70-4f4e-8b52-9302b6fd6238-dns-svc\") pod \"dnsmasq-dns-659ddb758c-h5cnt\" (UID: \"171527dd-3e70-4f4e-8b52-9302b6fd6238\") " pod="openstack/dnsmasq-dns-659ddb758c-h5cnt" Mar 09 03:00:47 crc kubenswrapper[4901]: I0309 03:00:47.569006 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/171527dd-3e70-4f4e-8b52-9302b6fd6238-ovsdbserver-nb\") pod \"dnsmasq-dns-659ddb758c-h5cnt\" (UID: \"171527dd-3e70-4f4e-8b52-9302b6fd6238\") " pod="openstack/dnsmasq-dns-659ddb758c-h5cnt" Mar 09 03:00:47 crc kubenswrapper[4901]: I0309 03:00:47.569997 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/171527dd-3e70-4f4e-8b52-9302b6fd6238-ovsdbserver-sb\") pod \"dnsmasq-dns-659ddb758c-h5cnt\" (UID: \"171527dd-3e70-4f4e-8b52-9302b6fd6238\") " pod="openstack/dnsmasq-dns-659ddb758c-h5cnt" Mar 09 03:00:47 crc kubenswrapper[4901]: I0309 03:00:47.571212 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/171527dd-3e70-4f4e-8b52-9302b6fd6238-config\") pod \"dnsmasq-dns-659ddb758c-h5cnt\" (UID: \"171527dd-3e70-4f4e-8b52-9302b6fd6238\") " pod="openstack/dnsmasq-dns-659ddb758c-h5cnt" Mar 09 03:00:47 crc kubenswrapper[4901]: I0309 03:00:47.634957 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz5n2\" (UniqueName: \"kubernetes.io/projected/171527dd-3e70-4f4e-8b52-9302b6fd6238-kube-api-access-xz5n2\") pod \"dnsmasq-dns-659ddb758c-h5cnt\" (UID: \"171527dd-3e70-4f4e-8b52-9302b6fd6238\") " pod="openstack/dnsmasq-dns-659ddb758c-h5cnt" Mar 09 03:00:47 crc kubenswrapper[4901]: I0309 03:00:47.733491 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-659ddb758c-h5cnt" Mar 09 03:00:48 crc kubenswrapper[4901]: I0309 03:00:48.197679 4901 generic.go:334] "Generic (PLEG): container finished" podID="9df0684a-2816-4af7-97cf-00e31c542eef" containerID="0c2c9885ccda1bae73c95e9243ec29fe06b6d01586a48419ab131559dc2b48fa" exitCode=0 Mar 09 03:00:48 crc kubenswrapper[4901]: I0309 03:00:48.197751 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"9df0684a-2816-4af7-97cf-00e31c542eef","Type":"ContainerDied","Data":"0c2c9885ccda1bae73c95e9243ec29fe06b6d01586a48419ab131559dc2b48fa"} Mar 09 03:00:48 crc kubenswrapper[4901]: I0309 03:00:48.201717 4901 generic.go:334] "Generic (PLEG): container finished" podID="f0098aa8-4248-48ec-a254-368c395308b1" containerID="3a05919e373ebb6f88b2fb0ed9c30b2b394efeac74eb31a4d3a3029fe54bc70d" exitCode=0 Mar 09 03:00:48 crc kubenswrapper[4901]: I0309 03:00:48.201805 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f0098aa8-4248-48ec-a254-368c395308b1","Type":"ContainerDied","Data":"3a05919e373ebb6f88b2fb0ed9c30b2b394efeac74eb31a4d3a3029fe54bc70d"} Mar 09 03:00:48 crc kubenswrapper[4901]: I0309 03:00:48.585149 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79f9fc56ff-fc488" Mar 09 03:00:48 crc kubenswrapper[4901]: I0309 03:00:48.590718 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-j2tdt" Mar 09 03:00:48 crc kubenswrapper[4901]: I0309 03:00:48.685652 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgnnz\" (UniqueName: \"kubernetes.io/projected/c09579bf-bfa6-4f4a-b7c3-4be8a914270e-kube-api-access-bgnnz\") pod \"c09579bf-bfa6-4f4a-b7c3-4be8a914270e\" (UID: \"c09579bf-bfa6-4f4a-b7c3-4be8a914270e\") " Mar 09 03:00:48 crc kubenswrapper[4901]: I0309 03:00:48.685689 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8xdz\" (UniqueName: \"kubernetes.io/projected/375fef20-b677-465b-bc26-cb609f6babae-kube-api-access-g8xdz\") pod \"375fef20-b677-465b-bc26-cb609f6babae\" (UID: \"375fef20-b677-465b-bc26-cb609f6babae\") " Mar 09 03:00:48 crc kubenswrapper[4901]: I0309 03:00:48.685759 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c09579bf-bfa6-4f4a-b7c3-4be8a914270e-config\") pod \"c09579bf-bfa6-4f4a-b7c3-4be8a914270e\" (UID: \"c09579bf-bfa6-4f4a-b7c3-4be8a914270e\") " Mar 09 03:00:48 crc kubenswrapper[4901]: I0309 03:00:48.685813 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/375fef20-b677-465b-bc26-cb609f6babae-config\") pod \"375fef20-b677-465b-bc26-cb609f6babae\" (UID: \"375fef20-b677-465b-bc26-cb609f6babae\") " Mar 09 03:00:48 crc kubenswrapper[4901]: I0309 03:00:48.685848 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c09579bf-bfa6-4f4a-b7c3-4be8a914270e-dns-svc\") pod \"c09579bf-bfa6-4f4a-b7c3-4be8a914270e\" (UID: \"c09579bf-bfa6-4f4a-b7c3-4be8a914270e\") " Mar 09 03:00:48 crc kubenswrapper[4901]: I0309 03:00:48.685868 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/375fef20-b677-465b-bc26-cb609f6babae-dns-svc\") pod \"375fef20-b677-465b-bc26-cb609f6babae\" (UID: \"375fef20-b677-465b-bc26-cb609f6babae\") " Mar 09 03:00:48 crc kubenswrapper[4901]: I0309 03:00:48.686889 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c09579bf-bfa6-4f4a-b7c3-4be8a914270e-config" (OuterVolumeSpecName: "config") pod "c09579bf-bfa6-4f4a-b7c3-4be8a914270e" (UID: "c09579bf-bfa6-4f4a-b7c3-4be8a914270e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:00:48 crc kubenswrapper[4901]: I0309 03:00:48.687248 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/375fef20-b677-465b-bc26-cb609f6babae-config" (OuterVolumeSpecName: "config") pod "375fef20-b677-465b-bc26-cb609f6babae" (UID: "375fef20-b677-465b-bc26-cb609f6babae"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:00:48 crc kubenswrapper[4901]: I0309 03:00:48.687370 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/375fef20-b677-465b-bc26-cb609f6babae-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "375fef20-b677-465b-bc26-cb609f6babae" (UID: "375fef20-b677-465b-bc26-cb609f6babae"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:00:48 crc kubenswrapper[4901]: I0309 03:00:48.687454 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c09579bf-bfa6-4f4a-b7c3-4be8a914270e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c09579bf-bfa6-4f4a-b7c3-4be8a914270e" (UID: "c09579bf-bfa6-4f4a-b7c3-4be8a914270e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:00:48 crc kubenswrapper[4901]: I0309 03:00:48.689328 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c09579bf-bfa6-4f4a-b7c3-4be8a914270e-kube-api-access-bgnnz" (OuterVolumeSpecName: "kube-api-access-bgnnz") pod "c09579bf-bfa6-4f4a-b7c3-4be8a914270e" (UID: "c09579bf-bfa6-4f4a-b7c3-4be8a914270e"). InnerVolumeSpecName "kube-api-access-bgnnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:00:48 crc kubenswrapper[4901]: I0309 03:00:48.690798 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/375fef20-b677-465b-bc26-cb609f6babae-kube-api-access-g8xdz" (OuterVolumeSpecName: "kube-api-access-g8xdz") pod "375fef20-b677-465b-bc26-cb609f6babae" (UID: "375fef20-b677-465b-bc26-cb609f6babae"). InnerVolumeSpecName "kube-api-access-g8xdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:00:48 crc kubenswrapper[4901]: I0309 03:00:48.787421 4901 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/375fef20-b677-465b-bc26-cb609f6babae-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 03:00:48 crc kubenswrapper[4901]: I0309 03:00:48.787500 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgnnz\" (UniqueName: \"kubernetes.io/projected/c09579bf-bfa6-4f4a-b7c3-4be8a914270e-kube-api-access-bgnnz\") on node \"crc\" DevicePath \"\"" Mar 09 03:00:48 crc kubenswrapper[4901]: I0309 03:00:48.787552 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8xdz\" (UniqueName: \"kubernetes.io/projected/375fef20-b677-465b-bc26-cb609f6babae-kube-api-access-g8xdz\") on node \"crc\" DevicePath \"\"" Mar 09 03:00:48 crc kubenswrapper[4901]: I0309 03:00:48.787573 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c09579bf-bfa6-4f4a-b7c3-4be8a914270e-config\") on node \"crc\" DevicePath \"\"" Mar 09 03:00:48 crc kubenswrapper[4901]: I0309 03:00:48.787589 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/375fef20-b677-465b-bc26-cb609f6babae-config\") on node \"crc\" DevicePath \"\"" Mar 09 03:00:48 crc kubenswrapper[4901]: I0309 03:00:48.787605 4901 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c09579bf-bfa6-4f4a-b7c3-4be8a914270e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 03:00:49 crc kubenswrapper[4901]: I0309 03:00:49.211930 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-j2tdt" event={"ID":"c09579bf-bfa6-4f4a-b7c3-4be8a914270e","Type":"ContainerDied","Data":"0c0bcb4e430b06af5a8ad6ac3f00c53d885d8630c6fbfebe108a30883eb120d8"} Mar 09 03:00:49 crc kubenswrapper[4901]: I0309 03:00:49.212045 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-j2tdt" Mar 09 03:00:49 crc kubenswrapper[4901]: I0309 03:00:49.219723 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79f9fc56ff-fc488" event={"ID":"375fef20-b677-465b-bc26-cb609f6babae","Type":"ContainerDied","Data":"95b56f47a0b9ac70edf505f6646db379f6a4fb3662601fa0b96e0ace3ab139c1"} Mar 09 03:00:49 crc kubenswrapper[4901]: I0309 03:00:49.219785 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79f9fc56ff-fc488" Mar 09 03:00:49 crc kubenswrapper[4901]: I0309 03:00:49.284352 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-j2tdt"] Mar 09 03:00:49 crc kubenswrapper[4901]: I0309 03:00:49.294581 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-j2tdt"] Mar 09 03:00:49 crc kubenswrapper[4901]: I0309 03:00:49.316166 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79f9fc56ff-fc488"] Mar 09 03:00:49 crc kubenswrapper[4901]: I0309 03:00:49.321238 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79f9fc56ff-fc488"] Mar 09 03:00:49 crc kubenswrapper[4901]: I0309 03:00:49.521587 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 09 03:00:49 crc kubenswrapper[4901]: I0309 03:00:49.871049 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-nmx84"] Mar 09 03:00:49 crc kubenswrapper[4901]: W0309 03:00:49.876001 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod065dfe75_7489_4b15_8a4d_4adf13393aea.slice/crio-e9fb4b2a34aaff405e7d422b2c31e892d446b9f1b8c75813f2f8499868a73aed WatchSource:0}: Error finding container e9fb4b2a34aaff405e7d422b2c31e892d446b9f1b8c75813f2f8499868a73aed: Status 404 returned error can't find the container with id e9fb4b2a34aaff405e7d422b2c31e892d446b9f1b8c75813f2f8499868a73aed Mar 09 03:00:49 crc kubenswrapper[4901]: I0309 03:00:49.964990 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-659ddb758c-h5cnt"] Mar 09 03:00:49 crc kubenswrapper[4901]: I0309 03:00:49.974888 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b47ddbdf5-rtwnv"] Mar 09 03:00:50 crc kubenswrapper[4901]: W0309 03:00:50.116304 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc431ae87_ef51_42f6_8ace_cd59ac685e50.slice/crio-d741d7455bb9a1b175ac7cbd4a02faff701c2968e20c1e248b918cc9d750d1e7 WatchSource:0}: Error finding container d741d7455bb9a1b175ac7cbd4a02faff701c2968e20c1e248b918cc9d750d1e7: Status 404 returned error can't find the container with id d741d7455bb9a1b175ac7cbd4a02faff701c2968e20c1e248b918cc9d750d1e7 Mar 09 03:00:50 crc kubenswrapper[4901]: W0309 03:00:50.116973 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod171527dd_3e70_4f4e_8b52_9302b6fd6238.slice/crio-42c575fa96b6650caf9ba60b5cc69c7c9bec4dd715d2462e05d19f2f671ca948 WatchSource:0}: Error finding container 42c575fa96b6650caf9ba60b5cc69c7c9bec4dd715d2462e05d19f2f671ca948: Status 404 returned error can't find the container with id 42c575fa96b6650caf9ba60b5cc69c7c9bec4dd715d2462e05d19f2f671ca948 Mar 09 03:00:50 crc kubenswrapper[4901]: I0309 03:00:50.120537 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="375fef20-b677-465b-bc26-cb609f6babae" path="/var/lib/kubelet/pods/375fef20-b677-465b-bc26-cb609f6babae/volumes" Mar 09 03:00:50 crc kubenswrapper[4901]: I0309 03:00:50.121400 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c09579bf-bfa6-4f4a-b7c3-4be8a914270e" path="/var/lib/kubelet/pods/c09579bf-bfa6-4f4a-b7c3-4be8a914270e/volumes" Mar 09 03:00:50 crc kubenswrapper[4901]: I0309 03:00:50.226989 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"167ad9cc-678d-499b-9be0-2e74112f84c9","Type":"ContainerStarted","Data":"6f9b2c7bb5b12105cf493783a9ea56c27bfdd210b2404ad5c2701248c69906c1"} Mar 09 03:00:50 crc kubenswrapper[4901]: I0309 03:00:50.227988 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-659ddb758c-h5cnt" event={"ID":"171527dd-3e70-4f4e-8b52-9302b6fd6238","Type":"ContainerStarted","Data":"42c575fa96b6650caf9ba60b5cc69c7c9bec4dd715d2462e05d19f2f671ca948"} Mar 09 03:00:50 crc kubenswrapper[4901]: I0309 03:00:50.229427 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"9df0684a-2816-4af7-97cf-00e31c542eef","Type":"ContainerStarted","Data":"a30e9af390a47f07275ae732e18499bb625ab606c1ab75fcb6f396d64e6313b0"} Mar 09 03:00:50 crc kubenswrapper[4901]: I0309 03:00:50.230680 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f0098aa8-4248-48ec-a254-368c395308b1","Type":"ContainerStarted","Data":"b4c164d97e9bee042efac1c9e73f97d746429bbdb1ecf72f43fc47c10cd2ec24"} Mar 09 03:00:50 crc kubenswrapper[4901]: I0309 03:00:50.231994 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"c86c22f2-896c-4c29-95c7-024aea61dcd2","Type":"ContainerStarted","Data":"85a90efd98e045964af92120b771705db922d73189fbe96e8a40862a41bec02c"} Mar 09 03:00:50 crc kubenswrapper[4901]: I0309 03:00:50.232675 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-nmx84" event={"ID":"065dfe75-7489-4b15-8a4d-4adf13393aea","Type":"ContainerStarted","Data":"e9fb4b2a34aaff405e7d422b2c31e892d446b9f1b8c75813f2f8499868a73aed"} Mar 09 03:00:50 crc kubenswrapper[4901]: I0309 03:00:50.233352 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b47ddbdf5-rtwnv" event={"ID":"c431ae87-ef51-42f6-8ace-cd59ac685e50","Type":"ContainerStarted","Data":"d741d7455bb9a1b175ac7cbd4a02faff701c2968e20c1e248b918cc9d750d1e7"} Mar 09 03:00:50 crc kubenswrapper[4901]: I0309 03:00:50.287168 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=13.614832278 podStartE2EDuration="25.287148427s" podCreationTimestamp="2026-03-09 03:00:25 +0000 UTC" firstStartedPulling="2026-03-09 03:00:37.708995248 +0000 UTC m=+1162.298658980" lastFinishedPulling="2026-03-09 03:00:49.381311397 +0000 UTC m=+1173.970975129" observedRunningTime="2026-03-09 03:00:50.280032078 +0000 UTC m=+1174.869695810" watchObservedRunningTime="2026-03-09 03:00:50.287148427 +0000 UTC m=+1174.876812159" Mar 09 03:00:50 crc kubenswrapper[4901]: I0309 03:00:50.288946 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=11.222030676 podStartE2EDuration="23.288938912s" podCreationTimestamp="2026-03-09 03:00:27 +0000 UTC" firstStartedPulling="2026-03-09 03:00:37.318399492 +0000 UTC m=+1161.908063214" lastFinishedPulling="2026-03-09 03:00:49.385307718 +0000 UTC m=+1173.974971450" observedRunningTime="2026-03-09 03:00:50.253855347 +0000 UTC m=+1174.843519079" watchObservedRunningTime="2026-03-09 03:00:50.288938912 +0000 UTC m=+1174.878602644" Mar 09 03:00:50 crc kubenswrapper[4901]: I0309 03:00:50.306872 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=26.616034643 podStartE2EDuration="34.306854104s" podCreationTimestamp="2026-03-09 03:00:16 +0000 UTC" firstStartedPulling="2026-03-09 03:00:36.513243091 +0000 UTC m=+1161.102906833" lastFinishedPulling="2026-03-09 03:00:44.204062542 +0000 UTC m=+1168.793726294" observedRunningTime="2026-03-09 03:00:50.304341651 +0000 UTC m=+1174.894005393" watchObservedRunningTime="2026-03-09 03:00:50.306854104 +0000 UTC m=+1174.896517826" Mar 09 03:00:50 crc kubenswrapper[4901]: I0309 03:00:50.335832 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=26.089348996 podStartE2EDuration="33.335814045s" podCreationTimestamp="2026-03-09 03:00:17 +0000 UTC" firstStartedPulling="2026-03-09 03:00:36.513516447 +0000 UTC m=+1161.103180219" lastFinishedPulling="2026-03-09 03:00:43.759981526 +0000 UTC m=+1168.349645268" observedRunningTime="2026-03-09 03:00:50.332637425 +0000 UTC m=+1174.922301157" watchObservedRunningTime="2026-03-09 03:00:50.335814045 +0000 UTC m=+1174.925477777" Mar 09 03:00:50 crc kubenswrapper[4901]: I0309 03:00:50.942433 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 09 03:00:50 crc kubenswrapper[4901]: I0309 03:00:50.987417 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 09 03:00:51 crc kubenswrapper[4901]: I0309 03:00:51.248271 4901 generic.go:334] "Generic (PLEG): container finished" podID="c431ae87-ef51-42f6-8ace-cd59ac685e50" containerID="df9cd0199fcc69f609437b41bc77f246ff247244dbcd9149c203e85cb217921b" exitCode=0 Mar 09 03:00:51 crc kubenswrapper[4901]: I0309 03:00:51.248350 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b47ddbdf5-rtwnv" event={"ID":"c431ae87-ef51-42f6-8ace-cd59ac685e50","Type":"ContainerDied","Data":"df9cd0199fcc69f609437b41bc77f246ff247244dbcd9149c203e85cb217921b"} Mar 09 03:00:51 crc kubenswrapper[4901]: I0309 03:00:51.256633 4901 generic.go:334] "Generic (PLEG): container finished" podID="171527dd-3e70-4f4e-8b52-9302b6fd6238" containerID="5ec18fba2a1c1f77b64b6ed489e33eb3eaf7fc823ecee63156c904288d9accd5" exitCode=0 Mar 09 03:00:51 crc kubenswrapper[4901]: I0309 03:00:51.256716 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-659ddb758c-h5cnt" event={"ID":"171527dd-3e70-4f4e-8b52-9302b6fd6238","Type":"ContainerDied","Data":"5ec18fba2a1c1f77b64b6ed489e33eb3eaf7fc823ecee63156c904288d9accd5"} Mar 09 03:00:51 crc kubenswrapper[4901]: I0309 03:00:51.259392 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"46c7df0b-fc0a-4fd9-b097-72da03442510","Type":"ContainerStarted","Data":"16bb4afb1c0b882241395638795c8e5f3d5e49f87188f36ca44e3bfb83ad26f7"} Mar 09 03:00:51 crc kubenswrapper[4901]: I0309 03:00:51.260998 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-nmx84" event={"ID":"065dfe75-7489-4b15-8a4d-4adf13393aea","Type":"ContainerStarted","Data":"97ab13e50a94e0d652fdbd979cbe25cc835cb2286e4d09687a06d52e3b5f01f1"} Mar 09 03:00:51 crc kubenswrapper[4901]: I0309 03:00:51.261342 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 09 03:00:51 crc kubenswrapper[4901]: I0309 03:00:51.307702 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 09 03:00:51 crc kubenswrapper[4901]: I0309 03:00:51.320815 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-nmx84" podStartSLOduration=5.320793182 podStartE2EDuration="5.320793182s" podCreationTimestamp="2026-03-09 03:00:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 03:00:51.310660366 +0000 UTC m=+1175.900324108" watchObservedRunningTime="2026-03-09 03:00:51.320793182 +0000 UTC m=+1175.910456914" Mar 09 03:00:51 crc kubenswrapper[4901]: I0309 03:00:51.757754 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 09 03:00:51 crc kubenswrapper[4901]: I0309 03:00:51.845527 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-659ddb758c-h5cnt"] Mar 09 03:00:51 crc kubenswrapper[4901]: I0309 03:00:51.881716 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58df884995-7dflj"] Mar 09 03:00:51 crc kubenswrapper[4901]: I0309 03:00:51.884561 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58df884995-7dflj" Mar 09 03:00:51 crc kubenswrapper[4901]: I0309 03:00:51.902112 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58df884995-7dflj"] Mar 09 03:00:52 crc kubenswrapper[4901]: I0309 03:00:52.040177 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb2ebb18-ec3e-4597-a641-de94b57c923d-config\") pod \"dnsmasq-dns-58df884995-7dflj\" (UID: \"eb2ebb18-ec3e-4597-a641-de94b57c923d\") " pod="openstack/dnsmasq-dns-58df884995-7dflj" Mar 09 03:00:52 crc kubenswrapper[4901]: I0309 03:00:52.040245 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkmpp\" (UniqueName: \"kubernetes.io/projected/eb2ebb18-ec3e-4597-a641-de94b57c923d-kube-api-access-nkmpp\") pod \"dnsmasq-dns-58df884995-7dflj\" (UID: \"eb2ebb18-ec3e-4597-a641-de94b57c923d\") " pod="openstack/dnsmasq-dns-58df884995-7dflj" Mar 09 03:00:52 crc kubenswrapper[4901]: I0309 03:00:52.040267 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb2ebb18-ec3e-4597-a641-de94b57c923d-ovsdbserver-sb\") pod \"dnsmasq-dns-58df884995-7dflj\" (UID: \"eb2ebb18-ec3e-4597-a641-de94b57c923d\") " pod="openstack/dnsmasq-dns-58df884995-7dflj" Mar 09 03:00:52 crc kubenswrapper[4901]: I0309 03:00:52.040301 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb2ebb18-ec3e-4597-a641-de94b57c923d-ovsdbserver-nb\") pod \"dnsmasq-dns-58df884995-7dflj\" (UID: \"eb2ebb18-ec3e-4597-a641-de94b57c923d\") " pod="openstack/dnsmasq-dns-58df884995-7dflj" Mar 09 03:00:52 crc kubenswrapper[4901]: I0309 03:00:52.040491 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb2ebb18-ec3e-4597-a641-de94b57c923d-dns-svc\") pod \"dnsmasq-dns-58df884995-7dflj\" (UID: \"eb2ebb18-ec3e-4597-a641-de94b57c923d\") " pod="openstack/dnsmasq-dns-58df884995-7dflj" Mar 09 03:00:52 crc kubenswrapper[4901]: I0309 03:00:52.141787 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb2ebb18-ec3e-4597-a641-de94b57c923d-dns-svc\") pod \"dnsmasq-dns-58df884995-7dflj\" (UID: \"eb2ebb18-ec3e-4597-a641-de94b57c923d\") " pod="openstack/dnsmasq-dns-58df884995-7dflj" Mar 09 03:00:52 crc kubenswrapper[4901]: I0309 03:00:52.141877 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb2ebb18-ec3e-4597-a641-de94b57c923d-config\") pod \"dnsmasq-dns-58df884995-7dflj\" (UID: \"eb2ebb18-ec3e-4597-a641-de94b57c923d\") " pod="openstack/dnsmasq-dns-58df884995-7dflj" Mar 09 03:00:52 crc kubenswrapper[4901]: I0309 03:00:52.141900 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkmpp\" (UniqueName: \"kubernetes.io/projected/eb2ebb18-ec3e-4597-a641-de94b57c923d-kube-api-access-nkmpp\") pod \"dnsmasq-dns-58df884995-7dflj\" (UID: \"eb2ebb18-ec3e-4597-a641-de94b57c923d\") " pod="openstack/dnsmasq-dns-58df884995-7dflj" Mar 09 03:00:52 crc kubenswrapper[4901]: I0309 03:00:52.141916 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb2ebb18-ec3e-4597-a641-de94b57c923d-ovsdbserver-sb\") pod \"dnsmasq-dns-58df884995-7dflj\" (UID: \"eb2ebb18-ec3e-4597-a641-de94b57c923d\") " pod="openstack/dnsmasq-dns-58df884995-7dflj" Mar 09 03:00:52 crc kubenswrapper[4901]: I0309 03:00:52.141935 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb2ebb18-ec3e-4597-a641-de94b57c923d-ovsdbserver-nb\") pod \"dnsmasq-dns-58df884995-7dflj\" (UID: \"eb2ebb18-ec3e-4597-a641-de94b57c923d\") " pod="openstack/dnsmasq-dns-58df884995-7dflj" Mar 09 03:00:52 crc kubenswrapper[4901]: I0309 03:00:52.142565 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb2ebb18-ec3e-4597-a641-de94b57c923d-dns-svc\") pod \"dnsmasq-dns-58df884995-7dflj\" (UID: \"eb2ebb18-ec3e-4597-a641-de94b57c923d\") " pod="openstack/dnsmasq-dns-58df884995-7dflj" Mar 09 03:00:52 crc kubenswrapper[4901]: I0309 03:00:52.143052 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb2ebb18-ec3e-4597-a641-de94b57c923d-ovsdbserver-nb\") pod \"dnsmasq-dns-58df884995-7dflj\" (UID: \"eb2ebb18-ec3e-4597-a641-de94b57c923d\") " pod="openstack/dnsmasq-dns-58df884995-7dflj" Mar 09 03:00:52 crc kubenswrapper[4901]: I0309 03:00:52.143295 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb2ebb18-ec3e-4597-a641-de94b57c923d-config\") pod \"dnsmasq-dns-58df884995-7dflj\" (UID: \"eb2ebb18-ec3e-4597-a641-de94b57c923d\") " pod="openstack/dnsmasq-dns-58df884995-7dflj" Mar 09 03:00:52 crc kubenswrapper[4901]: I0309 03:00:52.143563 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb2ebb18-ec3e-4597-a641-de94b57c923d-ovsdbserver-sb\") pod \"dnsmasq-dns-58df884995-7dflj\" (UID: \"eb2ebb18-ec3e-4597-a641-de94b57c923d\") " pod="openstack/dnsmasq-dns-58df884995-7dflj" Mar 09 03:00:52 crc kubenswrapper[4901]: I0309 03:00:52.165996 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkmpp\" (UniqueName: \"kubernetes.io/projected/eb2ebb18-ec3e-4597-a641-de94b57c923d-kube-api-access-nkmpp\") pod \"dnsmasq-dns-58df884995-7dflj\" (UID: \"eb2ebb18-ec3e-4597-a641-de94b57c923d\") " pod="openstack/dnsmasq-dns-58df884995-7dflj" Mar 09 03:00:52 crc kubenswrapper[4901]: I0309 03:00:52.210185 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58df884995-7dflj" Mar 09 03:00:52 crc kubenswrapper[4901]: I0309 03:00:52.272527 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b47ddbdf5-rtwnv" event={"ID":"c431ae87-ef51-42f6-8ace-cd59ac685e50","Type":"ContainerStarted","Data":"6ac311bce46a5221b5a6a5158083609bd8be2a49eb80aba6263cd5efcc18529d"} Mar 09 03:00:52 crc kubenswrapper[4901]: I0309 03:00:52.272617 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b47ddbdf5-rtwnv" Mar 09 03:00:52 crc kubenswrapper[4901]: I0309 03:00:52.274953 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-659ddb758c-h5cnt" event={"ID":"171527dd-3e70-4f4e-8b52-9302b6fd6238","Type":"ContainerStarted","Data":"25377e9b21412a91545980379c30f3ccd01fe4876bc92c11e5fa4f0715c34cc7"} Mar 09 03:00:52 crc kubenswrapper[4901]: I0309 03:00:52.276052 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-659ddb758c-h5cnt" Mar 09 03:00:52 crc kubenswrapper[4901]: I0309 03:00:52.314801 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b47ddbdf5-rtwnv" podStartSLOduration=4.902869362 podStartE2EDuration="5.314778147s" podCreationTimestamp="2026-03-09 03:00:47 +0000 UTC" firstStartedPulling="2026-03-09 03:00:50.119438355 +0000 UTC m=+1174.709102127" lastFinishedPulling="2026-03-09 03:00:50.53134718 +0000 UTC m=+1175.121010912" observedRunningTime="2026-03-09 03:00:52.291797817 +0000 UTC m=+1176.881461559" watchObservedRunningTime="2026-03-09 03:00:52.314778147 +0000 UTC m=+1176.904441909" Mar 09 03:00:52 crc kubenswrapper[4901]: I0309 03:00:52.317817 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-659ddb758c-h5cnt" podStartSLOduration=4.806209842 podStartE2EDuration="5.317803693s" podCreationTimestamp="2026-03-09 03:00:47 +0000 UTC" firstStartedPulling="2026-03-09 03:00:50.121387594 +0000 UTC m=+1174.711051326" lastFinishedPulling="2026-03-09 03:00:50.632981435 +0000 UTC m=+1175.222645177" observedRunningTime="2026-03-09 03:00:52.308943419 +0000 UTC m=+1176.898607161" watchObservedRunningTime="2026-03-09 03:00:52.317803693 +0000 UTC m=+1176.907467445" Mar 09 03:00:52 crc kubenswrapper[4901]: I0309 03:00:52.695348 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58df884995-7dflj"] Mar 09 03:00:52 crc kubenswrapper[4901]: I0309 03:00:52.912705 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 09 03:00:52 crc kubenswrapper[4901]: I0309 03:00:52.920186 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 09 03:00:52 crc kubenswrapper[4901]: I0309 03:00:52.922338 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 09 03:00:52 crc kubenswrapper[4901]: I0309 03:00:52.922665 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 09 03:00:52 crc kubenswrapper[4901]: I0309 03:00:52.922822 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-shv88" Mar 09 03:00:52 crc kubenswrapper[4901]: I0309 03:00:52.922968 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 09 03:00:52 crc kubenswrapper[4901]: I0309 03:00:52.939051 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 09 03:00:52 crc kubenswrapper[4901]: I0309 03:00:52.953060 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 09 03:00:52 crc kubenswrapper[4901]: I0309 03:00:52.996832 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.061763 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/56e038ec-a406-4f6b-9b8a-135c56be7514-cache\") pod \"swift-storage-0\" (UID: \"56e038ec-a406-4f6b-9b8a-135c56be7514\") " pod="openstack/swift-storage-0" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.062024 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rtff\" (UniqueName: \"kubernetes.io/projected/56e038ec-a406-4f6b-9b8a-135c56be7514-kube-api-access-2rtff\") pod \"swift-storage-0\" (UID: \"56e038ec-a406-4f6b-9b8a-135c56be7514\") " pod="openstack/swift-storage-0" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.062112 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/56e038ec-a406-4f6b-9b8a-135c56be7514-etc-swift\") pod \"swift-storage-0\" (UID: \"56e038ec-a406-4f6b-9b8a-135c56be7514\") " pod="openstack/swift-storage-0" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.062142 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56e038ec-a406-4f6b-9b8a-135c56be7514-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"56e038ec-a406-4f6b-9b8a-135c56be7514\") " pod="openstack/swift-storage-0" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.062179 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"56e038ec-a406-4f6b-9b8a-135c56be7514\") " pod="openstack/swift-storage-0" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.062238 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/56e038ec-a406-4f6b-9b8a-135c56be7514-lock\") pod \"swift-storage-0\" (UID: \"56e038ec-a406-4f6b-9b8a-135c56be7514\") " pod="openstack/swift-storage-0" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.163590 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rtff\" (UniqueName: \"kubernetes.io/projected/56e038ec-a406-4f6b-9b8a-135c56be7514-kube-api-access-2rtff\") pod \"swift-storage-0\" (UID: \"56e038ec-a406-4f6b-9b8a-135c56be7514\") " pod="openstack/swift-storage-0" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.163649 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/56e038ec-a406-4f6b-9b8a-135c56be7514-etc-swift\") pod \"swift-storage-0\" (UID: \"56e038ec-a406-4f6b-9b8a-135c56be7514\") " pod="openstack/swift-storage-0" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.163675 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56e038ec-a406-4f6b-9b8a-135c56be7514-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"56e038ec-a406-4f6b-9b8a-135c56be7514\") " pod="openstack/swift-storage-0" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.163704 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"56e038ec-a406-4f6b-9b8a-135c56be7514\") " pod="openstack/swift-storage-0" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.163729 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/56e038ec-a406-4f6b-9b8a-135c56be7514-lock\") pod \"swift-storage-0\" (UID: \"56e038ec-a406-4f6b-9b8a-135c56be7514\") " pod="openstack/swift-storage-0" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.163838 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/56e038ec-a406-4f6b-9b8a-135c56be7514-cache\") pod \"swift-storage-0\" (UID: \"56e038ec-a406-4f6b-9b8a-135c56be7514\") " pod="openstack/swift-storage-0" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.164464 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/56e038ec-a406-4f6b-9b8a-135c56be7514-cache\") pod \"swift-storage-0\" (UID: \"56e038ec-a406-4f6b-9b8a-135c56be7514\") " pod="openstack/swift-storage-0" Mar 09 03:00:53 crc kubenswrapper[4901]: E0309 03:00:53.164863 4901 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 09 03:00:53 crc kubenswrapper[4901]: E0309 03:00:53.164897 4901 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.164945 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/56e038ec-a406-4f6b-9b8a-135c56be7514-lock\") pod \"swift-storage-0\" (UID: \"56e038ec-a406-4f6b-9b8a-135c56be7514\") " pod="openstack/swift-storage-0" Mar 09 03:00:53 crc kubenswrapper[4901]: E0309 03:00:53.164961 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/56e038ec-a406-4f6b-9b8a-135c56be7514-etc-swift podName:56e038ec-a406-4f6b-9b8a-135c56be7514 nodeName:}" failed. No retries permitted until 2026-03-09 03:00:53.664939562 +0000 UTC m=+1178.254603374 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/56e038ec-a406-4f6b-9b8a-135c56be7514-etc-swift") pod "swift-storage-0" (UID: "56e038ec-a406-4f6b-9b8a-135c56be7514") : configmap "swift-ring-files" not found Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.165877 4901 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"56e038ec-a406-4f6b-9b8a-135c56be7514\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/swift-storage-0" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.168802 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56e038ec-a406-4f6b-9b8a-135c56be7514-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"56e038ec-a406-4f6b-9b8a-135c56be7514\") " pod="openstack/swift-storage-0" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.203674 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rtff\" (UniqueName: \"kubernetes.io/projected/56e038ec-a406-4f6b-9b8a-135c56be7514-kube-api-access-2rtff\") pod \"swift-storage-0\" (UID: \"56e038ec-a406-4f6b-9b8a-135c56be7514\") " pod="openstack/swift-storage-0" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.205109 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"56e038ec-a406-4f6b-9b8a-135c56be7514\") " pod="openstack/swift-storage-0" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.283340 4901 generic.go:334] "Generic (PLEG): container finished" podID="eb2ebb18-ec3e-4597-a641-de94b57c923d" containerID="65662cf1784cc36b38023a272dab489323d59c686112b747b011f6a531cd625c" exitCode=0 Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.283857 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-659ddb758c-h5cnt" podUID="171527dd-3e70-4f4e-8b52-9302b6fd6238" containerName="dnsmasq-dns" containerID="cri-o://25377e9b21412a91545980379c30f3ccd01fe4876bc92c11e5fa4f0715c34cc7" gracePeriod=10 Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.284697 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58df884995-7dflj" event={"ID":"eb2ebb18-ec3e-4597-a641-de94b57c923d","Type":"ContainerDied","Data":"65662cf1784cc36b38023a272dab489323d59c686112b747b011f6a531cd625c"} Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.284737 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58df884995-7dflj" event={"ID":"eb2ebb18-ec3e-4597-a641-de94b57c923d","Type":"ContainerStarted","Data":"49d8ae16315304ac35357ac8beff794405cca949dc50117ce9dbf2491502f20e"} Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.284756 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.343082 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.482986 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-5z8xc"] Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.484292 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-5z8xc" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.489615 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.489786 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.489906 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.502981 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-5z8xc"] Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.538127 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-5z8xc"] Mar 09 03:00:53 crc kubenswrapper[4901]: E0309 03:00:53.538803 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-rjhtm ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-rjhtm ring-data-devices scripts swiftconf]: context canceled" pod="openstack/swift-ring-rebalance-5z8xc" podUID="990424a9-a455-4e13-a31b-88002f687d6a" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.547604 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-wbqpx"] Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.549124 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-wbqpx" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.570591 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/19d5d146-9bfc-45cd-ae62-ffd05473b125-swiftconf\") pod \"swift-ring-rebalance-wbqpx\" (UID: \"19d5d146-9bfc-45cd-ae62-ffd05473b125\") " pod="openstack/swift-ring-rebalance-wbqpx" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.570702 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjhtm\" (UniqueName: \"kubernetes.io/projected/990424a9-a455-4e13-a31b-88002f687d6a-kube-api-access-rjhtm\") pod \"swift-ring-rebalance-5z8xc\" (UID: \"990424a9-a455-4e13-a31b-88002f687d6a\") " pod="openstack/swift-ring-rebalance-5z8xc" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.570745 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/990424a9-a455-4e13-a31b-88002f687d6a-swiftconf\") pod \"swift-ring-rebalance-5z8xc\" (UID: \"990424a9-a455-4e13-a31b-88002f687d6a\") " pod="openstack/swift-ring-rebalance-5z8xc" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.570779 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/990424a9-a455-4e13-a31b-88002f687d6a-ring-data-devices\") pod \"swift-ring-rebalance-5z8xc\" (UID: \"990424a9-a455-4e13-a31b-88002f687d6a\") " pod="openstack/swift-ring-rebalance-5z8xc" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.570808 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/990424a9-a455-4e13-a31b-88002f687d6a-etc-swift\") pod \"swift-ring-rebalance-5z8xc\" (UID: \"990424a9-a455-4e13-a31b-88002f687d6a\") " pod="openstack/swift-ring-rebalance-5z8xc" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.570858 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/19d5d146-9bfc-45cd-ae62-ffd05473b125-scripts\") pod \"swift-ring-rebalance-wbqpx\" (UID: \"19d5d146-9bfc-45cd-ae62-ffd05473b125\") " pod="openstack/swift-ring-rebalance-wbqpx" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.570881 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/990424a9-a455-4e13-a31b-88002f687d6a-dispersionconf\") pod \"swift-ring-rebalance-5z8xc\" (UID: \"990424a9-a455-4e13-a31b-88002f687d6a\") " pod="openstack/swift-ring-rebalance-5z8xc" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.570902 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19d5d146-9bfc-45cd-ae62-ffd05473b125-combined-ca-bundle\") pod \"swift-ring-rebalance-wbqpx\" (UID: \"19d5d146-9bfc-45cd-ae62-ffd05473b125\") " pod="openstack/swift-ring-rebalance-wbqpx" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.570939 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/19d5d146-9bfc-45cd-ae62-ffd05473b125-ring-data-devices\") pod \"swift-ring-rebalance-wbqpx\" (UID: \"19d5d146-9bfc-45cd-ae62-ffd05473b125\") " pod="openstack/swift-ring-rebalance-wbqpx" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.570976 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/19d5d146-9bfc-45cd-ae62-ffd05473b125-etc-swift\") pod \"swift-ring-rebalance-wbqpx\" (UID: \"19d5d146-9bfc-45cd-ae62-ffd05473b125\") " pod="openstack/swift-ring-rebalance-wbqpx" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.570999 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/990424a9-a455-4e13-a31b-88002f687d6a-combined-ca-bundle\") pod \"swift-ring-rebalance-5z8xc\" (UID: \"990424a9-a455-4e13-a31b-88002f687d6a\") " pod="openstack/swift-ring-rebalance-5z8xc" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.571032 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/990424a9-a455-4e13-a31b-88002f687d6a-scripts\") pod \"swift-ring-rebalance-5z8xc\" (UID: \"990424a9-a455-4e13-a31b-88002f687d6a\") " pod="openstack/swift-ring-rebalance-5z8xc" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.571056 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/19d5d146-9bfc-45cd-ae62-ffd05473b125-dispersionconf\") pod \"swift-ring-rebalance-wbqpx\" (UID: \"19d5d146-9bfc-45cd-ae62-ffd05473b125\") " pod="openstack/swift-ring-rebalance-wbqpx" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.571077 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfvrl\" (UniqueName: \"kubernetes.io/projected/19d5d146-9bfc-45cd-ae62-ffd05473b125-kube-api-access-bfvrl\") pod \"swift-ring-rebalance-wbqpx\" (UID: \"19d5d146-9bfc-45cd-ae62-ffd05473b125\") " pod="openstack/swift-ring-rebalance-wbqpx" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.586452 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-wbqpx"] Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.672592 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/990424a9-a455-4e13-a31b-88002f687d6a-ring-data-devices\") pod \"swift-ring-rebalance-5z8xc\" (UID: \"990424a9-a455-4e13-a31b-88002f687d6a\") " pod="openstack/swift-ring-rebalance-5z8xc" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.672653 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/990424a9-a455-4e13-a31b-88002f687d6a-etc-swift\") pod \"swift-ring-rebalance-5z8xc\" (UID: \"990424a9-a455-4e13-a31b-88002f687d6a\") " pod="openstack/swift-ring-rebalance-5z8xc" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.672688 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/56e038ec-a406-4f6b-9b8a-135c56be7514-etc-swift\") pod \"swift-storage-0\" (UID: \"56e038ec-a406-4f6b-9b8a-135c56be7514\") " pod="openstack/swift-storage-0" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.672719 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/990424a9-a455-4e13-a31b-88002f687d6a-dispersionconf\") pod \"swift-ring-rebalance-5z8xc\" (UID: \"990424a9-a455-4e13-a31b-88002f687d6a\") " pod="openstack/swift-ring-rebalance-5z8xc" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.672740 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/19d5d146-9bfc-45cd-ae62-ffd05473b125-scripts\") pod \"swift-ring-rebalance-wbqpx\" (UID: \"19d5d146-9bfc-45cd-ae62-ffd05473b125\") " pod="openstack/swift-ring-rebalance-wbqpx" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.672765 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19d5d146-9bfc-45cd-ae62-ffd05473b125-combined-ca-bundle\") pod \"swift-ring-rebalance-wbqpx\" (UID: \"19d5d146-9bfc-45cd-ae62-ffd05473b125\") " pod="openstack/swift-ring-rebalance-wbqpx" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.672798 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/19d5d146-9bfc-45cd-ae62-ffd05473b125-ring-data-devices\") pod \"swift-ring-rebalance-wbqpx\" (UID: \"19d5d146-9bfc-45cd-ae62-ffd05473b125\") " pod="openstack/swift-ring-rebalance-wbqpx" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.672832 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/19d5d146-9bfc-45cd-ae62-ffd05473b125-etc-swift\") pod \"swift-ring-rebalance-wbqpx\" (UID: \"19d5d146-9bfc-45cd-ae62-ffd05473b125\") " pod="openstack/swift-ring-rebalance-wbqpx" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.672854 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/990424a9-a455-4e13-a31b-88002f687d6a-combined-ca-bundle\") pod \"swift-ring-rebalance-5z8xc\" (UID: \"990424a9-a455-4e13-a31b-88002f687d6a\") " pod="openstack/swift-ring-rebalance-5z8xc" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.672881 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/990424a9-a455-4e13-a31b-88002f687d6a-scripts\") pod \"swift-ring-rebalance-5z8xc\" (UID: \"990424a9-a455-4e13-a31b-88002f687d6a\") " pod="openstack/swift-ring-rebalance-5z8xc" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.672902 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/19d5d146-9bfc-45cd-ae62-ffd05473b125-dispersionconf\") pod \"swift-ring-rebalance-wbqpx\" (UID: \"19d5d146-9bfc-45cd-ae62-ffd05473b125\") " pod="openstack/swift-ring-rebalance-wbqpx" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.672964 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfvrl\" (UniqueName: \"kubernetes.io/projected/19d5d146-9bfc-45cd-ae62-ffd05473b125-kube-api-access-bfvrl\") pod \"swift-ring-rebalance-wbqpx\" (UID: \"19d5d146-9bfc-45cd-ae62-ffd05473b125\") " pod="openstack/swift-ring-rebalance-wbqpx" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.672998 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/19d5d146-9bfc-45cd-ae62-ffd05473b125-swiftconf\") pod \"swift-ring-rebalance-wbqpx\" (UID: \"19d5d146-9bfc-45cd-ae62-ffd05473b125\") " pod="openstack/swift-ring-rebalance-wbqpx" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.673060 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjhtm\" (UniqueName: \"kubernetes.io/projected/990424a9-a455-4e13-a31b-88002f687d6a-kube-api-access-rjhtm\") pod \"swift-ring-rebalance-5z8xc\" (UID: \"990424a9-a455-4e13-a31b-88002f687d6a\") " pod="openstack/swift-ring-rebalance-5z8xc" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.673108 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/990424a9-a455-4e13-a31b-88002f687d6a-swiftconf\") pod \"swift-ring-rebalance-5z8xc\" (UID: \"990424a9-a455-4e13-a31b-88002f687d6a\") " pod="openstack/swift-ring-rebalance-5z8xc" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.674677 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/19d5d146-9bfc-45cd-ae62-ffd05473b125-etc-swift\") pod \"swift-ring-rebalance-wbqpx\" (UID: \"19d5d146-9bfc-45cd-ae62-ffd05473b125\") " pod="openstack/swift-ring-rebalance-wbqpx" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.674684 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/19d5d146-9bfc-45cd-ae62-ffd05473b125-ring-data-devices\") pod \"swift-ring-rebalance-wbqpx\" (UID: \"19d5d146-9bfc-45cd-ae62-ffd05473b125\") " pod="openstack/swift-ring-rebalance-wbqpx" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.676077 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/990424a9-a455-4e13-a31b-88002f687d6a-scripts\") pod \"swift-ring-rebalance-5z8xc\" (UID: \"990424a9-a455-4e13-a31b-88002f687d6a\") " pod="openstack/swift-ring-rebalance-5z8xc" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.676092 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.676581 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/990424a9-a455-4e13-a31b-88002f687d6a-swiftconf\") pod \"swift-ring-rebalance-5z8xc\" (UID: \"990424a9-a455-4e13-a31b-88002f687d6a\") " pod="openstack/swift-ring-rebalance-5z8xc" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.676836 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/990424a9-a455-4e13-a31b-88002f687d6a-ring-data-devices\") pod \"swift-ring-rebalance-5z8xc\" (UID: \"990424a9-a455-4e13-a31b-88002f687d6a\") " pod="openstack/swift-ring-rebalance-5z8xc" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.677770 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/19d5d146-9bfc-45cd-ae62-ffd05473b125-scripts\") pod \"swift-ring-rebalance-wbqpx\" (UID: \"19d5d146-9bfc-45cd-ae62-ffd05473b125\") " pod="openstack/swift-ring-rebalance-wbqpx" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.677917 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/990424a9-a455-4e13-a31b-88002f687d6a-etc-swift\") pod \"swift-ring-rebalance-5z8xc\" (UID: \"990424a9-a455-4e13-a31b-88002f687d6a\") " pod="openstack/swift-ring-rebalance-5z8xc" Mar 09 03:00:53 crc kubenswrapper[4901]: E0309 03:00:53.678123 4901 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 09 03:00:53 crc kubenswrapper[4901]: E0309 03:00:53.678210 4901 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 09 03:00:53 crc kubenswrapper[4901]: E0309 03:00:53.678357 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/56e038ec-a406-4f6b-9b8a-135c56be7514-etc-swift podName:56e038ec-a406-4f6b-9b8a-135c56be7514 nodeName:}" failed. No retries permitted until 2026-03-09 03:00:54.678330778 +0000 UTC m=+1179.267994580 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/56e038ec-a406-4f6b-9b8a-135c56be7514-etc-swift") pod "swift-storage-0" (UID: "56e038ec-a406-4f6b-9b8a-135c56be7514") : configmap "swift-ring-files" not found Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.680404 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/19d5d146-9bfc-45cd-ae62-ffd05473b125-dispersionconf\") pod \"swift-ring-rebalance-wbqpx\" (UID: \"19d5d146-9bfc-45cd-ae62-ffd05473b125\") " pod="openstack/swift-ring-rebalance-wbqpx" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.683570 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19d5d146-9bfc-45cd-ae62-ffd05473b125-combined-ca-bundle\") pod \"swift-ring-rebalance-wbqpx\" (UID: \"19d5d146-9bfc-45cd-ae62-ffd05473b125\") " pod="openstack/swift-ring-rebalance-wbqpx" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.684323 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/990424a9-a455-4e13-a31b-88002f687d6a-dispersionconf\") pod \"swift-ring-rebalance-5z8xc\" (UID: \"990424a9-a455-4e13-a31b-88002f687d6a\") " pod="openstack/swift-ring-rebalance-5z8xc" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.684608 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/19d5d146-9bfc-45cd-ae62-ffd05473b125-swiftconf\") pod \"swift-ring-rebalance-wbqpx\" (UID: \"19d5d146-9bfc-45cd-ae62-ffd05473b125\") " pod="openstack/swift-ring-rebalance-wbqpx" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.687667 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.720613 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.733044 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.733325 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-gsjhs" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.733713 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/990424a9-a455-4e13-a31b-88002f687d6a-combined-ca-bundle\") pod \"swift-ring-rebalance-5z8xc\" (UID: \"990424a9-a455-4e13-a31b-88002f687d6a\") " pod="openstack/swift-ring-rebalance-5z8xc" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.734598 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.734997 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.746028 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjhtm\" (UniqueName: \"kubernetes.io/projected/990424a9-a455-4e13-a31b-88002f687d6a-kube-api-access-rjhtm\") pod \"swift-ring-rebalance-5z8xc\" (UID: \"990424a9-a455-4e13-a31b-88002f687d6a\") " pod="openstack/swift-ring-rebalance-5z8xc" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.754112 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfvrl\" (UniqueName: \"kubernetes.io/projected/19d5d146-9bfc-45cd-ae62-ffd05473b125-kube-api-access-bfvrl\") pod \"swift-ring-rebalance-wbqpx\" (UID: \"19d5d146-9bfc-45cd-ae62-ffd05473b125\") " pod="openstack/swift-ring-rebalance-wbqpx" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.774310 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26f9c7a2-e2b4-4be1-8206-6c067702cc74-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"26f9c7a2-e2b4-4be1-8206-6c067702cc74\") " pod="openstack/ovn-northd-0" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.774392 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/26f9c7a2-e2b4-4be1-8206-6c067702cc74-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"26f9c7a2-e2b4-4be1-8206-6c067702cc74\") " pod="openstack/ovn-northd-0" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.774452 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbn95\" (UniqueName: \"kubernetes.io/projected/26f9c7a2-e2b4-4be1-8206-6c067702cc74-kube-api-access-vbn95\") pod \"ovn-northd-0\" (UID: \"26f9c7a2-e2b4-4be1-8206-6c067702cc74\") " pod="openstack/ovn-northd-0" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.774524 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/26f9c7a2-e2b4-4be1-8206-6c067702cc74-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"26f9c7a2-e2b4-4be1-8206-6c067702cc74\") " pod="openstack/ovn-northd-0" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.774558 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/26f9c7a2-e2b4-4be1-8206-6c067702cc74-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"26f9c7a2-e2b4-4be1-8206-6c067702cc74\") " pod="openstack/ovn-northd-0" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.774600 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26f9c7a2-e2b4-4be1-8206-6c067702cc74-config\") pod \"ovn-northd-0\" (UID: \"26f9c7a2-e2b4-4be1-8206-6c067702cc74\") " pod="openstack/ovn-northd-0" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.774626 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/26f9c7a2-e2b4-4be1-8206-6c067702cc74-scripts\") pod \"ovn-northd-0\" (UID: \"26f9c7a2-e2b4-4be1-8206-6c067702cc74\") " pod="openstack/ovn-northd-0" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.875810 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/26f9c7a2-e2b4-4be1-8206-6c067702cc74-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"26f9c7a2-e2b4-4be1-8206-6c067702cc74\") " pod="openstack/ovn-northd-0" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.875878 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/26f9c7a2-e2b4-4be1-8206-6c067702cc74-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"26f9c7a2-e2b4-4be1-8206-6c067702cc74\") " pod="openstack/ovn-northd-0" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.875929 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26f9c7a2-e2b4-4be1-8206-6c067702cc74-config\") pod \"ovn-northd-0\" (UID: \"26f9c7a2-e2b4-4be1-8206-6c067702cc74\") " pod="openstack/ovn-northd-0" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.875961 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/26f9c7a2-e2b4-4be1-8206-6c067702cc74-scripts\") pod \"ovn-northd-0\" (UID: \"26f9c7a2-e2b4-4be1-8206-6c067702cc74\") " pod="openstack/ovn-northd-0" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.876002 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26f9c7a2-e2b4-4be1-8206-6c067702cc74-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"26f9c7a2-e2b4-4be1-8206-6c067702cc74\") " pod="openstack/ovn-northd-0" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.876044 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/26f9c7a2-e2b4-4be1-8206-6c067702cc74-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"26f9c7a2-e2b4-4be1-8206-6c067702cc74\") " pod="openstack/ovn-northd-0" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.876090 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbn95\" (UniqueName: \"kubernetes.io/projected/26f9c7a2-e2b4-4be1-8206-6c067702cc74-kube-api-access-vbn95\") pod \"ovn-northd-0\" (UID: \"26f9c7a2-e2b4-4be1-8206-6c067702cc74\") " pod="openstack/ovn-northd-0" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.877119 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/26f9c7a2-e2b4-4be1-8206-6c067702cc74-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"26f9c7a2-e2b4-4be1-8206-6c067702cc74\") " pod="openstack/ovn-northd-0" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.877493 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/26f9c7a2-e2b4-4be1-8206-6c067702cc74-scripts\") pod \"ovn-northd-0\" (UID: \"26f9c7a2-e2b4-4be1-8206-6c067702cc74\") " pod="openstack/ovn-northd-0" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.877833 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26f9c7a2-e2b4-4be1-8206-6c067702cc74-config\") pod \"ovn-northd-0\" (UID: \"26f9c7a2-e2b4-4be1-8206-6c067702cc74\") " pod="openstack/ovn-northd-0" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.879482 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/26f9c7a2-e2b4-4be1-8206-6c067702cc74-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"26f9c7a2-e2b4-4be1-8206-6c067702cc74\") " pod="openstack/ovn-northd-0" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.879899 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26f9c7a2-e2b4-4be1-8206-6c067702cc74-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"26f9c7a2-e2b4-4be1-8206-6c067702cc74\") " pod="openstack/ovn-northd-0" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.880489 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/26f9c7a2-e2b4-4be1-8206-6c067702cc74-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"26f9c7a2-e2b4-4be1-8206-6c067702cc74\") " pod="openstack/ovn-northd-0" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.891071 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-wbqpx" Mar 09 03:00:53 crc kubenswrapper[4901]: I0309 03:00:53.904783 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbn95\" (UniqueName: \"kubernetes.io/projected/26f9c7a2-e2b4-4be1-8206-6c067702cc74-kube-api-access-vbn95\") pod \"ovn-northd-0\" (UID: \"26f9c7a2-e2b4-4be1-8206-6c067702cc74\") " pod="openstack/ovn-northd-0" Mar 09 03:00:54 crc kubenswrapper[4901]: I0309 03:00:54.115175 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 09 03:00:54 crc kubenswrapper[4901]: I0309 03:00:54.300694 4901 generic.go:334] "Generic (PLEG): container finished" podID="171527dd-3e70-4f4e-8b52-9302b6fd6238" containerID="25377e9b21412a91545980379c30f3ccd01fe4876bc92c11e5fa4f0715c34cc7" exitCode=0 Mar 09 03:00:54 crc kubenswrapper[4901]: I0309 03:00:54.301057 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-5z8xc" Mar 09 03:00:54 crc kubenswrapper[4901]: I0309 03:00:54.301310 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-659ddb758c-h5cnt" event={"ID":"171527dd-3e70-4f4e-8b52-9302b6fd6238","Type":"ContainerDied","Data":"25377e9b21412a91545980379c30f3ccd01fe4876bc92c11e5fa4f0715c34cc7"} Mar 09 03:00:54 crc kubenswrapper[4901]: I0309 03:00:54.318539 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-5z8xc" Mar 09 03:00:54 crc kubenswrapper[4901]: I0309 03:00:54.359155 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-wbqpx"] Mar 09 03:00:54 crc kubenswrapper[4901]: W0309 03:00:54.362278 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod19d5d146_9bfc_45cd_ae62_ffd05473b125.slice/crio-d66d1177026e29d5043c80764a9ceb0dc68bb3b8f35f0d8a4ac690907130d06d WatchSource:0}: Error finding container d66d1177026e29d5043c80764a9ceb0dc68bb3b8f35f0d8a4ac690907130d06d: Status 404 returned error can't find the container with id d66d1177026e29d5043c80764a9ceb0dc68bb3b8f35f0d8a4ac690907130d06d Mar 09 03:00:54 crc kubenswrapper[4901]: I0309 03:00:54.397025 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/990424a9-a455-4e13-a31b-88002f687d6a-etc-swift\") pod \"990424a9-a455-4e13-a31b-88002f687d6a\" (UID: \"990424a9-a455-4e13-a31b-88002f687d6a\") " Mar 09 03:00:54 crc kubenswrapper[4901]: I0309 03:00:54.397183 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/990424a9-a455-4e13-a31b-88002f687d6a-combined-ca-bundle\") pod \"990424a9-a455-4e13-a31b-88002f687d6a\" (UID: \"990424a9-a455-4e13-a31b-88002f687d6a\") " Mar 09 03:00:54 crc kubenswrapper[4901]: I0309 03:00:54.397274 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/990424a9-a455-4e13-a31b-88002f687d6a-swiftconf\") pod \"990424a9-a455-4e13-a31b-88002f687d6a\" (UID: \"990424a9-a455-4e13-a31b-88002f687d6a\") " Mar 09 03:00:54 crc kubenswrapper[4901]: I0309 03:00:54.397718 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/990424a9-a455-4e13-a31b-88002f687d6a-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "990424a9-a455-4e13-a31b-88002f687d6a" (UID: "990424a9-a455-4e13-a31b-88002f687d6a"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:00:54 crc kubenswrapper[4901]: I0309 03:00:54.398376 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/990424a9-a455-4e13-a31b-88002f687d6a-dispersionconf\") pod \"990424a9-a455-4e13-a31b-88002f687d6a\" (UID: \"990424a9-a455-4e13-a31b-88002f687d6a\") " Mar 09 03:00:54 crc kubenswrapper[4901]: I0309 03:00:54.398451 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjhtm\" (UniqueName: \"kubernetes.io/projected/990424a9-a455-4e13-a31b-88002f687d6a-kube-api-access-rjhtm\") pod \"990424a9-a455-4e13-a31b-88002f687d6a\" (UID: \"990424a9-a455-4e13-a31b-88002f687d6a\") " Mar 09 03:00:54 crc kubenswrapper[4901]: I0309 03:00:54.398485 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/990424a9-a455-4e13-a31b-88002f687d6a-ring-data-devices\") pod \"990424a9-a455-4e13-a31b-88002f687d6a\" (UID: \"990424a9-a455-4e13-a31b-88002f687d6a\") " Mar 09 03:00:54 crc kubenswrapper[4901]: I0309 03:00:54.398562 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/990424a9-a455-4e13-a31b-88002f687d6a-scripts\") pod \"990424a9-a455-4e13-a31b-88002f687d6a\" (UID: \"990424a9-a455-4e13-a31b-88002f687d6a\") " Mar 09 03:00:54 crc kubenswrapper[4901]: I0309 03:00:54.399119 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/990424a9-a455-4e13-a31b-88002f687d6a-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "990424a9-a455-4e13-a31b-88002f687d6a" (UID: "990424a9-a455-4e13-a31b-88002f687d6a"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:00:54 crc kubenswrapper[4901]: I0309 03:00:54.399334 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/990424a9-a455-4e13-a31b-88002f687d6a-scripts" (OuterVolumeSpecName: "scripts") pod "990424a9-a455-4e13-a31b-88002f687d6a" (UID: "990424a9-a455-4e13-a31b-88002f687d6a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:00:54 crc kubenswrapper[4901]: I0309 03:00:54.400176 4901 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/990424a9-a455-4e13-a31b-88002f687d6a-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 03:00:54 crc kubenswrapper[4901]: I0309 03:00:54.400198 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/990424a9-a455-4e13-a31b-88002f687d6a-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:00:54 crc kubenswrapper[4901]: I0309 03:00:54.400208 4901 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/990424a9-a455-4e13-a31b-88002f687d6a-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 03:00:54 crc kubenswrapper[4901]: I0309 03:00:54.402919 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/990424a9-a455-4e13-a31b-88002f687d6a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "990424a9-a455-4e13-a31b-88002f687d6a" (UID: "990424a9-a455-4e13-a31b-88002f687d6a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:00:54 crc kubenswrapper[4901]: I0309 03:00:54.402933 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/990424a9-a455-4e13-a31b-88002f687d6a-kube-api-access-rjhtm" (OuterVolumeSpecName: "kube-api-access-rjhtm") pod "990424a9-a455-4e13-a31b-88002f687d6a" (UID: "990424a9-a455-4e13-a31b-88002f687d6a"). InnerVolumeSpecName "kube-api-access-rjhtm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:00:54 crc kubenswrapper[4901]: I0309 03:00:54.402965 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/990424a9-a455-4e13-a31b-88002f687d6a-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "990424a9-a455-4e13-a31b-88002f687d6a" (UID: "990424a9-a455-4e13-a31b-88002f687d6a"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:00:54 crc kubenswrapper[4901]: I0309 03:00:54.405124 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/990424a9-a455-4e13-a31b-88002f687d6a-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "990424a9-a455-4e13-a31b-88002f687d6a" (UID: "990424a9-a455-4e13-a31b-88002f687d6a"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:00:54 crc kubenswrapper[4901]: I0309 03:00:54.501476 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/990424a9-a455-4e13-a31b-88002f687d6a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:00:54 crc kubenswrapper[4901]: I0309 03:00:54.501511 4901 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/990424a9-a455-4e13-a31b-88002f687d6a-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 03:00:54 crc kubenswrapper[4901]: I0309 03:00:54.501520 4901 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/990424a9-a455-4e13-a31b-88002f687d6a-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 03:00:54 crc kubenswrapper[4901]: I0309 03:00:54.501528 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjhtm\" (UniqueName: \"kubernetes.io/projected/990424a9-a455-4e13-a31b-88002f687d6a-kube-api-access-rjhtm\") on node \"crc\" DevicePath \"\"" Mar 09 03:00:54 crc kubenswrapper[4901]: I0309 03:00:54.582620 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 09 03:00:54 crc kubenswrapper[4901]: W0309 03:00:54.586750 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26f9c7a2_e2b4_4be1_8206_6c067702cc74.slice/crio-032244c67de9b121df4c60605ba9ece2c844f49edd92bde9ded65add4865d57a WatchSource:0}: Error finding container 032244c67de9b121df4c60605ba9ece2c844f49edd92bde9ded65add4865d57a: Status 404 returned error can't find the container with id 032244c67de9b121df4c60605ba9ece2c844f49edd92bde9ded65add4865d57a Mar 09 03:00:54 crc kubenswrapper[4901]: I0309 03:00:54.704418 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/56e038ec-a406-4f6b-9b8a-135c56be7514-etc-swift\") pod \"swift-storage-0\" (UID: \"56e038ec-a406-4f6b-9b8a-135c56be7514\") " pod="openstack/swift-storage-0" Mar 09 03:00:54 crc kubenswrapper[4901]: E0309 03:00:54.704561 4901 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 09 03:00:54 crc kubenswrapper[4901]: E0309 03:00:54.704583 4901 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 09 03:00:54 crc kubenswrapper[4901]: E0309 03:00:54.704644 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/56e038ec-a406-4f6b-9b8a-135c56be7514-etc-swift podName:56e038ec-a406-4f6b-9b8a-135c56be7514 nodeName:}" failed. No retries permitted until 2026-03-09 03:00:56.704627277 +0000 UTC m=+1181.294291009 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/56e038ec-a406-4f6b-9b8a-135c56be7514-etc-swift") pod "swift-storage-0" (UID: "56e038ec-a406-4f6b-9b8a-135c56be7514") : configmap "swift-ring-files" not found Mar 09 03:00:55 crc kubenswrapper[4901]: I0309 03:00:55.310786 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-wbqpx" event={"ID":"19d5d146-9bfc-45cd-ae62-ffd05473b125","Type":"ContainerStarted","Data":"d66d1177026e29d5043c80764a9ceb0dc68bb3b8f35f0d8a4ac690907130d06d"} Mar 09 03:00:55 crc kubenswrapper[4901]: I0309 03:00:55.312319 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"26f9c7a2-e2b4-4be1-8206-6c067702cc74","Type":"ContainerStarted","Data":"032244c67de9b121df4c60605ba9ece2c844f49edd92bde9ded65add4865d57a"} Mar 09 03:00:55 crc kubenswrapper[4901]: I0309 03:00:55.312374 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-5z8xc" Mar 09 03:00:55 crc kubenswrapper[4901]: I0309 03:00:55.378600 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-5z8xc"] Mar 09 03:00:55 crc kubenswrapper[4901]: I0309 03:00:55.385883 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-5z8xc"] Mar 09 03:00:56 crc kubenswrapper[4901]: I0309 03:00:56.116243 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="990424a9-a455-4e13-a31b-88002f687d6a" path="/var/lib/kubelet/pods/990424a9-a455-4e13-a31b-88002f687d6a/volumes" Mar 09 03:00:56 crc kubenswrapper[4901]: I0309 03:00:56.740955 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/56e038ec-a406-4f6b-9b8a-135c56be7514-etc-swift\") pod \"swift-storage-0\" (UID: \"56e038ec-a406-4f6b-9b8a-135c56be7514\") " pod="openstack/swift-storage-0" Mar 09 03:00:56 crc kubenswrapper[4901]: E0309 03:00:56.741296 4901 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 09 03:00:56 crc kubenswrapper[4901]: E0309 03:00:56.741313 4901 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 09 03:00:56 crc kubenswrapper[4901]: E0309 03:00:56.741360 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/56e038ec-a406-4f6b-9b8a-135c56be7514-etc-swift podName:56e038ec-a406-4f6b-9b8a-135c56be7514 nodeName:}" failed. No retries permitted until 2026-03-09 03:01:00.741343097 +0000 UTC m=+1185.331006839 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/56e038ec-a406-4f6b-9b8a-135c56be7514-etc-swift") pod "swift-storage-0" (UID: "56e038ec-a406-4f6b-9b8a-135c56be7514") : configmap "swift-ring-files" not found Mar 09 03:00:57 crc kubenswrapper[4901]: I0309 03:00:57.554032 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b47ddbdf5-rtwnv" Mar 09 03:00:57 crc kubenswrapper[4901]: I0309 03:00:57.737983 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-659ddb758c-h5cnt" podUID="171527dd-3e70-4f4e-8b52-9302b6fd6238" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.116:5353: connect: connection refused" Mar 09 03:00:57 crc kubenswrapper[4901]: I0309 03:00:57.865707 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 09 03:00:57 crc kubenswrapper[4901]: I0309 03:00:57.865822 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 09 03:00:58 crc kubenswrapper[4901]: I0309 03:00:58.193744 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-659ddb758c-h5cnt" Mar 09 03:00:58 crc kubenswrapper[4901]: I0309 03:00:58.354067 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-659ddb758c-h5cnt" event={"ID":"171527dd-3e70-4f4e-8b52-9302b6fd6238","Type":"ContainerDied","Data":"42c575fa96b6650caf9ba60b5cc69c7c9bec4dd715d2462e05d19f2f671ca948"} Mar 09 03:00:58 crc kubenswrapper[4901]: I0309 03:00:58.354114 4901 scope.go:117] "RemoveContainer" containerID="25377e9b21412a91545980379c30f3ccd01fe4876bc92c11e5fa4f0715c34cc7" Mar 09 03:00:58 crc kubenswrapper[4901]: I0309 03:00:58.354077 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-659ddb758c-h5cnt" Mar 09 03:00:58 crc kubenswrapper[4901]: I0309 03:00:58.356470 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58df884995-7dflj" event={"ID":"eb2ebb18-ec3e-4597-a641-de94b57c923d","Type":"ContainerStarted","Data":"bc53cc89e1ac261aeac91468049cc78702de8ea39f6d74b15594ae7e1ebf598d"} Mar 09 03:00:58 crc kubenswrapper[4901]: I0309 03:00:58.363531 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/171527dd-3e70-4f4e-8b52-9302b6fd6238-ovsdbserver-sb\") pod \"171527dd-3e70-4f4e-8b52-9302b6fd6238\" (UID: \"171527dd-3e70-4f4e-8b52-9302b6fd6238\") " Mar 09 03:00:58 crc kubenswrapper[4901]: I0309 03:00:58.363593 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/171527dd-3e70-4f4e-8b52-9302b6fd6238-ovsdbserver-nb\") pod \"171527dd-3e70-4f4e-8b52-9302b6fd6238\" (UID: \"171527dd-3e70-4f4e-8b52-9302b6fd6238\") " Mar 09 03:00:58 crc kubenswrapper[4901]: I0309 03:00:58.363636 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xz5n2\" (UniqueName: \"kubernetes.io/projected/171527dd-3e70-4f4e-8b52-9302b6fd6238-kube-api-access-xz5n2\") pod \"171527dd-3e70-4f4e-8b52-9302b6fd6238\" (UID: \"171527dd-3e70-4f4e-8b52-9302b6fd6238\") " Mar 09 03:00:58 crc kubenswrapper[4901]: I0309 03:00:58.363703 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/171527dd-3e70-4f4e-8b52-9302b6fd6238-config\") pod \"171527dd-3e70-4f4e-8b52-9302b6fd6238\" (UID: \"171527dd-3e70-4f4e-8b52-9302b6fd6238\") " Mar 09 03:00:58 crc kubenswrapper[4901]: I0309 03:00:58.363765 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/171527dd-3e70-4f4e-8b52-9302b6fd6238-dns-svc\") pod \"171527dd-3e70-4f4e-8b52-9302b6fd6238\" (UID: \"171527dd-3e70-4f4e-8b52-9302b6fd6238\") " Mar 09 03:00:58 crc kubenswrapper[4901]: I0309 03:00:58.381294 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/171527dd-3e70-4f4e-8b52-9302b6fd6238-kube-api-access-xz5n2" (OuterVolumeSpecName: "kube-api-access-xz5n2") pod "171527dd-3e70-4f4e-8b52-9302b6fd6238" (UID: "171527dd-3e70-4f4e-8b52-9302b6fd6238"). InnerVolumeSpecName "kube-api-access-xz5n2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:00:58 crc kubenswrapper[4901]: I0309 03:00:58.385665 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58df884995-7dflj" podStartSLOduration=7.385643893 podStartE2EDuration="7.385643893s" podCreationTimestamp="2026-03-09 03:00:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 03:00:58.381416716 +0000 UTC m=+1182.971080448" watchObservedRunningTime="2026-03-09 03:00:58.385643893 +0000 UTC m=+1182.975307645" Mar 09 03:00:58 crc kubenswrapper[4901]: I0309 03:00:58.401848 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/171527dd-3e70-4f4e-8b52-9302b6fd6238-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "171527dd-3e70-4f4e-8b52-9302b6fd6238" (UID: "171527dd-3e70-4f4e-8b52-9302b6fd6238"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:00:58 crc kubenswrapper[4901]: I0309 03:00:58.405011 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/171527dd-3e70-4f4e-8b52-9302b6fd6238-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "171527dd-3e70-4f4e-8b52-9302b6fd6238" (UID: "171527dd-3e70-4f4e-8b52-9302b6fd6238"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:00:58 crc kubenswrapper[4901]: I0309 03:00:58.407553 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/171527dd-3e70-4f4e-8b52-9302b6fd6238-config" (OuterVolumeSpecName: "config") pod "171527dd-3e70-4f4e-8b52-9302b6fd6238" (UID: "171527dd-3e70-4f4e-8b52-9302b6fd6238"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:00:58 crc kubenswrapper[4901]: I0309 03:00:58.415673 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/171527dd-3e70-4f4e-8b52-9302b6fd6238-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "171527dd-3e70-4f4e-8b52-9302b6fd6238" (UID: "171527dd-3e70-4f4e-8b52-9302b6fd6238"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:00:58 crc kubenswrapper[4901]: I0309 03:00:58.466369 4901 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/171527dd-3e70-4f4e-8b52-9302b6fd6238-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 03:00:58 crc kubenswrapper[4901]: I0309 03:00:58.466396 4901 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/171527dd-3e70-4f4e-8b52-9302b6fd6238-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 03:00:58 crc kubenswrapper[4901]: I0309 03:00:58.466405 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xz5n2\" (UniqueName: \"kubernetes.io/projected/171527dd-3e70-4f4e-8b52-9302b6fd6238-kube-api-access-xz5n2\") on node \"crc\" DevicePath \"\"" Mar 09 03:00:58 crc kubenswrapper[4901]: I0309 03:00:58.466416 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/171527dd-3e70-4f4e-8b52-9302b6fd6238-config\") on node \"crc\" DevicePath \"\"" Mar 09 03:00:58 crc kubenswrapper[4901]: I0309 03:00:58.466424 4901 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/171527dd-3e70-4f4e-8b52-9302b6fd6238-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 03:00:58 crc kubenswrapper[4901]: E0309 03:00:58.654119 4901 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.20:51180->38.102.83.20:42365: write tcp 38.102.83.20:51180->38.102.83.20:42365: write: broken pipe Mar 09 03:00:58 crc kubenswrapper[4901]: I0309 03:00:58.687585 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-659ddb758c-h5cnt"] Mar 09 03:00:58 crc kubenswrapper[4901]: I0309 03:00:58.694550 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-659ddb758c-h5cnt"] Mar 09 03:00:59 crc kubenswrapper[4901]: I0309 03:00:59.232388 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 09 03:00:59 crc kubenswrapper[4901]: I0309 03:00:59.232518 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 09 03:00:59 crc kubenswrapper[4901]: I0309 03:00:59.312764 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 09 03:00:59 crc kubenswrapper[4901]: I0309 03:00:59.368191 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58df884995-7dflj" Mar 09 03:00:59 crc kubenswrapper[4901]: I0309 03:00:59.426937 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 09 03:00:59 crc kubenswrapper[4901]: I0309 03:00:59.448381 4901 scope.go:117] "RemoveContainer" containerID="5ec18fba2a1c1f77b64b6ed489e33eb3eaf7fc823ecee63156c904288d9accd5" Mar 09 03:01:00 crc kubenswrapper[4901]: I0309 03:01:00.116022 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="171527dd-3e70-4f4e-8b52-9302b6fd6238" path="/var/lib/kubelet/pods/171527dd-3e70-4f4e-8b52-9302b6fd6238/volumes" Mar 09 03:01:00 crc kubenswrapper[4901]: I0309 03:01:00.810896 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/56e038ec-a406-4f6b-9b8a-135c56be7514-etc-swift\") pod \"swift-storage-0\" (UID: \"56e038ec-a406-4f6b-9b8a-135c56be7514\") " pod="openstack/swift-storage-0" Mar 09 03:01:00 crc kubenswrapper[4901]: E0309 03:01:00.811333 4901 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 09 03:01:00 crc kubenswrapper[4901]: E0309 03:01:00.811352 4901 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 09 03:01:00 crc kubenswrapper[4901]: E0309 03:01:00.811392 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/56e038ec-a406-4f6b-9b8a-135c56be7514-etc-swift podName:56e038ec-a406-4f6b-9b8a-135c56be7514 nodeName:}" failed. No retries permitted until 2026-03-09 03:01:08.81137935 +0000 UTC m=+1193.401043082 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/56e038ec-a406-4f6b-9b8a-135c56be7514-etc-swift") pod "swift-storage-0" (UID: "56e038ec-a406-4f6b-9b8a-135c56be7514") : configmap "swift-ring-files" not found Mar 09 03:01:01 crc kubenswrapper[4901]: I0309 03:01:01.389104 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"26f9c7a2-e2b4-4be1-8206-6c067702cc74","Type":"ContainerStarted","Data":"40a1a57654e5d5e4c9a9cd5ea7b2003a15b2859aff2f929b6d3f5d5857124593"} Mar 09 03:01:01 crc kubenswrapper[4901]: I0309 03:01:01.389153 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"26f9c7a2-e2b4-4be1-8206-6c067702cc74","Type":"ContainerStarted","Data":"380ec713a35cadba56e726245f8b17f2443d117a6aac88cbd4d4d5386efa672d"} Mar 09 03:01:01 crc kubenswrapper[4901]: I0309 03:01:01.390366 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 09 03:01:01 crc kubenswrapper[4901]: I0309 03:01:01.392966 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-wbqpx" event={"ID":"19d5d146-9bfc-45cd-ae62-ffd05473b125","Type":"ContainerStarted","Data":"d5910d7c426253115f7317e5b034b541d545d2f6be616729dbcd814afc42fd07"} Mar 09 03:01:01 crc kubenswrapper[4901]: I0309 03:01:01.412121 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.465188351 podStartE2EDuration="8.41209767s" podCreationTimestamp="2026-03-09 03:00:53 +0000 UTC" firstStartedPulling="2026-03-09 03:00:54.588517767 +0000 UTC m=+1179.178181499" lastFinishedPulling="2026-03-09 03:01:00.535427076 +0000 UTC m=+1185.125090818" observedRunningTime="2026-03-09 03:01:01.407083084 +0000 UTC m=+1185.996746836" watchObservedRunningTime="2026-03-09 03:01:01.41209767 +0000 UTC m=+1186.001761432" Mar 09 03:01:01 crc kubenswrapper[4901]: I0309 03:01:01.436913 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-wbqpx" podStartSLOduration=2.258220919 podStartE2EDuration="8.436896446s" podCreationTimestamp="2026-03-09 03:00:53 +0000 UTC" firstStartedPulling="2026-03-09 03:00:54.367569942 +0000 UTC m=+1178.957233694" lastFinishedPulling="2026-03-09 03:01:00.546245489 +0000 UTC m=+1185.135909221" observedRunningTime="2026-03-09 03:01:01.4303093 +0000 UTC m=+1186.019973032" watchObservedRunningTime="2026-03-09 03:01:01.436896446 +0000 UTC m=+1186.026560178" Mar 09 03:01:01 crc kubenswrapper[4901]: I0309 03:01:01.987591 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 09 03:01:02 crc kubenswrapper[4901]: I0309 03:01:02.077442 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 09 03:01:02 crc kubenswrapper[4901]: I0309 03:01:02.211490 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58df884995-7dflj" Mar 09 03:01:02 crc kubenswrapper[4901]: I0309 03:01:02.276789 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b47ddbdf5-rtwnv"] Mar 09 03:01:02 crc kubenswrapper[4901]: I0309 03:01:02.277378 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b47ddbdf5-rtwnv" podUID="c431ae87-ef51-42f6-8ace-cd59ac685e50" containerName="dnsmasq-dns" containerID="cri-o://6ac311bce46a5221b5a6a5158083609bd8be2a49eb80aba6263cd5efcc18529d" gracePeriod=10 Mar 09 03:01:02 crc kubenswrapper[4901]: I0309 03:01:02.400906 4901 generic.go:334] "Generic (PLEG): container finished" podID="c431ae87-ef51-42f6-8ace-cd59ac685e50" containerID="6ac311bce46a5221b5a6a5158083609bd8be2a49eb80aba6263cd5efcc18529d" exitCode=0 Mar 09 03:01:02 crc kubenswrapper[4901]: I0309 03:01:02.401307 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b47ddbdf5-rtwnv" event={"ID":"c431ae87-ef51-42f6-8ace-cd59ac685e50","Type":"ContainerDied","Data":"6ac311bce46a5221b5a6a5158083609bd8be2a49eb80aba6263cd5efcc18529d"} Mar 09 03:01:02 crc kubenswrapper[4901]: I0309 03:01:02.755547 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b47ddbdf5-rtwnv" Mar 09 03:01:02 crc kubenswrapper[4901]: I0309 03:01:02.855512 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c431ae87-ef51-42f6-8ace-cd59ac685e50-dns-svc\") pod \"c431ae87-ef51-42f6-8ace-cd59ac685e50\" (UID: \"c431ae87-ef51-42f6-8ace-cd59ac685e50\") " Mar 09 03:01:02 crc kubenswrapper[4901]: I0309 03:01:02.855723 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c431ae87-ef51-42f6-8ace-cd59ac685e50-config\") pod \"c431ae87-ef51-42f6-8ace-cd59ac685e50\" (UID: \"c431ae87-ef51-42f6-8ace-cd59ac685e50\") " Mar 09 03:01:02 crc kubenswrapper[4901]: I0309 03:01:02.855788 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c431ae87-ef51-42f6-8ace-cd59ac685e50-ovsdbserver-nb\") pod \"c431ae87-ef51-42f6-8ace-cd59ac685e50\" (UID: \"c431ae87-ef51-42f6-8ace-cd59ac685e50\") " Mar 09 03:01:02 crc kubenswrapper[4901]: I0309 03:01:02.855877 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dtkd\" (UniqueName: \"kubernetes.io/projected/c431ae87-ef51-42f6-8ace-cd59ac685e50-kube-api-access-7dtkd\") pod \"c431ae87-ef51-42f6-8ace-cd59ac685e50\" (UID: \"c431ae87-ef51-42f6-8ace-cd59ac685e50\") " Mar 09 03:01:02 crc kubenswrapper[4901]: I0309 03:01:02.883655 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c431ae87-ef51-42f6-8ace-cd59ac685e50-kube-api-access-7dtkd" (OuterVolumeSpecName: "kube-api-access-7dtkd") pod "c431ae87-ef51-42f6-8ace-cd59ac685e50" (UID: "c431ae87-ef51-42f6-8ace-cd59ac685e50"). InnerVolumeSpecName "kube-api-access-7dtkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:01:02 crc kubenswrapper[4901]: I0309 03:01:02.900237 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c431ae87-ef51-42f6-8ace-cd59ac685e50-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c431ae87-ef51-42f6-8ace-cd59ac685e50" (UID: "c431ae87-ef51-42f6-8ace-cd59ac685e50"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:01:02 crc kubenswrapper[4901]: I0309 03:01:02.901782 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c431ae87-ef51-42f6-8ace-cd59ac685e50-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c431ae87-ef51-42f6-8ace-cd59ac685e50" (UID: "c431ae87-ef51-42f6-8ace-cd59ac685e50"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:01:02 crc kubenswrapper[4901]: I0309 03:01:02.907231 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c431ae87-ef51-42f6-8ace-cd59ac685e50-config" (OuterVolumeSpecName: "config") pod "c431ae87-ef51-42f6-8ace-cd59ac685e50" (UID: "c431ae87-ef51-42f6-8ace-cd59ac685e50"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:01:02 crc kubenswrapper[4901]: I0309 03:01:02.960185 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c431ae87-ef51-42f6-8ace-cd59ac685e50-config\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:02 crc kubenswrapper[4901]: I0309 03:01:02.960244 4901 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c431ae87-ef51-42f6-8ace-cd59ac685e50-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:02 crc kubenswrapper[4901]: I0309 03:01:02.960261 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dtkd\" (UniqueName: \"kubernetes.io/projected/c431ae87-ef51-42f6-8ace-cd59ac685e50-kube-api-access-7dtkd\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:02 crc kubenswrapper[4901]: I0309 03:01:02.960274 4901 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c431ae87-ef51-42f6-8ace-cd59ac685e50-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:03 crc kubenswrapper[4901]: I0309 03:01:03.414126 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b47ddbdf5-rtwnv" Mar 09 03:01:03 crc kubenswrapper[4901]: I0309 03:01:03.415023 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b47ddbdf5-rtwnv" event={"ID":"c431ae87-ef51-42f6-8ace-cd59ac685e50","Type":"ContainerDied","Data":"d741d7455bb9a1b175ac7cbd4a02faff701c2968e20c1e248b918cc9d750d1e7"} Mar 09 03:01:03 crc kubenswrapper[4901]: I0309 03:01:03.415088 4901 scope.go:117] "RemoveContainer" containerID="6ac311bce46a5221b5a6a5158083609bd8be2a49eb80aba6263cd5efcc18529d" Mar 09 03:01:03 crc kubenswrapper[4901]: I0309 03:01:03.445708 4901 scope.go:117] "RemoveContainer" containerID="df9cd0199fcc69f609437b41bc77f246ff247244dbcd9149c203e85cb217921b" Mar 09 03:01:03 crc kubenswrapper[4901]: I0309 03:01:03.464212 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b47ddbdf5-rtwnv"] Mar 09 03:01:03 crc kubenswrapper[4901]: I0309 03:01:03.474063 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b47ddbdf5-rtwnv"] Mar 09 03:01:04 crc kubenswrapper[4901]: I0309 03:01:04.124418 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c431ae87-ef51-42f6-8ace-cd59ac685e50" path="/var/lib/kubelet/pods/c431ae87-ef51-42f6-8ace-cd59ac685e50/volumes" Mar 09 03:01:06 crc kubenswrapper[4901]: I0309 03:01:06.557487 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-277cz"] Mar 09 03:01:06 crc kubenswrapper[4901]: E0309 03:01:06.557761 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c431ae87-ef51-42f6-8ace-cd59ac685e50" containerName="dnsmasq-dns" Mar 09 03:01:06 crc kubenswrapper[4901]: I0309 03:01:06.557773 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="c431ae87-ef51-42f6-8ace-cd59ac685e50" containerName="dnsmasq-dns" Mar 09 03:01:06 crc kubenswrapper[4901]: E0309 03:01:06.557795 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="171527dd-3e70-4f4e-8b52-9302b6fd6238" containerName="init" Mar 09 03:01:06 crc kubenswrapper[4901]: I0309 03:01:06.557800 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="171527dd-3e70-4f4e-8b52-9302b6fd6238" containerName="init" Mar 09 03:01:06 crc kubenswrapper[4901]: E0309 03:01:06.557809 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="171527dd-3e70-4f4e-8b52-9302b6fd6238" containerName="dnsmasq-dns" Mar 09 03:01:06 crc kubenswrapper[4901]: I0309 03:01:06.557815 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="171527dd-3e70-4f4e-8b52-9302b6fd6238" containerName="dnsmasq-dns" Mar 09 03:01:06 crc kubenswrapper[4901]: E0309 03:01:06.557827 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c431ae87-ef51-42f6-8ace-cd59ac685e50" containerName="init" Mar 09 03:01:06 crc kubenswrapper[4901]: I0309 03:01:06.557833 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="c431ae87-ef51-42f6-8ace-cd59ac685e50" containerName="init" Mar 09 03:01:06 crc kubenswrapper[4901]: I0309 03:01:06.557970 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="c431ae87-ef51-42f6-8ace-cd59ac685e50" containerName="dnsmasq-dns" Mar 09 03:01:06 crc kubenswrapper[4901]: I0309 03:01:06.557982 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="171527dd-3e70-4f4e-8b52-9302b6fd6238" containerName="dnsmasq-dns" Mar 09 03:01:06 crc kubenswrapper[4901]: I0309 03:01:06.558434 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-277cz" Mar 09 03:01:06 crc kubenswrapper[4901]: I0309 03:01:06.566467 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 09 03:01:06 crc kubenswrapper[4901]: I0309 03:01:06.581603 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-277cz"] Mar 09 03:01:06 crc kubenswrapper[4901]: I0309 03:01:06.624303 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0a96256-0032-47af-9508-352567bec408-operator-scripts\") pod \"root-account-create-update-277cz\" (UID: \"c0a96256-0032-47af-9508-352567bec408\") " pod="openstack/root-account-create-update-277cz" Mar 09 03:01:06 crc kubenswrapper[4901]: I0309 03:01:06.624382 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd482\" (UniqueName: \"kubernetes.io/projected/c0a96256-0032-47af-9508-352567bec408-kube-api-access-wd482\") pod \"root-account-create-update-277cz\" (UID: \"c0a96256-0032-47af-9508-352567bec408\") " pod="openstack/root-account-create-update-277cz" Mar 09 03:01:06 crc kubenswrapper[4901]: I0309 03:01:06.725453 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wd482\" (UniqueName: \"kubernetes.io/projected/c0a96256-0032-47af-9508-352567bec408-kube-api-access-wd482\") pod \"root-account-create-update-277cz\" (UID: \"c0a96256-0032-47af-9508-352567bec408\") " pod="openstack/root-account-create-update-277cz" Mar 09 03:01:06 crc kubenswrapper[4901]: I0309 03:01:06.725601 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0a96256-0032-47af-9508-352567bec408-operator-scripts\") pod \"root-account-create-update-277cz\" (UID: \"c0a96256-0032-47af-9508-352567bec408\") " pod="openstack/root-account-create-update-277cz" Mar 09 03:01:06 crc kubenswrapper[4901]: I0309 03:01:06.726537 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0a96256-0032-47af-9508-352567bec408-operator-scripts\") pod \"root-account-create-update-277cz\" (UID: \"c0a96256-0032-47af-9508-352567bec408\") " pod="openstack/root-account-create-update-277cz" Mar 09 03:01:06 crc kubenswrapper[4901]: I0309 03:01:06.755172 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wd482\" (UniqueName: \"kubernetes.io/projected/c0a96256-0032-47af-9508-352567bec408-kube-api-access-wd482\") pod \"root-account-create-update-277cz\" (UID: \"c0a96256-0032-47af-9508-352567bec408\") " pod="openstack/root-account-create-update-277cz" Mar 09 03:01:06 crc kubenswrapper[4901]: I0309 03:01:06.875704 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-277cz" Mar 09 03:01:07 crc kubenswrapper[4901]: W0309 03:01:07.414500 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0a96256_0032_47af_9508_352567bec408.slice/crio-f9e2e8169b81b25fc6408d03cbc17ec3ec9694a68d1bbf31e8b4ec7e95b92321 WatchSource:0}: Error finding container f9e2e8169b81b25fc6408d03cbc17ec3ec9694a68d1bbf31e8b4ec7e95b92321: Status 404 returned error can't find the container with id f9e2e8169b81b25fc6408d03cbc17ec3ec9694a68d1bbf31e8b4ec7e95b92321 Mar 09 03:01:07 crc kubenswrapper[4901]: I0309 03:01:07.416444 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-277cz"] Mar 09 03:01:07 crc kubenswrapper[4901]: I0309 03:01:07.471629 4901 generic.go:334] "Generic (PLEG): container finished" podID="19d5d146-9bfc-45cd-ae62-ffd05473b125" containerID="d5910d7c426253115f7317e5b034b541d545d2f6be616729dbcd814afc42fd07" exitCode=0 Mar 09 03:01:07 crc kubenswrapper[4901]: I0309 03:01:07.471690 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-wbqpx" event={"ID":"19d5d146-9bfc-45cd-ae62-ffd05473b125","Type":"ContainerDied","Data":"d5910d7c426253115f7317e5b034b541d545d2f6be616729dbcd814afc42fd07"} Mar 09 03:01:07 crc kubenswrapper[4901]: I0309 03:01:07.477631 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-277cz" event={"ID":"c0a96256-0032-47af-9508-352567bec408","Type":"ContainerStarted","Data":"f9e2e8169b81b25fc6408d03cbc17ec3ec9694a68d1bbf31e8b4ec7e95b92321"} Mar 09 03:01:07 crc kubenswrapper[4901]: I0309 03:01:07.552011 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b47ddbdf5-rtwnv" podUID="c431ae87-ef51-42f6-8ace-cd59ac685e50" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: i/o timeout" Mar 09 03:01:08 crc kubenswrapper[4901]: I0309 03:01:08.489010 4901 generic.go:334] "Generic (PLEG): container finished" podID="c0a96256-0032-47af-9508-352567bec408" containerID="a2ad28f5f864002403825bfa3b2973d7bd3cc4be809b84ffdf5a9e79dab3db9b" exitCode=0 Mar 09 03:01:08 crc kubenswrapper[4901]: I0309 03:01:08.489134 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-277cz" event={"ID":"c0a96256-0032-47af-9508-352567bec408","Type":"ContainerDied","Data":"a2ad28f5f864002403825bfa3b2973d7bd3cc4be809b84ffdf5a9e79dab3db9b"} Mar 09 03:01:08 crc kubenswrapper[4901]: I0309 03:01:08.867657 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/56e038ec-a406-4f6b-9b8a-135c56be7514-etc-swift\") pod \"swift-storage-0\" (UID: \"56e038ec-a406-4f6b-9b8a-135c56be7514\") " pod="openstack/swift-storage-0" Mar 09 03:01:08 crc kubenswrapper[4901]: I0309 03:01:08.879391 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/56e038ec-a406-4f6b-9b8a-135c56be7514-etc-swift\") pod \"swift-storage-0\" (UID: \"56e038ec-a406-4f6b-9b8a-135c56be7514\") " pod="openstack/swift-storage-0" Mar 09 03:01:08 crc kubenswrapper[4901]: I0309 03:01:08.929548 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-wbqpx" Mar 09 03:01:09 crc kubenswrapper[4901]: I0309 03:01:09.071250 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfvrl\" (UniqueName: \"kubernetes.io/projected/19d5d146-9bfc-45cd-ae62-ffd05473b125-kube-api-access-bfvrl\") pod \"19d5d146-9bfc-45cd-ae62-ffd05473b125\" (UID: \"19d5d146-9bfc-45cd-ae62-ffd05473b125\") " Mar 09 03:01:09 crc kubenswrapper[4901]: I0309 03:01:09.071666 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/19d5d146-9bfc-45cd-ae62-ffd05473b125-etc-swift\") pod \"19d5d146-9bfc-45cd-ae62-ffd05473b125\" (UID: \"19d5d146-9bfc-45cd-ae62-ffd05473b125\") " Mar 09 03:01:09 crc kubenswrapper[4901]: I0309 03:01:09.071871 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/19d5d146-9bfc-45cd-ae62-ffd05473b125-swiftconf\") pod \"19d5d146-9bfc-45cd-ae62-ffd05473b125\" (UID: \"19d5d146-9bfc-45cd-ae62-ffd05473b125\") " Mar 09 03:01:09 crc kubenswrapper[4901]: I0309 03:01:09.072635 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/19d5d146-9bfc-45cd-ae62-ffd05473b125-dispersionconf\") pod \"19d5d146-9bfc-45cd-ae62-ffd05473b125\" (UID: \"19d5d146-9bfc-45cd-ae62-ffd05473b125\") " Mar 09 03:01:09 crc kubenswrapper[4901]: I0309 03:01:09.072692 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/19d5d146-9bfc-45cd-ae62-ffd05473b125-ring-data-devices\") pod \"19d5d146-9bfc-45cd-ae62-ffd05473b125\" (UID: \"19d5d146-9bfc-45cd-ae62-ffd05473b125\") " Mar 09 03:01:09 crc kubenswrapper[4901]: I0309 03:01:09.072707 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19d5d146-9bfc-45cd-ae62-ffd05473b125-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "19d5d146-9bfc-45cd-ae62-ffd05473b125" (UID: "19d5d146-9bfc-45cd-ae62-ffd05473b125"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:01:09 crc kubenswrapper[4901]: I0309 03:01:09.072725 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19d5d146-9bfc-45cd-ae62-ffd05473b125-combined-ca-bundle\") pod \"19d5d146-9bfc-45cd-ae62-ffd05473b125\" (UID: \"19d5d146-9bfc-45cd-ae62-ffd05473b125\") " Mar 09 03:01:09 crc kubenswrapper[4901]: I0309 03:01:09.072911 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/19d5d146-9bfc-45cd-ae62-ffd05473b125-scripts\") pod \"19d5d146-9bfc-45cd-ae62-ffd05473b125\" (UID: \"19d5d146-9bfc-45cd-ae62-ffd05473b125\") " Mar 09 03:01:09 crc kubenswrapper[4901]: I0309 03:01:09.073536 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19d5d146-9bfc-45cd-ae62-ffd05473b125-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "19d5d146-9bfc-45cd-ae62-ffd05473b125" (UID: "19d5d146-9bfc-45cd-ae62-ffd05473b125"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:01:09 crc kubenswrapper[4901]: I0309 03:01:09.073744 4901 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/19d5d146-9bfc-45cd-ae62-ffd05473b125-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:09 crc kubenswrapper[4901]: I0309 03:01:09.073769 4901 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/19d5d146-9bfc-45cd-ae62-ffd05473b125-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:09 crc kubenswrapper[4901]: I0309 03:01:09.078343 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19d5d146-9bfc-45cd-ae62-ffd05473b125-kube-api-access-bfvrl" (OuterVolumeSpecName: "kube-api-access-bfvrl") pod "19d5d146-9bfc-45cd-ae62-ffd05473b125" (UID: "19d5d146-9bfc-45cd-ae62-ffd05473b125"). InnerVolumeSpecName "kube-api-access-bfvrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:01:09 crc kubenswrapper[4901]: I0309 03:01:09.084134 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19d5d146-9bfc-45cd-ae62-ffd05473b125-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "19d5d146-9bfc-45cd-ae62-ffd05473b125" (UID: "19d5d146-9bfc-45cd-ae62-ffd05473b125"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:01:09 crc kubenswrapper[4901]: I0309 03:01:09.098627 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19d5d146-9bfc-45cd-ae62-ffd05473b125-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "19d5d146-9bfc-45cd-ae62-ffd05473b125" (UID: "19d5d146-9bfc-45cd-ae62-ffd05473b125"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:01:09 crc kubenswrapper[4901]: I0309 03:01:09.104955 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19d5d146-9bfc-45cd-ae62-ffd05473b125-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "19d5d146-9bfc-45cd-ae62-ffd05473b125" (UID: "19d5d146-9bfc-45cd-ae62-ffd05473b125"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:01:09 crc kubenswrapper[4901]: I0309 03:01:09.112410 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19d5d146-9bfc-45cd-ae62-ffd05473b125-scripts" (OuterVolumeSpecName: "scripts") pod "19d5d146-9bfc-45cd-ae62-ffd05473b125" (UID: "19d5d146-9bfc-45cd-ae62-ffd05473b125"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:01:09 crc kubenswrapper[4901]: I0309 03:01:09.138364 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 09 03:01:09 crc kubenswrapper[4901]: I0309 03:01:09.175643 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/19d5d146-9bfc-45cd-ae62-ffd05473b125-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:09 crc kubenswrapper[4901]: I0309 03:01:09.176023 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfvrl\" (UniqueName: \"kubernetes.io/projected/19d5d146-9bfc-45cd-ae62-ffd05473b125-kube-api-access-bfvrl\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:09 crc kubenswrapper[4901]: I0309 03:01:09.176209 4901 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/19d5d146-9bfc-45cd-ae62-ffd05473b125-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:09 crc kubenswrapper[4901]: I0309 03:01:09.176960 4901 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/19d5d146-9bfc-45cd-ae62-ffd05473b125-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:09 crc kubenswrapper[4901]: I0309 03:01:09.177157 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19d5d146-9bfc-45cd-ae62-ffd05473b125-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:09 crc kubenswrapper[4901]: I0309 03:01:09.500678 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-wbqpx" Mar 09 03:01:09 crc kubenswrapper[4901]: I0309 03:01:09.500658 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-wbqpx" event={"ID":"19d5d146-9bfc-45cd-ae62-ffd05473b125","Type":"ContainerDied","Data":"d66d1177026e29d5043c80764a9ceb0dc68bb3b8f35f0d8a4ac690907130d06d"} Mar 09 03:01:09 crc kubenswrapper[4901]: I0309 03:01:09.500736 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d66d1177026e29d5043c80764a9ceb0dc68bb3b8f35f0d8a4ac690907130d06d" Mar 09 03:01:09 crc kubenswrapper[4901]: I0309 03:01:09.652294 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-plljd"] Mar 09 03:01:09 crc kubenswrapper[4901]: E0309 03:01:09.652971 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19d5d146-9bfc-45cd-ae62-ffd05473b125" containerName="swift-ring-rebalance" Mar 09 03:01:09 crc kubenswrapper[4901]: I0309 03:01:09.652992 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="19d5d146-9bfc-45cd-ae62-ffd05473b125" containerName="swift-ring-rebalance" Mar 09 03:01:09 crc kubenswrapper[4901]: I0309 03:01:09.653347 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="19d5d146-9bfc-45cd-ae62-ffd05473b125" containerName="swift-ring-rebalance" Mar 09 03:01:09 crc kubenswrapper[4901]: I0309 03:01:09.654181 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-plljd" Mar 09 03:01:09 crc kubenswrapper[4901]: I0309 03:01:09.667265 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-plljd"] Mar 09 03:01:09 crc kubenswrapper[4901]: I0309 03:01:09.776087 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-149a-account-create-update-727vf"] Mar 09 03:01:09 crc kubenswrapper[4901]: I0309 03:01:09.777594 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-149a-account-create-update-727vf" Mar 09 03:01:09 crc kubenswrapper[4901]: I0309 03:01:09.781498 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 09 03:01:09 crc kubenswrapper[4901]: I0309 03:01:09.781879 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-149a-account-create-update-727vf"] Mar 09 03:01:09 crc kubenswrapper[4901]: I0309 03:01:09.798738 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t4ks\" (UniqueName: \"kubernetes.io/projected/5bfe9cc4-0c72-472c-a6ae-e08d6ae58d6c-kube-api-access-7t4ks\") pod \"glance-db-create-plljd\" (UID: \"5bfe9cc4-0c72-472c-a6ae-e08d6ae58d6c\") " pod="openstack/glance-db-create-plljd" Mar 09 03:01:09 crc kubenswrapper[4901]: I0309 03:01:09.798765 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5bfe9cc4-0c72-472c-a6ae-e08d6ae58d6c-operator-scripts\") pod \"glance-db-create-plljd\" (UID: \"5bfe9cc4-0c72-472c-a6ae-e08d6ae58d6c\") " pod="openstack/glance-db-create-plljd" Mar 09 03:01:09 crc kubenswrapper[4901]: I0309 03:01:09.814473 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 09 03:01:09 crc kubenswrapper[4901]: I0309 03:01:09.899731 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7t4ks\" (UniqueName: \"kubernetes.io/projected/5bfe9cc4-0c72-472c-a6ae-e08d6ae58d6c-kube-api-access-7t4ks\") pod \"glance-db-create-plljd\" (UID: \"5bfe9cc4-0c72-472c-a6ae-e08d6ae58d6c\") " pod="openstack/glance-db-create-plljd" Mar 09 03:01:09 crc kubenswrapper[4901]: I0309 03:01:09.899859 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5bfe9cc4-0c72-472c-a6ae-e08d6ae58d6c-operator-scripts\") pod \"glance-db-create-plljd\" (UID: \"5bfe9cc4-0c72-472c-a6ae-e08d6ae58d6c\") " pod="openstack/glance-db-create-plljd" Mar 09 03:01:09 crc kubenswrapper[4901]: I0309 03:01:09.899993 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d4ba8c8-8fd5-4f29-a16e-bf1e628f99c0-operator-scripts\") pod \"glance-149a-account-create-update-727vf\" (UID: \"1d4ba8c8-8fd5-4f29-a16e-bf1e628f99c0\") " pod="openstack/glance-149a-account-create-update-727vf" Mar 09 03:01:09 crc kubenswrapper[4901]: I0309 03:01:09.900091 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdbq8\" (UniqueName: \"kubernetes.io/projected/1d4ba8c8-8fd5-4f29-a16e-bf1e628f99c0-kube-api-access-mdbq8\") pod \"glance-149a-account-create-update-727vf\" (UID: \"1d4ba8c8-8fd5-4f29-a16e-bf1e628f99c0\") " pod="openstack/glance-149a-account-create-update-727vf" Mar 09 03:01:09 crc kubenswrapper[4901]: I0309 03:01:09.901340 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5bfe9cc4-0c72-472c-a6ae-e08d6ae58d6c-operator-scripts\") pod \"glance-db-create-plljd\" (UID: \"5bfe9cc4-0c72-472c-a6ae-e08d6ae58d6c\") " pod="openstack/glance-db-create-plljd" Mar 09 03:01:09 crc kubenswrapper[4901]: I0309 03:01:09.927152 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t4ks\" (UniqueName: \"kubernetes.io/projected/5bfe9cc4-0c72-472c-a6ae-e08d6ae58d6c-kube-api-access-7t4ks\") pod \"glance-db-create-plljd\" (UID: \"5bfe9cc4-0c72-472c-a6ae-e08d6ae58d6c\") " pod="openstack/glance-db-create-plljd" Mar 09 03:01:09 crc kubenswrapper[4901]: I0309 03:01:09.958318 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-277cz" Mar 09 03:01:09 crc kubenswrapper[4901]: I0309 03:01:09.998123 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-plljd" Mar 09 03:01:10 crc kubenswrapper[4901]: I0309 03:01:10.002032 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d4ba8c8-8fd5-4f29-a16e-bf1e628f99c0-operator-scripts\") pod \"glance-149a-account-create-update-727vf\" (UID: \"1d4ba8c8-8fd5-4f29-a16e-bf1e628f99c0\") " pod="openstack/glance-149a-account-create-update-727vf" Mar 09 03:01:10 crc kubenswrapper[4901]: I0309 03:01:10.002110 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdbq8\" (UniqueName: \"kubernetes.io/projected/1d4ba8c8-8fd5-4f29-a16e-bf1e628f99c0-kube-api-access-mdbq8\") pod \"glance-149a-account-create-update-727vf\" (UID: \"1d4ba8c8-8fd5-4f29-a16e-bf1e628f99c0\") " pod="openstack/glance-149a-account-create-update-727vf" Mar 09 03:01:10 crc kubenswrapper[4901]: I0309 03:01:10.002754 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d4ba8c8-8fd5-4f29-a16e-bf1e628f99c0-operator-scripts\") pod \"glance-149a-account-create-update-727vf\" (UID: \"1d4ba8c8-8fd5-4f29-a16e-bf1e628f99c0\") " pod="openstack/glance-149a-account-create-update-727vf" Mar 09 03:01:10 crc kubenswrapper[4901]: I0309 03:01:10.022434 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdbq8\" (UniqueName: \"kubernetes.io/projected/1d4ba8c8-8fd5-4f29-a16e-bf1e628f99c0-kube-api-access-mdbq8\") pod \"glance-149a-account-create-update-727vf\" (UID: \"1d4ba8c8-8fd5-4f29-a16e-bf1e628f99c0\") " pod="openstack/glance-149a-account-create-update-727vf" Mar 09 03:01:10 crc kubenswrapper[4901]: I0309 03:01:10.103643 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0a96256-0032-47af-9508-352567bec408-operator-scripts\") pod \"c0a96256-0032-47af-9508-352567bec408\" (UID: \"c0a96256-0032-47af-9508-352567bec408\") " Mar 09 03:01:10 crc kubenswrapper[4901]: I0309 03:01:10.103791 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wd482\" (UniqueName: \"kubernetes.io/projected/c0a96256-0032-47af-9508-352567bec408-kube-api-access-wd482\") pod \"c0a96256-0032-47af-9508-352567bec408\" (UID: \"c0a96256-0032-47af-9508-352567bec408\") " Mar 09 03:01:10 crc kubenswrapper[4901]: I0309 03:01:10.104367 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0a96256-0032-47af-9508-352567bec408-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c0a96256-0032-47af-9508-352567bec408" (UID: "c0a96256-0032-47af-9508-352567bec408"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:01:10 crc kubenswrapper[4901]: I0309 03:01:10.110133 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0a96256-0032-47af-9508-352567bec408-kube-api-access-wd482" (OuterVolumeSpecName: "kube-api-access-wd482") pod "c0a96256-0032-47af-9508-352567bec408" (UID: "c0a96256-0032-47af-9508-352567bec408"). InnerVolumeSpecName "kube-api-access-wd482". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:01:10 crc kubenswrapper[4901]: I0309 03:01:10.119399 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-149a-account-create-update-727vf" Mar 09 03:01:11 crc kubenswrapper[4901]: I0309 03:01:10.205653 4901 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0a96256-0032-47af-9508-352567bec408-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:11 crc kubenswrapper[4901]: I0309 03:01:10.205682 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wd482\" (UniqueName: \"kubernetes.io/projected/c0a96256-0032-47af-9508-352567bec408-kube-api-access-wd482\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:11 crc kubenswrapper[4901]: I0309 03:01:10.867184 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"56e038ec-a406-4f6b-9b8a-135c56be7514","Type":"ContainerStarted","Data":"c53e28cc744f24ff1fba305406ef3d31a267154c97d80480108d1da32a3bf7b0"} Mar 09 03:01:11 crc kubenswrapper[4901]: I0309 03:01:10.869805 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-277cz" event={"ID":"c0a96256-0032-47af-9508-352567bec408","Type":"ContainerDied","Data":"f9e2e8169b81b25fc6408d03cbc17ec3ec9694a68d1bbf31e8b4ec7e95b92321"} Mar 09 03:01:11 crc kubenswrapper[4901]: I0309 03:01:10.869849 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9e2e8169b81b25fc6408d03cbc17ec3ec9694a68d1bbf31e8b4ec7e95b92321" Mar 09 03:01:11 crc kubenswrapper[4901]: I0309 03:01:10.869953 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-277cz" Mar 09 03:01:11 crc kubenswrapper[4901]: I0309 03:01:10.874088 4901 generic.go:334] "Generic (PLEG): container finished" podID="98538e55-cb87-49e2-9fd5-fff06d7edfdd" containerID="d573b837ddf089152e6738d97df2ec1aa5c6f25f6f2ae8c229ee9079ec71fbad" exitCode=0 Mar 09 03:01:11 crc kubenswrapper[4901]: I0309 03:01:10.874130 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"98538e55-cb87-49e2-9fd5-fff06d7edfdd","Type":"ContainerDied","Data":"d573b837ddf089152e6738d97df2ec1aa5c6f25f6f2ae8c229ee9079ec71fbad"} Mar 09 03:01:11 crc kubenswrapper[4901]: I0309 03:01:10.998498 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-cszqf"] Mar 09 03:01:11 crc kubenswrapper[4901]: E0309 03:01:10.999662 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0a96256-0032-47af-9508-352567bec408" containerName="mariadb-account-create-update" Mar 09 03:01:11 crc kubenswrapper[4901]: I0309 03:01:10.999683 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0a96256-0032-47af-9508-352567bec408" containerName="mariadb-account-create-update" Mar 09 03:01:11 crc kubenswrapper[4901]: I0309 03:01:11.002042 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0a96256-0032-47af-9508-352567bec408" containerName="mariadb-account-create-update" Mar 09 03:01:11 crc kubenswrapper[4901]: I0309 03:01:11.002774 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-cszqf" Mar 09 03:01:11 crc kubenswrapper[4901]: I0309 03:01:11.008370 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-cszqf"] Mar 09 03:01:11 crc kubenswrapper[4901]: I0309 03:01:11.019637 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-0c1f-account-create-update-ldjfp"] Mar 09 03:01:11 crc kubenswrapper[4901]: I0309 03:01:11.021328 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0c1f-account-create-update-ldjfp" Mar 09 03:01:11 crc kubenswrapper[4901]: I0309 03:01:11.023433 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 09 03:01:11 crc kubenswrapper[4901]: I0309 03:01:11.035580 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-0c1f-account-create-update-ldjfp"] Mar 09 03:01:11 crc kubenswrapper[4901]: I0309 03:01:11.066593 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-plljd"] Mar 09 03:01:11 crc kubenswrapper[4901]: I0309 03:01:11.079054 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-m97c8"] Mar 09 03:01:11 crc kubenswrapper[4901]: I0309 03:01:11.080348 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-m97c8" Mar 09 03:01:11 crc kubenswrapper[4901]: I0309 03:01:11.092719 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-8504-account-create-update-82s6b"] Mar 09 03:01:11 crc kubenswrapper[4901]: I0309 03:01:11.094030 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8504-account-create-update-82s6b" Mar 09 03:01:11 crc kubenswrapper[4901]: I0309 03:01:11.096524 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 09 03:01:11 crc kubenswrapper[4901]: I0309 03:01:11.098667 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-m97c8"] Mar 09 03:01:11 crc kubenswrapper[4901]: I0309 03:01:11.105853 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8504-account-create-update-82s6b"] Mar 09 03:01:11 crc kubenswrapper[4901]: I0309 03:01:11.157170 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10c171ad-86f3-4601-8abd-89334e351bc8-operator-scripts\") pod \"keystone-db-create-cszqf\" (UID: \"10c171ad-86f3-4601-8abd-89334e351bc8\") " pod="openstack/keystone-db-create-cszqf" Mar 09 03:01:11 crc kubenswrapper[4901]: I0309 03:01:11.157252 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pvtm\" (UniqueName: \"kubernetes.io/projected/10c171ad-86f3-4601-8abd-89334e351bc8-kube-api-access-4pvtm\") pod \"keystone-db-create-cszqf\" (UID: \"10c171ad-86f3-4601-8abd-89334e351bc8\") " pod="openstack/keystone-db-create-cszqf" Mar 09 03:01:11 crc kubenswrapper[4901]: I0309 03:01:11.157327 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e20e8f79-9634-43fd-a9a5-d2710f828a86-operator-scripts\") pod \"keystone-0c1f-account-create-update-ldjfp\" (UID: \"e20e8f79-9634-43fd-a9a5-d2710f828a86\") " pod="openstack/keystone-0c1f-account-create-update-ldjfp" Mar 09 03:01:11 crc kubenswrapper[4901]: I0309 03:01:11.157364 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nc8k2\" (UniqueName: \"kubernetes.io/projected/e20e8f79-9634-43fd-a9a5-d2710f828a86-kube-api-access-nc8k2\") pod \"keystone-0c1f-account-create-update-ldjfp\" (UID: \"e20e8f79-9634-43fd-a9a5-d2710f828a86\") " pod="openstack/keystone-0c1f-account-create-update-ldjfp" Mar 09 03:01:11 crc kubenswrapper[4901]: I0309 03:01:11.258483 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e20e8f79-9634-43fd-a9a5-d2710f828a86-operator-scripts\") pod \"keystone-0c1f-account-create-update-ldjfp\" (UID: \"e20e8f79-9634-43fd-a9a5-d2710f828a86\") " pod="openstack/keystone-0c1f-account-create-update-ldjfp" Mar 09 03:01:11 crc kubenswrapper[4901]: I0309 03:01:11.258524 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjpm2\" (UniqueName: \"kubernetes.io/projected/d752c748-e235-4087-849e-3fe86c6e52b4-kube-api-access-qjpm2\") pod \"placement-db-create-m97c8\" (UID: \"d752c748-e235-4087-849e-3fe86c6e52b4\") " pod="openstack/placement-db-create-m97c8" Mar 09 03:01:11 crc kubenswrapper[4901]: I0309 03:01:11.258571 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nc8k2\" (UniqueName: \"kubernetes.io/projected/e20e8f79-9634-43fd-a9a5-d2710f828a86-kube-api-access-nc8k2\") pod \"keystone-0c1f-account-create-update-ldjfp\" (UID: \"e20e8f79-9634-43fd-a9a5-d2710f828a86\") " pod="openstack/keystone-0c1f-account-create-update-ldjfp" Mar 09 03:01:11 crc kubenswrapper[4901]: I0309 03:01:11.258590 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d752c748-e235-4087-849e-3fe86c6e52b4-operator-scripts\") pod \"placement-db-create-m97c8\" (UID: \"d752c748-e235-4087-849e-3fe86c6e52b4\") " pod="openstack/placement-db-create-m97c8" Mar 09 03:01:11 crc kubenswrapper[4901]: I0309 03:01:11.258607 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e394216-8657-46dd-95d8-5d0e73512d11-operator-scripts\") pod \"placement-8504-account-create-update-82s6b\" (UID: \"7e394216-8657-46dd-95d8-5d0e73512d11\") " pod="openstack/placement-8504-account-create-update-82s6b" Mar 09 03:01:11 crc kubenswrapper[4901]: I0309 03:01:11.258668 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10c171ad-86f3-4601-8abd-89334e351bc8-operator-scripts\") pod \"keystone-db-create-cszqf\" (UID: \"10c171ad-86f3-4601-8abd-89334e351bc8\") " pod="openstack/keystone-db-create-cszqf" Mar 09 03:01:11 crc kubenswrapper[4901]: I0309 03:01:11.258718 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvc6t\" (UniqueName: \"kubernetes.io/projected/7e394216-8657-46dd-95d8-5d0e73512d11-kube-api-access-vvc6t\") pod \"placement-8504-account-create-update-82s6b\" (UID: \"7e394216-8657-46dd-95d8-5d0e73512d11\") " pod="openstack/placement-8504-account-create-update-82s6b" Mar 09 03:01:11 crc kubenswrapper[4901]: I0309 03:01:11.258736 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pvtm\" (UniqueName: \"kubernetes.io/projected/10c171ad-86f3-4601-8abd-89334e351bc8-kube-api-access-4pvtm\") pod \"keystone-db-create-cszqf\" (UID: \"10c171ad-86f3-4601-8abd-89334e351bc8\") " pod="openstack/keystone-db-create-cszqf" Mar 09 03:01:11 crc kubenswrapper[4901]: I0309 03:01:11.260859 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e20e8f79-9634-43fd-a9a5-d2710f828a86-operator-scripts\") pod \"keystone-0c1f-account-create-update-ldjfp\" (UID: \"e20e8f79-9634-43fd-a9a5-d2710f828a86\") " pod="openstack/keystone-0c1f-account-create-update-ldjfp" Mar 09 03:01:11 crc kubenswrapper[4901]: I0309 03:01:11.261501 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10c171ad-86f3-4601-8abd-89334e351bc8-operator-scripts\") pod \"keystone-db-create-cszqf\" (UID: \"10c171ad-86f3-4601-8abd-89334e351bc8\") " pod="openstack/keystone-db-create-cszqf" Mar 09 03:01:11 crc kubenswrapper[4901]: I0309 03:01:11.279305 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nc8k2\" (UniqueName: \"kubernetes.io/projected/e20e8f79-9634-43fd-a9a5-d2710f828a86-kube-api-access-nc8k2\") pod \"keystone-0c1f-account-create-update-ldjfp\" (UID: \"e20e8f79-9634-43fd-a9a5-d2710f828a86\") " pod="openstack/keystone-0c1f-account-create-update-ldjfp" Mar 09 03:01:11 crc kubenswrapper[4901]: I0309 03:01:11.282168 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pvtm\" (UniqueName: \"kubernetes.io/projected/10c171ad-86f3-4601-8abd-89334e351bc8-kube-api-access-4pvtm\") pod \"keystone-db-create-cszqf\" (UID: \"10c171ad-86f3-4601-8abd-89334e351bc8\") " pod="openstack/keystone-db-create-cszqf" Mar 09 03:01:11 crc kubenswrapper[4901]: I0309 03:01:11.359914 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e394216-8657-46dd-95d8-5d0e73512d11-operator-scripts\") pod \"placement-8504-account-create-update-82s6b\" (UID: \"7e394216-8657-46dd-95d8-5d0e73512d11\") " pod="openstack/placement-8504-account-create-update-82s6b" Mar 09 03:01:11 crc kubenswrapper[4901]: I0309 03:01:11.360422 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvc6t\" (UniqueName: \"kubernetes.io/projected/7e394216-8657-46dd-95d8-5d0e73512d11-kube-api-access-vvc6t\") pod \"placement-8504-account-create-update-82s6b\" (UID: \"7e394216-8657-46dd-95d8-5d0e73512d11\") " pod="openstack/placement-8504-account-create-update-82s6b" Mar 09 03:01:11 crc kubenswrapper[4901]: I0309 03:01:11.360544 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjpm2\" (UniqueName: \"kubernetes.io/projected/d752c748-e235-4087-849e-3fe86c6e52b4-kube-api-access-qjpm2\") pod \"placement-db-create-m97c8\" (UID: \"d752c748-e235-4087-849e-3fe86c6e52b4\") " pod="openstack/placement-db-create-m97c8" Mar 09 03:01:11 crc kubenswrapper[4901]: I0309 03:01:11.360929 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d752c748-e235-4087-849e-3fe86c6e52b4-operator-scripts\") pod \"placement-db-create-m97c8\" (UID: \"d752c748-e235-4087-849e-3fe86c6e52b4\") " pod="openstack/placement-db-create-m97c8" Mar 09 03:01:11 crc kubenswrapper[4901]: I0309 03:01:11.361166 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e394216-8657-46dd-95d8-5d0e73512d11-operator-scripts\") pod \"placement-8504-account-create-update-82s6b\" (UID: \"7e394216-8657-46dd-95d8-5d0e73512d11\") " pod="openstack/placement-8504-account-create-update-82s6b" Mar 09 03:01:11 crc kubenswrapper[4901]: I0309 03:01:11.361910 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d752c748-e235-4087-849e-3fe86c6e52b4-operator-scripts\") pod \"placement-db-create-m97c8\" (UID: \"d752c748-e235-4087-849e-3fe86c6e52b4\") " pod="openstack/placement-db-create-m97c8" Mar 09 03:01:11 crc kubenswrapper[4901]: I0309 03:01:11.378923 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvc6t\" (UniqueName: \"kubernetes.io/projected/7e394216-8657-46dd-95d8-5d0e73512d11-kube-api-access-vvc6t\") pod \"placement-8504-account-create-update-82s6b\" (UID: \"7e394216-8657-46dd-95d8-5d0e73512d11\") " pod="openstack/placement-8504-account-create-update-82s6b" Mar 09 03:01:11 crc kubenswrapper[4901]: I0309 03:01:11.378940 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjpm2\" (UniqueName: \"kubernetes.io/projected/d752c748-e235-4087-849e-3fe86c6e52b4-kube-api-access-qjpm2\") pod \"placement-db-create-m97c8\" (UID: \"d752c748-e235-4087-849e-3fe86c6e52b4\") " pod="openstack/placement-db-create-m97c8" Mar 09 03:01:11 crc kubenswrapper[4901]: I0309 03:01:11.398145 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-cszqf" Mar 09 03:01:11 crc kubenswrapper[4901]: I0309 03:01:11.430750 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0c1f-account-create-update-ldjfp" Mar 09 03:01:11 crc kubenswrapper[4901]: I0309 03:01:11.466656 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-m97c8" Mar 09 03:01:11 crc kubenswrapper[4901]: I0309 03:01:11.472835 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8504-account-create-update-82s6b" Mar 09 03:01:11 crc kubenswrapper[4901]: I0309 03:01:11.881282 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-149a-account-create-update-727vf"] Mar 09 03:01:11 crc kubenswrapper[4901]: W0309 03:01:11.886495 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d4ba8c8_8fd5_4f29_a16e_bf1e628f99c0.slice/crio-5e754b5f2eecde66fafe0c556649b1569e94538c74a316050ccc25fa10e50b14 WatchSource:0}: Error finding container 5e754b5f2eecde66fafe0c556649b1569e94538c74a316050ccc25fa10e50b14: Status 404 returned error can't find the container with id 5e754b5f2eecde66fafe0c556649b1569e94538c74a316050ccc25fa10e50b14 Mar 09 03:01:11 crc kubenswrapper[4901]: I0309 03:01:11.892492 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"98538e55-cb87-49e2-9fd5-fff06d7edfdd","Type":"ContainerStarted","Data":"d88fb8444efa6a21fe15aca1c8ba0da30c0a28364fd9a1356f05611a979ae19f"} Mar 09 03:01:11 crc kubenswrapper[4901]: I0309 03:01:11.892807 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 09 03:01:11 crc kubenswrapper[4901]: I0309 03:01:11.895525 4901 generic.go:334] "Generic (PLEG): container finished" podID="5bfe9cc4-0c72-472c-a6ae-e08d6ae58d6c" containerID="9c89ebccb26bac47875a363934008992acf70441ad49375a8adc23935e857520" exitCode=0 Mar 09 03:01:11 crc kubenswrapper[4901]: I0309 03:01:11.895570 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-plljd" event={"ID":"5bfe9cc4-0c72-472c-a6ae-e08d6ae58d6c","Type":"ContainerDied","Data":"9c89ebccb26bac47875a363934008992acf70441ad49375a8adc23935e857520"} Mar 09 03:01:11 crc kubenswrapper[4901]: I0309 03:01:11.895594 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-plljd" event={"ID":"5bfe9cc4-0c72-472c-a6ae-e08d6ae58d6c","Type":"ContainerStarted","Data":"74b1008c43beb8ed9c2e3fecb89d0e55dfb75bd11d0977a020f0b54663cf609b"} Mar 09 03:01:11 crc kubenswrapper[4901]: I0309 03:01:11.923575 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.535921449 podStartE2EDuration="56.92356132s" podCreationTimestamp="2026-03-09 03:00:15 +0000 UTC" firstStartedPulling="2026-03-09 03:00:17.16983763 +0000 UTC m=+1141.759501362" lastFinishedPulling="2026-03-09 03:00:36.557477501 +0000 UTC m=+1161.147141233" observedRunningTime="2026-03-09 03:01:11.918658556 +0000 UTC m=+1196.508322288" watchObservedRunningTime="2026-03-09 03:01:11.92356132 +0000 UTC m=+1196.513225052" Mar 09 03:01:12 crc kubenswrapper[4901]: I0309 03:01:12.085177 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-0c1f-account-create-update-ldjfp"] Mar 09 03:01:12 crc kubenswrapper[4901]: I0309 03:01:12.212306 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-cszqf"] Mar 09 03:01:12 crc kubenswrapper[4901]: W0309 03:01:12.215367 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10c171ad_86f3_4601_8abd_89334e351bc8.slice/crio-ce4f8f32add6b370fc7d53777314b06d10ae01a20a9b58cdead0c52aa38907bf WatchSource:0}: Error finding container ce4f8f32add6b370fc7d53777314b06d10ae01a20a9b58cdead0c52aa38907bf: Status 404 returned error can't find the container with id ce4f8f32add6b370fc7d53777314b06d10ae01a20a9b58cdead0c52aa38907bf Mar 09 03:01:12 crc kubenswrapper[4901]: I0309 03:01:12.282551 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-m97c8"] Mar 09 03:01:12 crc kubenswrapper[4901]: I0309 03:01:12.292623 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8504-account-create-update-82s6b"] Mar 09 03:01:12 crc kubenswrapper[4901]: I0309 03:01:12.906948 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8504-account-create-update-82s6b" event={"ID":"7e394216-8657-46dd-95d8-5d0e73512d11","Type":"ContainerStarted","Data":"2a9015c6301a41f7bb9edf9e61e1f441f7386e0db5929e6264964b6d4abbcbdb"} Mar 09 03:01:12 crc kubenswrapper[4901]: I0309 03:01:12.913821 4901 generic.go:334] "Generic (PLEG): container finished" podID="e20e8f79-9634-43fd-a9a5-d2710f828a86" containerID="c97e0e91fc19928d4cd2d28f3a4db51e71395cd30f0e8c6635ab7ce7d7ad2c6b" exitCode=0 Mar 09 03:01:12 crc kubenswrapper[4901]: I0309 03:01:12.917460 4901 generic.go:334] "Generic (PLEG): container finished" podID="1d4ba8c8-8fd5-4f29-a16e-bf1e628f99c0" containerID="7636d66231ac9d71a66a7ce47bddddcc5f7b6d3fa536ff6823ddb5d6e4ae9419" exitCode=0 Mar 09 03:01:12 crc kubenswrapper[4901]: I0309 03:01:12.924111 4901 generic.go:334] "Generic (PLEG): container finished" podID="10c171ad-86f3-4601-8abd-89334e351bc8" containerID="9dd7765fed399b21a402a85557bb2e3752b30c45c4c7b942075acb9f6b5c5dd7" exitCode=0 Mar 09 03:01:12 crc kubenswrapper[4901]: I0309 03:01:12.907269 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8504-account-create-update-82s6b" event={"ID":"7e394216-8657-46dd-95d8-5d0e73512d11","Type":"ContainerStarted","Data":"6183072df085fa6cd60d4e985a06442f5b18ee1f5ac146f660f8b9443c7ec24d"} Mar 09 03:01:12 crc kubenswrapper[4901]: I0309 03:01:12.929026 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-277cz"] Mar 09 03:01:12 crc kubenswrapper[4901]: I0309 03:01:12.929052 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-m97c8" event={"ID":"d752c748-e235-4087-849e-3fe86c6e52b4","Type":"ContainerStarted","Data":"ddd5a5a2fd5db2b102441a537f515378dc011c9879e9d5aab1e360514034a5bc"} Mar 09 03:01:12 crc kubenswrapper[4901]: I0309 03:01:12.929069 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-m97c8" event={"ID":"d752c748-e235-4087-849e-3fe86c6e52b4","Type":"ContainerStarted","Data":"29a2b23a2aa391cc713fd8d65b041afe837bb343c85ed2b6d76af257f395eb95"} Mar 09 03:01:12 crc kubenswrapper[4901]: I0309 03:01:12.929082 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0c1f-account-create-update-ldjfp" event={"ID":"e20e8f79-9634-43fd-a9a5-d2710f828a86","Type":"ContainerDied","Data":"c97e0e91fc19928d4cd2d28f3a4db51e71395cd30f0e8c6635ab7ce7d7ad2c6b"} Mar 09 03:01:12 crc kubenswrapper[4901]: I0309 03:01:12.929097 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0c1f-account-create-update-ldjfp" event={"ID":"e20e8f79-9634-43fd-a9a5-d2710f828a86","Type":"ContainerStarted","Data":"28e78e4192e38a0268a02456b542e8463ef24efc24dc798c0d78b244cea3d754"} Mar 09 03:01:12 crc kubenswrapper[4901]: I0309 03:01:12.929108 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-149a-account-create-update-727vf" event={"ID":"1d4ba8c8-8fd5-4f29-a16e-bf1e628f99c0","Type":"ContainerDied","Data":"7636d66231ac9d71a66a7ce47bddddcc5f7b6d3fa536ff6823ddb5d6e4ae9419"} Mar 09 03:01:12 crc kubenswrapper[4901]: I0309 03:01:12.929120 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-149a-account-create-update-727vf" event={"ID":"1d4ba8c8-8fd5-4f29-a16e-bf1e628f99c0","Type":"ContainerStarted","Data":"5e754b5f2eecde66fafe0c556649b1569e94538c74a316050ccc25fa10e50b14"} Mar 09 03:01:12 crc kubenswrapper[4901]: I0309 03:01:12.929130 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"56e038ec-a406-4f6b-9b8a-135c56be7514","Type":"ContainerStarted","Data":"8439b508bcb9b7e1d34dad860ed688032784076964cafe31bf8854469d12a0c4"} Mar 09 03:01:12 crc kubenswrapper[4901]: I0309 03:01:12.929139 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"56e038ec-a406-4f6b-9b8a-135c56be7514","Type":"ContainerStarted","Data":"b505e13d0afa284626e3a000524fb455406b74e6c642956f44df576c999c444c"} Mar 09 03:01:12 crc kubenswrapper[4901]: I0309 03:01:12.929148 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"56e038ec-a406-4f6b-9b8a-135c56be7514","Type":"ContainerStarted","Data":"0508076142d286eb3dc29b982443d11cf9f76d1d98901e2dde15dd0067359954"} Mar 09 03:01:12 crc kubenswrapper[4901]: I0309 03:01:12.929158 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"56e038ec-a406-4f6b-9b8a-135c56be7514","Type":"ContainerStarted","Data":"5bf4c0e106e3a5033b95c4d3d3124a40b5aeabe081706850be3c85ef4ff88af9"} Mar 09 03:01:12 crc kubenswrapper[4901]: I0309 03:01:12.929167 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-cszqf" event={"ID":"10c171ad-86f3-4601-8abd-89334e351bc8","Type":"ContainerDied","Data":"9dd7765fed399b21a402a85557bb2e3752b30c45c4c7b942075acb9f6b5c5dd7"} Mar 09 03:01:12 crc kubenswrapper[4901]: I0309 03:01:12.929179 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-cszqf" event={"ID":"10c171ad-86f3-4601-8abd-89334e351bc8","Type":"ContainerStarted","Data":"ce4f8f32add6b370fc7d53777314b06d10ae01a20a9b58cdead0c52aa38907bf"} Mar 09 03:01:12 crc kubenswrapper[4901]: I0309 03:01:12.946563 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-277cz"] Mar 09 03:01:12 crc kubenswrapper[4901]: I0309 03:01:12.955787 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-8504-account-create-update-82s6b" podStartSLOduration=1.955749778 podStartE2EDuration="1.955749778s" podCreationTimestamp="2026-03-09 03:01:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 03:01:12.93125283 +0000 UTC m=+1197.520916592" watchObservedRunningTime="2026-03-09 03:01:12.955749778 +0000 UTC m=+1197.545413510" Mar 09 03:01:13 crc kubenswrapper[4901]: I0309 03:01:13.003717 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-m97c8" podStartSLOduration=2.003701178 podStartE2EDuration="2.003701178s" podCreationTimestamp="2026-03-09 03:01:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 03:01:12.99824169 +0000 UTC m=+1197.587905432" watchObservedRunningTime="2026-03-09 03:01:13.003701178 +0000 UTC m=+1197.593364910" Mar 09 03:01:13 crc kubenswrapper[4901]: I0309 03:01:13.198689 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-plljd" Mar 09 03:01:13 crc kubenswrapper[4901]: I0309 03:01:13.320171 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7t4ks\" (UniqueName: \"kubernetes.io/projected/5bfe9cc4-0c72-472c-a6ae-e08d6ae58d6c-kube-api-access-7t4ks\") pod \"5bfe9cc4-0c72-472c-a6ae-e08d6ae58d6c\" (UID: \"5bfe9cc4-0c72-472c-a6ae-e08d6ae58d6c\") " Mar 09 03:01:13 crc kubenswrapper[4901]: I0309 03:01:13.320451 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5bfe9cc4-0c72-472c-a6ae-e08d6ae58d6c-operator-scripts\") pod \"5bfe9cc4-0c72-472c-a6ae-e08d6ae58d6c\" (UID: \"5bfe9cc4-0c72-472c-a6ae-e08d6ae58d6c\") " Mar 09 03:01:13 crc kubenswrapper[4901]: I0309 03:01:13.321074 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bfe9cc4-0c72-472c-a6ae-e08d6ae58d6c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5bfe9cc4-0c72-472c-a6ae-e08d6ae58d6c" (UID: "5bfe9cc4-0c72-472c-a6ae-e08d6ae58d6c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:01:13 crc kubenswrapper[4901]: I0309 03:01:13.325337 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bfe9cc4-0c72-472c-a6ae-e08d6ae58d6c-kube-api-access-7t4ks" (OuterVolumeSpecName: "kube-api-access-7t4ks") pod "5bfe9cc4-0c72-472c-a6ae-e08d6ae58d6c" (UID: "5bfe9cc4-0c72-472c-a6ae-e08d6ae58d6c"). InnerVolumeSpecName "kube-api-access-7t4ks". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:01:13 crc kubenswrapper[4901]: I0309 03:01:13.423797 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7t4ks\" (UniqueName: \"kubernetes.io/projected/5bfe9cc4-0c72-472c-a6ae-e08d6ae58d6c-kube-api-access-7t4ks\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:13 crc kubenswrapper[4901]: I0309 03:01:13.423864 4901 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5bfe9cc4-0c72-472c-a6ae-e08d6ae58d6c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:13 crc kubenswrapper[4901]: I0309 03:01:13.938843 4901 generic.go:334] "Generic (PLEG): container finished" podID="7e394216-8657-46dd-95d8-5d0e73512d11" containerID="2a9015c6301a41f7bb9edf9e61e1f441f7386e0db5929e6264964b6d4abbcbdb" exitCode=0 Mar 09 03:01:13 crc kubenswrapper[4901]: I0309 03:01:13.938949 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8504-account-create-update-82s6b" event={"ID":"7e394216-8657-46dd-95d8-5d0e73512d11","Type":"ContainerDied","Data":"2a9015c6301a41f7bb9edf9e61e1f441f7386e0db5929e6264964b6d4abbcbdb"} Mar 09 03:01:13 crc kubenswrapper[4901]: I0309 03:01:13.943213 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-plljd" event={"ID":"5bfe9cc4-0c72-472c-a6ae-e08d6ae58d6c","Type":"ContainerDied","Data":"74b1008c43beb8ed9c2e3fecb89d0e55dfb75bd11d0977a020f0b54663cf609b"} Mar 09 03:01:13 crc kubenswrapper[4901]: I0309 03:01:13.943265 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74b1008c43beb8ed9c2e3fecb89d0e55dfb75bd11d0977a020f0b54663cf609b" Mar 09 03:01:13 crc kubenswrapper[4901]: I0309 03:01:13.943313 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-plljd" Mar 09 03:01:13 crc kubenswrapper[4901]: I0309 03:01:13.946093 4901 generic.go:334] "Generic (PLEG): container finished" podID="d752c748-e235-4087-849e-3fe86c6e52b4" containerID="ddd5a5a2fd5db2b102441a537f515378dc011c9879e9d5aab1e360514034a5bc" exitCode=0 Mar 09 03:01:13 crc kubenswrapper[4901]: I0309 03:01:13.946299 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-m97c8" event={"ID":"d752c748-e235-4087-849e-3fe86c6e52b4","Type":"ContainerDied","Data":"ddd5a5a2fd5db2b102441a537f515378dc011c9879e9d5aab1e360514034a5bc"} Mar 09 03:01:14 crc kubenswrapper[4901]: I0309 03:01:14.133617 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-5rg5k" podUID="dc54c941-19d2-42c1-b9f0-a3a58999bda5" containerName="ovn-controller" probeResult="failure" output=< Mar 09 03:01:14 crc kubenswrapper[4901]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 09 03:01:14 crc kubenswrapper[4901]: > Mar 09 03:01:14 crc kubenswrapper[4901]: I0309 03:01:14.154778 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0a96256-0032-47af-9508-352567bec408" path="/var/lib/kubelet/pods/c0a96256-0032-47af-9508-352567bec408/volumes" Mar 09 03:01:14 crc kubenswrapper[4901]: I0309 03:01:14.207550 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 09 03:01:14 crc kubenswrapper[4901]: I0309 03:01:14.344462 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-149a-account-create-update-727vf" Mar 09 03:01:14 crc kubenswrapper[4901]: I0309 03:01:14.450633 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-cszqf" Mar 09 03:01:14 crc kubenswrapper[4901]: I0309 03:01:14.466829 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d4ba8c8-8fd5-4f29-a16e-bf1e628f99c0-operator-scripts\") pod \"1d4ba8c8-8fd5-4f29-a16e-bf1e628f99c0\" (UID: \"1d4ba8c8-8fd5-4f29-a16e-bf1e628f99c0\") " Mar 09 03:01:14 crc kubenswrapper[4901]: I0309 03:01:14.466901 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdbq8\" (UniqueName: \"kubernetes.io/projected/1d4ba8c8-8fd5-4f29-a16e-bf1e628f99c0-kube-api-access-mdbq8\") pod \"1d4ba8c8-8fd5-4f29-a16e-bf1e628f99c0\" (UID: \"1d4ba8c8-8fd5-4f29-a16e-bf1e628f99c0\") " Mar 09 03:01:14 crc kubenswrapper[4901]: I0309 03:01:14.473314 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d4ba8c8-8fd5-4f29-a16e-bf1e628f99c0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1d4ba8c8-8fd5-4f29-a16e-bf1e628f99c0" (UID: "1d4ba8c8-8fd5-4f29-a16e-bf1e628f99c0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:01:14 crc kubenswrapper[4901]: I0309 03:01:14.473605 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d4ba8c8-8fd5-4f29-a16e-bf1e628f99c0-kube-api-access-mdbq8" (OuterVolumeSpecName: "kube-api-access-mdbq8") pod "1d4ba8c8-8fd5-4f29-a16e-bf1e628f99c0" (UID: "1d4ba8c8-8fd5-4f29-a16e-bf1e628f99c0"). InnerVolumeSpecName "kube-api-access-mdbq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:01:14 crc kubenswrapper[4901]: I0309 03:01:14.478778 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0c1f-account-create-update-ldjfp" Mar 09 03:01:14 crc kubenswrapper[4901]: I0309 03:01:14.568214 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pvtm\" (UniqueName: \"kubernetes.io/projected/10c171ad-86f3-4601-8abd-89334e351bc8-kube-api-access-4pvtm\") pod \"10c171ad-86f3-4601-8abd-89334e351bc8\" (UID: \"10c171ad-86f3-4601-8abd-89334e351bc8\") " Mar 09 03:01:14 crc kubenswrapper[4901]: I0309 03:01:14.568379 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10c171ad-86f3-4601-8abd-89334e351bc8-operator-scripts\") pod \"10c171ad-86f3-4601-8abd-89334e351bc8\" (UID: \"10c171ad-86f3-4601-8abd-89334e351bc8\") " Mar 09 03:01:14 crc kubenswrapper[4901]: I0309 03:01:14.568772 4901 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d4ba8c8-8fd5-4f29-a16e-bf1e628f99c0-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:14 crc kubenswrapper[4901]: I0309 03:01:14.568791 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdbq8\" (UniqueName: \"kubernetes.io/projected/1d4ba8c8-8fd5-4f29-a16e-bf1e628f99c0-kube-api-access-mdbq8\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:14 crc kubenswrapper[4901]: I0309 03:01:14.569317 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10c171ad-86f3-4601-8abd-89334e351bc8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "10c171ad-86f3-4601-8abd-89334e351bc8" (UID: "10c171ad-86f3-4601-8abd-89334e351bc8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:01:14 crc kubenswrapper[4901]: I0309 03:01:14.572966 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10c171ad-86f3-4601-8abd-89334e351bc8-kube-api-access-4pvtm" (OuterVolumeSpecName: "kube-api-access-4pvtm") pod "10c171ad-86f3-4601-8abd-89334e351bc8" (UID: "10c171ad-86f3-4601-8abd-89334e351bc8"). InnerVolumeSpecName "kube-api-access-4pvtm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:01:14 crc kubenswrapper[4901]: I0309 03:01:14.669436 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nc8k2\" (UniqueName: \"kubernetes.io/projected/e20e8f79-9634-43fd-a9a5-d2710f828a86-kube-api-access-nc8k2\") pod \"e20e8f79-9634-43fd-a9a5-d2710f828a86\" (UID: \"e20e8f79-9634-43fd-a9a5-d2710f828a86\") " Mar 09 03:01:14 crc kubenswrapper[4901]: I0309 03:01:14.669503 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e20e8f79-9634-43fd-a9a5-d2710f828a86-operator-scripts\") pod \"e20e8f79-9634-43fd-a9a5-d2710f828a86\" (UID: \"e20e8f79-9634-43fd-a9a5-d2710f828a86\") " Mar 09 03:01:14 crc kubenswrapper[4901]: I0309 03:01:14.669790 4901 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10c171ad-86f3-4601-8abd-89334e351bc8-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:14 crc kubenswrapper[4901]: I0309 03:01:14.669805 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pvtm\" (UniqueName: \"kubernetes.io/projected/10c171ad-86f3-4601-8abd-89334e351bc8-kube-api-access-4pvtm\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:14 crc kubenswrapper[4901]: I0309 03:01:14.669947 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e20e8f79-9634-43fd-a9a5-d2710f828a86-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e20e8f79-9634-43fd-a9a5-d2710f828a86" (UID: "e20e8f79-9634-43fd-a9a5-d2710f828a86"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:01:14 crc kubenswrapper[4901]: I0309 03:01:14.674438 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e20e8f79-9634-43fd-a9a5-d2710f828a86-kube-api-access-nc8k2" (OuterVolumeSpecName: "kube-api-access-nc8k2") pod "e20e8f79-9634-43fd-a9a5-d2710f828a86" (UID: "e20e8f79-9634-43fd-a9a5-d2710f828a86"). InnerVolumeSpecName "kube-api-access-nc8k2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:01:14 crc kubenswrapper[4901]: I0309 03:01:14.770882 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nc8k2\" (UniqueName: \"kubernetes.io/projected/e20e8f79-9634-43fd-a9a5-d2710f828a86-kube-api-access-nc8k2\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:14 crc kubenswrapper[4901]: I0309 03:01:14.770915 4901 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e20e8f79-9634-43fd-a9a5-d2710f828a86-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:14 crc kubenswrapper[4901]: I0309 03:01:14.986851 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0c1f-account-create-update-ldjfp" Mar 09 03:01:14 crc kubenswrapper[4901]: I0309 03:01:14.986840 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0c1f-account-create-update-ldjfp" event={"ID":"e20e8f79-9634-43fd-a9a5-d2710f828a86","Type":"ContainerDied","Data":"28e78e4192e38a0268a02456b542e8463ef24efc24dc798c0d78b244cea3d754"} Mar 09 03:01:14 crc kubenswrapper[4901]: I0309 03:01:14.987002 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28e78e4192e38a0268a02456b542e8463ef24efc24dc798c0d78b244cea3d754" Mar 09 03:01:14 crc kubenswrapper[4901]: I0309 03:01:14.988672 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-149a-account-create-update-727vf" event={"ID":"1d4ba8c8-8fd5-4f29-a16e-bf1e628f99c0","Type":"ContainerDied","Data":"5e754b5f2eecde66fafe0c556649b1569e94538c74a316050ccc25fa10e50b14"} Mar 09 03:01:14 crc kubenswrapper[4901]: I0309 03:01:14.988709 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e754b5f2eecde66fafe0c556649b1569e94538c74a316050ccc25fa10e50b14" Mar 09 03:01:14 crc kubenswrapper[4901]: I0309 03:01:14.988790 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-149a-account-create-update-727vf" Mar 09 03:01:15 crc kubenswrapper[4901]: I0309 03:01:15.010525 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"56e038ec-a406-4f6b-9b8a-135c56be7514","Type":"ContainerStarted","Data":"000fe2a3e4881852b517c846ea1372dc4b8cf6aada1cff25241e58df0a0f1d14"} Mar 09 03:01:15 crc kubenswrapper[4901]: I0309 03:01:15.010572 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"56e038ec-a406-4f6b-9b8a-135c56be7514","Type":"ContainerStarted","Data":"efdf4e619a6d24b736b4544527ea94436e6c978c7ceba7ef958652cf7cb597b8"} Mar 09 03:01:15 crc kubenswrapper[4901]: I0309 03:01:15.010585 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"56e038ec-a406-4f6b-9b8a-135c56be7514","Type":"ContainerStarted","Data":"88fc84894c5e86912090b30b3eb8149fd1b794d55763d71b556f863fbc68ed0f"} Mar 09 03:01:15 crc kubenswrapper[4901]: I0309 03:01:15.019763 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-cszqf" Mar 09 03:01:15 crc kubenswrapper[4901]: I0309 03:01:15.022294 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-cszqf" event={"ID":"10c171ad-86f3-4601-8abd-89334e351bc8","Type":"ContainerDied","Data":"ce4f8f32add6b370fc7d53777314b06d10ae01a20a9b58cdead0c52aa38907bf"} Mar 09 03:01:15 crc kubenswrapper[4901]: I0309 03:01:15.022330 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce4f8f32add6b370fc7d53777314b06d10ae01a20a9b58cdead0c52aa38907bf" Mar 09 03:01:15 crc kubenswrapper[4901]: I0309 03:01:15.622748 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-m97c8" Mar 09 03:01:15 crc kubenswrapper[4901]: I0309 03:01:15.629811 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8504-account-create-update-82s6b" Mar 09 03:01:15 crc kubenswrapper[4901]: I0309 03:01:15.789405 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvc6t\" (UniqueName: \"kubernetes.io/projected/7e394216-8657-46dd-95d8-5d0e73512d11-kube-api-access-vvc6t\") pod \"7e394216-8657-46dd-95d8-5d0e73512d11\" (UID: \"7e394216-8657-46dd-95d8-5d0e73512d11\") " Mar 09 03:01:15 crc kubenswrapper[4901]: I0309 03:01:15.789463 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d752c748-e235-4087-849e-3fe86c6e52b4-operator-scripts\") pod \"d752c748-e235-4087-849e-3fe86c6e52b4\" (UID: \"d752c748-e235-4087-849e-3fe86c6e52b4\") " Mar 09 03:01:15 crc kubenswrapper[4901]: I0309 03:01:15.789533 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e394216-8657-46dd-95d8-5d0e73512d11-operator-scripts\") pod \"7e394216-8657-46dd-95d8-5d0e73512d11\" (UID: \"7e394216-8657-46dd-95d8-5d0e73512d11\") " Mar 09 03:01:15 crc kubenswrapper[4901]: I0309 03:01:15.789562 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjpm2\" (UniqueName: \"kubernetes.io/projected/d752c748-e235-4087-849e-3fe86c6e52b4-kube-api-access-qjpm2\") pod \"d752c748-e235-4087-849e-3fe86c6e52b4\" (UID: \"d752c748-e235-4087-849e-3fe86c6e52b4\") " Mar 09 03:01:15 crc kubenswrapper[4901]: I0309 03:01:15.791275 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e394216-8657-46dd-95d8-5d0e73512d11-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7e394216-8657-46dd-95d8-5d0e73512d11" (UID: "7e394216-8657-46dd-95d8-5d0e73512d11"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:01:15 crc kubenswrapper[4901]: I0309 03:01:15.791396 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d752c748-e235-4087-849e-3fe86c6e52b4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d752c748-e235-4087-849e-3fe86c6e52b4" (UID: "d752c748-e235-4087-849e-3fe86c6e52b4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:01:15 crc kubenswrapper[4901]: I0309 03:01:15.795011 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d752c748-e235-4087-849e-3fe86c6e52b4-kube-api-access-qjpm2" (OuterVolumeSpecName: "kube-api-access-qjpm2") pod "d752c748-e235-4087-849e-3fe86c6e52b4" (UID: "d752c748-e235-4087-849e-3fe86c6e52b4"). InnerVolumeSpecName "kube-api-access-qjpm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:01:15 crc kubenswrapper[4901]: I0309 03:01:15.795460 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e394216-8657-46dd-95d8-5d0e73512d11-kube-api-access-vvc6t" (OuterVolumeSpecName: "kube-api-access-vvc6t") pod "7e394216-8657-46dd-95d8-5d0e73512d11" (UID: "7e394216-8657-46dd-95d8-5d0e73512d11"). InnerVolumeSpecName "kube-api-access-vvc6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:01:15 crc kubenswrapper[4901]: I0309 03:01:15.891454 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvc6t\" (UniqueName: \"kubernetes.io/projected/7e394216-8657-46dd-95d8-5d0e73512d11-kube-api-access-vvc6t\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:15 crc kubenswrapper[4901]: I0309 03:01:15.891491 4901 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d752c748-e235-4087-849e-3fe86c6e52b4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:15 crc kubenswrapper[4901]: I0309 03:01:15.891511 4901 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e394216-8657-46dd-95d8-5d0e73512d11-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:15 crc kubenswrapper[4901]: I0309 03:01:15.891528 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjpm2\" (UniqueName: \"kubernetes.io/projected/d752c748-e235-4087-849e-3fe86c6e52b4-kube-api-access-qjpm2\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:16 crc kubenswrapper[4901]: I0309 03:01:16.037827 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"56e038ec-a406-4f6b-9b8a-135c56be7514","Type":"ContainerStarted","Data":"deebe1091232da3c6c138fb30edea0b726dc89153aad8d9068b83577825506dd"} Mar 09 03:01:16 crc kubenswrapper[4901]: I0309 03:01:16.041192 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8504-account-create-update-82s6b" Mar 09 03:01:16 crc kubenswrapper[4901]: I0309 03:01:16.041170 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8504-account-create-update-82s6b" event={"ID":"7e394216-8657-46dd-95d8-5d0e73512d11","Type":"ContainerDied","Data":"6183072df085fa6cd60d4e985a06442f5b18ee1f5ac146f660f8b9443c7ec24d"} Mar 09 03:01:16 crc kubenswrapper[4901]: I0309 03:01:16.042157 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6183072df085fa6cd60d4e985a06442f5b18ee1f5ac146f660f8b9443c7ec24d" Mar 09 03:01:16 crc kubenswrapper[4901]: I0309 03:01:16.043438 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-m97c8" event={"ID":"d752c748-e235-4087-849e-3fe86c6e52b4","Type":"ContainerDied","Data":"29a2b23a2aa391cc713fd8d65b041afe837bb343c85ed2b6d76af257f395eb95"} Mar 09 03:01:16 crc kubenswrapper[4901]: I0309 03:01:16.043647 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29a2b23a2aa391cc713fd8d65b041afe837bb343c85ed2b6d76af257f395eb95" Mar 09 03:01:16 crc kubenswrapper[4901]: I0309 03:01:16.043558 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-m97c8" Mar 09 03:01:17 crc kubenswrapper[4901]: I0309 03:01:17.921698 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-b7nv8"] Mar 09 03:01:17 crc kubenswrapper[4901]: E0309 03:01:17.922470 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d752c748-e235-4087-849e-3fe86c6e52b4" containerName="mariadb-database-create" Mar 09 03:01:17 crc kubenswrapper[4901]: I0309 03:01:17.922483 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="d752c748-e235-4087-849e-3fe86c6e52b4" containerName="mariadb-database-create" Mar 09 03:01:17 crc kubenswrapper[4901]: E0309 03:01:17.922492 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d4ba8c8-8fd5-4f29-a16e-bf1e628f99c0" containerName="mariadb-account-create-update" Mar 09 03:01:17 crc kubenswrapper[4901]: I0309 03:01:17.922498 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d4ba8c8-8fd5-4f29-a16e-bf1e628f99c0" containerName="mariadb-account-create-update" Mar 09 03:01:17 crc kubenswrapper[4901]: E0309 03:01:17.922515 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10c171ad-86f3-4601-8abd-89334e351bc8" containerName="mariadb-database-create" Mar 09 03:01:17 crc kubenswrapper[4901]: I0309 03:01:17.922521 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="10c171ad-86f3-4601-8abd-89334e351bc8" containerName="mariadb-database-create" Mar 09 03:01:17 crc kubenswrapper[4901]: E0309 03:01:17.922538 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bfe9cc4-0c72-472c-a6ae-e08d6ae58d6c" containerName="mariadb-database-create" Mar 09 03:01:17 crc kubenswrapper[4901]: I0309 03:01:17.922544 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bfe9cc4-0c72-472c-a6ae-e08d6ae58d6c" containerName="mariadb-database-create" Mar 09 03:01:17 crc kubenswrapper[4901]: E0309 03:01:17.922553 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e20e8f79-9634-43fd-a9a5-d2710f828a86" containerName="mariadb-account-create-update" Mar 09 03:01:17 crc kubenswrapper[4901]: I0309 03:01:17.922559 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="e20e8f79-9634-43fd-a9a5-d2710f828a86" containerName="mariadb-account-create-update" Mar 09 03:01:17 crc kubenswrapper[4901]: E0309 03:01:17.922574 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e394216-8657-46dd-95d8-5d0e73512d11" containerName="mariadb-account-create-update" Mar 09 03:01:17 crc kubenswrapper[4901]: I0309 03:01:17.922579 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e394216-8657-46dd-95d8-5d0e73512d11" containerName="mariadb-account-create-update" Mar 09 03:01:17 crc kubenswrapper[4901]: I0309 03:01:17.922711 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bfe9cc4-0c72-472c-a6ae-e08d6ae58d6c" containerName="mariadb-database-create" Mar 09 03:01:17 crc kubenswrapper[4901]: I0309 03:01:17.922727 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="10c171ad-86f3-4601-8abd-89334e351bc8" containerName="mariadb-database-create" Mar 09 03:01:17 crc kubenswrapper[4901]: I0309 03:01:17.922735 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d4ba8c8-8fd5-4f29-a16e-bf1e628f99c0" containerName="mariadb-account-create-update" Mar 09 03:01:17 crc kubenswrapper[4901]: I0309 03:01:17.922744 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e394216-8657-46dd-95d8-5d0e73512d11" containerName="mariadb-account-create-update" Mar 09 03:01:17 crc kubenswrapper[4901]: I0309 03:01:17.922751 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="e20e8f79-9634-43fd-a9a5-d2710f828a86" containerName="mariadb-account-create-update" Mar 09 03:01:17 crc kubenswrapper[4901]: I0309 03:01:17.922760 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="d752c748-e235-4087-849e-3fe86c6e52b4" containerName="mariadb-database-create" Mar 09 03:01:17 crc kubenswrapper[4901]: I0309 03:01:17.923278 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-b7nv8" Mar 09 03:01:17 crc kubenswrapper[4901]: I0309 03:01:17.927064 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 09 03:01:17 crc kubenswrapper[4901]: I0309 03:01:17.928208 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-b7nv8"] Mar 09 03:01:18 crc kubenswrapper[4901]: I0309 03:01:18.043083 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rn4j\" (UniqueName: \"kubernetes.io/projected/87881a32-acab-48f5-8e13-a5f2c01fdc09-kube-api-access-8rn4j\") pod \"root-account-create-update-b7nv8\" (UID: \"87881a32-acab-48f5-8e13-a5f2c01fdc09\") " pod="openstack/root-account-create-update-b7nv8" Mar 09 03:01:18 crc kubenswrapper[4901]: I0309 03:01:18.043173 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87881a32-acab-48f5-8e13-a5f2c01fdc09-operator-scripts\") pod \"root-account-create-update-b7nv8\" (UID: \"87881a32-acab-48f5-8e13-a5f2c01fdc09\") " pod="openstack/root-account-create-update-b7nv8" Mar 09 03:01:18 crc kubenswrapper[4901]: I0309 03:01:18.062755 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"56e038ec-a406-4f6b-9b8a-135c56be7514","Type":"ContainerStarted","Data":"ec7b6ca857145cd8ffde882836905792898feadc504e21586c4cd6aba7ec5a11"} Mar 09 03:01:18 crc kubenswrapper[4901]: I0309 03:01:18.062936 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"56e038ec-a406-4f6b-9b8a-135c56be7514","Type":"ContainerStarted","Data":"1f84aa3c39b4c9622fe4347965d91df9c469a92794647651d5a87ec099686973"} Mar 09 03:01:18 crc kubenswrapper[4901]: I0309 03:01:18.063001 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"56e038ec-a406-4f6b-9b8a-135c56be7514","Type":"ContainerStarted","Data":"e1eb6f0364aa3902d58b824b7fb25b904c93c1eeb008b6cf519903b0f5d38d17"} Mar 09 03:01:18 crc kubenswrapper[4901]: I0309 03:01:18.063062 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"56e038ec-a406-4f6b-9b8a-135c56be7514","Type":"ContainerStarted","Data":"2a68b1ca4efba68812d3a303a2ceab2b4b6448914471d7a3decd7cf6b6f34bb6"} Mar 09 03:01:18 crc kubenswrapper[4901]: I0309 03:01:18.063121 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"56e038ec-a406-4f6b-9b8a-135c56be7514","Type":"ContainerStarted","Data":"b23a37af3d719448611f4ad6a32fe5c2c308cd7ba1a776e15eadba7f364fb7bf"} Mar 09 03:01:18 crc kubenswrapper[4901]: I0309 03:01:18.144879 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rn4j\" (UniqueName: \"kubernetes.io/projected/87881a32-acab-48f5-8e13-a5f2c01fdc09-kube-api-access-8rn4j\") pod \"root-account-create-update-b7nv8\" (UID: \"87881a32-acab-48f5-8e13-a5f2c01fdc09\") " pod="openstack/root-account-create-update-b7nv8" Mar 09 03:01:18 crc kubenswrapper[4901]: I0309 03:01:18.145154 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87881a32-acab-48f5-8e13-a5f2c01fdc09-operator-scripts\") pod \"root-account-create-update-b7nv8\" (UID: \"87881a32-acab-48f5-8e13-a5f2c01fdc09\") " pod="openstack/root-account-create-update-b7nv8" Mar 09 03:01:18 crc kubenswrapper[4901]: I0309 03:01:18.145865 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87881a32-acab-48f5-8e13-a5f2c01fdc09-operator-scripts\") pod \"root-account-create-update-b7nv8\" (UID: \"87881a32-acab-48f5-8e13-a5f2c01fdc09\") " pod="openstack/root-account-create-update-b7nv8" Mar 09 03:01:18 crc kubenswrapper[4901]: I0309 03:01:18.174870 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rn4j\" (UniqueName: \"kubernetes.io/projected/87881a32-acab-48f5-8e13-a5f2c01fdc09-kube-api-access-8rn4j\") pod \"root-account-create-update-b7nv8\" (UID: \"87881a32-acab-48f5-8e13-a5f2c01fdc09\") " pod="openstack/root-account-create-update-b7nv8" Mar 09 03:01:18 crc kubenswrapper[4901]: I0309 03:01:18.255032 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-b7nv8" Mar 09 03:01:18 crc kubenswrapper[4901]: I0309 03:01:18.502193 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-b7nv8"] Mar 09 03:01:19 crc kubenswrapper[4901]: I0309 03:01:19.075088 4901 generic.go:334] "Generic (PLEG): container finished" podID="87881a32-acab-48f5-8e13-a5f2c01fdc09" containerID="34a6b8e27fa535bcf55548c59ff362d66bb423d69d3e0871101811cba5ab368d" exitCode=0 Mar 09 03:01:19 crc kubenswrapper[4901]: I0309 03:01:19.075243 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-b7nv8" event={"ID":"87881a32-acab-48f5-8e13-a5f2c01fdc09","Type":"ContainerDied","Data":"34a6b8e27fa535bcf55548c59ff362d66bb423d69d3e0871101811cba5ab368d"} Mar 09 03:01:19 crc kubenswrapper[4901]: I0309 03:01:19.075314 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-b7nv8" event={"ID":"87881a32-acab-48f5-8e13-a5f2c01fdc09","Type":"ContainerStarted","Data":"6dc49186226346aa1be2dfd7b4c192d93aaaca077a28a5a011a0cfb072f70857"} Mar 09 03:01:19 crc kubenswrapper[4901]: I0309 03:01:19.092721 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"56e038ec-a406-4f6b-9b8a-135c56be7514","Type":"ContainerStarted","Data":"5a61a9051d7facf2f0fe68fbc34956eedc046cea9aa7aa425a2ad7983580763d"} Mar 09 03:01:19 crc kubenswrapper[4901]: I0309 03:01:19.092784 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"56e038ec-a406-4f6b-9b8a-135c56be7514","Type":"ContainerStarted","Data":"98baa58acecf6a889a2bcf29944696da987b06e0a31a75f7580e5b62f1cf2db6"} Mar 09 03:01:19 crc kubenswrapper[4901]: I0309 03:01:19.099477 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-hltph" Mar 09 03:01:19 crc kubenswrapper[4901]: I0309 03:01:19.104123 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-hltph" Mar 09 03:01:19 crc kubenswrapper[4901]: I0309 03:01:19.105516 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-5rg5k" podUID="dc54c941-19d2-42c1-b9f0-a3a58999bda5" containerName="ovn-controller" probeResult="failure" output=< Mar 09 03:01:19 crc kubenswrapper[4901]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 09 03:01:19 crc kubenswrapper[4901]: > Mar 09 03:01:19 crc kubenswrapper[4901]: I0309 03:01:19.147942 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.821844404 podStartE2EDuration="28.147923156s" podCreationTimestamp="2026-03-09 03:00:51 +0000 UTC" firstStartedPulling="2026-03-09 03:01:09.814004353 +0000 UTC m=+1194.403668085" lastFinishedPulling="2026-03-09 03:01:17.140083105 +0000 UTC m=+1201.729746837" observedRunningTime="2026-03-09 03:01:19.138997421 +0000 UTC m=+1203.728661163" watchObservedRunningTime="2026-03-09 03:01:19.147923156 +0000 UTC m=+1203.737586898" Mar 09 03:01:19 crc kubenswrapper[4901]: I0309 03:01:19.359847 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-5rg5k-config-pkp7w"] Mar 09 03:01:19 crc kubenswrapper[4901]: I0309 03:01:19.361270 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5rg5k-config-pkp7w" Mar 09 03:01:19 crc kubenswrapper[4901]: I0309 03:01:19.369912 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-5rg5k-config-pkp7w"] Mar 09 03:01:19 crc kubenswrapper[4901]: I0309 03:01:19.370995 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 09 03:01:19 crc kubenswrapper[4901]: I0309 03:01:19.450262 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bdcf4fccc-2vwc7"] Mar 09 03:01:19 crc kubenswrapper[4901]: I0309 03:01:19.452022 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bdcf4fccc-2vwc7" Mar 09 03:01:19 crc kubenswrapper[4901]: I0309 03:01:19.453498 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 09 03:01:19 crc kubenswrapper[4901]: I0309 03:01:19.463859 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/00ae6119-9761-4588-8e95-525cbe33198a-var-run-ovn\") pod \"ovn-controller-5rg5k-config-pkp7w\" (UID: \"00ae6119-9761-4588-8e95-525cbe33198a\") " pod="openstack/ovn-controller-5rg5k-config-pkp7w" Mar 09 03:01:19 crc kubenswrapper[4901]: I0309 03:01:19.463928 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlhv2\" (UniqueName: \"kubernetes.io/projected/00ae6119-9761-4588-8e95-525cbe33198a-kube-api-access-zlhv2\") pod \"ovn-controller-5rg5k-config-pkp7w\" (UID: \"00ae6119-9761-4588-8e95-525cbe33198a\") " pod="openstack/ovn-controller-5rg5k-config-pkp7w" Mar 09 03:01:19 crc kubenswrapper[4901]: I0309 03:01:19.463970 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00ae6119-9761-4588-8e95-525cbe33198a-scripts\") pod \"ovn-controller-5rg5k-config-pkp7w\" (UID: \"00ae6119-9761-4588-8e95-525cbe33198a\") " pod="openstack/ovn-controller-5rg5k-config-pkp7w" Mar 09 03:01:19 crc kubenswrapper[4901]: I0309 03:01:19.464032 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/00ae6119-9761-4588-8e95-525cbe33198a-var-log-ovn\") pod \"ovn-controller-5rg5k-config-pkp7w\" (UID: \"00ae6119-9761-4588-8e95-525cbe33198a\") " pod="openstack/ovn-controller-5rg5k-config-pkp7w" Mar 09 03:01:19 crc kubenswrapper[4901]: I0309 03:01:19.464058 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/00ae6119-9761-4588-8e95-525cbe33198a-var-run\") pod \"ovn-controller-5rg5k-config-pkp7w\" (UID: \"00ae6119-9761-4588-8e95-525cbe33198a\") " pod="openstack/ovn-controller-5rg5k-config-pkp7w" Mar 09 03:01:19 crc kubenswrapper[4901]: I0309 03:01:19.464092 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/00ae6119-9761-4588-8e95-525cbe33198a-additional-scripts\") pod \"ovn-controller-5rg5k-config-pkp7w\" (UID: \"00ae6119-9761-4588-8e95-525cbe33198a\") " pod="openstack/ovn-controller-5rg5k-config-pkp7w" Mar 09 03:01:19 crc kubenswrapper[4901]: I0309 03:01:19.465859 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bdcf4fccc-2vwc7"] Mar 09 03:01:19 crc kubenswrapper[4901]: I0309 03:01:19.565680 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzshw\" (UniqueName: \"kubernetes.io/projected/e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe-kube-api-access-nzshw\") pod \"dnsmasq-dns-5bdcf4fccc-2vwc7\" (UID: \"e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe\") " pod="openstack/dnsmasq-dns-5bdcf4fccc-2vwc7" Mar 09 03:01:19 crc kubenswrapper[4901]: I0309 03:01:19.565731 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe-ovsdbserver-sb\") pod \"dnsmasq-dns-5bdcf4fccc-2vwc7\" (UID: \"e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe\") " pod="openstack/dnsmasq-dns-5bdcf4fccc-2vwc7" Mar 09 03:01:19 crc kubenswrapper[4901]: I0309 03:01:19.565758 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/00ae6119-9761-4588-8e95-525cbe33198a-var-log-ovn\") pod \"ovn-controller-5rg5k-config-pkp7w\" (UID: \"00ae6119-9761-4588-8e95-525cbe33198a\") " pod="openstack/ovn-controller-5rg5k-config-pkp7w" Mar 09 03:01:19 crc kubenswrapper[4901]: I0309 03:01:19.565780 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/00ae6119-9761-4588-8e95-525cbe33198a-var-run\") pod \"ovn-controller-5rg5k-config-pkp7w\" (UID: \"00ae6119-9761-4588-8e95-525cbe33198a\") " pod="openstack/ovn-controller-5rg5k-config-pkp7w" Mar 09 03:01:19 crc kubenswrapper[4901]: I0309 03:01:19.565810 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/00ae6119-9761-4588-8e95-525cbe33198a-additional-scripts\") pod \"ovn-controller-5rg5k-config-pkp7w\" (UID: \"00ae6119-9761-4588-8e95-525cbe33198a\") " pod="openstack/ovn-controller-5rg5k-config-pkp7w" Mar 09 03:01:19 crc kubenswrapper[4901]: I0309 03:01:19.565849 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe-dns-swift-storage-0\") pod \"dnsmasq-dns-5bdcf4fccc-2vwc7\" (UID: \"e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe\") " pod="openstack/dnsmasq-dns-5bdcf4fccc-2vwc7" Mar 09 03:01:19 crc kubenswrapper[4901]: I0309 03:01:19.565869 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe-config\") pod \"dnsmasq-dns-5bdcf4fccc-2vwc7\" (UID: \"e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe\") " pod="openstack/dnsmasq-dns-5bdcf4fccc-2vwc7" Mar 09 03:01:19 crc kubenswrapper[4901]: I0309 03:01:19.565898 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe-dns-svc\") pod \"dnsmasq-dns-5bdcf4fccc-2vwc7\" (UID: \"e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe\") " pod="openstack/dnsmasq-dns-5bdcf4fccc-2vwc7" Mar 09 03:01:19 crc kubenswrapper[4901]: I0309 03:01:19.565928 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/00ae6119-9761-4588-8e95-525cbe33198a-var-run-ovn\") pod \"ovn-controller-5rg5k-config-pkp7w\" (UID: \"00ae6119-9761-4588-8e95-525cbe33198a\") " pod="openstack/ovn-controller-5rg5k-config-pkp7w" Mar 09 03:01:19 crc kubenswrapper[4901]: I0309 03:01:19.565957 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlhv2\" (UniqueName: \"kubernetes.io/projected/00ae6119-9761-4588-8e95-525cbe33198a-kube-api-access-zlhv2\") pod \"ovn-controller-5rg5k-config-pkp7w\" (UID: \"00ae6119-9761-4588-8e95-525cbe33198a\") " pod="openstack/ovn-controller-5rg5k-config-pkp7w" Mar 09 03:01:19 crc kubenswrapper[4901]: I0309 03:01:19.565982 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00ae6119-9761-4588-8e95-525cbe33198a-scripts\") pod \"ovn-controller-5rg5k-config-pkp7w\" (UID: \"00ae6119-9761-4588-8e95-525cbe33198a\") " pod="openstack/ovn-controller-5rg5k-config-pkp7w" Mar 09 03:01:19 crc kubenswrapper[4901]: I0309 03:01:19.565999 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe-ovsdbserver-nb\") pod \"dnsmasq-dns-5bdcf4fccc-2vwc7\" (UID: \"e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe\") " pod="openstack/dnsmasq-dns-5bdcf4fccc-2vwc7" Mar 09 03:01:19 crc kubenswrapper[4901]: I0309 03:01:19.566317 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/00ae6119-9761-4588-8e95-525cbe33198a-var-log-ovn\") pod \"ovn-controller-5rg5k-config-pkp7w\" (UID: \"00ae6119-9761-4588-8e95-525cbe33198a\") " pod="openstack/ovn-controller-5rg5k-config-pkp7w" Mar 09 03:01:19 crc kubenswrapper[4901]: I0309 03:01:19.566362 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/00ae6119-9761-4588-8e95-525cbe33198a-var-run\") pod \"ovn-controller-5rg5k-config-pkp7w\" (UID: \"00ae6119-9761-4588-8e95-525cbe33198a\") " pod="openstack/ovn-controller-5rg5k-config-pkp7w" Mar 09 03:01:19 crc kubenswrapper[4901]: I0309 03:01:19.566967 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/00ae6119-9761-4588-8e95-525cbe33198a-additional-scripts\") pod \"ovn-controller-5rg5k-config-pkp7w\" (UID: \"00ae6119-9761-4588-8e95-525cbe33198a\") " pod="openstack/ovn-controller-5rg5k-config-pkp7w" Mar 09 03:01:19 crc kubenswrapper[4901]: I0309 03:01:19.567027 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/00ae6119-9761-4588-8e95-525cbe33198a-var-run-ovn\") pod \"ovn-controller-5rg5k-config-pkp7w\" (UID: \"00ae6119-9761-4588-8e95-525cbe33198a\") " pod="openstack/ovn-controller-5rg5k-config-pkp7w" Mar 09 03:01:19 crc kubenswrapper[4901]: I0309 03:01:19.568779 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00ae6119-9761-4588-8e95-525cbe33198a-scripts\") pod \"ovn-controller-5rg5k-config-pkp7w\" (UID: \"00ae6119-9761-4588-8e95-525cbe33198a\") " pod="openstack/ovn-controller-5rg5k-config-pkp7w" Mar 09 03:01:19 crc kubenswrapper[4901]: I0309 03:01:19.584001 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlhv2\" (UniqueName: \"kubernetes.io/projected/00ae6119-9761-4588-8e95-525cbe33198a-kube-api-access-zlhv2\") pod \"ovn-controller-5rg5k-config-pkp7w\" (UID: \"00ae6119-9761-4588-8e95-525cbe33198a\") " pod="openstack/ovn-controller-5rg5k-config-pkp7w" Mar 09 03:01:19 crc kubenswrapper[4901]: I0309 03:01:19.667296 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzshw\" (UniqueName: \"kubernetes.io/projected/e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe-kube-api-access-nzshw\") pod \"dnsmasq-dns-5bdcf4fccc-2vwc7\" (UID: \"e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe\") " pod="openstack/dnsmasq-dns-5bdcf4fccc-2vwc7" Mar 09 03:01:19 crc kubenswrapper[4901]: I0309 03:01:19.667368 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe-ovsdbserver-sb\") pod \"dnsmasq-dns-5bdcf4fccc-2vwc7\" (UID: \"e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe\") " pod="openstack/dnsmasq-dns-5bdcf4fccc-2vwc7" Mar 09 03:01:19 crc kubenswrapper[4901]: I0309 03:01:19.667480 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe-dns-swift-storage-0\") pod \"dnsmasq-dns-5bdcf4fccc-2vwc7\" (UID: \"e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe\") " pod="openstack/dnsmasq-dns-5bdcf4fccc-2vwc7" Mar 09 03:01:19 crc kubenswrapper[4901]: I0309 03:01:19.667568 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe-config\") pod \"dnsmasq-dns-5bdcf4fccc-2vwc7\" (UID: \"e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe\") " pod="openstack/dnsmasq-dns-5bdcf4fccc-2vwc7" Mar 09 03:01:19 crc kubenswrapper[4901]: I0309 03:01:19.667638 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe-dns-svc\") pod \"dnsmasq-dns-5bdcf4fccc-2vwc7\" (UID: \"e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe\") " pod="openstack/dnsmasq-dns-5bdcf4fccc-2vwc7" Mar 09 03:01:19 crc kubenswrapper[4901]: I0309 03:01:19.667751 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe-ovsdbserver-nb\") pod \"dnsmasq-dns-5bdcf4fccc-2vwc7\" (UID: \"e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe\") " pod="openstack/dnsmasq-dns-5bdcf4fccc-2vwc7" Mar 09 03:01:19 crc kubenswrapper[4901]: I0309 03:01:19.668525 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe-dns-swift-storage-0\") pod \"dnsmasq-dns-5bdcf4fccc-2vwc7\" (UID: \"e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe\") " pod="openstack/dnsmasq-dns-5bdcf4fccc-2vwc7" Mar 09 03:01:19 crc kubenswrapper[4901]: I0309 03:01:19.668828 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe-ovsdbserver-sb\") pod \"dnsmasq-dns-5bdcf4fccc-2vwc7\" (UID: \"e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe\") " pod="openstack/dnsmasq-dns-5bdcf4fccc-2vwc7" Mar 09 03:01:19 crc kubenswrapper[4901]: I0309 03:01:19.669246 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe-config\") pod \"dnsmasq-dns-5bdcf4fccc-2vwc7\" (UID: \"e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe\") " pod="openstack/dnsmasq-dns-5bdcf4fccc-2vwc7" Mar 09 03:01:19 crc kubenswrapper[4901]: I0309 03:01:19.669214 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe-ovsdbserver-nb\") pod \"dnsmasq-dns-5bdcf4fccc-2vwc7\" (UID: \"e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe\") " pod="openstack/dnsmasq-dns-5bdcf4fccc-2vwc7" Mar 09 03:01:19 crc kubenswrapper[4901]: I0309 03:01:19.669954 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe-dns-svc\") pod \"dnsmasq-dns-5bdcf4fccc-2vwc7\" (UID: \"e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe\") " pod="openstack/dnsmasq-dns-5bdcf4fccc-2vwc7" Mar 09 03:01:19 crc kubenswrapper[4901]: I0309 03:01:19.680525 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5rg5k-config-pkp7w" Mar 09 03:01:19 crc kubenswrapper[4901]: I0309 03:01:19.690258 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzshw\" (UniqueName: \"kubernetes.io/projected/e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe-kube-api-access-nzshw\") pod \"dnsmasq-dns-5bdcf4fccc-2vwc7\" (UID: \"e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe\") " pod="openstack/dnsmasq-dns-5bdcf4fccc-2vwc7" Mar 09 03:01:19 crc kubenswrapper[4901]: I0309 03:01:19.778985 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bdcf4fccc-2vwc7" Mar 09 03:01:19 crc kubenswrapper[4901]: I0309 03:01:19.966683 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-kb7jk"] Mar 09 03:01:19 crc kubenswrapper[4901]: I0309 03:01:19.968416 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-kb7jk" Mar 09 03:01:19 crc kubenswrapper[4901]: I0309 03:01:19.977933 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-kb7jk"] Mar 09 03:01:19 crc kubenswrapper[4901]: I0309 03:01:19.979785 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 09 03:01:19 crc kubenswrapper[4901]: I0309 03:01:19.979978 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-tdwfn" Mar 09 03:01:20 crc kubenswrapper[4901]: I0309 03:01:20.076312 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61302320-9299-4dcc-abeb-05c28dd977c1-combined-ca-bundle\") pod \"glance-db-sync-kb7jk\" (UID: \"61302320-9299-4dcc-abeb-05c28dd977c1\") " pod="openstack/glance-db-sync-kb7jk" Mar 09 03:01:20 crc kubenswrapper[4901]: I0309 03:01:20.076620 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61302320-9299-4dcc-abeb-05c28dd977c1-config-data\") pod \"glance-db-sync-kb7jk\" (UID: \"61302320-9299-4dcc-abeb-05c28dd977c1\") " pod="openstack/glance-db-sync-kb7jk" Mar 09 03:01:20 crc kubenswrapper[4901]: I0309 03:01:20.076640 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/61302320-9299-4dcc-abeb-05c28dd977c1-db-sync-config-data\") pod \"glance-db-sync-kb7jk\" (UID: \"61302320-9299-4dcc-abeb-05c28dd977c1\") " pod="openstack/glance-db-sync-kb7jk" Mar 09 03:01:20 crc kubenswrapper[4901]: I0309 03:01:20.076680 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5qwr\" (UniqueName: \"kubernetes.io/projected/61302320-9299-4dcc-abeb-05c28dd977c1-kube-api-access-j5qwr\") pod \"glance-db-sync-kb7jk\" (UID: \"61302320-9299-4dcc-abeb-05c28dd977c1\") " pod="openstack/glance-db-sync-kb7jk" Mar 09 03:01:20 crc kubenswrapper[4901]: I0309 03:01:20.193817 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61302320-9299-4dcc-abeb-05c28dd977c1-combined-ca-bundle\") pod \"glance-db-sync-kb7jk\" (UID: \"61302320-9299-4dcc-abeb-05c28dd977c1\") " pod="openstack/glance-db-sync-kb7jk" Mar 09 03:01:20 crc kubenswrapper[4901]: I0309 03:01:20.193912 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61302320-9299-4dcc-abeb-05c28dd977c1-config-data\") pod \"glance-db-sync-kb7jk\" (UID: \"61302320-9299-4dcc-abeb-05c28dd977c1\") " pod="openstack/glance-db-sync-kb7jk" Mar 09 03:01:20 crc kubenswrapper[4901]: I0309 03:01:20.193950 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/61302320-9299-4dcc-abeb-05c28dd977c1-db-sync-config-data\") pod \"glance-db-sync-kb7jk\" (UID: \"61302320-9299-4dcc-abeb-05c28dd977c1\") " pod="openstack/glance-db-sync-kb7jk" Mar 09 03:01:20 crc kubenswrapper[4901]: I0309 03:01:20.194014 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5qwr\" (UniqueName: \"kubernetes.io/projected/61302320-9299-4dcc-abeb-05c28dd977c1-kube-api-access-j5qwr\") pod \"glance-db-sync-kb7jk\" (UID: \"61302320-9299-4dcc-abeb-05c28dd977c1\") " pod="openstack/glance-db-sync-kb7jk" Mar 09 03:01:20 crc kubenswrapper[4901]: I0309 03:01:20.197817 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61302320-9299-4dcc-abeb-05c28dd977c1-combined-ca-bundle\") pod \"glance-db-sync-kb7jk\" (UID: \"61302320-9299-4dcc-abeb-05c28dd977c1\") " pod="openstack/glance-db-sync-kb7jk" Mar 09 03:01:20 crc kubenswrapper[4901]: I0309 03:01:20.198634 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61302320-9299-4dcc-abeb-05c28dd977c1-config-data\") pod \"glance-db-sync-kb7jk\" (UID: \"61302320-9299-4dcc-abeb-05c28dd977c1\") " pod="openstack/glance-db-sync-kb7jk" Mar 09 03:01:20 crc kubenswrapper[4901]: I0309 03:01:20.203849 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/61302320-9299-4dcc-abeb-05c28dd977c1-db-sync-config-data\") pod \"glance-db-sync-kb7jk\" (UID: \"61302320-9299-4dcc-abeb-05c28dd977c1\") " pod="openstack/glance-db-sync-kb7jk" Mar 09 03:01:20 crc kubenswrapper[4901]: I0309 03:01:20.205263 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bdcf4fccc-2vwc7"] Mar 09 03:01:20 crc kubenswrapper[4901]: W0309 03:01:20.207332 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode48bc07d_a9a2_45bb_ac1a_c1dd8eca08fe.slice/crio-fd2fd4d0512fad1a2d30715fd2ba612e370d39334f001ff1a942426bdca7066c WatchSource:0}: Error finding container fd2fd4d0512fad1a2d30715fd2ba612e370d39334f001ff1a942426bdca7066c: Status 404 returned error can't find the container with id fd2fd4d0512fad1a2d30715fd2ba612e370d39334f001ff1a942426bdca7066c Mar 09 03:01:20 crc kubenswrapper[4901]: I0309 03:01:20.211553 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5qwr\" (UniqueName: \"kubernetes.io/projected/61302320-9299-4dcc-abeb-05c28dd977c1-kube-api-access-j5qwr\") pod \"glance-db-sync-kb7jk\" (UID: \"61302320-9299-4dcc-abeb-05c28dd977c1\") " pod="openstack/glance-db-sync-kb7jk" Mar 09 03:01:20 crc kubenswrapper[4901]: I0309 03:01:20.255863 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-5rg5k-config-pkp7w"] Mar 09 03:01:20 crc kubenswrapper[4901]: I0309 03:01:20.296460 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-kb7jk" Mar 09 03:01:20 crc kubenswrapper[4901]: I0309 03:01:20.552660 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-b7nv8" Mar 09 03:01:20 crc kubenswrapper[4901]: I0309 03:01:20.699456 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rn4j\" (UniqueName: \"kubernetes.io/projected/87881a32-acab-48f5-8e13-a5f2c01fdc09-kube-api-access-8rn4j\") pod \"87881a32-acab-48f5-8e13-a5f2c01fdc09\" (UID: \"87881a32-acab-48f5-8e13-a5f2c01fdc09\") " Mar 09 03:01:20 crc kubenswrapper[4901]: I0309 03:01:20.699535 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87881a32-acab-48f5-8e13-a5f2c01fdc09-operator-scripts\") pod \"87881a32-acab-48f5-8e13-a5f2c01fdc09\" (UID: \"87881a32-acab-48f5-8e13-a5f2c01fdc09\") " Mar 09 03:01:20 crc kubenswrapper[4901]: I0309 03:01:20.700445 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87881a32-acab-48f5-8e13-a5f2c01fdc09-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "87881a32-acab-48f5-8e13-a5f2c01fdc09" (UID: "87881a32-acab-48f5-8e13-a5f2c01fdc09"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:01:20 crc kubenswrapper[4901]: I0309 03:01:20.705947 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87881a32-acab-48f5-8e13-a5f2c01fdc09-kube-api-access-8rn4j" (OuterVolumeSpecName: "kube-api-access-8rn4j") pod "87881a32-acab-48f5-8e13-a5f2c01fdc09" (UID: "87881a32-acab-48f5-8e13-a5f2c01fdc09"). InnerVolumeSpecName "kube-api-access-8rn4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:01:20 crc kubenswrapper[4901]: I0309 03:01:20.801217 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rn4j\" (UniqueName: \"kubernetes.io/projected/87881a32-acab-48f5-8e13-a5f2c01fdc09-kube-api-access-8rn4j\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:20 crc kubenswrapper[4901]: I0309 03:01:20.801642 4901 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87881a32-acab-48f5-8e13-a5f2c01fdc09-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:20 crc kubenswrapper[4901]: I0309 03:01:20.955664 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-kb7jk"] Mar 09 03:01:20 crc kubenswrapper[4901]: W0309 03:01:20.972242 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61302320_9299_4dcc_abeb_05c28dd977c1.slice/crio-b2a5ae07e4928c1b599b4fb3fa452d070e25efb638f5b9edf3ef30e5b2d57649 WatchSource:0}: Error finding container b2a5ae07e4928c1b599b4fb3fa452d070e25efb638f5b9edf3ef30e5b2d57649: Status 404 returned error can't find the container with id b2a5ae07e4928c1b599b4fb3fa452d070e25efb638f5b9edf3ef30e5b2d57649 Mar 09 03:01:21 crc kubenswrapper[4901]: I0309 03:01:21.111772 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-kb7jk" event={"ID":"61302320-9299-4dcc-abeb-05c28dd977c1","Type":"ContainerStarted","Data":"b2a5ae07e4928c1b599b4fb3fa452d070e25efb638f5b9edf3ef30e5b2d57649"} Mar 09 03:01:21 crc kubenswrapper[4901]: I0309 03:01:21.113886 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-b7nv8" event={"ID":"87881a32-acab-48f5-8e13-a5f2c01fdc09","Type":"ContainerDied","Data":"6dc49186226346aa1be2dfd7b4c192d93aaaca077a28a5a011a0cfb072f70857"} Mar 09 03:01:21 crc kubenswrapper[4901]: I0309 03:01:21.113909 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6dc49186226346aa1be2dfd7b4c192d93aaaca077a28a5a011a0cfb072f70857" Mar 09 03:01:21 crc kubenswrapper[4901]: I0309 03:01:21.113965 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-b7nv8" Mar 09 03:01:21 crc kubenswrapper[4901]: I0309 03:01:21.115526 4901 generic.go:334] "Generic (PLEG): container finished" podID="e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe" containerID="df9376a93fc75107929870fac305532c8f8535f22398bb8dc3f3654ba39d51f4" exitCode=0 Mar 09 03:01:21 crc kubenswrapper[4901]: I0309 03:01:21.115564 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bdcf4fccc-2vwc7" event={"ID":"e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe","Type":"ContainerDied","Data":"df9376a93fc75107929870fac305532c8f8535f22398bb8dc3f3654ba39d51f4"} Mar 09 03:01:21 crc kubenswrapper[4901]: I0309 03:01:21.115579 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bdcf4fccc-2vwc7" event={"ID":"e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe","Type":"ContainerStarted","Data":"fd2fd4d0512fad1a2d30715fd2ba612e370d39334f001ff1a942426bdca7066c"} Mar 09 03:01:21 crc kubenswrapper[4901]: I0309 03:01:21.119313 4901 generic.go:334] "Generic (PLEG): container finished" podID="00ae6119-9761-4588-8e95-525cbe33198a" containerID="ce0cbdc3bc8aab4b550b8ac5657a16eab3fb78b29d770ddb8c402a6b846084d9" exitCode=0 Mar 09 03:01:21 crc kubenswrapper[4901]: I0309 03:01:21.119360 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5rg5k-config-pkp7w" event={"ID":"00ae6119-9761-4588-8e95-525cbe33198a","Type":"ContainerDied","Data":"ce0cbdc3bc8aab4b550b8ac5657a16eab3fb78b29d770ddb8c402a6b846084d9"} Mar 09 03:01:21 crc kubenswrapper[4901]: I0309 03:01:21.119388 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5rg5k-config-pkp7w" event={"ID":"00ae6119-9761-4588-8e95-525cbe33198a","Type":"ContainerStarted","Data":"30141bf4df44366d6efc7f7dbe003df9edadfe7d3d9a72a6d432208cbaef7fcc"} Mar 09 03:01:22 crc kubenswrapper[4901]: I0309 03:01:22.129694 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bdcf4fccc-2vwc7" event={"ID":"e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe","Type":"ContainerStarted","Data":"5b5ddb2d4483db37db677d989441cc79de881a4e50a87a1af5715665794290a6"} Mar 09 03:01:22 crc kubenswrapper[4901]: I0309 03:01:22.130032 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5bdcf4fccc-2vwc7" Mar 09 03:01:22 crc kubenswrapper[4901]: I0309 03:01:22.151155 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5bdcf4fccc-2vwc7" podStartSLOduration=3.151141256 podStartE2EDuration="3.151141256s" podCreationTimestamp="2026-03-09 03:01:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 03:01:22.150010857 +0000 UTC m=+1206.739674589" watchObservedRunningTime="2026-03-09 03:01:22.151141256 +0000 UTC m=+1206.740804988" Mar 09 03:01:22 crc kubenswrapper[4901]: I0309 03:01:22.459263 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5rg5k-config-pkp7w" Mar 09 03:01:22 crc kubenswrapper[4901]: I0309 03:01:22.534902 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00ae6119-9761-4588-8e95-525cbe33198a-scripts\") pod \"00ae6119-9761-4588-8e95-525cbe33198a\" (UID: \"00ae6119-9761-4588-8e95-525cbe33198a\") " Mar 09 03:01:22 crc kubenswrapper[4901]: I0309 03:01:22.534982 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/00ae6119-9761-4588-8e95-525cbe33198a-var-log-ovn\") pod \"00ae6119-9761-4588-8e95-525cbe33198a\" (UID: \"00ae6119-9761-4588-8e95-525cbe33198a\") " Mar 09 03:01:22 crc kubenswrapper[4901]: I0309 03:01:22.535027 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/00ae6119-9761-4588-8e95-525cbe33198a-var-run\") pod \"00ae6119-9761-4588-8e95-525cbe33198a\" (UID: \"00ae6119-9761-4588-8e95-525cbe33198a\") " Mar 09 03:01:22 crc kubenswrapper[4901]: I0309 03:01:22.535059 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/00ae6119-9761-4588-8e95-525cbe33198a-var-run-ovn\") pod \"00ae6119-9761-4588-8e95-525cbe33198a\" (UID: \"00ae6119-9761-4588-8e95-525cbe33198a\") " Mar 09 03:01:22 crc kubenswrapper[4901]: I0309 03:01:22.535078 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlhv2\" (UniqueName: \"kubernetes.io/projected/00ae6119-9761-4588-8e95-525cbe33198a-kube-api-access-zlhv2\") pod \"00ae6119-9761-4588-8e95-525cbe33198a\" (UID: \"00ae6119-9761-4588-8e95-525cbe33198a\") " Mar 09 03:01:22 crc kubenswrapper[4901]: I0309 03:01:22.535125 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/00ae6119-9761-4588-8e95-525cbe33198a-additional-scripts\") pod \"00ae6119-9761-4588-8e95-525cbe33198a\" (UID: \"00ae6119-9761-4588-8e95-525cbe33198a\") " Mar 09 03:01:22 crc kubenswrapper[4901]: I0309 03:01:22.535482 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/00ae6119-9761-4588-8e95-525cbe33198a-var-run" (OuterVolumeSpecName: "var-run") pod "00ae6119-9761-4588-8e95-525cbe33198a" (UID: "00ae6119-9761-4588-8e95-525cbe33198a"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 03:01:22 crc kubenswrapper[4901]: I0309 03:01:22.535535 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/00ae6119-9761-4588-8e95-525cbe33198a-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "00ae6119-9761-4588-8e95-525cbe33198a" (UID: "00ae6119-9761-4588-8e95-525cbe33198a"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 03:01:22 crc kubenswrapper[4901]: I0309 03:01:22.535528 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/00ae6119-9761-4588-8e95-525cbe33198a-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "00ae6119-9761-4588-8e95-525cbe33198a" (UID: "00ae6119-9761-4588-8e95-525cbe33198a"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 03:01:22 crc kubenswrapper[4901]: I0309 03:01:22.536552 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00ae6119-9761-4588-8e95-525cbe33198a-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "00ae6119-9761-4588-8e95-525cbe33198a" (UID: "00ae6119-9761-4588-8e95-525cbe33198a"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:01:22 crc kubenswrapper[4901]: I0309 03:01:22.536716 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00ae6119-9761-4588-8e95-525cbe33198a-scripts" (OuterVolumeSpecName: "scripts") pod "00ae6119-9761-4588-8e95-525cbe33198a" (UID: "00ae6119-9761-4588-8e95-525cbe33198a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:01:22 crc kubenswrapper[4901]: I0309 03:01:22.541259 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00ae6119-9761-4588-8e95-525cbe33198a-kube-api-access-zlhv2" (OuterVolumeSpecName: "kube-api-access-zlhv2") pod "00ae6119-9761-4588-8e95-525cbe33198a" (UID: "00ae6119-9761-4588-8e95-525cbe33198a"). InnerVolumeSpecName "kube-api-access-zlhv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:01:22 crc kubenswrapper[4901]: I0309 03:01:22.636331 4901 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/00ae6119-9761-4588-8e95-525cbe33198a-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:22 crc kubenswrapper[4901]: I0309 03:01:22.636367 4901 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/00ae6119-9761-4588-8e95-525cbe33198a-var-run\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:22 crc kubenswrapper[4901]: I0309 03:01:22.636375 4901 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/00ae6119-9761-4588-8e95-525cbe33198a-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:22 crc kubenswrapper[4901]: I0309 03:01:22.636385 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlhv2\" (UniqueName: \"kubernetes.io/projected/00ae6119-9761-4588-8e95-525cbe33198a-kube-api-access-zlhv2\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:22 crc kubenswrapper[4901]: I0309 03:01:22.636395 4901 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/00ae6119-9761-4588-8e95-525cbe33198a-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:22 crc kubenswrapper[4901]: I0309 03:01:22.636404 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00ae6119-9761-4588-8e95-525cbe33198a-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:23 crc kubenswrapper[4901]: I0309 03:01:23.138302 4901 generic.go:334] "Generic (PLEG): container finished" podID="46c7df0b-fc0a-4fd9-b097-72da03442510" containerID="16bb4afb1c0b882241395638795c8e5f3d5e49f87188f36ca44e3bfb83ad26f7" exitCode=0 Mar 09 03:01:23 crc kubenswrapper[4901]: I0309 03:01:23.138512 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"46c7df0b-fc0a-4fd9-b097-72da03442510","Type":"ContainerDied","Data":"16bb4afb1c0b882241395638795c8e5f3d5e49f87188f36ca44e3bfb83ad26f7"} Mar 09 03:01:23 crc kubenswrapper[4901]: I0309 03:01:23.143025 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5rg5k-config-pkp7w" Mar 09 03:01:23 crc kubenswrapper[4901]: I0309 03:01:23.147623 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5rg5k-config-pkp7w" event={"ID":"00ae6119-9761-4588-8e95-525cbe33198a","Type":"ContainerDied","Data":"30141bf4df44366d6efc7f7dbe003df9edadfe7d3d9a72a6d432208cbaef7fcc"} Mar 09 03:01:23 crc kubenswrapper[4901]: I0309 03:01:23.147666 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30141bf4df44366d6efc7f7dbe003df9edadfe7d3d9a72a6d432208cbaef7fcc" Mar 09 03:01:23 crc kubenswrapper[4901]: I0309 03:01:23.582602 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-5rg5k-config-pkp7w"] Mar 09 03:01:23 crc kubenswrapper[4901]: I0309 03:01:23.593591 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-5rg5k-config-pkp7w"] Mar 09 03:01:24 crc kubenswrapper[4901]: I0309 03:01:24.082561 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-5rg5k" Mar 09 03:01:24 crc kubenswrapper[4901]: I0309 03:01:24.130643 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00ae6119-9761-4588-8e95-525cbe33198a" path="/var/lib/kubelet/pods/00ae6119-9761-4588-8e95-525cbe33198a/volumes" Mar 09 03:01:24 crc kubenswrapper[4901]: I0309 03:01:24.159879 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"46c7df0b-fc0a-4fd9-b097-72da03442510","Type":"ContainerStarted","Data":"8e53b64e302219a5893985d8539fa02c98bbe9e4e4c23ce74114a1519a51b3c0"} Mar 09 03:01:24 crc kubenswrapper[4901]: I0309 03:01:24.161045 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 09 03:01:24 crc kubenswrapper[4901]: I0309 03:01:24.197308 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=-9223371966.657486 podStartE2EDuration="1m10.197290633s" podCreationTimestamp="2026-03-09 03:00:14 +0000 UTC" firstStartedPulling="2026-03-09 03:00:16.171915849 +0000 UTC m=+1140.761579581" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 03:01:24.187043104 +0000 UTC m=+1208.776706836" watchObservedRunningTime="2026-03-09 03:01:24.197290633 +0000 UTC m=+1208.786954365" Mar 09 03:01:26 crc kubenswrapper[4901]: I0309 03:01:26.608468 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 09 03:01:29 crc kubenswrapper[4901]: I0309 03:01:29.780390 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5bdcf4fccc-2vwc7" Mar 09 03:01:29 crc kubenswrapper[4901]: I0309 03:01:29.847726 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58df884995-7dflj"] Mar 09 03:01:29 crc kubenswrapper[4901]: I0309 03:01:29.847963 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58df884995-7dflj" podUID="eb2ebb18-ec3e-4597-a641-de94b57c923d" containerName="dnsmasq-dns" containerID="cri-o://bc53cc89e1ac261aeac91468049cc78702de8ea39f6d74b15594ae7e1ebf598d" gracePeriod=10 Mar 09 03:01:30 crc kubenswrapper[4901]: I0309 03:01:30.211530 4901 generic.go:334] "Generic (PLEG): container finished" podID="eb2ebb18-ec3e-4597-a641-de94b57c923d" containerID="bc53cc89e1ac261aeac91468049cc78702de8ea39f6d74b15594ae7e1ebf598d" exitCode=0 Mar 09 03:01:30 crc kubenswrapper[4901]: I0309 03:01:30.211619 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58df884995-7dflj" event={"ID":"eb2ebb18-ec3e-4597-a641-de94b57c923d","Type":"ContainerDied","Data":"bc53cc89e1ac261aeac91468049cc78702de8ea39f6d74b15594ae7e1ebf598d"} Mar 09 03:01:32 crc kubenswrapper[4901]: I0309 03:01:32.211490 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-58df884995-7dflj" podUID="eb2ebb18-ec3e-4597-a641-de94b57c923d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.117:5353: connect: connection refused" Mar 09 03:01:33 crc kubenswrapper[4901]: I0309 03:01:33.135641 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58df884995-7dflj" Mar 09 03:01:33 crc kubenswrapper[4901]: I0309 03:01:33.234960 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58df884995-7dflj" event={"ID":"eb2ebb18-ec3e-4597-a641-de94b57c923d","Type":"ContainerDied","Data":"49d8ae16315304ac35357ac8beff794405cca949dc50117ce9dbf2491502f20e"} Mar 09 03:01:33 crc kubenswrapper[4901]: I0309 03:01:33.235010 4901 scope.go:117] "RemoveContainer" containerID="bc53cc89e1ac261aeac91468049cc78702de8ea39f6d74b15594ae7e1ebf598d" Mar 09 03:01:33 crc kubenswrapper[4901]: I0309 03:01:33.235112 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58df884995-7dflj" Mar 09 03:01:33 crc kubenswrapper[4901]: I0309 03:01:33.272904 4901 scope.go:117] "RemoveContainer" containerID="65662cf1784cc36b38023a272dab489323d59c686112b747b011f6a531cd625c" Mar 09 03:01:33 crc kubenswrapper[4901]: I0309 03:01:33.323779 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb2ebb18-ec3e-4597-a641-de94b57c923d-config\") pod \"eb2ebb18-ec3e-4597-a641-de94b57c923d\" (UID: \"eb2ebb18-ec3e-4597-a641-de94b57c923d\") " Mar 09 03:01:33 crc kubenswrapper[4901]: I0309 03:01:33.323845 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb2ebb18-ec3e-4597-a641-de94b57c923d-ovsdbserver-sb\") pod \"eb2ebb18-ec3e-4597-a641-de94b57c923d\" (UID: \"eb2ebb18-ec3e-4597-a641-de94b57c923d\") " Mar 09 03:01:33 crc kubenswrapper[4901]: I0309 03:01:33.323930 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkmpp\" (UniqueName: \"kubernetes.io/projected/eb2ebb18-ec3e-4597-a641-de94b57c923d-kube-api-access-nkmpp\") pod \"eb2ebb18-ec3e-4597-a641-de94b57c923d\" (UID: \"eb2ebb18-ec3e-4597-a641-de94b57c923d\") " Mar 09 03:01:33 crc kubenswrapper[4901]: I0309 03:01:33.323992 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb2ebb18-ec3e-4597-a641-de94b57c923d-dns-svc\") pod \"eb2ebb18-ec3e-4597-a641-de94b57c923d\" (UID: \"eb2ebb18-ec3e-4597-a641-de94b57c923d\") " Mar 09 03:01:33 crc kubenswrapper[4901]: I0309 03:01:33.324018 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb2ebb18-ec3e-4597-a641-de94b57c923d-ovsdbserver-nb\") pod \"eb2ebb18-ec3e-4597-a641-de94b57c923d\" (UID: \"eb2ebb18-ec3e-4597-a641-de94b57c923d\") " Mar 09 03:01:33 crc kubenswrapper[4901]: I0309 03:01:33.330959 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb2ebb18-ec3e-4597-a641-de94b57c923d-kube-api-access-nkmpp" (OuterVolumeSpecName: "kube-api-access-nkmpp") pod "eb2ebb18-ec3e-4597-a641-de94b57c923d" (UID: "eb2ebb18-ec3e-4597-a641-de94b57c923d"). InnerVolumeSpecName "kube-api-access-nkmpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:01:33 crc kubenswrapper[4901]: I0309 03:01:33.368024 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb2ebb18-ec3e-4597-a641-de94b57c923d-config" (OuterVolumeSpecName: "config") pod "eb2ebb18-ec3e-4597-a641-de94b57c923d" (UID: "eb2ebb18-ec3e-4597-a641-de94b57c923d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:01:33 crc kubenswrapper[4901]: I0309 03:01:33.369145 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb2ebb18-ec3e-4597-a641-de94b57c923d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "eb2ebb18-ec3e-4597-a641-de94b57c923d" (UID: "eb2ebb18-ec3e-4597-a641-de94b57c923d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:01:33 crc kubenswrapper[4901]: I0309 03:01:33.375167 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb2ebb18-ec3e-4597-a641-de94b57c923d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "eb2ebb18-ec3e-4597-a641-de94b57c923d" (UID: "eb2ebb18-ec3e-4597-a641-de94b57c923d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:01:33 crc kubenswrapper[4901]: I0309 03:01:33.389913 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb2ebb18-ec3e-4597-a641-de94b57c923d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "eb2ebb18-ec3e-4597-a641-de94b57c923d" (UID: "eb2ebb18-ec3e-4597-a641-de94b57c923d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:01:33 crc kubenswrapper[4901]: I0309 03:01:33.425925 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb2ebb18-ec3e-4597-a641-de94b57c923d-config\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:33 crc kubenswrapper[4901]: I0309 03:01:33.425988 4901 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb2ebb18-ec3e-4597-a641-de94b57c923d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:33 crc kubenswrapper[4901]: I0309 03:01:33.426622 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkmpp\" (UniqueName: \"kubernetes.io/projected/eb2ebb18-ec3e-4597-a641-de94b57c923d-kube-api-access-nkmpp\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:33 crc kubenswrapper[4901]: I0309 03:01:33.426656 4901 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb2ebb18-ec3e-4597-a641-de94b57c923d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:33 crc kubenswrapper[4901]: I0309 03:01:33.426666 4901 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb2ebb18-ec3e-4597-a641-de94b57c923d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:33 crc kubenswrapper[4901]: I0309 03:01:33.571918 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58df884995-7dflj"] Mar 09 03:01:33 crc kubenswrapper[4901]: I0309 03:01:33.590271 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58df884995-7dflj"] Mar 09 03:01:34 crc kubenswrapper[4901]: I0309 03:01:34.122024 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb2ebb18-ec3e-4597-a641-de94b57c923d" path="/var/lib/kubelet/pods/eb2ebb18-ec3e-4597-a641-de94b57c923d/volumes" Mar 09 03:01:34 crc kubenswrapper[4901]: I0309 03:01:34.251443 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-kb7jk" event={"ID":"61302320-9299-4dcc-abeb-05c28dd977c1","Type":"ContainerStarted","Data":"6bf103e4ec5e9ca0d5bc452b5db7fc5f1271184e38999c1f0e1d1b833052187b"} Mar 09 03:01:34 crc kubenswrapper[4901]: I0309 03:01:34.275561 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-kb7jk" podStartSLOduration=3.293708853 podStartE2EDuration="15.275534291s" podCreationTimestamp="2026-03-09 03:01:19 +0000 UTC" firstStartedPulling="2026-03-09 03:01:20.975440255 +0000 UTC m=+1205.565103987" lastFinishedPulling="2026-03-09 03:01:32.957265673 +0000 UTC m=+1217.546929425" observedRunningTime="2026-03-09 03:01:34.272940765 +0000 UTC m=+1218.862604507" watchObservedRunningTime="2026-03-09 03:01:34.275534291 +0000 UTC m=+1218.865198063" Mar 09 03:01:35 crc kubenswrapper[4901]: I0309 03:01:35.714520 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 09 03:01:35 crc kubenswrapper[4901]: I0309 03:01:35.998919 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-nghlv"] Mar 09 03:01:35 crc kubenswrapper[4901]: E0309 03:01:35.999350 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00ae6119-9761-4588-8e95-525cbe33198a" containerName="ovn-config" Mar 09 03:01:35 crc kubenswrapper[4901]: I0309 03:01:35.999366 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="00ae6119-9761-4588-8e95-525cbe33198a" containerName="ovn-config" Mar 09 03:01:35 crc kubenswrapper[4901]: E0309 03:01:35.999388 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87881a32-acab-48f5-8e13-a5f2c01fdc09" containerName="mariadb-account-create-update" Mar 09 03:01:35 crc kubenswrapper[4901]: I0309 03:01:35.999396 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="87881a32-acab-48f5-8e13-a5f2c01fdc09" containerName="mariadb-account-create-update" Mar 09 03:01:35 crc kubenswrapper[4901]: E0309 03:01:35.999416 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb2ebb18-ec3e-4597-a641-de94b57c923d" containerName="dnsmasq-dns" Mar 09 03:01:35 crc kubenswrapper[4901]: I0309 03:01:35.999423 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb2ebb18-ec3e-4597-a641-de94b57c923d" containerName="dnsmasq-dns" Mar 09 03:01:35 crc kubenswrapper[4901]: E0309 03:01:35.999432 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb2ebb18-ec3e-4597-a641-de94b57c923d" containerName="init" Mar 09 03:01:35 crc kubenswrapper[4901]: I0309 03:01:35.999439 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb2ebb18-ec3e-4597-a641-de94b57c923d" containerName="init" Mar 09 03:01:35 crc kubenswrapper[4901]: I0309 03:01:35.999634 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="00ae6119-9761-4588-8e95-525cbe33198a" containerName="ovn-config" Mar 09 03:01:35 crc kubenswrapper[4901]: I0309 03:01:35.999653 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="87881a32-acab-48f5-8e13-a5f2c01fdc09" containerName="mariadb-account-create-update" Mar 09 03:01:35 crc kubenswrapper[4901]: I0309 03:01:35.999673 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb2ebb18-ec3e-4597-a641-de94b57c923d" containerName="dnsmasq-dns" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.000501 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-nghlv" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.014110 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-nghlv"] Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.115765 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-8712-account-create-update-6l9cg"] Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.116712 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8712-account-create-update-6l9cg" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.119645 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.129348 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8712-account-create-update-6l9cg"] Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.173163 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tz7d\" (UniqueName: \"kubernetes.io/projected/ea546a17-3742-4171-a2ff-1df8b5dce890-kube-api-access-6tz7d\") pod \"cinder-db-create-nghlv\" (UID: \"ea546a17-3742-4171-a2ff-1df8b5dce890\") " pod="openstack/cinder-db-create-nghlv" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.173217 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea546a17-3742-4171-a2ff-1df8b5dce890-operator-scripts\") pod \"cinder-db-create-nghlv\" (UID: \"ea546a17-3742-4171-a2ff-1df8b5dce890\") " pod="openstack/cinder-db-create-nghlv" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.275063 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/173f3ec0-4825-426f-8a8f-fa693b7068d2-operator-scripts\") pod \"cinder-8712-account-create-update-6l9cg\" (UID: \"173f3ec0-4825-426f-8a8f-fa693b7068d2\") " pod="openstack/cinder-8712-account-create-update-6l9cg" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.275140 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tz7d\" (UniqueName: \"kubernetes.io/projected/ea546a17-3742-4171-a2ff-1df8b5dce890-kube-api-access-6tz7d\") pod \"cinder-db-create-nghlv\" (UID: \"ea546a17-3742-4171-a2ff-1df8b5dce890\") " pod="openstack/cinder-db-create-nghlv" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.275325 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea546a17-3742-4171-a2ff-1df8b5dce890-operator-scripts\") pod \"cinder-db-create-nghlv\" (UID: \"ea546a17-3742-4171-a2ff-1df8b5dce890\") " pod="openstack/cinder-db-create-nghlv" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.275509 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6c4k\" (UniqueName: \"kubernetes.io/projected/173f3ec0-4825-426f-8a8f-fa693b7068d2-kube-api-access-x6c4k\") pod \"cinder-8712-account-create-update-6l9cg\" (UID: \"173f3ec0-4825-426f-8a8f-fa693b7068d2\") " pod="openstack/cinder-8712-account-create-update-6l9cg" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.276058 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea546a17-3742-4171-a2ff-1df8b5dce890-operator-scripts\") pod \"cinder-db-create-nghlv\" (UID: \"ea546a17-3742-4171-a2ff-1df8b5dce890\") " pod="openstack/cinder-db-create-nghlv" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.304994 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-sxdq5"] Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.305994 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-sxdq5" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.323180 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-3c33-account-create-update-5dpsl"] Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.324486 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-3c33-account-create-update-5dpsl" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.328366 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-sxdq5"] Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.330106 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.339381 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-3c33-account-create-update-5dpsl"] Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.353903 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tz7d\" (UniqueName: \"kubernetes.io/projected/ea546a17-3742-4171-a2ff-1df8b5dce890-kube-api-access-6tz7d\") pod \"cinder-db-create-nghlv\" (UID: \"ea546a17-3742-4171-a2ff-1df8b5dce890\") " pod="openstack/cinder-db-create-nghlv" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.376587 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/173f3ec0-4825-426f-8a8f-fa693b7068d2-operator-scripts\") pod \"cinder-8712-account-create-update-6l9cg\" (UID: \"173f3ec0-4825-426f-8a8f-fa693b7068d2\") " pod="openstack/cinder-8712-account-create-update-6l9cg" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.376894 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6c4k\" (UniqueName: \"kubernetes.io/projected/173f3ec0-4825-426f-8a8f-fa693b7068d2-kube-api-access-x6c4k\") pod \"cinder-8712-account-create-update-6l9cg\" (UID: \"173f3ec0-4825-426f-8a8f-fa693b7068d2\") " pod="openstack/cinder-8712-account-create-update-6l9cg" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.377518 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/173f3ec0-4825-426f-8a8f-fa693b7068d2-operator-scripts\") pod \"cinder-8712-account-create-update-6l9cg\" (UID: \"173f3ec0-4825-426f-8a8f-fa693b7068d2\") " pod="openstack/cinder-8712-account-create-update-6l9cg" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.399585 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6c4k\" (UniqueName: \"kubernetes.io/projected/173f3ec0-4825-426f-8a8f-fa693b7068d2-kube-api-access-x6c4k\") pod \"cinder-8712-account-create-update-6l9cg\" (UID: \"173f3ec0-4825-426f-8a8f-fa693b7068d2\") " pod="openstack/cinder-8712-account-create-update-6l9cg" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.434849 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-29jrh"] Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.436407 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-29jrh" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.446400 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-29jrh"] Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.474673 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8712-account-create-update-6l9cg" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.477987 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2grx\" (UniqueName: \"kubernetes.io/projected/617dd4a9-970a-4c70-b587-f74323c172da-kube-api-access-g2grx\") pod \"barbican-db-create-sxdq5\" (UID: \"617dd4a9-970a-4c70-b587-f74323c172da\") " pod="openstack/barbican-db-create-sxdq5" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.478044 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df5d1410-e474-4ae8-980b-46092cc080b0-operator-scripts\") pod \"barbican-3c33-account-create-update-5dpsl\" (UID: \"df5d1410-e474-4ae8-980b-46092cc080b0\") " pod="openstack/barbican-3c33-account-create-update-5dpsl" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.478096 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-795jw\" (UniqueName: \"kubernetes.io/projected/df5d1410-e474-4ae8-980b-46092cc080b0-kube-api-access-795jw\") pod \"barbican-3c33-account-create-update-5dpsl\" (UID: \"df5d1410-e474-4ae8-980b-46092cc080b0\") " pod="openstack/barbican-3c33-account-create-update-5dpsl" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.478139 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/617dd4a9-970a-4c70-b587-f74323c172da-operator-scripts\") pod \"barbican-db-create-sxdq5\" (UID: \"617dd4a9-970a-4c70-b587-f74323c172da\") " pod="openstack/barbican-db-create-sxdq5" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.490831 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-jl2xh"] Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.492020 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jl2xh" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.496733 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.496926 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.497086 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.497186 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-ghd6s" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.498941 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-jl2xh"] Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.543275 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-e2c2-account-create-update-kdpd5"] Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.544386 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e2c2-account-create-update-kdpd5" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.546244 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.560349 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-e2c2-account-create-update-kdpd5"] Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.584207 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bwxq\" (UniqueName: \"kubernetes.io/projected/0b6c57bb-65c0-4563-a902-94e55b6f0713-kube-api-access-2bwxq\") pod \"keystone-db-sync-jl2xh\" (UID: \"0b6c57bb-65c0-4563-a902-94e55b6f0713\") " pod="openstack/keystone-db-sync-jl2xh" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.584275 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prvdd\" (UniqueName: \"kubernetes.io/projected/142cab77-aec7-45a4-9c64-45c3209e2a9d-kube-api-access-prvdd\") pod \"neutron-db-create-29jrh\" (UID: \"142cab77-aec7-45a4-9c64-45c3209e2a9d\") " pod="openstack/neutron-db-create-29jrh" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.584394 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2grx\" (UniqueName: \"kubernetes.io/projected/617dd4a9-970a-4c70-b587-f74323c172da-kube-api-access-g2grx\") pod \"barbican-db-create-sxdq5\" (UID: \"617dd4a9-970a-4c70-b587-f74323c172da\") " pod="openstack/barbican-db-create-sxdq5" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.584450 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df5d1410-e474-4ae8-980b-46092cc080b0-operator-scripts\") pod \"barbican-3c33-account-create-update-5dpsl\" (UID: \"df5d1410-e474-4ae8-980b-46092cc080b0\") " pod="openstack/barbican-3c33-account-create-update-5dpsl" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.584487 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b6c57bb-65c0-4563-a902-94e55b6f0713-config-data\") pod \"keystone-db-sync-jl2xh\" (UID: \"0b6c57bb-65c0-4563-a902-94e55b6f0713\") " pod="openstack/keystone-db-sync-jl2xh" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.584526 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/142cab77-aec7-45a4-9c64-45c3209e2a9d-operator-scripts\") pod \"neutron-db-create-29jrh\" (UID: \"142cab77-aec7-45a4-9c64-45c3209e2a9d\") " pod="openstack/neutron-db-create-29jrh" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.584543 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-795jw\" (UniqueName: \"kubernetes.io/projected/df5d1410-e474-4ae8-980b-46092cc080b0-kube-api-access-795jw\") pod \"barbican-3c33-account-create-update-5dpsl\" (UID: \"df5d1410-e474-4ae8-980b-46092cc080b0\") " pod="openstack/barbican-3c33-account-create-update-5dpsl" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.584573 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b6c57bb-65c0-4563-a902-94e55b6f0713-combined-ca-bundle\") pod \"keystone-db-sync-jl2xh\" (UID: \"0b6c57bb-65c0-4563-a902-94e55b6f0713\") " pod="openstack/keystone-db-sync-jl2xh" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.584614 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/617dd4a9-970a-4c70-b587-f74323c172da-operator-scripts\") pod \"barbican-db-create-sxdq5\" (UID: \"617dd4a9-970a-4c70-b587-f74323c172da\") " pod="openstack/barbican-db-create-sxdq5" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.585382 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/617dd4a9-970a-4c70-b587-f74323c172da-operator-scripts\") pod \"barbican-db-create-sxdq5\" (UID: \"617dd4a9-970a-4c70-b587-f74323c172da\") " pod="openstack/barbican-db-create-sxdq5" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.585507 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df5d1410-e474-4ae8-980b-46092cc080b0-operator-scripts\") pod \"barbican-3c33-account-create-update-5dpsl\" (UID: \"df5d1410-e474-4ae8-980b-46092cc080b0\") " pod="openstack/barbican-3c33-account-create-update-5dpsl" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.614857 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-795jw\" (UniqueName: \"kubernetes.io/projected/df5d1410-e474-4ae8-980b-46092cc080b0-kube-api-access-795jw\") pod \"barbican-3c33-account-create-update-5dpsl\" (UID: \"df5d1410-e474-4ae8-980b-46092cc080b0\") " pod="openstack/barbican-3c33-account-create-update-5dpsl" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.616388 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-nghlv" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.632776 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2grx\" (UniqueName: \"kubernetes.io/projected/617dd4a9-970a-4c70-b587-f74323c172da-kube-api-access-g2grx\") pod \"barbican-db-create-sxdq5\" (UID: \"617dd4a9-970a-4c70-b587-f74323c172da\") " pod="openstack/barbican-db-create-sxdq5" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.703806 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bwxq\" (UniqueName: \"kubernetes.io/projected/0b6c57bb-65c0-4563-a902-94e55b6f0713-kube-api-access-2bwxq\") pod \"keystone-db-sync-jl2xh\" (UID: \"0b6c57bb-65c0-4563-a902-94e55b6f0713\") " pod="openstack/keystone-db-sync-jl2xh" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.703851 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prvdd\" (UniqueName: \"kubernetes.io/projected/142cab77-aec7-45a4-9c64-45c3209e2a9d-kube-api-access-prvdd\") pod \"neutron-db-create-29jrh\" (UID: \"142cab77-aec7-45a4-9c64-45c3209e2a9d\") " pod="openstack/neutron-db-create-29jrh" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.703917 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50a6d871-6acd-45a8-ae76-958e8fd0b9ec-operator-scripts\") pod \"neutron-e2c2-account-create-update-kdpd5\" (UID: \"50a6d871-6acd-45a8-ae76-958e8fd0b9ec\") " pod="openstack/neutron-e2c2-account-create-update-kdpd5" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.703945 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b6c57bb-65c0-4563-a902-94e55b6f0713-config-data\") pod \"keystone-db-sync-jl2xh\" (UID: \"0b6c57bb-65c0-4563-a902-94e55b6f0713\") " pod="openstack/keystone-db-sync-jl2xh" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.703975 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/142cab77-aec7-45a4-9c64-45c3209e2a9d-operator-scripts\") pod \"neutron-db-create-29jrh\" (UID: \"142cab77-aec7-45a4-9c64-45c3209e2a9d\") " pod="openstack/neutron-db-create-29jrh" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.704000 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b6c57bb-65c0-4563-a902-94e55b6f0713-combined-ca-bundle\") pod \"keystone-db-sync-jl2xh\" (UID: \"0b6c57bb-65c0-4563-a902-94e55b6f0713\") " pod="openstack/keystone-db-sync-jl2xh" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.704022 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdqsl\" (UniqueName: \"kubernetes.io/projected/50a6d871-6acd-45a8-ae76-958e8fd0b9ec-kube-api-access-sdqsl\") pod \"neutron-e2c2-account-create-update-kdpd5\" (UID: \"50a6d871-6acd-45a8-ae76-958e8fd0b9ec\") " pod="openstack/neutron-e2c2-account-create-update-kdpd5" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.705202 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/142cab77-aec7-45a4-9c64-45c3209e2a9d-operator-scripts\") pod \"neutron-db-create-29jrh\" (UID: \"142cab77-aec7-45a4-9c64-45c3209e2a9d\") " pod="openstack/neutron-db-create-29jrh" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.705377 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-3c33-account-create-update-5dpsl" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.710156 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b6c57bb-65c0-4563-a902-94e55b6f0713-combined-ca-bundle\") pod \"keystone-db-sync-jl2xh\" (UID: \"0b6c57bb-65c0-4563-a902-94e55b6f0713\") " pod="openstack/keystone-db-sync-jl2xh" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.721289 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b6c57bb-65c0-4563-a902-94e55b6f0713-config-data\") pod \"keystone-db-sync-jl2xh\" (UID: \"0b6c57bb-65c0-4563-a902-94e55b6f0713\") " pod="openstack/keystone-db-sync-jl2xh" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.738314 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prvdd\" (UniqueName: \"kubernetes.io/projected/142cab77-aec7-45a4-9c64-45c3209e2a9d-kube-api-access-prvdd\") pod \"neutron-db-create-29jrh\" (UID: \"142cab77-aec7-45a4-9c64-45c3209e2a9d\") " pod="openstack/neutron-db-create-29jrh" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.762190 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bwxq\" (UniqueName: \"kubernetes.io/projected/0b6c57bb-65c0-4563-a902-94e55b6f0713-kube-api-access-2bwxq\") pod \"keystone-db-sync-jl2xh\" (UID: \"0b6c57bb-65c0-4563-a902-94e55b6f0713\") " pod="openstack/keystone-db-sync-jl2xh" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.762284 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-29jrh" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.806168 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50a6d871-6acd-45a8-ae76-958e8fd0b9ec-operator-scripts\") pod \"neutron-e2c2-account-create-update-kdpd5\" (UID: \"50a6d871-6acd-45a8-ae76-958e8fd0b9ec\") " pod="openstack/neutron-e2c2-account-create-update-kdpd5" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.806520 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdqsl\" (UniqueName: \"kubernetes.io/projected/50a6d871-6acd-45a8-ae76-958e8fd0b9ec-kube-api-access-sdqsl\") pod \"neutron-e2c2-account-create-update-kdpd5\" (UID: \"50a6d871-6acd-45a8-ae76-958e8fd0b9ec\") " pod="openstack/neutron-e2c2-account-create-update-kdpd5" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.807549 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50a6d871-6acd-45a8-ae76-958e8fd0b9ec-operator-scripts\") pod \"neutron-e2c2-account-create-update-kdpd5\" (UID: \"50a6d871-6acd-45a8-ae76-958e8fd0b9ec\") " pod="openstack/neutron-e2c2-account-create-update-kdpd5" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.828740 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdqsl\" (UniqueName: \"kubernetes.io/projected/50a6d871-6acd-45a8-ae76-958e8fd0b9ec-kube-api-access-sdqsl\") pod \"neutron-e2c2-account-create-update-kdpd5\" (UID: \"50a6d871-6acd-45a8-ae76-958e8fd0b9ec\") " pod="openstack/neutron-e2c2-account-create-update-kdpd5" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.877454 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jl2xh" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.892034 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e2c2-account-create-update-kdpd5" Mar 09 03:01:36 crc kubenswrapper[4901]: I0309 03:01:36.922058 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-sxdq5" Mar 09 03:01:37 crc kubenswrapper[4901]: I0309 03:01:37.167417 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-nghlv"] Mar 09 03:01:37 crc kubenswrapper[4901]: I0309 03:01:37.237810 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8712-account-create-update-6l9cg"] Mar 09 03:01:37 crc kubenswrapper[4901]: W0309 03:01:37.250650 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod173f3ec0_4825_426f_8a8f_fa693b7068d2.slice/crio-bf93d0068890565339dd04dd75e629aa6ada6128315702a0b8dc8d0cf3f0c442 WatchSource:0}: Error finding container bf93d0068890565339dd04dd75e629aa6ada6128315702a0b8dc8d0cf3f0c442: Status 404 returned error can't find the container with id bf93d0068890565339dd04dd75e629aa6ada6128315702a0b8dc8d0cf3f0c442 Mar 09 03:01:37 crc kubenswrapper[4901]: I0309 03:01:37.280703 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8712-account-create-update-6l9cg" event={"ID":"173f3ec0-4825-426f-8a8f-fa693b7068d2","Type":"ContainerStarted","Data":"bf93d0068890565339dd04dd75e629aa6ada6128315702a0b8dc8d0cf3f0c442"} Mar 09 03:01:37 crc kubenswrapper[4901]: I0309 03:01:37.286355 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-nghlv" event={"ID":"ea546a17-3742-4171-a2ff-1df8b5dce890","Type":"ContainerStarted","Data":"e9e5853c2287347eb35126cc1cce13e81874443f22b793257e9e1c830e1fb811"} Mar 09 03:01:37 crc kubenswrapper[4901]: I0309 03:01:37.309499 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-jl2xh"] Mar 09 03:01:37 crc kubenswrapper[4901]: I0309 03:01:37.362519 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-3c33-account-create-update-5dpsl"] Mar 09 03:01:37 crc kubenswrapper[4901]: I0309 03:01:37.373500 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-29jrh"] Mar 09 03:01:37 crc kubenswrapper[4901]: I0309 03:01:37.556403 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-sxdq5"] Mar 09 03:01:37 crc kubenswrapper[4901]: I0309 03:01:37.578135 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-e2c2-account-create-update-kdpd5"] Mar 09 03:01:37 crc kubenswrapper[4901]: W0309 03:01:37.598092 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod617dd4a9_970a_4c70_b587_f74323c172da.slice/crio-eb8ad76b4ee1582d1fd7127a2c42295ba68e05aee70b3cec2746376d188e6fd8 WatchSource:0}: Error finding container eb8ad76b4ee1582d1fd7127a2c42295ba68e05aee70b3cec2746376d188e6fd8: Status 404 returned error can't find the container with id eb8ad76b4ee1582d1fd7127a2c42295ba68e05aee70b3cec2746376d188e6fd8 Mar 09 03:01:37 crc kubenswrapper[4901]: W0309 03:01:37.598453 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50a6d871_6acd_45a8_ae76_958e8fd0b9ec.slice/crio-0bfb8eaf1c667702466b6651a14fbd47fc018668036d49096667baf5eb1c05f8 WatchSource:0}: Error finding container 0bfb8eaf1c667702466b6651a14fbd47fc018668036d49096667baf5eb1c05f8: Status 404 returned error can't find the container with id 0bfb8eaf1c667702466b6651a14fbd47fc018668036d49096667baf5eb1c05f8 Mar 09 03:01:38 crc kubenswrapper[4901]: I0309 03:01:38.297211 4901 generic.go:334] "Generic (PLEG): container finished" podID="173f3ec0-4825-426f-8a8f-fa693b7068d2" containerID="c237a3dc031b13f8d0620e7b2419fa1f1ad071510e6313e26b966db949a54e0d" exitCode=0 Mar 09 03:01:38 crc kubenswrapper[4901]: I0309 03:01:38.297477 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8712-account-create-update-6l9cg" event={"ID":"173f3ec0-4825-426f-8a8f-fa693b7068d2","Type":"ContainerDied","Data":"c237a3dc031b13f8d0620e7b2419fa1f1ad071510e6313e26b966db949a54e0d"} Mar 09 03:01:38 crc kubenswrapper[4901]: I0309 03:01:38.299514 4901 generic.go:334] "Generic (PLEG): container finished" podID="142cab77-aec7-45a4-9c64-45c3209e2a9d" containerID="ca232f3af6edff5f71fbe5a88099baa97cdd016953afd10711e2cf0859f5d72e" exitCode=0 Mar 09 03:01:38 crc kubenswrapper[4901]: I0309 03:01:38.299562 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-29jrh" event={"ID":"142cab77-aec7-45a4-9c64-45c3209e2a9d","Type":"ContainerDied","Data":"ca232f3af6edff5f71fbe5a88099baa97cdd016953afd10711e2cf0859f5d72e"} Mar 09 03:01:38 crc kubenswrapper[4901]: I0309 03:01:38.299581 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-29jrh" event={"ID":"142cab77-aec7-45a4-9c64-45c3209e2a9d","Type":"ContainerStarted","Data":"f19c814a86eb3fa662d7163f2c38a888f17deae97ff50fd109674cff2118d44f"} Mar 09 03:01:38 crc kubenswrapper[4901]: I0309 03:01:38.301878 4901 generic.go:334] "Generic (PLEG): container finished" podID="df5d1410-e474-4ae8-980b-46092cc080b0" containerID="64204cefd4cbe01f276c43b9ac47b50629c25ba8c97f8eb560808222b34c0134" exitCode=0 Mar 09 03:01:38 crc kubenswrapper[4901]: I0309 03:01:38.301944 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-3c33-account-create-update-5dpsl" event={"ID":"df5d1410-e474-4ae8-980b-46092cc080b0","Type":"ContainerDied","Data":"64204cefd4cbe01f276c43b9ac47b50629c25ba8c97f8eb560808222b34c0134"} Mar 09 03:01:38 crc kubenswrapper[4901]: I0309 03:01:38.302000 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-3c33-account-create-update-5dpsl" event={"ID":"df5d1410-e474-4ae8-980b-46092cc080b0","Type":"ContainerStarted","Data":"7cb78eae3f7d05750a2c5d86d4f8ab9e97c2b2de6e977efc514aaf62b5bc9e43"} Mar 09 03:01:38 crc kubenswrapper[4901]: I0309 03:01:38.302896 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jl2xh" event={"ID":"0b6c57bb-65c0-4563-a902-94e55b6f0713","Type":"ContainerStarted","Data":"86d9474a548e2bbef4b64e3d2fb5fa4e0888be9f71d73d09bf91a021fcd96238"} Mar 09 03:01:38 crc kubenswrapper[4901]: I0309 03:01:38.304205 4901 generic.go:334] "Generic (PLEG): container finished" podID="50a6d871-6acd-45a8-ae76-958e8fd0b9ec" containerID="6a0b0aaad45a66c9809f0281931099b6d19002aa141c6a0eada8f3fbaa72fd76" exitCode=0 Mar 09 03:01:38 crc kubenswrapper[4901]: I0309 03:01:38.304255 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-e2c2-account-create-update-kdpd5" event={"ID":"50a6d871-6acd-45a8-ae76-958e8fd0b9ec","Type":"ContainerDied","Data":"6a0b0aaad45a66c9809f0281931099b6d19002aa141c6a0eada8f3fbaa72fd76"} Mar 09 03:01:38 crc kubenswrapper[4901]: I0309 03:01:38.304273 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-e2c2-account-create-update-kdpd5" event={"ID":"50a6d871-6acd-45a8-ae76-958e8fd0b9ec","Type":"ContainerStarted","Data":"0bfb8eaf1c667702466b6651a14fbd47fc018668036d49096667baf5eb1c05f8"} Mar 09 03:01:38 crc kubenswrapper[4901]: I0309 03:01:38.305631 4901 generic.go:334] "Generic (PLEG): container finished" podID="ea546a17-3742-4171-a2ff-1df8b5dce890" containerID="a252bc804dc69a7818b1fc5403d3bbbeeb8a07502d697a31a00f9fffeffc0dc3" exitCode=0 Mar 09 03:01:38 crc kubenswrapper[4901]: I0309 03:01:38.305693 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-nghlv" event={"ID":"ea546a17-3742-4171-a2ff-1df8b5dce890","Type":"ContainerDied","Data":"a252bc804dc69a7818b1fc5403d3bbbeeb8a07502d697a31a00f9fffeffc0dc3"} Mar 09 03:01:38 crc kubenswrapper[4901]: I0309 03:01:38.308782 4901 generic.go:334] "Generic (PLEG): container finished" podID="617dd4a9-970a-4c70-b587-f74323c172da" containerID="ea76e472f757b16bd652388c5e4764ba583b8796f84a9249bfa120a81030e5e8" exitCode=0 Mar 09 03:01:38 crc kubenswrapper[4901]: I0309 03:01:38.308842 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-sxdq5" event={"ID":"617dd4a9-970a-4c70-b587-f74323c172da","Type":"ContainerDied","Data":"ea76e472f757b16bd652388c5e4764ba583b8796f84a9249bfa120a81030e5e8"} Mar 09 03:01:38 crc kubenswrapper[4901]: I0309 03:01:38.308876 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-sxdq5" event={"ID":"617dd4a9-970a-4c70-b587-f74323c172da","Type":"ContainerStarted","Data":"eb8ad76b4ee1582d1fd7127a2c42295ba68e05aee70b3cec2746376d188e6fd8"} Mar 09 03:01:41 crc kubenswrapper[4901]: I0309 03:01:41.335978 4901 generic.go:334] "Generic (PLEG): container finished" podID="61302320-9299-4dcc-abeb-05c28dd977c1" containerID="6bf103e4ec5e9ca0d5bc452b5db7fc5f1271184e38999c1f0e1d1b833052187b" exitCode=0 Mar 09 03:01:41 crc kubenswrapper[4901]: I0309 03:01:41.336070 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-kb7jk" event={"ID":"61302320-9299-4dcc-abeb-05c28dd977c1","Type":"ContainerDied","Data":"6bf103e4ec5e9ca0d5bc452b5db7fc5f1271184e38999c1f0e1d1b833052187b"} Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.175942 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-sxdq5" Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.264096 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-nghlv" Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.276009 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8712-account-create-update-6l9cg" Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.332943 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-29jrh" Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.343376 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/617dd4a9-970a-4c70-b587-f74323c172da-operator-scripts\") pod \"617dd4a9-970a-4c70-b587-f74323c172da\" (UID: \"617dd4a9-970a-4c70-b587-f74323c172da\") " Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.343633 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2grx\" (UniqueName: \"kubernetes.io/projected/617dd4a9-970a-4c70-b587-f74323c172da-kube-api-access-g2grx\") pod \"617dd4a9-970a-4c70-b587-f74323c172da\" (UID: \"617dd4a9-970a-4c70-b587-f74323c172da\") " Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.343990 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/617dd4a9-970a-4c70-b587-f74323c172da-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "617dd4a9-970a-4c70-b587-f74323c172da" (UID: "617dd4a9-970a-4c70-b587-f74323c172da"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.344055 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-3c33-account-create-update-5dpsl" Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.351325 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/617dd4a9-970a-4c70-b587-f74323c172da-kube-api-access-g2grx" (OuterVolumeSpecName: "kube-api-access-g2grx") pod "617dd4a9-970a-4c70-b587-f74323c172da" (UID: "617dd4a9-970a-4c70-b587-f74323c172da"). InnerVolumeSpecName "kube-api-access-g2grx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.366656 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8712-account-create-update-6l9cg" Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.366904 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e2c2-account-create-update-kdpd5" Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.366987 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8712-account-create-update-6l9cg" event={"ID":"173f3ec0-4825-426f-8a8f-fa693b7068d2","Type":"ContainerDied","Data":"bf93d0068890565339dd04dd75e629aa6ada6128315702a0b8dc8d0cf3f0c442"} Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.367015 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf93d0068890565339dd04dd75e629aa6ada6128315702a0b8dc8d0cf3f0c442" Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.370026 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-29jrh" event={"ID":"142cab77-aec7-45a4-9c64-45c3209e2a9d","Type":"ContainerDied","Data":"f19c814a86eb3fa662d7163f2c38a888f17deae97ff50fd109674cff2118d44f"} Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.370050 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f19c814a86eb3fa662d7163f2c38a888f17deae97ff50fd109674cff2118d44f" Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.370058 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-29jrh" Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.371559 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-3c33-account-create-update-5dpsl" Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.371597 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-3c33-account-create-update-5dpsl" event={"ID":"df5d1410-e474-4ae8-980b-46092cc080b0","Type":"ContainerDied","Data":"7cb78eae3f7d05750a2c5d86d4f8ab9e97c2b2de6e977efc514aaf62b5bc9e43"} Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.371636 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7cb78eae3f7d05750a2c5d86d4f8ab9e97c2b2de6e977efc514aaf62b5bc9e43" Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.373646 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e2c2-account-create-update-kdpd5" Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.373650 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-e2c2-account-create-update-kdpd5" event={"ID":"50a6d871-6acd-45a8-ae76-958e8fd0b9ec","Type":"ContainerDied","Data":"0bfb8eaf1c667702466b6651a14fbd47fc018668036d49096667baf5eb1c05f8"} Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.373801 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0bfb8eaf1c667702466b6651a14fbd47fc018668036d49096667baf5eb1c05f8" Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.375494 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-nghlv" event={"ID":"ea546a17-3742-4171-a2ff-1df8b5dce890","Type":"ContainerDied","Data":"e9e5853c2287347eb35126cc1cce13e81874443f22b793257e9e1c830e1fb811"} Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.375776 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9e5853c2287347eb35126cc1cce13e81874443f22b793257e9e1c830e1fb811" Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.375516 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-nghlv" Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.376795 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-sxdq5" Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.376840 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-sxdq5" event={"ID":"617dd4a9-970a-4c70-b587-f74323c172da","Type":"ContainerDied","Data":"eb8ad76b4ee1582d1fd7127a2c42295ba68e05aee70b3cec2746376d188e6fd8"} Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.376881 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb8ad76b4ee1582d1fd7127a2c42295ba68e05aee70b3cec2746376d188e6fd8" Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.445020 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-795jw\" (UniqueName: \"kubernetes.io/projected/df5d1410-e474-4ae8-980b-46092cc080b0-kube-api-access-795jw\") pod \"df5d1410-e474-4ae8-980b-46092cc080b0\" (UID: \"df5d1410-e474-4ae8-980b-46092cc080b0\") " Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.445211 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prvdd\" (UniqueName: \"kubernetes.io/projected/142cab77-aec7-45a4-9c64-45c3209e2a9d-kube-api-access-prvdd\") pod \"142cab77-aec7-45a4-9c64-45c3209e2a9d\" (UID: \"142cab77-aec7-45a4-9c64-45c3209e2a9d\") " Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.445264 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tz7d\" (UniqueName: \"kubernetes.io/projected/ea546a17-3742-4171-a2ff-1df8b5dce890-kube-api-access-6tz7d\") pod \"ea546a17-3742-4171-a2ff-1df8b5dce890\" (UID: \"ea546a17-3742-4171-a2ff-1df8b5dce890\") " Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.445303 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea546a17-3742-4171-a2ff-1df8b5dce890-operator-scripts\") pod \"ea546a17-3742-4171-a2ff-1df8b5dce890\" (UID: \"ea546a17-3742-4171-a2ff-1df8b5dce890\") " Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.445340 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df5d1410-e474-4ae8-980b-46092cc080b0-operator-scripts\") pod \"df5d1410-e474-4ae8-980b-46092cc080b0\" (UID: \"df5d1410-e474-4ae8-980b-46092cc080b0\") " Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.445382 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/173f3ec0-4825-426f-8a8f-fa693b7068d2-operator-scripts\") pod \"173f3ec0-4825-426f-8a8f-fa693b7068d2\" (UID: \"173f3ec0-4825-426f-8a8f-fa693b7068d2\") " Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.445408 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6c4k\" (UniqueName: \"kubernetes.io/projected/173f3ec0-4825-426f-8a8f-fa693b7068d2-kube-api-access-x6c4k\") pod \"173f3ec0-4825-426f-8a8f-fa693b7068d2\" (UID: \"173f3ec0-4825-426f-8a8f-fa693b7068d2\") " Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.445466 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/142cab77-aec7-45a4-9c64-45c3209e2a9d-operator-scripts\") pod \"142cab77-aec7-45a4-9c64-45c3209e2a9d\" (UID: \"142cab77-aec7-45a4-9c64-45c3209e2a9d\") " Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.446385 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2grx\" (UniqueName: \"kubernetes.io/projected/617dd4a9-970a-4c70-b587-f74323c172da-kube-api-access-g2grx\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.446485 4901 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/617dd4a9-970a-4c70-b587-f74323c172da-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.446531 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/142cab77-aec7-45a4-9c64-45c3209e2a9d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "142cab77-aec7-45a4-9c64-45c3209e2a9d" (UID: "142cab77-aec7-45a4-9c64-45c3209e2a9d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.446935 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea546a17-3742-4171-a2ff-1df8b5dce890-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ea546a17-3742-4171-a2ff-1df8b5dce890" (UID: "ea546a17-3742-4171-a2ff-1df8b5dce890"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.447003 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df5d1410-e474-4ae8-980b-46092cc080b0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "df5d1410-e474-4ae8-980b-46092cc080b0" (UID: "df5d1410-e474-4ae8-980b-46092cc080b0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.447168 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/173f3ec0-4825-426f-8a8f-fa693b7068d2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "173f3ec0-4825-426f-8a8f-fa693b7068d2" (UID: "173f3ec0-4825-426f-8a8f-fa693b7068d2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.449554 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/142cab77-aec7-45a4-9c64-45c3209e2a9d-kube-api-access-prvdd" (OuterVolumeSpecName: "kube-api-access-prvdd") pod "142cab77-aec7-45a4-9c64-45c3209e2a9d" (UID: "142cab77-aec7-45a4-9c64-45c3209e2a9d"). InnerVolumeSpecName "kube-api-access-prvdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.450303 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df5d1410-e474-4ae8-980b-46092cc080b0-kube-api-access-795jw" (OuterVolumeSpecName: "kube-api-access-795jw") pod "df5d1410-e474-4ae8-980b-46092cc080b0" (UID: "df5d1410-e474-4ae8-980b-46092cc080b0"). InnerVolumeSpecName "kube-api-access-795jw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.450349 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/173f3ec0-4825-426f-8a8f-fa693b7068d2-kube-api-access-x6c4k" (OuterVolumeSpecName: "kube-api-access-x6c4k") pod "173f3ec0-4825-426f-8a8f-fa693b7068d2" (UID: "173f3ec0-4825-426f-8a8f-fa693b7068d2"). InnerVolumeSpecName "kube-api-access-x6c4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.450891 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea546a17-3742-4171-a2ff-1df8b5dce890-kube-api-access-6tz7d" (OuterVolumeSpecName: "kube-api-access-6tz7d") pod "ea546a17-3742-4171-a2ff-1df8b5dce890" (UID: "ea546a17-3742-4171-a2ff-1df8b5dce890"). InnerVolumeSpecName "kube-api-access-6tz7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.548178 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50a6d871-6acd-45a8-ae76-958e8fd0b9ec-operator-scripts\") pod \"50a6d871-6acd-45a8-ae76-958e8fd0b9ec\" (UID: \"50a6d871-6acd-45a8-ae76-958e8fd0b9ec\") " Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.548445 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdqsl\" (UniqueName: \"kubernetes.io/projected/50a6d871-6acd-45a8-ae76-958e8fd0b9ec-kube-api-access-sdqsl\") pod \"50a6d871-6acd-45a8-ae76-958e8fd0b9ec\" (UID: \"50a6d871-6acd-45a8-ae76-958e8fd0b9ec\") " Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.548871 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50a6d871-6acd-45a8-ae76-958e8fd0b9ec-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "50a6d871-6acd-45a8-ae76-958e8fd0b9ec" (UID: "50a6d871-6acd-45a8-ae76-958e8fd0b9ec"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.549009 4901 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/142cab77-aec7-45a4-9c64-45c3209e2a9d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.549033 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-795jw\" (UniqueName: \"kubernetes.io/projected/df5d1410-e474-4ae8-980b-46092cc080b0-kube-api-access-795jw\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.549052 4901 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50a6d871-6acd-45a8-ae76-958e8fd0b9ec-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.549069 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prvdd\" (UniqueName: \"kubernetes.io/projected/142cab77-aec7-45a4-9c64-45c3209e2a9d-kube-api-access-prvdd\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.549090 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tz7d\" (UniqueName: \"kubernetes.io/projected/ea546a17-3742-4171-a2ff-1df8b5dce890-kube-api-access-6tz7d\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.549105 4901 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea546a17-3742-4171-a2ff-1df8b5dce890-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.549120 4901 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df5d1410-e474-4ae8-980b-46092cc080b0-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.549136 4901 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/173f3ec0-4825-426f-8a8f-fa693b7068d2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.549151 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6c4k\" (UniqueName: \"kubernetes.io/projected/173f3ec0-4825-426f-8a8f-fa693b7068d2-kube-api-access-x6c4k\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.553028 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50a6d871-6acd-45a8-ae76-958e8fd0b9ec-kube-api-access-sdqsl" (OuterVolumeSpecName: "kube-api-access-sdqsl") pod "50a6d871-6acd-45a8-ae76-958e8fd0b9ec" (UID: "50a6d871-6acd-45a8-ae76-958e8fd0b9ec"). InnerVolumeSpecName "kube-api-access-sdqsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.650395 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdqsl\" (UniqueName: \"kubernetes.io/projected/50a6d871-6acd-45a8-ae76-958e8fd0b9ec-kube-api-access-sdqsl\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.697196 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-kb7jk" Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.751359 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/61302320-9299-4dcc-abeb-05c28dd977c1-db-sync-config-data\") pod \"61302320-9299-4dcc-abeb-05c28dd977c1\" (UID: \"61302320-9299-4dcc-abeb-05c28dd977c1\") " Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.751442 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61302320-9299-4dcc-abeb-05c28dd977c1-combined-ca-bundle\") pod \"61302320-9299-4dcc-abeb-05c28dd977c1\" (UID: \"61302320-9299-4dcc-abeb-05c28dd977c1\") " Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.751492 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5qwr\" (UniqueName: \"kubernetes.io/projected/61302320-9299-4dcc-abeb-05c28dd977c1-kube-api-access-j5qwr\") pod \"61302320-9299-4dcc-abeb-05c28dd977c1\" (UID: \"61302320-9299-4dcc-abeb-05c28dd977c1\") " Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.751531 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61302320-9299-4dcc-abeb-05c28dd977c1-config-data\") pod \"61302320-9299-4dcc-abeb-05c28dd977c1\" (UID: \"61302320-9299-4dcc-abeb-05c28dd977c1\") " Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.754628 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61302320-9299-4dcc-abeb-05c28dd977c1-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "61302320-9299-4dcc-abeb-05c28dd977c1" (UID: "61302320-9299-4dcc-abeb-05c28dd977c1"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.755072 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61302320-9299-4dcc-abeb-05c28dd977c1-kube-api-access-j5qwr" (OuterVolumeSpecName: "kube-api-access-j5qwr") pod "61302320-9299-4dcc-abeb-05c28dd977c1" (UID: "61302320-9299-4dcc-abeb-05c28dd977c1"). InnerVolumeSpecName "kube-api-access-j5qwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.778095 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61302320-9299-4dcc-abeb-05c28dd977c1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "61302320-9299-4dcc-abeb-05c28dd977c1" (UID: "61302320-9299-4dcc-abeb-05c28dd977c1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.808676 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61302320-9299-4dcc-abeb-05c28dd977c1-config-data" (OuterVolumeSpecName: "config-data") pod "61302320-9299-4dcc-abeb-05c28dd977c1" (UID: "61302320-9299-4dcc-abeb-05c28dd977c1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.852901 4901 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/61302320-9299-4dcc-abeb-05c28dd977c1-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.852949 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61302320-9299-4dcc-abeb-05c28dd977c1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.852962 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5qwr\" (UniqueName: \"kubernetes.io/projected/61302320-9299-4dcc-abeb-05c28dd977c1-kube-api-access-j5qwr\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:42 crc kubenswrapper[4901]: I0309 03:01:42.852977 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61302320-9299-4dcc-abeb-05c28dd977c1-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:43 crc kubenswrapper[4901]: I0309 03:01:43.421174 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-kb7jk" event={"ID":"61302320-9299-4dcc-abeb-05c28dd977c1","Type":"ContainerDied","Data":"b2a5ae07e4928c1b599b4fb3fa452d070e25efb638f5b9edf3ef30e5b2d57649"} Mar 09 03:01:43 crc kubenswrapper[4901]: I0309 03:01:43.422335 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2a5ae07e4928c1b599b4fb3fa452d070e25efb638f5b9edf3ef30e5b2d57649" Mar 09 03:01:43 crc kubenswrapper[4901]: I0309 03:01:43.422778 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-kb7jk" Mar 09 03:01:43 crc kubenswrapper[4901]: I0309 03:01:43.428722 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jl2xh" event={"ID":"0b6c57bb-65c0-4563-a902-94e55b6f0713","Type":"ContainerStarted","Data":"0bbad5de2a9568ae31ba845734a4840ec69ea84dc62b4ce66b1d97a4dc0ceae5"} Mar 09 03:01:43 crc kubenswrapper[4901]: I0309 03:01:43.461629 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-jl2xh" podStartSLOduration=2.614951873 podStartE2EDuration="7.461611574s" podCreationTimestamp="2026-03-09 03:01:36 +0000 UTC" firstStartedPulling="2026-03-09 03:01:37.321822249 +0000 UTC m=+1221.911485981" lastFinishedPulling="2026-03-09 03:01:42.16848194 +0000 UTC m=+1226.758145682" observedRunningTime="2026-03-09 03:01:43.442122362 +0000 UTC m=+1228.031786094" watchObservedRunningTime="2026-03-09 03:01:43.461611574 +0000 UTC m=+1228.051275306" Mar 09 03:01:43 crc kubenswrapper[4901]: E0309 03:01:43.577765 4901 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61302320_9299_4dcc_abeb_05c28dd977c1.slice\": RecentStats: unable to find data in memory cache]" Mar 09 03:01:43 crc kubenswrapper[4901]: I0309 03:01:43.864865 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd445f5bc-fc5bl"] Mar 09 03:01:43 crc kubenswrapper[4901]: E0309 03:01:43.866894 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61302320-9299-4dcc-abeb-05c28dd977c1" containerName="glance-db-sync" Mar 09 03:01:43 crc kubenswrapper[4901]: I0309 03:01:43.866909 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="61302320-9299-4dcc-abeb-05c28dd977c1" containerName="glance-db-sync" Mar 09 03:01:43 crc kubenswrapper[4901]: E0309 03:01:43.866924 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df5d1410-e474-4ae8-980b-46092cc080b0" containerName="mariadb-account-create-update" Mar 09 03:01:43 crc kubenswrapper[4901]: I0309 03:01:43.866930 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="df5d1410-e474-4ae8-980b-46092cc080b0" containerName="mariadb-account-create-update" Mar 09 03:01:43 crc kubenswrapper[4901]: E0309 03:01:43.866941 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="617dd4a9-970a-4c70-b587-f74323c172da" containerName="mariadb-database-create" Mar 09 03:01:43 crc kubenswrapper[4901]: I0309 03:01:43.866947 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="617dd4a9-970a-4c70-b587-f74323c172da" containerName="mariadb-database-create" Mar 09 03:01:43 crc kubenswrapper[4901]: E0309 03:01:43.866963 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="173f3ec0-4825-426f-8a8f-fa693b7068d2" containerName="mariadb-account-create-update" Mar 09 03:01:43 crc kubenswrapper[4901]: I0309 03:01:43.866968 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="173f3ec0-4825-426f-8a8f-fa693b7068d2" containerName="mariadb-account-create-update" Mar 09 03:01:43 crc kubenswrapper[4901]: E0309 03:01:43.866987 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="142cab77-aec7-45a4-9c64-45c3209e2a9d" containerName="mariadb-database-create" Mar 09 03:01:43 crc kubenswrapper[4901]: I0309 03:01:43.866993 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="142cab77-aec7-45a4-9c64-45c3209e2a9d" containerName="mariadb-database-create" Mar 09 03:01:43 crc kubenswrapper[4901]: E0309 03:01:43.867001 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50a6d871-6acd-45a8-ae76-958e8fd0b9ec" containerName="mariadb-account-create-update" Mar 09 03:01:43 crc kubenswrapper[4901]: I0309 03:01:43.867007 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="50a6d871-6acd-45a8-ae76-958e8fd0b9ec" containerName="mariadb-account-create-update" Mar 09 03:01:43 crc kubenswrapper[4901]: E0309 03:01:43.867021 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea546a17-3742-4171-a2ff-1df8b5dce890" containerName="mariadb-database-create" Mar 09 03:01:43 crc kubenswrapper[4901]: I0309 03:01:43.867028 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea546a17-3742-4171-a2ff-1df8b5dce890" containerName="mariadb-database-create" Mar 09 03:01:43 crc kubenswrapper[4901]: I0309 03:01:43.867169 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea546a17-3742-4171-a2ff-1df8b5dce890" containerName="mariadb-database-create" Mar 09 03:01:43 crc kubenswrapper[4901]: I0309 03:01:43.867393 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="50a6d871-6acd-45a8-ae76-958e8fd0b9ec" containerName="mariadb-account-create-update" Mar 09 03:01:43 crc kubenswrapper[4901]: I0309 03:01:43.867407 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="142cab77-aec7-45a4-9c64-45c3209e2a9d" containerName="mariadb-database-create" Mar 09 03:01:43 crc kubenswrapper[4901]: I0309 03:01:43.867418 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="df5d1410-e474-4ae8-980b-46092cc080b0" containerName="mariadb-account-create-update" Mar 09 03:01:43 crc kubenswrapper[4901]: I0309 03:01:43.867444 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="173f3ec0-4825-426f-8a8f-fa693b7068d2" containerName="mariadb-account-create-update" Mar 09 03:01:43 crc kubenswrapper[4901]: I0309 03:01:43.867474 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="61302320-9299-4dcc-abeb-05c28dd977c1" containerName="glance-db-sync" Mar 09 03:01:43 crc kubenswrapper[4901]: I0309 03:01:43.867481 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="617dd4a9-970a-4c70-b587-f74323c172da" containerName="mariadb-database-create" Mar 09 03:01:43 crc kubenswrapper[4901]: I0309 03:01:43.869894 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd445f5bc-fc5bl" Mar 09 03:01:43 crc kubenswrapper[4901]: I0309 03:01:43.902282 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd445f5bc-fc5bl"] Mar 09 03:01:43 crc kubenswrapper[4901]: I0309 03:01:43.974962 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd445f5bc-fc5bl\" (UID: \"dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87\") " pod="openstack/dnsmasq-dns-7fd445f5bc-fc5bl" Mar 09 03:01:43 crc kubenswrapper[4901]: I0309 03:01:43.975280 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87-config\") pod \"dnsmasq-dns-7fd445f5bc-fc5bl\" (UID: \"dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87\") " pod="openstack/dnsmasq-dns-7fd445f5bc-fc5bl" Mar 09 03:01:43 crc kubenswrapper[4901]: I0309 03:01:43.975301 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lb6vm\" (UniqueName: \"kubernetes.io/projected/dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87-kube-api-access-lb6vm\") pod \"dnsmasq-dns-7fd445f5bc-fc5bl\" (UID: \"dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87\") " pod="openstack/dnsmasq-dns-7fd445f5bc-fc5bl" Mar 09 03:01:43 crc kubenswrapper[4901]: I0309 03:01:43.975338 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87-dns-swift-storage-0\") pod \"dnsmasq-dns-7fd445f5bc-fc5bl\" (UID: \"dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87\") " pod="openstack/dnsmasq-dns-7fd445f5bc-fc5bl" Mar 09 03:01:43 crc kubenswrapper[4901]: I0309 03:01:43.975371 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87-dns-svc\") pod \"dnsmasq-dns-7fd445f5bc-fc5bl\" (UID: \"dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87\") " pod="openstack/dnsmasq-dns-7fd445f5bc-fc5bl" Mar 09 03:01:43 crc kubenswrapper[4901]: I0309 03:01:43.975406 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87-ovsdbserver-sb\") pod \"dnsmasq-dns-7fd445f5bc-fc5bl\" (UID: \"dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87\") " pod="openstack/dnsmasq-dns-7fd445f5bc-fc5bl" Mar 09 03:01:44 crc kubenswrapper[4901]: I0309 03:01:44.076511 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87-dns-swift-storage-0\") pod \"dnsmasq-dns-7fd445f5bc-fc5bl\" (UID: \"dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87\") " pod="openstack/dnsmasq-dns-7fd445f5bc-fc5bl" Mar 09 03:01:44 crc kubenswrapper[4901]: I0309 03:01:44.076583 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87-dns-svc\") pod \"dnsmasq-dns-7fd445f5bc-fc5bl\" (UID: \"dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87\") " pod="openstack/dnsmasq-dns-7fd445f5bc-fc5bl" Mar 09 03:01:44 crc kubenswrapper[4901]: I0309 03:01:44.076636 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87-ovsdbserver-sb\") pod \"dnsmasq-dns-7fd445f5bc-fc5bl\" (UID: \"dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87\") " pod="openstack/dnsmasq-dns-7fd445f5bc-fc5bl" Mar 09 03:01:44 crc kubenswrapper[4901]: I0309 03:01:44.076683 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd445f5bc-fc5bl\" (UID: \"dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87\") " pod="openstack/dnsmasq-dns-7fd445f5bc-fc5bl" Mar 09 03:01:44 crc kubenswrapper[4901]: I0309 03:01:44.076731 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87-config\") pod \"dnsmasq-dns-7fd445f5bc-fc5bl\" (UID: \"dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87\") " pod="openstack/dnsmasq-dns-7fd445f5bc-fc5bl" Mar 09 03:01:44 crc kubenswrapper[4901]: I0309 03:01:44.076752 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lb6vm\" (UniqueName: \"kubernetes.io/projected/dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87-kube-api-access-lb6vm\") pod \"dnsmasq-dns-7fd445f5bc-fc5bl\" (UID: \"dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87\") " pod="openstack/dnsmasq-dns-7fd445f5bc-fc5bl" Mar 09 03:01:44 crc kubenswrapper[4901]: I0309 03:01:44.077553 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87-dns-swift-storage-0\") pod \"dnsmasq-dns-7fd445f5bc-fc5bl\" (UID: \"dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87\") " pod="openstack/dnsmasq-dns-7fd445f5bc-fc5bl" Mar 09 03:01:44 crc kubenswrapper[4901]: I0309 03:01:44.077689 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87-ovsdbserver-sb\") pod \"dnsmasq-dns-7fd445f5bc-fc5bl\" (UID: \"dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87\") " pod="openstack/dnsmasq-dns-7fd445f5bc-fc5bl" Mar 09 03:01:44 crc kubenswrapper[4901]: I0309 03:01:44.077715 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87-config\") pod \"dnsmasq-dns-7fd445f5bc-fc5bl\" (UID: \"dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87\") " pod="openstack/dnsmasq-dns-7fd445f5bc-fc5bl" Mar 09 03:01:44 crc kubenswrapper[4901]: I0309 03:01:44.078162 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd445f5bc-fc5bl\" (UID: \"dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87\") " pod="openstack/dnsmasq-dns-7fd445f5bc-fc5bl" Mar 09 03:01:44 crc kubenswrapper[4901]: I0309 03:01:44.078716 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87-dns-svc\") pod \"dnsmasq-dns-7fd445f5bc-fc5bl\" (UID: \"dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87\") " pod="openstack/dnsmasq-dns-7fd445f5bc-fc5bl" Mar 09 03:01:44 crc kubenswrapper[4901]: I0309 03:01:44.105291 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lb6vm\" (UniqueName: \"kubernetes.io/projected/dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87-kube-api-access-lb6vm\") pod \"dnsmasq-dns-7fd445f5bc-fc5bl\" (UID: \"dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87\") " pod="openstack/dnsmasq-dns-7fd445f5bc-fc5bl" Mar 09 03:01:44 crc kubenswrapper[4901]: I0309 03:01:44.215915 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd445f5bc-fc5bl" Mar 09 03:01:44 crc kubenswrapper[4901]: I0309 03:01:44.694909 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd445f5bc-fc5bl"] Mar 09 03:01:44 crc kubenswrapper[4901]: W0309 03:01:44.705548 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc4d36d6_5ae4_4b51_ac8e_58ca1f094e87.slice/crio-f01f3355863aba1332eab1a7cbf54676ae167e6c22b83337b98f5e75481000da WatchSource:0}: Error finding container f01f3355863aba1332eab1a7cbf54676ae167e6c22b83337b98f5e75481000da: Status 404 returned error can't find the container with id f01f3355863aba1332eab1a7cbf54676ae167e6c22b83337b98f5e75481000da Mar 09 03:01:45 crc kubenswrapper[4901]: I0309 03:01:45.447397 4901 generic.go:334] "Generic (PLEG): container finished" podID="dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87" containerID="eabc16e4cb88d98be0eb4b39b0c7908658d7e544e32938215701000a99d0197f" exitCode=0 Mar 09 03:01:45 crc kubenswrapper[4901]: I0309 03:01:45.447642 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd445f5bc-fc5bl" event={"ID":"dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87","Type":"ContainerDied","Data":"eabc16e4cb88d98be0eb4b39b0c7908658d7e544e32938215701000a99d0197f"} Mar 09 03:01:45 crc kubenswrapper[4901]: I0309 03:01:45.447665 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd445f5bc-fc5bl" event={"ID":"dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87","Type":"ContainerStarted","Data":"f01f3355863aba1332eab1a7cbf54676ae167e6c22b83337b98f5e75481000da"} Mar 09 03:01:46 crc kubenswrapper[4901]: I0309 03:01:46.460328 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd445f5bc-fc5bl" event={"ID":"dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87","Type":"ContainerStarted","Data":"9597de26ae09249d81f1ead20d96f269323192bc4314f6965dcfa4428bb7991b"} Mar 09 03:01:46 crc kubenswrapper[4901]: I0309 03:01:46.460782 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fd445f5bc-fc5bl" Mar 09 03:01:46 crc kubenswrapper[4901]: I0309 03:01:46.462273 4901 generic.go:334] "Generic (PLEG): container finished" podID="0b6c57bb-65c0-4563-a902-94e55b6f0713" containerID="0bbad5de2a9568ae31ba845734a4840ec69ea84dc62b4ce66b1d97a4dc0ceae5" exitCode=0 Mar 09 03:01:46 crc kubenswrapper[4901]: I0309 03:01:46.462322 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jl2xh" event={"ID":"0b6c57bb-65c0-4563-a902-94e55b6f0713","Type":"ContainerDied","Data":"0bbad5de2a9568ae31ba845734a4840ec69ea84dc62b4ce66b1d97a4dc0ceae5"} Mar 09 03:01:46 crc kubenswrapper[4901]: I0309 03:01:46.516583 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fd445f5bc-fc5bl" podStartSLOduration=3.516555359 podStartE2EDuration="3.516555359s" podCreationTimestamp="2026-03-09 03:01:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 03:01:46.483601768 +0000 UTC m=+1231.073265510" watchObservedRunningTime="2026-03-09 03:01:46.516555359 +0000 UTC m=+1231.106219131" Mar 09 03:01:47 crc kubenswrapper[4901]: I0309 03:01:47.764676 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jl2xh" Mar 09 03:01:47 crc kubenswrapper[4901]: I0309 03:01:47.941484 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b6c57bb-65c0-4563-a902-94e55b6f0713-config-data\") pod \"0b6c57bb-65c0-4563-a902-94e55b6f0713\" (UID: \"0b6c57bb-65c0-4563-a902-94e55b6f0713\") " Mar 09 03:01:47 crc kubenswrapper[4901]: I0309 03:01:47.941560 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b6c57bb-65c0-4563-a902-94e55b6f0713-combined-ca-bundle\") pod \"0b6c57bb-65c0-4563-a902-94e55b6f0713\" (UID: \"0b6c57bb-65c0-4563-a902-94e55b6f0713\") " Mar 09 03:01:47 crc kubenswrapper[4901]: I0309 03:01:47.941750 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bwxq\" (UniqueName: \"kubernetes.io/projected/0b6c57bb-65c0-4563-a902-94e55b6f0713-kube-api-access-2bwxq\") pod \"0b6c57bb-65c0-4563-a902-94e55b6f0713\" (UID: \"0b6c57bb-65c0-4563-a902-94e55b6f0713\") " Mar 09 03:01:47 crc kubenswrapper[4901]: I0309 03:01:47.954572 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b6c57bb-65c0-4563-a902-94e55b6f0713-kube-api-access-2bwxq" (OuterVolumeSpecName: "kube-api-access-2bwxq") pod "0b6c57bb-65c0-4563-a902-94e55b6f0713" (UID: "0b6c57bb-65c0-4563-a902-94e55b6f0713"). InnerVolumeSpecName "kube-api-access-2bwxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:01:47 crc kubenswrapper[4901]: I0309 03:01:47.971472 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b6c57bb-65c0-4563-a902-94e55b6f0713-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0b6c57bb-65c0-4563-a902-94e55b6f0713" (UID: "0b6c57bb-65c0-4563-a902-94e55b6f0713"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:01:48 crc kubenswrapper[4901]: I0309 03:01:48.012848 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b6c57bb-65c0-4563-a902-94e55b6f0713-config-data" (OuterVolumeSpecName: "config-data") pod "0b6c57bb-65c0-4563-a902-94e55b6f0713" (UID: "0b6c57bb-65c0-4563-a902-94e55b6f0713"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:01:48 crc kubenswrapper[4901]: I0309 03:01:48.053265 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bwxq\" (UniqueName: \"kubernetes.io/projected/0b6c57bb-65c0-4563-a902-94e55b6f0713-kube-api-access-2bwxq\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:48 crc kubenswrapper[4901]: I0309 03:01:48.053295 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b6c57bb-65c0-4563-a902-94e55b6f0713-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:48 crc kubenswrapper[4901]: I0309 03:01:48.053306 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b6c57bb-65c0-4563-a902-94e55b6f0713-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:48 crc kubenswrapper[4901]: I0309 03:01:48.484692 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jl2xh" event={"ID":"0b6c57bb-65c0-4563-a902-94e55b6f0713","Type":"ContainerDied","Data":"86d9474a548e2bbef4b64e3d2fb5fa4e0888be9f71d73d09bf91a021fcd96238"} Mar 09 03:01:48 crc kubenswrapper[4901]: I0309 03:01:48.485029 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86d9474a548e2bbef4b64e3d2fb5fa4e0888be9f71d73d09bf91a021fcd96238" Mar 09 03:01:48 crc kubenswrapper[4901]: I0309 03:01:48.484792 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jl2xh" Mar 09 03:01:48 crc kubenswrapper[4901]: I0309 03:01:48.798056 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-wgm9z"] Mar 09 03:01:48 crc kubenswrapper[4901]: E0309 03:01:48.798464 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b6c57bb-65c0-4563-a902-94e55b6f0713" containerName="keystone-db-sync" Mar 09 03:01:48 crc kubenswrapper[4901]: I0309 03:01:48.798480 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b6c57bb-65c0-4563-a902-94e55b6f0713" containerName="keystone-db-sync" Mar 09 03:01:48 crc kubenswrapper[4901]: I0309 03:01:48.798691 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b6c57bb-65c0-4563-a902-94e55b6f0713" containerName="keystone-db-sync" Mar 09 03:01:48 crc kubenswrapper[4901]: I0309 03:01:48.799304 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wgm9z" Mar 09 03:01:48 crc kubenswrapper[4901]: I0309 03:01:48.802557 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 09 03:01:48 crc kubenswrapper[4901]: I0309 03:01:48.802775 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 09 03:01:48 crc kubenswrapper[4901]: I0309 03:01:48.802916 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 09 03:01:48 crc kubenswrapper[4901]: I0309 03:01:48.803094 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-ghd6s" Mar 09 03:01:48 crc kubenswrapper[4901]: I0309 03:01:48.803159 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 09 03:01:48 crc kubenswrapper[4901]: I0309 03:01:48.816862 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd445f5bc-fc5bl"] Mar 09 03:01:48 crc kubenswrapper[4901]: I0309 03:01:48.817108 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7fd445f5bc-fc5bl" podUID="dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87" containerName="dnsmasq-dns" containerID="cri-o://9597de26ae09249d81f1ead20d96f269323192bc4314f6965dcfa4428bb7991b" gracePeriod=10 Mar 09 03:01:48 crc kubenswrapper[4901]: I0309 03:01:48.852895 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5879b95d97-2h27b"] Mar 09 03:01:48 crc kubenswrapper[4901]: I0309 03:01:48.854487 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5879b95d97-2h27b" Mar 09 03:01:48 crc kubenswrapper[4901]: I0309 03:01:48.862497 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-wgm9z"] Mar 09 03:01:48 crc kubenswrapper[4901]: I0309 03:01:48.865886 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1922b7e7-e63a-4d39-9785-11d2c64f5ec3-credential-keys\") pod \"keystone-bootstrap-wgm9z\" (UID: \"1922b7e7-e63a-4d39-9785-11d2c64f5ec3\") " pod="openstack/keystone-bootstrap-wgm9z" Mar 09 03:01:48 crc kubenswrapper[4901]: I0309 03:01:48.865913 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4m2m\" (UniqueName: \"kubernetes.io/projected/1922b7e7-e63a-4d39-9785-11d2c64f5ec3-kube-api-access-q4m2m\") pod \"keystone-bootstrap-wgm9z\" (UID: \"1922b7e7-e63a-4d39-9785-11d2c64f5ec3\") " pod="openstack/keystone-bootstrap-wgm9z" Mar 09 03:01:48 crc kubenswrapper[4901]: I0309 03:01:48.865936 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50b48e26-823e-4741-9985-85d776d2002f-dns-svc\") pod \"dnsmasq-dns-5879b95d97-2h27b\" (UID: \"50b48e26-823e-4741-9985-85d776d2002f\") " pod="openstack/dnsmasq-dns-5879b95d97-2h27b" Mar 09 03:01:48 crc kubenswrapper[4901]: I0309 03:01:48.865952 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50b48e26-823e-4741-9985-85d776d2002f-ovsdbserver-sb\") pod \"dnsmasq-dns-5879b95d97-2h27b\" (UID: \"50b48e26-823e-4741-9985-85d776d2002f\") " pod="openstack/dnsmasq-dns-5879b95d97-2h27b" Mar 09 03:01:48 crc kubenswrapper[4901]: I0309 03:01:48.865966 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1922b7e7-e63a-4d39-9785-11d2c64f5ec3-config-data\") pod \"keystone-bootstrap-wgm9z\" (UID: \"1922b7e7-e63a-4d39-9785-11d2c64f5ec3\") " pod="openstack/keystone-bootstrap-wgm9z" Mar 09 03:01:48 crc kubenswrapper[4901]: I0309 03:01:48.866009 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/50b48e26-823e-4741-9985-85d776d2002f-dns-swift-storage-0\") pod \"dnsmasq-dns-5879b95d97-2h27b\" (UID: \"50b48e26-823e-4741-9985-85d776d2002f\") " pod="openstack/dnsmasq-dns-5879b95d97-2h27b" Mar 09 03:01:48 crc kubenswrapper[4901]: I0309 03:01:48.866029 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1922b7e7-e63a-4d39-9785-11d2c64f5ec3-combined-ca-bundle\") pod \"keystone-bootstrap-wgm9z\" (UID: \"1922b7e7-e63a-4d39-9785-11d2c64f5ec3\") " pod="openstack/keystone-bootstrap-wgm9z" Mar 09 03:01:48 crc kubenswrapper[4901]: I0309 03:01:48.866050 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1922b7e7-e63a-4d39-9785-11d2c64f5ec3-scripts\") pod \"keystone-bootstrap-wgm9z\" (UID: \"1922b7e7-e63a-4d39-9785-11d2c64f5ec3\") " pod="openstack/keystone-bootstrap-wgm9z" Mar 09 03:01:48 crc kubenswrapper[4901]: I0309 03:01:48.866078 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/50b48e26-823e-4741-9985-85d776d2002f-ovsdbserver-nb\") pod \"dnsmasq-dns-5879b95d97-2h27b\" (UID: \"50b48e26-823e-4741-9985-85d776d2002f\") " pod="openstack/dnsmasq-dns-5879b95d97-2h27b" Mar 09 03:01:48 crc kubenswrapper[4901]: I0309 03:01:48.866117 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1922b7e7-e63a-4d39-9785-11d2c64f5ec3-fernet-keys\") pod \"keystone-bootstrap-wgm9z\" (UID: \"1922b7e7-e63a-4d39-9785-11d2c64f5ec3\") " pod="openstack/keystone-bootstrap-wgm9z" Mar 09 03:01:48 crc kubenswrapper[4901]: I0309 03:01:48.866135 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50b48e26-823e-4741-9985-85d776d2002f-config\") pod \"dnsmasq-dns-5879b95d97-2h27b\" (UID: \"50b48e26-823e-4741-9985-85d776d2002f\") " pod="openstack/dnsmasq-dns-5879b95d97-2h27b" Mar 09 03:01:48 crc kubenswrapper[4901]: I0309 03:01:48.866157 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktjrd\" (UniqueName: \"kubernetes.io/projected/50b48e26-823e-4741-9985-85d776d2002f-kube-api-access-ktjrd\") pod \"dnsmasq-dns-5879b95d97-2h27b\" (UID: \"50b48e26-823e-4741-9985-85d776d2002f\") " pod="openstack/dnsmasq-dns-5879b95d97-2h27b" Mar 09 03:01:48 crc kubenswrapper[4901]: I0309 03:01:48.906678 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5879b95d97-2h27b"] Mar 09 03:01:48 crc kubenswrapper[4901]: I0309 03:01:48.977895 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/50b48e26-823e-4741-9985-85d776d2002f-ovsdbserver-nb\") pod \"dnsmasq-dns-5879b95d97-2h27b\" (UID: \"50b48e26-823e-4741-9985-85d776d2002f\") " pod="openstack/dnsmasq-dns-5879b95d97-2h27b" Mar 09 03:01:48 crc kubenswrapper[4901]: I0309 03:01:48.977980 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1922b7e7-e63a-4d39-9785-11d2c64f5ec3-fernet-keys\") pod \"keystone-bootstrap-wgm9z\" (UID: \"1922b7e7-e63a-4d39-9785-11d2c64f5ec3\") " pod="openstack/keystone-bootstrap-wgm9z" Mar 09 03:01:48 crc kubenswrapper[4901]: I0309 03:01:48.978008 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50b48e26-823e-4741-9985-85d776d2002f-config\") pod \"dnsmasq-dns-5879b95d97-2h27b\" (UID: \"50b48e26-823e-4741-9985-85d776d2002f\") " pod="openstack/dnsmasq-dns-5879b95d97-2h27b" Mar 09 03:01:48 crc kubenswrapper[4901]: I0309 03:01:48.978038 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktjrd\" (UniqueName: \"kubernetes.io/projected/50b48e26-823e-4741-9985-85d776d2002f-kube-api-access-ktjrd\") pod \"dnsmasq-dns-5879b95d97-2h27b\" (UID: \"50b48e26-823e-4741-9985-85d776d2002f\") " pod="openstack/dnsmasq-dns-5879b95d97-2h27b" Mar 09 03:01:48 crc kubenswrapper[4901]: I0309 03:01:48.978063 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1922b7e7-e63a-4d39-9785-11d2c64f5ec3-credential-keys\") pod \"keystone-bootstrap-wgm9z\" (UID: \"1922b7e7-e63a-4d39-9785-11d2c64f5ec3\") " pod="openstack/keystone-bootstrap-wgm9z" Mar 09 03:01:48 crc kubenswrapper[4901]: I0309 03:01:48.978085 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4m2m\" (UniqueName: \"kubernetes.io/projected/1922b7e7-e63a-4d39-9785-11d2c64f5ec3-kube-api-access-q4m2m\") pod \"keystone-bootstrap-wgm9z\" (UID: \"1922b7e7-e63a-4d39-9785-11d2c64f5ec3\") " pod="openstack/keystone-bootstrap-wgm9z" Mar 09 03:01:48 crc kubenswrapper[4901]: I0309 03:01:48.978108 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50b48e26-823e-4741-9985-85d776d2002f-dns-svc\") pod \"dnsmasq-dns-5879b95d97-2h27b\" (UID: \"50b48e26-823e-4741-9985-85d776d2002f\") " pod="openstack/dnsmasq-dns-5879b95d97-2h27b" Mar 09 03:01:48 crc kubenswrapper[4901]: I0309 03:01:48.978125 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50b48e26-823e-4741-9985-85d776d2002f-ovsdbserver-sb\") pod \"dnsmasq-dns-5879b95d97-2h27b\" (UID: \"50b48e26-823e-4741-9985-85d776d2002f\") " pod="openstack/dnsmasq-dns-5879b95d97-2h27b" Mar 09 03:01:48 crc kubenswrapper[4901]: I0309 03:01:48.978144 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1922b7e7-e63a-4d39-9785-11d2c64f5ec3-config-data\") pod \"keystone-bootstrap-wgm9z\" (UID: \"1922b7e7-e63a-4d39-9785-11d2c64f5ec3\") " pod="openstack/keystone-bootstrap-wgm9z" Mar 09 03:01:48 crc kubenswrapper[4901]: I0309 03:01:48.978188 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/50b48e26-823e-4741-9985-85d776d2002f-dns-swift-storage-0\") pod \"dnsmasq-dns-5879b95d97-2h27b\" (UID: \"50b48e26-823e-4741-9985-85d776d2002f\") " pod="openstack/dnsmasq-dns-5879b95d97-2h27b" Mar 09 03:01:48 crc kubenswrapper[4901]: I0309 03:01:48.978206 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1922b7e7-e63a-4d39-9785-11d2c64f5ec3-combined-ca-bundle\") pod \"keystone-bootstrap-wgm9z\" (UID: \"1922b7e7-e63a-4d39-9785-11d2c64f5ec3\") " pod="openstack/keystone-bootstrap-wgm9z" Mar 09 03:01:48 crc kubenswrapper[4901]: I0309 03:01:48.978248 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1922b7e7-e63a-4d39-9785-11d2c64f5ec3-scripts\") pod \"keystone-bootstrap-wgm9z\" (UID: \"1922b7e7-e63a-4d39-9785-11d2c64f5ec3\") " pod="openstack/keystone-bootstrap-wgm9z" Mar 09 03:01:48 crc kubenswrapper[4901]: I0309 03:01:48.982503 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/50b48e26-823e-4741-9985-85d776d2002f-ovsdbserver-nb\") pod \"dnsmasq-dns-5879b95d97-2h27b\" (UID: \"50b48e26-823e-4741-9985-85d776d2002f\") " pod="openstack/dnsmasq-dns-5879b95d97-2h27b" Mar 09 03:01:48 crc kubenswrapper[4901]: I0309 03:01:48.986292 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50b48e26-823e-4741-9985-85d776d2002f-config\") pod \"dnsmasq-dns-5879b95d97-2h27b\" (UID: \"50b48e26-823e-4741-9985-85d776d2002f\") " pod="openstack/dnsmasq-dns-5879b95d97-2h27b" Mar 09 03:01:48 crc kubenswrapper[4901]: I0309 03:01:48.992269 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50b48e26-823e-4741-9985-85d776d2002f-dns-svc\") pod \"dnsmasq-dns-5879b95d97-2h27b\" (UID: \"50b48e26-823e-4741-9985-85d776d2002f\") " pod="openstack/dnsmasq-dns-5879b95d97-2h27b" Mar 09 03:01:48 crc kubenswrapper[4901]: I0309 03:01:48.992857 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50b48e26-823e-4741-9985-85d776d2002f-ovsdbserver-sb\") pod \"dnsmasq-dns-5879b95d97-2h27b\" (UID: \"50b48e26-823e-4741-9985-85d776d2002f\") " pod="openstack/dnsmasq-dns-5879b95d97-2h27b" Mar 09 03:01:48 crc kubenswrapper[4901]: I0309 03:01:48.995359 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1922b7e7-e63a-4d39-9785-11d2c64f5ec3-scripts\") pod \"keystone-bootstrap-wgm9z\" (UID: \"1922b7e7-e63a-4d39-9785-11d2c64f5ec3\") " pod="openstack/keystone-bootstrap-wgm9z" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.005562 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1922b7e7-e63a-4d39-9785-11d2c64f5ec3-credential-keys\") pod \"keystone-bootstrap-wgm9z\" (UID: \"1922b7e7-e63a-4d39-9785-11d2c64f5ec3\") " pod="openstack/keystone-bootstrap-wgm9z" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.006816 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1922b7e7-e63a-4d39-9785-11d2c64f5ec3-fernet-keys\") pod \"keystone-bootstrap-wgm9z\" (UID: \"1922b7e7-e63a-4d39-9785-11d2c64f5ec3\") " pod="openstack/keystone-bootstrap-wgm9z" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.007746 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/50b48e26-823e-4741-9985-85d776d2002f-dns-swift-storage-0\") pod \"dnsmasq-dns-5879b95d97-2h27b\" (UID: \"50b48e26-823e-4741-9985-85d776d2002f\") " pod="openstack/dnsmasq-dns-5879b95d97-2h27b" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.008894 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1922b7e7-e63a-4d39-9785-11d2c64f5ec3-combined-ca-bundle\") pod \"keystone-bootstrap-wgm9z\" (UID: \"1922b7e7-e63a-4d39-9785-11d2c64f5ec3\") " pod="openstack/keystone-bootstrap-wgm9z" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.010765 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1922b7e7-e63a-4d39-9785-11d2c64f5ec3-config-data\") pod \"keystone-bootstrap-wgm9z\" (UID: \"1922b7e7-e63a-4d39-9785-11d2c64f5ec3\") " pod="openstack/keystone-bootstrap-wgm9z" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.021027 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.030253 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.034061 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.034303 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.043756 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4m2m\" (UniqueName: \"kubernetes.io/projected/1922b7e7-e63a-4d39-9785-11d2c64f5ec3-kube-api-access-q4m2m\") pod \"keystone-bootstrap-wgm9z\" (UID: \"1922b7e7-e63a-4d39-9785-11d2c64f5ec3\") " pod="openstack/keystone-bootstrap-wgm9z" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.079137 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgtt9\" (UniqueName: \"kubernetes.io/projected/d0a22f21-beb0-44a1-943c-08547dc523a8-kube-api-access-tgtt9\") pod \"ceilometer-0\" (UID: \"d0a22f21-beb0-44a1-943c-08547dc523a8\") " pod="openstack/ceilometer-0" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.079251 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0a22f21-beb0-44a1-943c-08547dc523a8-log-httpd\") pod \"ceilometer-0\" (UID: \"d0a22f21-beb0-44a1-943c-08547dc523a8\") " pod="openstack/ceilometer-0" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.079283 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0a22f21-beb0-44a1-943c-08547dc523a8-run-httpd\") pod \"ceilometer-0\" (UID: \"d0a22f21-beb0-44a1-943c-08547dc523a8\") " pod="openstack/ceilometer-0" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.079317 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0a22f21-beb0-44a1-943c-08547dc523a8-scripts\") pod \"ceilometer-0\" (UID: \"d0a22f21-beb0-44a1-943c-08547dc523a8\") " pod="openstack/ceilometer-0" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.079336 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0a22f21-beb0-44a1-943c-08547dc523a8-config-data\") pod \"ceilometer-0\" (UID: \"d0a22f21-beb0-44a1-943c-08547dc523a8\") " pod="openstack/ceilometer-0" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.079353 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d0a22f21-beb0-44a1-943c-08547dc523a8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d0a22f21-beb0-44a1-943c-08547dc523a8\") " pod="openstack/ceilometer-0" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.079367 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0a22f21-beb0-44a1-943c-08547dc523a8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d0a22f21-beb0-44a1-943c-08547dc523a8\") " pod="openstack/ceilometer-0" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.094439 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-kzq5q"] Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.095510 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-kzq5q" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.105062 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktjrd\" (UniqueName: \"kubernetes.io/projected/50b48e26-823e-4741-9985-85d776d2002f-kube-api-access-ktjrd\") pod \"dnsmasq-dns-5879b95d97-2h27b\" (UID: \"50b48e26-823e-4741-9985-85d776d2002f\") " pod="openstack/dnsmasq-dns-5879b95d97-2h27b" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.105831 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.105889 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-8mvrg" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.105994 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.123830 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-kzq5q"] Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.139395 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wgm9z" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.164846 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.181071 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m7z7\" (UniqueName: \"kubernetes.io/projected/ecfb1b2f-eb9b-47e0-905b-5785fc307df9-kube-api-access-5m7z7\") pod \"neutron-db-sync-kzq5q\" (UID: \"ecfb1b2f-eb9b-47e0-905b-5785fc307df9\") " pod="openstack/neutron-db-sync-kzq5q" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.181120 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0a22f21-beb0-44a1-943c-08547dc523a8-log-httpd\") pod \"ceilometer-0\" (UID: \"d0a22f21-beb0-44a1-943c-08547dc523a8\") " pod="openstack/ceilometer-0" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.181146 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0a22f21-beb0-44a1-943c-08547dc523a8-run-httpd\") pod \"ceilometer-0\" (UID: \"d0a22f21-beb0-44a1-943c-08547dc523a8\") " pod="openstack/ceilometer-0" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.181190 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0a22f21-beb0-44a1-943c-08547dc523a8-scripts\") pod \"ceilometer-0\" (UID: \"d0a22f21-beb0-44a1-943c-08547dc523a8\") " pod="openstack/ceilometer-0" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.181211 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0a22f21-beb0-44a1-943c-08547dc523a8-config-data\") pod \"ceilometer-0\" (UID: \"d0a22f21-beb0-44a1-943c-08547dc523a8\") " pod="openstack/ceilometer-0" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.181243 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d0a22f21-beb0-44a1-943c-08547dc523a8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d0a22f21-beb0-44a1-943c-08547dc523a8\") " pod="openstack/ceilometer-0" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.181263 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0a22f21-beb0-44a1-943c-08547dc523a8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d0a22f21-beb0-44a1-943c-08547dc523a8\") " pod="openstack/ceilometer-0" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.181287 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecfb1b2f-eb9b-47e0-905b-5785fc307df9-combined-ca-bundle\") pod \"neutron-db-sync-kzq5q\" (UID: \"ecfb1b2f-eb9b-47e0-905b-5785fc307df9\") " pod="openstack/neutron-db-sync-kzq5q" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.181335 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgtt9\" (UniqueName: \"kubernetes.io/projected/d0a22f21-beb0-44a1-943c-08547dc523a8-kube-api-access-tgtt9\") pod \"ceilometer-0\" (UID: \"d0a22f21-beb0-44a1-943c-08547dc523a8\") " pod="openstack/ceilometer-0" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.181409 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ecfb1b2f-eb9b-47e0-905b-5785fc307df9-config\") pod \"neutron-db-sync-kzq5q\" (UID: \"ecfb1b2f-eb9b-47e0-905b-5785fc307df9\") " pod="openstack/neutron-db-sync-kzq5q" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.181774 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-24d57"] Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.185927 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0a22f21-beb0-44a1-943c-08547dc523a8-run-httpd\") pod \"ceilometer-0\" (UID: \"d0a22f21-beb0-44a1-943c-08547dc523a8\") " pod="openstack/ceilometer-0" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.186235 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0a22f21-beb0-44a1-943c-08547dc523a8-log-httpd\") pod \"ceilometer-0\" (UID: \"d0a22f21-beb0-44a1-943c-08547dc523a8\") " pod="openstack/ceilometer-0" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.191866 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d0a22f21-beb0-44a1-943c-08547dc523a8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d0a22f21-beb0-44a1-943c-08547dc523a8\") " pod="openstack/ceilometer-0" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.205970 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0a22f21-beb0-44a1-943c-08547dc523a8-scripts\") pod \"ceilometer-0\" (UID: \"d0a22f21-beb0-44a1-943c-08547dc523a8\") " pod="openstack/ceilometer-0" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.206634 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-24d57" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.208352 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0a22f21-beb0-44a1-943c-08547dc523a8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d0a22f21-beb0-44a1-943c-08547dc523a8\") " pod="openstack/ceilometer-0" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.213648 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-f9knf" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.213895 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgtt9\" (UniqueName: \"kubernetes.io/projected/d0a22f21-beb0-44a1-943c-08547dc523a8-kube-api-access-tgtt9\") pod \"ceilometer-0\" (UID: \"d0a22f21-beb0-44a1-943c-08547dc523a8\") " pod="openstack/ceilometer-0" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.213912 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.214182 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.218958 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0a22f21-beb0-44a1-943c-08547dc523a8-config-data\") pod \"ceilometer-0\" (UID: \"d0a22f21-beb0-44a1-943c-08547dc523a8\") " pod="openstack/ceilometer-0" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.230776 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5879b95d97-2h27b" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.264631 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-24d57"] Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.283047 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e040c407-4b37-4bee-b200-0d97b5767ef1-etc-machine-id\") pod \"cinder-db-sync-24d57\" (UID: \"e040c407-4b37-4bee-b200-0d97b5767ef1\") " pod="openstack/cinder-db-sync-24d57" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.283108 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5m7z7\" (UniqueName: \"kubernetes.io/projected/ecfb1b2f-eb9b-47e0-905b-5785fc307df9-kube-api-access-5m7z7\") pod \"neutron-db-sync-kzq5q\" (UID: \"ecfb1b2f-eb9b-47e0-905b-5785fc307df9\") " pod="openstack/neutron-db-sync-kzq5q" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.283143 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e040c407-4b37-4bee-b200-0d97b5767ef1-combined-ca-bundle\") pod \"cinder-db-sync-24d57\" (UID: \"e040c407-4b37-4bee-b200-0d97b5767ef1\") " pod="openstack/cinder-db-sync-24d57" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.283163 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e040c407-4b37-4bee-b200-0d97b5767ef1-config-data\") pod \"cinder-db-sync-24d57\" (UID: \"e040c407-4b37-4bee-b200-0d97b5767ef1\") " pod="openstack/cinder-db-sync-24d57" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.283207 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecfb1b2f-eb9b-47e0-905b-5785fc307df9-combined-ca-bundle\") pod \"neutron-db-sync-kzq5q\" (UID: \"ecfb1b2f-eb9b-47e0-905b-5785fc307df9\") " pod="openstack/neutron-db-sync-kzq5q" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.283239 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8fr5\" (UniqueName: \"kubernetes.io/projected/e040c407-4b37-4bee-b200-0d97b5767ef1-kube-api-access-r8fr5\") pod \"cinder-db-sync-24d57\" (UID: \"e040c407-4b37-4bee-b200-0d97b5767ef1\") " pod="openstack/cinder-db-sync-24d57" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.283281 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e040c407-4b37-4bee-b200-0d97b5767ef1-db-sync-config-data\") pod \"cinder-db-sync-24d57\" (UID: \"e040c407-4b37-4bee-b200-0d97b5767ef1\") " pod="openstack/cinder-db-sync-24d57" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.283312 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ecfb1b2f-eb9b-47e0-905b-5785fc307df9-config\") pod \"neutron-db-sync-kzq5q\" (UID: \"ecfb1b2f-eb9b-47e0-905b-5785fc307df9\") " pod="openstack/neutron-db-sync-kzq5q" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.283331 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e040c407-4b37-4bee-b200-0d97b5767ef1-scripts\") pod \"cinder-db-sync-24d57\" (UID: \"e040c407-4b37-4bee-b200-0d97b5767ef1\") " pod="openstack/cinder-db-sync-24d57" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.293298 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.293629 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ecfb1b2f-eb9b-47e0-905b-5785fc307df9-config\") pod \"neutron-db-sync-kzq5q\" (UID: \"ecfb1b2f-eb9b-47e0-905b-5785fc307df9\") " pod="openstack/neutron-db-sync-kzq5q" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.303404 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecfb1b2f-eb9b-47e0-905b-5785fc307df9-combined-ca-bundle\") pod \"neutron-db-sync-kzq5q\" (UID: \"ecfb1b2f-eb9b-47e0-905b-5785fc307df9\") " pod="openstack/neutron-db-sync-kzq5q" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.303608 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-2mh7b"] Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.304659 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-2mh7b" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.304891 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m7z7\" (UniqueName: \"kubernetes.io/projected/ecfb1b2f-eb9b-47e0-905b-5785fc307df9-kube-api-access-5m7z7\") pod \"neutron-db-sync-kzq5q\" (UID: \"ecfb1b2f-eb9b-47e0-905b-5785fc307df9\") " pod="openstack/neutron-db-sync-kzq5q" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.318611 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-7z846" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.318789 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.325923 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-kzq5q" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.339998 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-2mh7b"] Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.383639 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5879b95d97-2h27b"] Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.388494 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e040c407-4b37-4bee-b200-0d97b5767ef1-db-sync-config-data\") pod \"cinder-db-sync-24d57\" (UID: \"e040c407-4b37-4bee-b200-0d97b5767ef1\") " pod="openstack/cinder-db-sync-24d57" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.388603 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e040c407-4b37-4bee-b200-0d97b5767ef1-scripts\") pod \"cinder-db-sync-24d57\" (UID: \"e040c407-4b37-4bee-b200-0d97b5767ef1\") " pod="openstack/cinder-db-sync-24d57" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.388637 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7acea3ec-d1eb-4971-b3b5-7c0b898cf07c-combined-ca-bundle\") pod \"barbican-db-sync-2mh7b\" (UID: \"7acea3ec-d1eb-4971-b3b5-7c0b898cf07c\") " pod="openstack/barbican-db-sync-2mh7b" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.388679 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e040c407-4b37-4bee-b200-0d97b5767ef1-etc-machine-id\") pod \"cinder-db-sync-24d57\" (UID: \"e040c407-4b37-4bee-b200-0d97b5767ef1\") " pod="openstack/cinder-db-sync-24d57" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.388730 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7acea3ec-d1eb-4971-b3b5-7c0b898cf07c-db-sync-config-data\") pod \"barbican-db-sync-2mh7b\" (UID: \"7acea3ec-d1eb-4971-b3b5-7c0b898cf07c\") " pod="openstack/barbican-db-sync-2mh7b" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.388815 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e040c407-4b37-4bee-b200-0d97b5767ef1-config-data\") pod \"cinder-db-sync-24d57\" (UID: \"e040c407-4b37-4bee-b200-0d97b5767ef1\") " pod="openstack/cinder-db-sync-24d57" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.388854 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e040c407-4b37-4bee-b200-0d97b5767ef1-combined-ca-bundle\") pod \"cinder-db-sync-24d57\" (UID: \"e040c407-4b37-4bee-b200-0d97b5767ef1\") " pod="openstack/cinder-db-sync-24d57" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.389000 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8fr5\" (UniqueName: \"kubernetes.io/projected/e040c407-4b37-4bee-b200-0d97b5767ef1-kube-api-access-r8fr5\") pod \"cinder-db-sync-24d57\" (UID: \"e040c407-4b37-4bee-b200-0d97b5767ef1\") " pod="openstack/cinder-db-sync-24d57" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.389080 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cq94g\" (UniqueName: \"kubernetes.io/projected/7acea3ec-d1eb-4971-b3b5-7c0b898cf07c-kube-api-access-cq94g\") pod \"barbican-db-sync-2mh7b\" (UID: \"7acea3ec-d1eb-4971-b3b5-7c0b898cf07c\") " pod="openstack/barbican-db-sync-2mh7b" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.393701 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e040c407-4b37-4bee-b200-0d97b5767ef1-db-sync-config-data\") pod \"cinder-db-sync-24d57\" (UID: \"e040c407-4b37-4bee-b200-0d97b5767ef1\") " pod="openstack/cinder-db-sync-24d57" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.394008 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e040c407-4b37-4bee-b200-0d97b5767ef1-etc-machine-id\") pod \"cinder-db-sync-24d57\" (UID: \"e040c407-4b37-4bee-b200-0d97b5767ef1\") " pod="openstack/cinder-db-sync-24d57" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.398356 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e040c407-4b37-4bee-b200-0d97b5767ef1-config-data\") pod \"cinder-db-sync-24d57\" (UID: \"e040c407-4b37-4bee-b200-0d97b5767ef1\") " pod="openstack/cinder-db-sync-24d57" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.398395 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-hvvzs"] Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.399443 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-hvvzs" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.401025 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e040c407-4b37-4bee-b200-0d97b5767ef1-scripts\") pod \"cinder-db-sync-24d57\" (UID: \"e040c407-4b37-4bee-b200-0d97b5767ef1\") " pod="openstack/cinder-db-sync-24d57" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.409776 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.410105 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-5mjl2" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.410130 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.410374 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e040c407-4b37-4bee-b200-0d97b5767ef1-combined-ca-bundle\") pod \"cinder-db-sync-24d57\" (UID: \"e040c407-4b37-4bee-b200-0d97b5767ef1\") " pod="openstack/cinder-db-sync-24d57" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.421682 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-hvvzs"] Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.428863 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bf6456ddf-pv679"] Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.430436 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf6456ddf-pv679" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.445505 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf6456ddf-pv679"] Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.461010 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8fr5\" (UniqueName: \"kubernetes.io/projected/e040c407-4b37-4bee-b200-0d97b5767ef1-kube-api-access-r8fr5\") pod \"cinder-db-sync-24d57\" (UID: \"e040c407-4b37-4bee-b200-0d97b5767ef1\") " pod="openstack/cinder-db-sync-24d57" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.490502 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cq94g\" (UniqueName: \"kubernetes.io/projected/7acea3ec-d1eb-4971-b3b5-7c0b898cf07c-kube-api-access-cq94g\") pod \"barbican-db-sync-2mh7b\" (UID: \"7acea3ec-d1eb-4971-b3b5-7c0b898cf07c\") " pod="openstack/barbican-db-sync-2mh7b" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.490572 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd374a21-cd74-447e-ab94-4e60e6f0d7be-combined-ca-bundle\") pod \"placement-db-sync-hvvzs\" (UID: \"cd374a21-cd74-447e-ab94-4e60e6f0d7be\") " pod="openstack/placement-db-sync-hvvzs" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.490598 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbnqd\" (UniqueName: \"kubernetes.io/projected/cd374a21-cd74-447e-ab94-4e60e6f0d7be-kube-api-access-sbnqd\") pod \"placement-db-sync-hvvzs\" (UID: \"cd374a21-cd74-447e-ab94-4e60e6f0d7be\") " pod="openstack/placement-db-sync-hvvzs" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.490617 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7acea3ec-d1eb-4971-b3b5-7c0b898cf07c-combined-ca-bundle\") pod \"barbican-db-sync-2mh7b\" (UID: \"7acea3ec-d1eb-4971-b3b5-7c0b898cf07c\") " pod="openstack/barbican-db-sync-2mh7b" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.490638 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd374a21-cd74-447e-ab94-4e60e6f0d7be-logs\") pod \"placement-db-sync-hvvzs\" (UID: \"cd374a21-cd74-447e-ab94-4e60e6f0d7be\") " pod="openstack/placement-db-sync-hvvzs" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.490662 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7acea3ec-d1eb-4971-b3b5-7c0b898cf07c-db-sync-config-data\") pod \"barbican-db-sync-2mh7b\" (UID: \"7acea3ec-d1eb-4971-b3b5-7c0b898cf07c\") " pod="openstack/barbican-db-sync-2mh7b" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.490709 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e886d37-056b-4000-876d-881906f1e3b3-ovsdbserver-sb\") pod \"dnsmasq-dns-5bf6456ddf-pv679\" (UID: \"4e886d37-056b-4000-876d-881906f1e3b3\") " pod="openstack/dnsmasq-dns-5bf6456ddf-pv679" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.490725 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e886d37-056b-4000-876d-881906f1e3b3-config\") pod \"dnsmasq-dns-5bf6456ddf-pv679\" (UID: \"4e886d37-056b-4000-876d-881906f1e3b3\") " pod="openstack/dnsmasq-dns-5bf6456ddf-pv679" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.490739 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnvp2\" (UniqueName: \"kubernetes.io/projected/4e886d37-056b-4000-876d-881906f1e3b3-kube-api-access-jnvp2\") pod \"dnsmasq-dns-5bf6456ddf-pv679\" (UID: \"4e886d37-056b-4000-876d-881906f1e3b3\") " pod="openstack/dnsmasq-dns-5bf6456ddf-pv679" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.490765 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd374a21-cd74-447e-ab94-4e60e6f0d7be-config-data\") pod \"placement-db-sync-hvvzs\" (UID: \"cd374a21-cd74-447e-ab94-4e60e6f0d7be\") " pod="openstack/placement-db-sync-hvvzs" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.490782 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e886d37-056b-4000-876d-881906f1e3b3-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf6456ddf-pv679\" (UID: \"4e886d37-056b-4000-876d-881906f1e3b3\") " pod="openstack/dnsmasq-dns-5bf6456ddf-pv679" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.490800 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd374a21-cd74-447e-ab94-4e60e6f0d7be-scripts\") pod \"placement-db-sync-hvvzs\" (UID: \"cd374a21-cd74-447e-ab94-4e60e6f0d7be\") " pod="openstack/placement-db-sync-hvvzs" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.490821 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e886d37-056b-4000-876d-881906f1e3b3-dns-svc\") pod \"dnsmasq-dns-5bf6456ddf-pv679\" (UID: \"4e886d37-056b-4000-876d-881906f1e3b3\") " pod="openstack/dnsmasq-dns-5bf6456ddf-pv679" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.490834 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4e886d37-056b-4000-876d-881906f1e3b3-dns-swift-storage-0\") pod \"dnsmasq-dns-5bf6456ddf-pv679\" (UID: \"4e886d37-056b-4000-876d-881906f1e3b3\") " pod="openstack/dnsmasq-dns-5bf6456ddf-pv679" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.495547 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7acea3ec-d1eb-4971-b3b5-7c0b898cf07c-combined-ca-bundle\") pod \"barbican-db-sync-2mh7b\" (UID: \"7acea3ec-d1eb-4971-b3b5-7c0b898cf07c\") " pod="openstack/barbican-db-sync-2mh7b" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.503450 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7acea3ec-d1eb-4971-b3b5-7c0b898cf07c-db-sync-config-data\") pod \"barbican-db-sync-2mh7b\" (UID: \"7acea3ec-d1eb-4971-b3b5-7c0b898cf07c\") " pod="openstack/barbican-db-sync-2mh7b" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.511946 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cq94g\" (UniqueName: \"kubernetes.io/projected/7acea3ec-d1eb-4971-b3b5-7c0b898cf07c-kube-api-access-cq94g\") pod \"barbican-db-sync-2mh7b\" (UID: \"7acea3ec-d1eb-4971-b3b5-7c0b898cf07c\") " pod="openstack/barbican-db-sync-2mh7b" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.543284 4901 generic.go:334] "Generic (PLEG): container finished" podID="dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87" containerID="9597de26ae09249d81f1ead20d96f269323192bc4314f6965dcfa4428bb7991b" exitCode=0 Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.543533 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd445f5bc-fc5bl" event={"ID":"dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87","Type":"ContainerDied","Data":"9597de26ae09249d81f1ead20d96f269323192bc4314f6965dcfa4428bb7991b"} Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.597077 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd374a21-cd74-447e-ab94-4e60e6f0d7be-config-data\") pod \"placement-db-sync-hvvzs\" (UID: \"cd374a21-cd74-447e-ab94-4e60e6f0d7be\") " pod="openstack/placement-db-sync-hvvzs" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.597126 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e886d37-056b-4000-876d-881906f1e3b3-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf6456ddf-pv679\" (UID: \"4e886d37-056b-4000-876d-881906f1e3b3\") " pod="openstack/dnsmasq-dns-5bf6456ddf-pv679" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.597154 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd374a21-cd74-447e-ab94-4e60e6f0d7be-scripts\") pod \"placement-db-sync-hvvzs\" (UID: \"cd374a21-cd74-447e-ab94-4e60e6f0d7be\") " pod="openstack/placement-db-sync-hvvzs" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.597174 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e886d37-056b-4000-876d-881906f1e3b3-dns-svc\") pod \"dnsmasq-dns-5bf6456ddf-pv679\" (UID: \"4e886d37-056b-4000-876d-881906f1e3b3\") " pod="openstack/dnsmasq-dns-5bf6456ddf-pv679" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.597189 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4e886d37-056b-4000-876d-881906f1e3b3-dns-swift-storage-0\") pod \"dnsmasq-dns-5bf6456ddf-pv679\" (UID: \"4e886d37-056b-4000-876d-881906f1e3b3\") " pod="openstack/dnsmasq-dns-5bf6456ddf-pv679" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.597252 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd374a21-cd74-447e-ab94-4e60e6f0d7be-combined-ca-bundle\") pod \"placement-db-sync-hvvzs\" (UID: \"cd374a21-cd74-447e-ab94-4e60e6f0d7be\") " pod="openstack/placement-db-sync-hvvzs" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.597272 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbnqd\" (UniqueName: \"kubernetes.io/projected/cd374a21-cd74-447e-ab94-4e60e6f0d7be-kube-api-access-sbnqd\") pod \"placement-db-sync-hvvzs\" (UID: \"cd374a21-cd74-447e-ab94-4e60e6f0d7be\") " pod="openstack/placement-db-sync-hvvzs" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.597295 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd374a21-cd74-447e-ab94-4e60e6f0d7be-logs\") pod \"placement-db-sync-hvvzs\" (UID: \"cd374a21-cd74-447e-ab94-4e60e6f0d7be\") " pod="openstack/placement-db-sync-hvvzs" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.597349 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e886d37-056b-4000-876d-881906f1e3b3-ovsdbserver-sb\") pod \"dnsmasq-dns-5bf6456ddf-pv679\" (UID: \"4e886d37-056b-4000-876d-881906f1e3b3\") " pod="openstack/dnsmasq-dns-5bf6456ddf-pv679" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.597365 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e886d37-056b-4000-876d-881906f1e3b3-config\") pod \"dnsmasq-dns-5bf6456ddf-pv679\" (UID: \"4e886d37-056b-4000-876d-881906f1e3b3\") " pod="openstack/dnsmasq-dns-5bf6456ddf-pv679" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.597381 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnvp2\" (UniqueName: \"kubernetes.io/projected/4e886d37-056b-4000-876d-881906f1e3b3-kube-api-access-jnvp2\") pod \"dnsmasq-dns-5bf6456ddf-pv679\" (UID: \"4e886d37-056b-4000-876d-881906f1e3b3\") " pod="openstack/dnsmasq-dns-5bf6456ddf-pv679" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.602589 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd374a21-cd74-447e-ab94-4e60e6f0d7be-logs\") pod \"placement-db-sync-hvvzs\" (UID: \"cd374a21-cd74-447e-ab94-4e60e6f0d7be\") " pod="openstack/placement-db-sync-hvvzs" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.603595 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e886d37-056b-4000-876d-881906f1e3b3-ovsdbserver-sb\") pod \"dnsmasq-dns-5bf6456ddf-pv679\" (UID: \"4e886d37-056b-4000-876d-881906f1e3b3\") " pod="openstack/dnsmasq-dns-5bf6456ddf-pv679" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.603989 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e886d37-056b-4000-876d-881906f1e3b3-config\") pod \"dnsmasq-dns-5bf6456ddf-pv679\" (UID: \"4e886d37-056b-4000-876d-881906f1e3b3\") " pod="openstack/dnsmasq-dns-5bf6456ddf-pv679" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.604544 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e886d37-056b-4000-876d-881906f1e3b3-dns-svc\") pod \"dnsmasq-dns-5bf6456ddf-pv679\" (UID: \"4e886d37-056b-4000-876d-881906f1e3b3\") " pod="openstack/dnsmasq-dns-5bf6456ddf-pv679" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.605485 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e886d37-056b-4000-876d-881906f1e3b3-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf6456ddf-pv679\" (UID: \"4e886d37-056b-4000-876d-881906f1e3b3\") " pod="openstack/dnsmasq-dns-5bf6456ddf-pv679" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.605972 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4e886d37-056b-4000-876d-881906f1e3b3-dns-swift-storage-0\") pod \"dnsmasq-dns-5bf6456ddf-pv679\" (UID: \"4e886d37-056b-4000-876d-881906f1e3b3\") " pod="openstack/dnsmasq-dns-5bf6456ddf-pv679" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.610129 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd374a21-cd74-447e-ab94-4e60e6f0d7be-scripts\") pod \"placement-db-sync-hvvzs\" (UID: \"cd374a21-cd74-447e-ab94-4e60e6f0d7be\") " pod="openstack/placement-db-sync-hvvzs" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.610878 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd374a21-cd74-447e-ab94-4e60e6f0d7be-config-data\") pod \"placement-db-sync-hvvzs\" (UID: \"cd374a21-cd74-447e-ab94-4e60e6f0d7be\") " pod="openstack/placement-db-sync-hvvzs" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.619122 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd374a21-cd74-447e-ab94-4e60e6f0d7be-combined-ca-bundle\") pod \"placement-db-sync-hvvzs\" (UID: \"cd374a21-cd74-447e-ab94-4e60e6f0d7be\") " pod="openstack/placement-db-sync-hvvzs" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.635951 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbnqd\" (UniqueName: \"kubernetes.io/projected/cd374a21-cd74-447e-ab94-4e60e6f0d7be-kube-api-access-sbnqd\") pod \"placement-db-sync-hvvzs\" (UID: \"cd374a21-cd74-447e-ab94-4e60e6f0d7be\") " pod="openstack/placement-db-sync-hvvzs" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.644069 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnvp2\" (UniqueName: \"kubernetes.io/projected/4e886d37-056b-4000-876d-881906f1e3b3-kube-api-access-jnvp2\") pod \"dnsmasq-dns-5bf6456ddf-pv679\" (UID: \"4e886d37-056b-4000-876d-881906f1e3b3\") " pod="openstack/dnsmasq-dns-5bf6456ddf-pv679" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.671752 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-24d57" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.730136 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-2mh7b" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.745480 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-hvvzs" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.765844 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf6456ddf-pv679" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.927581 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.930357 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.936830 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-tdwfn" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.937031 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.939687 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.939913 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 09 03:01:49 crc kubenswrapper[4901]: I0309 03:01:49.961927 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.018408 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-wgm9z"] Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.063294 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.064728 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.066505 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.066968 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.073910 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.111868 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/36150a58-4658-4234-95e1-fbd8a2150e3a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"36150a58-4658-4234-95e1-fbd8a2150e3a\") " pod="openstack/glance-default-external-api-0" Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.111943 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/36150a58-4658-4234-95e1-fbd8a2150e3a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"36150a58-4658-4234-95e1-fbd8a2150e3a\") " pod="openstack/glance-default-external-api-0" Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.112015 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36150a58-4658-4234-95e1-fbd8a2150e3a-scripts\") pod \"glance-default-external-api-0\" (UID: \"36150a58-4658-4234-95e1-fbd8a2150e3a\") " pod="openstack/glance-default-external-api-0" Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.112241 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36150a58-4658-4234-95e1-fbd8a2150e3a-config-data\") pod \"glance-default-external-api-0\" (UID: \"36150a58-4658-4234-95e1-fbd8a2150e3a\") " pod="openstack/glance-default-external-api-0" Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.112326 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hckkg\" (UniqueName: \"kubernetes.io/projected/36150a58-4658-4234-95e1-fbd8a2150e3a-kube-api-access-hckkg\") pod \"glance-default-external-api-0\" (UID: \"36150a58-4658-4234-95e1-fbd8a2150e3a\") " pod="openstack/glance-default-external-api-0" Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.112351 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36150a58-4658-4234-95e1-fbd8a2150e3a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"36150a58-4658-4234-95e1-fbd8a2150e3a\") " pod="openstack/glance-default-external-api-0" Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.112381 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"36150a58-4658-4234-95e1-fbd8a2150e3a\") " pod="openstack/glance-default-external-api-0" Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.112441 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36150a58-4658-4234-95e1-fbd8a2150e3a-logs\") pod \"glance-default-external-api-0\" (UID: \"36150a58-4658-4234-95e1-fbd8a2150e3a\") " pod="openstack/glance-default-external-api-0" Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.137143 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5879b95d97-2h27b"] Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.190125 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.213929 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea1dc125-f1fd-422e-b18e-8b54956dd53e-logs\") pod \"glance-default-internal-api-0\" (UID: \"ea1dc125-f1fd-422e-b18e-8b54956dd53e\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.213989 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/36150a58-4658-4234-95e1-fbd8a2150e3a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"36150a58-4658-4234-95e1-fbd8a2150e3a\") " pod="openstack/glance-default-external-api-0" Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.214104 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36150a58-4658-4234-95e1-fbd8a2150e3a-scripts\") pod \"glance-default-external-api-0\" (UID: \"36150a58-4658-4234-95e1-fbd8a2150e3a\") " pod="openstack/glance-default-external-api-0" Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.214133 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea1dc125-f1fd-422e-b18e-8b54956dd53e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ea1dc125-f1fd-422e-b18e-8b54956dd53e\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.214197 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea1dc125-f1fd-422e-b18e-8b54956dd53e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ea1dc125-f1fd-422e-b18e-8b54956dd53e\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.214272 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36150a58-4658-4234-95e1-fbd8a2150e3a-config-data\") pod \"glance-default-external-api-0\" (UID: \"36150a58-4658-4234-95e1-fbd8a2150e3a\") " pod="openstack/glance-default-external-api-0" Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.214305 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2457j\" (UniqueName: \"kubernetes.io/projected/ea1dc125-f1fd-422e-b18e-8b54956dd53e-kube-api-access-2457j\") pod \"glance-default-internal-api-0\" (UID: \"ea1dc125-f1fd-422e-b18e-8b54956dd53e\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.214331 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ea1dc125-f1fd-422e-b18e-8b54956dd53e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ea1dc125-f1fd-422e-b18e-8b54956dd53e\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.214359 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hckkg\" (UniqueName: \"kubernetes.io/projected/36150a58-4658-4234-95e1-fbd8a2150e3a-kube-api-access-hckkg\") pod \"glance-default-external-api-0\" (UID: \"36150a58-4658-4234-95e1-fbd8a2150e3a\") " pod="openstack/glance-default-external-api-0" Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.214382 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36150a58-4658-4234-95e1-fbd8a2150e3a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"36150a58-4658-4234-95e1-fbd8a2150e3a\") " pod="openstack/glance-default-external-api-0" Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.214404 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"36150a58-4658-4234-95e1-fbd8a2150e3a\") " pod="openstack/glance-default-external-api-0" Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.214422 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea1dc125-f1fd-422e-b18e-8b54956dd53e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ea1dc125-f1fd-422e-b18e-8b54956dd53e\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.214452 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"ea1dc125-f1fd-422e-b18e-8b54956dd53e\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.214473 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea1dc125-f1fd-422e-b18e-8b54956dd53e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ea1dc125-f1fd-422e-b18e-8b54956dd53e\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.214501 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36150a58-4658-4234-95e1-fbd8a2150e3a-logs\") pod \"glance-default-external-api-0\" (UID: \"36150a58-4658-4234-95e1-fbd8a2150e3a\") " pod="openstack/glance-default-external-api-0" Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.214530 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/36150a58-4658-4234-95e1-fbd8a2150e3a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"36150a58-4658-4234-95e1-fbd8a2150e3a\") " pod="openstack/glance-default-external-api-0" Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.214948 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/36150a58-4658-4234-95e1-fbd8a2150e3a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"36150a58-4658-4234-95e1-fbd8a2150e3a\") " pod="openstack/glance-default-external-api-0" Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.215184 4901 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"36150a58-4658-4234-95e1-fbd8a2150e3a\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.215245 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36150a58-4658-4234-95e1-fbd8a2150e3a-logs\") pod \"glance-default-external-api-0\" (UID: \"36150a58-4658-4234-95e1-fbd8a2150e3a\") " pod="openstack/glance-default-external-api-0" Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.219275 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36150a58-4658-4234-95e1-fbd8a2150e3a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"36150a58-4658-4234-95e1-fbd8a2150e3a\") " pod="openstack/glance-default-external-api-0" Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.220639 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36150a58-4658-4234-95e1-fbd8a2150e3a-scripts\") pod \"glance-default-external-api-0\" (UID: \"36150a58-4658-4234-95e1-fbd8a2150e3a\") " pod="openstack/glance-default-external-api-0" Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.227722 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/36150a58-4658-4234-95e1-fbd8a2150e3a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"36150a58-4658-4234-95e1-fbd8a2150e3a\") " pod="openstack/glance-default-external-api-0" Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.227997 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36150a58-4658-4234-95e1-fbd8a2150e3a-config-data\") pod \"glance-default-external-api-0\" (UID: \"36150a58-4658-4234-95e1-fbd8a2150e3a\") " pod="openstack/glance-default-external-api-0" Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.238815 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hckkg\" (UniqueName: \"kubernetes.io/projected/36150a58-4658-4234-95e1-fbd8a2150e3a-kube-api-access-hckkg\") pod \"glance-default-external-api-0\" (UID: \"36150a58-4658-4234-95e1-fbd8a2150e3a\") " pod="openstack/glance-default-external-api-0" Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.268606 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"36150a58-4658-4234-95e1-fbd8a2150e3a\") " pod="openstack/glance-default-external-api-0" Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.274094 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.315801 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea1dc125-f1fd-422e-b18e-8b54956dd53e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ea1dc125-f1fd-422e-b18e-8b54956dd53e\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.315900 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea1dc125-f1fd-422e-b18e-8b54956dd53e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ea1dc125-f1fd-422e-b18e-8b54956dd53e\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.315947 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2457j\" (UniqueName: \"kubernetes.io/projected/ea1dc125-f1fd-422e-b18e-8b54956dd53e-kube-api-access-2457j\") pod \"glance-default-internal-api-0\" (UID: \"ea1dc125-f1fd-422e-b18e-8b54956dd53e\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.315974 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ea1dc125-f1fd-422e-b18e-8b54956dd53e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ea1dc125-f1fd-422e-b18e-8b54956dd53e\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.316013 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea1dc125-f1fd-422e-b18e-8b54956dd53e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ea1dc125-f1fd-422e-b18e-8b54956dd53e\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.316046 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea1dc125-f1fd-422e-b18e-8b54956dd53e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ea1dc125-f1fd-422e-b18e-8b54956dd53e\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.316070 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"ea1dc125-f1fd-422e-b18e-8b54956dd53e\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.316133 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea1dc125-f1fd-422e-b18e-8b54956dd53e-logs\") pod \"glance-default-internal-api-0\" (UID: \"ea1dc125-f1fd-422e-b18e-8b54956dd53e\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.316707 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea1dc125-f1fd-422e-b18e-8b54956dd53e-logs\") pod \"glance-default-internal-api-0\" (UID: \"ea1dc125-f1fd-422e-b18e-8b54956dd53e\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.317771 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ea1dc125-f1fd-422e-b18e-8b54956dd53e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ea1dc125-f1fd-422e-b18e-8b54956dd53e\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.318148 4901 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"ea1dc125-f1fd-422e-b18e-8b54956dd53e\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.330492 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea1dc125-f1fd-422e-b18e-8b54956dd53e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ea1dc125-f1fd-422e-b18e-8b54956dd53e\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.330940 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea1dc125-f1fd-422e-b18e-8b54956dd53e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ea1dc125-f1fd-422e-b18e-8b54956dd53e\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.331400 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea1dc125-f1fd-422e-b18e-8b54956dd53e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ea1dc125-f1fd-422e-b18e-8b54956dd53e\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.331823 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea1dc125-f1fd-422e-b18e-8b54956dd53e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ea1dc125-f1fd-422e-b18e-8b54956dd53e\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.337212 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2457j\" (UniqueName: \"kubernetes.io/projected/ea1dc125-f1fd-422e-b18e-8b54956dd53e-kube-api-access-2457j\") pod \"glance-default-internal-api-0\" (UID: \"ea1dc125-f1fd-422e-b18e-8b54956dd53e\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.359633 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"ea1dc125-f1fd-422e-b18e-8b54956dd53e\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:01:50 crc kubenswrapper[4901]: W0309 03:01:50.402270 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0a22f21_beb0_44a1_943c_08547dc523a8.slice/crio-27224698f4f4a0df2a0d33bf1fec0814ffee80676a2af26cd1937b2e34d8651d WatchSource:0}: Error finding container 27224698f4f4a0df2a0d33bf1fec0814ffee80676a2af26cd1937b2e34d8651d: Status 404 returned error can't find the container with id 27224698f4f4a0df2a0d33bf1fec0814ffee80676a2af26cd1937b2e34d8651d Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.402722 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.605851 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d0a22f21-beb0-44a1-943c-08547dc523a8","Type":"ContainerStarted","Data":"27224698f4f4a0df2a0d33bf1fec0814ffee80676a2af26cd1937b2e34d8651d"} Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.616104 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5879b95d97-2h27b" event={"ID":"50b48e26-823e-4741-9985-85d776d2002f","Type":"ContainerStarted","Data":"891d7eccec507531b283f15f0830a2d5e3637a1eb255eb4682bccf94a4231a86"} Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.617709 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wgm9z" event={"ID":"1922b7e7-e63a-4d39-9785-11d2c64f5ec3","Type":"ContainerStarted","Data":"38a727bf7765a6d624a55482d90e987cb27e40cef06a0828b0f8ba5ef86eb04d"} Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.797412 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd445f5bc-fc5bl" Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.946445 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87-dns-svc\") pod \"dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87\" (UID: \"dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87\") " Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.946734 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87-ovsdbserver-nb\") pod \"dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87\" (UID: \"dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87\") " Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.946789 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87-dns-swift-storage-0\") pod \"dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87\" (UID: \"dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87\") " Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.946897 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87-ovsdbserver-sb\") pod \"dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87\" (UID: \"dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87\") " Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.946929 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87-config\") pod \"dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87\" (UID: \"dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87\") " Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.946957 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lb6vm\" (UniqueName: \"kubernetes.io/projected/dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87-kube-api-access-lb6vm\") pod \"dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87\" (UID: \"dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87\") " Mar 09 03:01:50 crc kubenswrapper[4901]: I0309 03:01:50.956837 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87-kube-api-access-lb6vm" (OuterVolumeSpecName: "kube-api-access-lb6vm") pod "dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87" (UID: "dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87"). InnerVolumeSpecName "kube-api-access-lb6vm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:01:51 crc kubenswrapper[4901]: I0309 03:01:51.012612 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-kzq5q"] Mar 09 03:01:51 crc kubenswrapper[4901]: I0309 03:01:51.019410 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87-config" (OuterVolumeSpecName: "config") pod "dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87" (UID: "dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:01:51 crc kubenswrapper[4901]: I0309 03:01:51.034535 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87" (UID: "dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:01:51 crc kubenswrapper[4901]: I0309 03:01:51.035729 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87" (UID: "dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:01:51 crc kubenswrapper[4901]: I0309 03:01:51.044411 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-2mh7b"] Mar 09 03:01:51 crc kubenswrapper[4901]: I0309 03:01:51.050580 4901 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:51 crc kubenswrapper[4901]: I0309 03:01:51.050608 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87-config\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:51 crc kubenswrapper[4901]: I0309 03:01:51.050619 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lb6vm\" (UniqueName: \"kubernetes.io/projected/dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87-kube-api-access-lb6vm\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:51 crc kubenswrapper[4901]: I0309 03:01:51.050628 4901 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:51 crc kubenswrapper[4901]: W0309 03:01:51.051398 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7acea3ec_d1eb_4971_b3b5_7c0b898cf07c.slice/crio-0875d8be1bdf57e78a21e99e38b8e031a6989e2cf5f767e3ceedb36940ee24dc WatchSource:0}: Error finding container 0875d8be1bdf57e78a21e99e38b8e031a6989e2cf5f767e3ceedb36940ee24dc: Status 404 returned error can't find the container with id 0875d8be1bdf57e78a21e99e38b8e031a6989e2cf5f767e3ceedb36940ee24dc Mar 09 03:01:51 crc kubenswrapper[4901]: I0309 03:01:51.053995 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87" (UID: "dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:01:51 crc kubenswrapper[4901]: I0309 03:01:51.057012 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87" (UID: "dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:01:51 crc kubenswrapper[4901]: I0309 03:01:51.160360 4901 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:51 crc kubenswrapper[4901]: I0309 03:01:51.160664 4901 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:51 crc kubenswrapper[4901]: I0309 03:01:51.174983 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-hvvzs"] Mar 09 03:01:51 crc kubenswrapper[4901]: I0309 03:01:51.187041 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf6456ddf-pv679"] Mar 09 03:01:51 crc kubenswrapper[4901]: W0309 03:01:51.188407 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd374a21_cd74_447e_ab94_4e60e6f0d7be.slice/crio-17866f29c8a8e93d4f89215a4f491b65610bfd63f5c2cd348280be8566eb6f51 WatchSource:0}: Error finding container 17866f29c8a8e93d4f89215a4f491b65610bfd63f5c2cd348280be8566eb6f51: Status 404 returned error can't find the container with id 17866f29c8a8e93d4f89215a4f491b65610bfd63f5c2cd348280be8566eb6f51 Mar 09 03:01:51 crc kubenswrapper[4901]: I0309 03:01:51.288082 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 03:01:51 crc kubenswrapper[4901]: I0309 03:01:51.334831 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-24d57"] Mar 09 03:01:51 crc kubenswrapper[4901]: W0309 03:01:51.371528 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode040c407_4b37_4bee_b200_0d97b5767ef1.slice/crio-c4d80066ab01a9e40394d0f2dc80715b884a6e89d2fd47341d66b0347e3860bc WatchSource:0}: Error finding container c4d80066ab01a9e40394d0f2dc80715b884a6e89d2fd47341d66b0347e3860bc: Status 404 returned error can't find the container with id c4d80066ab01a9e40394d0f2dc80715b884a6e89d2fd47341d66b0347e3860bc Mar 09 03:01:51 crc kubenswrapper[4901]: I0309 03:01:51.415636 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 03:01:51 crc kubenswrapper[4901]: I0309 03:01:51.658361 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-hvvzs" event={"ID":"cd374a21-cd74-447e-ab94-4e60e6f0d7be","Type":"ContainerStarted","Data":"17866f29c8a8e93d4f89215a4f491b65610bfd63f5c2cd348280be8566eb6f51"} Mar 09 03:01:51 crc kubenswrapper[4901]: I0309 03:01:51.663582 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wgm9z" event={"ID":"1922b7e7-e63a-4d39-9785-11d2c64f5ec3","Type":"ContainerStarted","Data":"6994cdf17ee7689665487d835e40572e3f243ef6d5aebacb4cd6c596e686a753"} Mar 09 03:01:51 crc kubenswrapper[4901]: I0309 03:01:51.666137 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-24d57" event={"ID":"e040c407-4b37-4bee-b200-0d97b5767ef1","Type":"ContainerStarted","Data":"c4d80066ab01a9e40394d0f2dc80715b884a6e89d2fd47341d66b0347e3860bc"} Mar 09 03:01:51 crc kubenswrapper[4901]: I0309 03:01:51.669995 4901 generic.go:334] "Generic (PLEG): container finished" podID="50b48e26-823e-4741-9985-85d776d2002f" containerID="6df1adbf582fb2e72fa53899541f2d8cceaa3c381f5b8934d2878f963677f3a2" exitCode=0 Mar 09 03:01:51 crc kubenswrapper[4901]: I0309 03:01:51.670042 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5879b95d97-2h27b" event={"ID":"50b48e26-823e-4741-9985-85d776d2002f","Type":"ContainerDied","Data":"6df1adbf582fb2e72fa53899541f2d8cceaa3c381f5b8934d2878f963677f3a2"} Mar 09 03:01:51 crc kubenswrapper[4901]: I0309 03:01:51.670099 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 03:01:51 crc kubenswrapper[4901]: I0309 03:01:51.672625 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ea1dc125-f1fd-422e-b18e-8b54956dd53e","Type":"ContainerStarted","Data":"aa8a499ade2275a64d3c0f9ff6ea6bbf530c2d407e782153902b152957decbb3"} Mar 09 03:01:51 crc kubenswrapper[4901]: I0309 03:01:51.684850 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-wgm9z" podStartSLOduration=3.684833017 podStartE2EDuration="3.684833017s" podCreationTimestamp="2026-03-09 03:01:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 03:01:51.684441377 +0000 UTC m=+1236.274105109" watchObservedRunningTime="2026-03-09 03:01:51.684833017 +0000 UTC m=+1236.274496749" Mar 09 03:01:51 crc kubenswrapper[4901]: I0309 03:01:51.699604 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd445f5bc-fc5bl" Mar 09 03:01:51 crc kubenswrapper[4901]: I0309 03:01:51.699610 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd445f5bc-fc5bl" event={"ID":"dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87","Type":"ContainerDied","Data":"f01f3355863aba1332eab1a7cbf54676ae167e6c22b83337b98f5e75481000da"} Mar 09 03:01:51 crc kubenswrapper[4901]: I0309 03:01:51.699922 4901 scope.go:117] "RemoveContainer" containerID="9597de26ae09249d81f1ead20d96f269323192bc4314f6965dcfa4428bb7991b" Mar 09 03:01:51 crc kubenswrapper[4901]: I0309 03:01:51.717053 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-kzq5q" event={"ID":"ecfb1b2f-eb9b-47e0-905b-5785fc307df9","Type":"ContainerStarted","Data":"6fb0dbd3461b344fda5d6ea3fedec48538094ee2e81dcd7ca011d50e8e49ef2a"} Mar 09 03:01:51 crc kubenswrapper[4901]: I0309 03:01:51.717097 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-kzq5q" event={"ID":"ecfb1b2f-eb9b-47e0-905b-5785fc307df9","Type":"ContainerStarted","Data":"809a082bce41aab01df9eb27fbca185981a19fc13639453729eb3c558a4b7b81"} Mar 09 03:01:51 crc kubenswrapper[4901]: I0309 03:01:51.734159 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-2mh7b" event={"ID":"7acea3ec-d1eb-4971-b3b5-7c0b898cf07c","Type":"ContainerStarted","Data":"0875d8be1bdf57e78a21e99e38b8e031a6989e2cf5f767e3ceedb36940ee24dc"} Mar 09 03:01:51 crc kubenswrapper[4901]: I0309 03:01:51.756118 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"36150a58-4658-4234-95e1-fbd8a2150e3a","Type":"ContainerStarted","Data":"caa6a2a84fad48dbbf9edb07dce2a1f55c372afd792e5e97df7fb0f1eba652b9"} Mar 09 03:01:51 crc kubenswrapper[4901]: I0309 03:01:51.761080 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 03:01:51 crc kubenswrapper[4901]: I0309 03:01:51.768600 4901 generic.go:334] "Generic (PLEG): container finished" podID="4e886d37-056b-4000-876d-881906f1e3b3" containerID="7b8f099fa8eab8f44a3ad033edfe6be9203fd9faee348e42b6abfacee9b039ef" exitCode=0 Mar 09 03:01:51 crc kubenswrapper[4901]: I0309 03:01:51.768644 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf6456ddf-pv679" event={"ID":"4e886d37-056b-4000-876d-881906f1e3b3","Type":"ContainerDied","Data":"7b8f099fa8eab8f44a3ad033edfe6be9203fd9faee348e42b6abfacee9b039ef"} Mar 09 03:01:51 crc kubenswrapper[4901]: I0309 03:01:51.768673 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf6456ddf-pv679" event={"ID":"4e886d37-056b-4000-876d-881906f1e3b3","Type":"ContainerStarted","Data":"c9995225a00974cc442dc9a8d79ed0476c1ebc61295d633c1280d0d84035f10c"} Mar 09 03:01:51 crc kubenswrapper[4901]: I0309 03:01:51.832400 4901 scope.go:117] "RemoveContainer" containerID="eabc16e4cb88d98be0eb4b39b0c7908658d7e544e32938215701000a99d0197f" Mar 09 03:01:51 crc kubenswrapper[4901]: I0309 03:01:51.865328 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 03:01:51 crc kubenswrapper[4901]: I0309 03:01:51.872642 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-kzq5q" podStartSLOduration=3.872615395 podStartE2EDuration="3.872615395s" podCreationTimestamp="2026-03-09 03:01:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 03:01:51.766778745 +0000 UTC m=+1236.356442477" watchObservedRunningTime="2026-03-09 03:01:51.872615395 +0000 UTC m=+1236.462279127" Mar 09 03:01:51 crc kubenswrapper[4901]: I0309 03:01:51.929689 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd445f5bc-fc5bl"] Mar 09 03:01:51 crc kubenswrapper[4901]: I0309 03:01:51.937841 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd445f5bc-fc5bl"] Mar 09 03:01:52 crc kubenswrapper[4901]: I0309 03:01:52.156007 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87" path="/var/lib/kubelet/pods/dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87/volumes" Mar 09 03:01:52 crc kubenswrapper[4901]: I0309 03:01:52.471351 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5879b95d97-2h27b" Mar 09 03:01:52 crc kubenswrapper[4901]: I0309 03:01:52.595110 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50b48e26-823e-4741-9985-85d776d2002f-config\") pod \"50b48e26-823e-4741-9985-85d776d2002f\" (UID: \"50b48e26-823e-4741-9985-85d776d2002f\") " Mar 09 03:01:52 crc kubenswrapper[4901]: I0309 03:01:52.595186 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/50b48e26-823e-4741-9985-85d776d2002f-dns-swift-storage-0\") pod \"50b48e26-823e-4741-9985-85d776d2002f\" (UID: \"50b48e26-823e-4741-9985-85d776d2002f\") " Mar 09 03:01:52 crc kubenswrapper[4901]: I0309 03:01:52.595320 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50b48e26-823e-4741-9985-85d776d2002f-dns-svc\") pod \"50b48e26-823e-4741-9985-85d776d2002f\" (UID: \"50b48e26-823e-4741-9985-85d776d2002f\") " Mar 09 03:01:52 crc kubenswrapper[4901]: I0309 03:01:52.595358 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50b48e26-823e-4741-9985-85d776d2002f-ovsdbserver-sb\") pod \"50b48e26-823e-4741-9985-85d776d2002f\" (UID: \"50b48e26-823e-4741-9985-85d776d2002f\") " Mar 09 03:01:52 crc kubenswrapper[4901]: I0309 03:01:52.595375 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/50b48e26-823e-4741-9985-85d776d2002f-ovsdbserver-nb\") pod \"50b48e26-823e-4741-9985-85d776d2002f\" (UID: \"50b48e26-823e-4741-9985-85d776d2002f\") " Mar 09 03:01:52 crc kubenswrapper[4901]: I0309 03:01:52.595434 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktjrd\" (UniqueName: \"kubernetes.io/projected/50b48e26-823e-4741-9985-85d776d2002f-kube-api-access-ktjrd\") pod \"50b48e26-823e-4741-9985-85d776d2002f\" (UID: \"50b48e26-823e-4741-9985-85d776d2002f\") " Mar 09 03:01:52 crc kubenswrapper[4901]: I0309 03:01:52.599671 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50b48e26-823e-4741-9985-85d776d2002f-kube-api-access-ktjrd" (OuterVolumeSpecName: "kube-api-access-ktjrd") pod "50b48e26-823e-4741-9985-85d776d2002f" (UID: "50b48e26-823e-4741-9985-85d776d2002f"). InnerVolumeSpecName "kube-api-access-ktjrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:01:52 crc kubenswrapper[4901]: I0309 03:01:52.618513 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50b48e26-823e-4741-9985-85d776d2002f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "50b48e26-823e-4741-9985-85d776d2002f" (UID: "50b48e26-823e-4741-9985-85d776d2002f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:01:52 crc kubenswrapper[4901]: I0309 03:01:52.622258 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50b48e26-823e-4741-9985-85d776d2002f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "50b48e26-823e-4741-9985-85d776d2002f" (UID: "50b48e26-823e-4741-9985-85d776d2002f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:01:52 crc kubenswrapper[4901]: I0309 03:01:52.637187 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50b48e26-823e-4741-9985-85d776d2002f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "50b48e26-823e-4741-9985-85d776d2002f" (UID: "50b48e26-823e-4741-9985-85d776d2002f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:01:52 crc kubenswrapper[4901]: I0309 03:01:52.644130 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50b48e26-823e-4741-9985-85d776d2002f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "50b48e26-823e-4741-9985-85d776d2002f" (UID: "50b48e26-823e-4741-9985-85d776d2002f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:01:52 crc kubenswrapper[4901]: I0309 03:01:52.665561 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50b48e26-823e-4741-9985-85d776d2002f-config" (OuterVolumeSpecName: "config") pod "50b48e26-823e-4741-9985-85d776d2002f" (UID: "50b48e26-823e-4741-9985-85d776d2002f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:01:52 crc kubenswrapper[4901]: I0309 03:01:52.697778 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50b48e26-823e-4741-9985-85d776d2002f-config\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:52 crc kubenswrapper[4901]: I0309 03:01:52.697809 4901 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/50b48e26-823e-4741-9985-85d776d2002f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:52 crc kubenswrapper[4901]: I0309 03:01:52.697819 4901 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50b48e26-823e-4741-9985-85d776d2002f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:52 crc kubenswrapper[4901]: I0309 03:01:52.697828 4901 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50b48e26-823e-4741-9985-85d776d2002f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:52 crc kubenswrapper[4901]: I0309 03:01:52.697836 4901 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/50b48e26-823e-4741-9985-85d776d2002f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:52 crc kubenswrapper[4901]: I0309 03:01:52.697843 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktjrd\" (UniqueName: \"kubernetes.io/projected/50b48e26-823e-4741-9985-85d776d2002f-kube-api-access-ktjrd\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:52 crc kubenswrapper[4901]: I0309 03:01:52.786292 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"36150a58-4658-4234-95e1-fbd8a2150e3a","Type":"ContainerStarted","Data":"0befcbdae9824655d573f04a64974e8227995442b9cd5555f2e28f83c9e8d63d"} Mar 09 03:01:52 crc kubenswrapper[4901]: I0309 03:01:52.800334 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf6456ddf-pv679" event={"ID":"4e886d37-056b-4000-876d-881906f1e3b3","Type":"ContainerStarted","Data":"cb6cb33ae3974e418b66973e91b2f8a2d4dea520a43558c5b8091cf96377b2ef"} Mar 09 03:01:52 crc kubenswrapper[4901]: I0309 03:01:52.801262 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5bf6456ddf-pv679" Mar 09 03:01:52 crc kubenswrapper[4901]: I0309 03:01:52.805075 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5879b95d97-2h27b" event={"ID":"50b48e26-823e-4741-9985-85d776d2002f","Type":"ContainerDied","Data":"891d7eccec507531b283f15f0830a2d5e3637a1eb255eb4682bccf94a4231a86"} Mar 09 03:01:52 crc kubenswrapper[4901]: I0309 03:01:52.805107 4901 scope.go:117] "RemoveContainer" containerID="6df1adbf582fb2e72fa53899541f2d8cceaa3c381f5b8934d2878f963677f3a2" Mar 09 03:01:52 crc kubenswrapper[4901]: I0309 03:01:52.805174 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5879b95d97-2h27b" Mar 09 03:01:52 crc kubenswrapper[4901]: I0309 03:01:52.831565 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ea1dc125-f1fd-422e-b18e-8b54956dd53e","Type":"ContainerStarted","Data":"eca5e585e55c2eef8d4b52f8f46e780c2b0716d6ad22f3090ca5e05353b7663d"} Mar 09 03:01:52 crc kubenswrapper[4901]: I0309 03:01:52.831971 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5bf6456ddf-pv679" podStartSLOduration=3.831955005 podStartE2EDuration="3.831955005s" podCreationTimestamp="2026-03-09 03:01:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 03:01:52.826673782 +0000 UTC m=+1237.416337514" watchObservedRunningTime="2026-03-09 03:01:52.831955005 +0000 UTC m=+1237.421618737" Mar 09 03:01:52 crc kubenswrapper[4901]: I0309 03:01:52.880270 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5879b95d97-2h27b"] Mar 09 03:01:52 crc kubenswrapper[4901]: I0309 03:01:52.889635 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5879b95d97-2h27b"] Mar 09 03:01:53 crc kubenswrapper[4901]: I0309 03:01:53.858314 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"36150a58-4658-4234-95e1-fbd8a2150e3a","Type":"ContainerStarted","Data":"d9ce7705317f5d77180a38dd8974551c075744ed35ba9bed92bb35fdc6a17f7a"} Mar 09 03:01:53 crc kubenswrapper[4901]: I0309 03:01:53.858320 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="36150a58-4658-4234-95e1-fbd8a2150e3a" containerName="glance-log" containerID="cri-o://0befcbdae9824655d573f04a64974e8227995442b9cd5555f2e28f83c9e8d63d" gracePeriod=30 Mar 09 03:01:53 crc kubenswrapper[4901]: I0309 03:01:53.858387 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="36150a58-4658-4234-95e1-fbd8a2150e3a" containerName="glance-httpd" containerID="cri-o://d9ce7705317f5d77180a38dd8974551c075744ed35ba9bed92bb35fdc6a17f7a" gracePeriod=30 Mar 09 03:01:53 crc kubenswrapper[4901]: I0309 03:01:53.872697 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="ea1dc125-f1fd-422e-b18e-8b54956dd53e" containerName="glance-log" containerID="cri-o://eca5e585e55c2eef8d4b52f8f46e780c2b0716d6ad22f3090ca5e05353b7663d" gracePeriod=30 Mar 09 03:01:53 crc kubenswrapper[4901]: I0309 03:01:53.872810 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="ea1dc125-f1fd-422e-b18e-8b54956dd53e" containerName="glance-httpd" containerID="cri-o://4b7faea4b66cc0d858bafaff43bcdba13fe992511969c7171160a58bf185b191" gracePeriod=30 Mar 09 03:01:53 crc kubenswrapper[4901]: I0309 03:01:53.873022 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ea1dc125-f1fd-422e-b18e-8b54956dd53e","Type":"ContainerStarted","Data":"4b7faea4b66cc0d858bafaff43bcdba13fe992511969c7171160a58bf185b191"} Mar 09 03:01:53 crc kubenswrapper[4901]: I0309 03:01:53.887014 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.886994001 podStartE2EDuration="5.886994001s" podCreationTimestamp="2026-03-09 03:01:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 03:01:53.88220976 +0000 UTC m=+1238.471873492" watchObservedRunningTime="2026-03-09 03:01:53.886994001 +0000 UTC m=+1238.476657733" Mar 09 03:01:53 crc kubenswrapper[4901]: I0309 03:01:53.923690 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.923672766 podStartE2EDuration="5.923672766s" podCreationTimestamp="2026-03-09 03:01:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 03:01:53.914469434 +0000 UTC m=+1238.504133176" watchObservedRunningTime="2026-03-09 03:01:53.923672766 +0000 UTC m=+1238.513336498" Mar 09 03:01:54 crc kubenswrapper[4901]: I0309 03:01:54.131443 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50b48e26-823e-4741-9985-85d776d2002f" path="/var/lib/kubelet/pods/50b48e26-823e-4741-9985-85d776d2002f/volumes" Mar 09 03:01:54 crc kubenswrapper[4901]: I0309 03:01:54.916062 4901 generic.go:334] "Generic (PLEG): container finished" podID="ea1dc125-f1fd-422e-b18e-8b54956dd53e" containerID="4b7faea4b66cc0d858bafaff43bcdba13fe992511969c7171160a58bf185b191" exitCode=0 Mar 09 03:01:54 crc kubenswrapper[4901]: I0309 03:01:54.916093 4901 generic.go:334] "Generic (PLEG): container finished" podID="ea1dc125-f1fd-422e-b18e-8b54956dd53e" containerID="eca5e585e55c2eef8d4b52f8f46e780c2b0716d6ad22f3090ca5e05353b7663d" exitCode=143 Mar 09 03:01:54 crc kubenswrapper[4901]: I0309 03:01:54.916130 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ea1dc125-f1fd-422e-b18e-8b54956dd53e","Type":"ContainerDied","Data":"4b7faea4b66cc0d858bafaff43bcdba13fe992511969c7171160a58bf185b191"} Mar 09 03:01:54 crc kubenswrapper[4901]: I0309 03:01:54.916155 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ea1dc125-f1fd-422e-b18e-8b54956dd53e","Type":"ContainerDied","Data":"eca5e585e55c2eef8d4b52f8f46e780c2b0716d6ad22f3090ca5e05353b7663d"} Mar 09 03:01:54 crc kubenswrapper[4901]: I0309 03:01:54.921875 4901 generic.go:334] "Generic (PLEG): container finished" podID="36150a58-4658-4234-95e1-fbd8a2150e3a" containerID="d9ce7705317f5d77180a38dd8974551c075744ed35ba9bed92bb35fdc6a17f7a" exitCode=0 Mar 09 03:01:54 crc kubenswrapper[4901]: I0309 03:01:54.921901 4901 generic.go:334] "Generic (PLEG): container finished" podID="36150a58-4658-4234-95e1-fbd8a2150e3a" containerID="0befcbdae9824655d573f04a64974e8227995442b9cd5555f2e28f83c9e8d63d" exitCode=143 Mar 09 03:01:54 crc kubenswrapper[4901]: I0309 03:01:54.922300 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"36150a58-4658-4234-95e1-fbd8a2150e3a","Type":"ContainerDied","Data":"d9ce7705317f5d77180a38dd8974551c075744ed35ba9bed92bb35fdc6a17f7a"} Mar 09 03:01:54 crc kubenswrapper[4901]: I0309 03:01:54.922359 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"36150a58-4658-4234-95e1-fbd8a2150e3a","Type":"ContainerDied","Data":"0befcbdae9824655d573f04a64974e8227995442b9cd5555f2e28f83c9e8d63d"} Mar 09 03:01:55 crc kubenswrapper[4901]: I0309 03:01:55.932344 4901 generic.go:334] "Generic (PLEG): container finished" podID="1922b7e7-e63a-4d39-9785-11d2c64f5ec3" containerID="6994cdf17ee7689665487d835e40572e3f243ef6d5aebacb4cd6c596e686a753" exitCode=0 Mar 09 03:01:55 crc kubenswrapper[4901]: I0309 03:01:55.932527 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wgm9z" event={"ID":"1922b7e7-e63a-4d39-9785-11d2c64f5ec3","Type":"ContainerDied","Data":"6994cdf17ee7689665487d835e40572e3f243ef6d5aebacb4cd6c596e686a753"} Mar 09 03:01:58 crc kubenswrapper[4901]: I0309 03:01:58.979261 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wgm9z" event={"ID":"1922b7e7-e63a-4d39-9785-11d2c64f5ec3","Type":"ContainerDied","Data":"38a727bf7765a6d624a55482d90e987cb27e40cef06a0828b0f8ba5ef86eb04d"} Mar 09 03:01:58 crc kubenswrapper[4901]: I0309 03:01:58.979834 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38a727bf7765a6d624a55482d90e987cb27e40cef06a0828b0f8ba5ef86eb04d" Mar 09 03:01:59 crc kubenswrapper[4901]: I0309 03:01:59.038503 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wgm9z" Mar 09 03:01:59 crc kubenswrapper[4901]: I0309 03:01:59.221622 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1922b7e7-e63a-4d39-9785-11d2c64f5ec3-credential-keys\") pod \"1922b7e7-e63a-4d39-9785-11d2c64f5ec3\" (UID: \"1922b7e7-e63a-4d39-9785-11d2c64f5ec3\") " Mar 09 03:01:59 crc kubenswrapper[4901]: I0309 03:01:59.221695 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1922b7e7-e63a-4d39-9785-11d2c64f5ec3-fernet-keys\") pod \"1922b7e7-e63a-4d39-9785-11d2c64f5ec3\" (UID: \"1922b7e7-e63a-4d39-9785-11d2c64f5ec3\") " Mar 09 03:01:59 crc kubenswrapper[4901]: I0309 03:01:59.221830 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1922b7e7-e63a-4d39-9785-11d2c64f5ec3-scripts\") pod \"1922b7e7-e63a-4d39-9785-11d2c64f5ec3\" (UID: \"1922b7e7-e63a-4d39-9785-11d2c64f5ec3\") " Mar 09 03:01:59 crc kubenswrapper[4901]: I0309 03:01:59.221883 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4m2m\" (UniqueName: \"kubernetes.io/projected/1922b7e7-e63a-4d39-9785-11d2c64f5ec3-kube-api-access-q4m2m\") pod \"1922b7e7-e63a-4d39-9785-11d2c64f5ec3\" (UID: \"1922b7e7-e63a-4d39-9785-11d2c64f5ec3\") " Mar 09 03:01:59 crc kubenswrapper[4901]: I0309 03:01:59.221901 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1922b7e7-e63a-4d39-9785-11d2c64f5ec3-combined-ca-bundle\") pod \"1922b7e7-e63a-4d39-9785-11d2c64f5ec3\" (UID: \"1922b7e7-e63a-4d39-9785-11d2c64f5ec3\") " Mar 09 03:01:59 crc kubenswrapper[4901]: I0309 03:01:59.221939 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1922b7e7-e63a-4d39-9785-11d2c64f5ec3-config-data\") pod \"1922b7e7-e63a-4d39-9785-11d2c64f5ec3\" (UID: \"1922b7e7-e63a-4d39-9785-11d2c64f5ec3\") " Mar 09 03:01:59 crc kubenswrapper[4901]: I0309 03:01:59.235397 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1922b7e7-e63a-4d39-9785-11d2c64f5ec3-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "1922b7e7-e63a-4d39-9785-11d2c64f5ec3" (UID: "1922b7e7-e63a-4d39-9785-11d2c64f5ec3"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:01:59 crc kubenswrapper[4901]: I0309 03:01:59.235433 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1922b7e7-e63a-4d39-9785-11d2c64f5ec3-scripts" (OuterVolumeSpecName: "scripts") pod "1922b7e7-e63a-4d39-9785-11d2c64f5ec3" (UID: "1922b7e7-e63a-4d39-9785-11d2c64f5ec3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:01:59 crc kubenswrapper[4901]: I0309 03:01:59.235453 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1922b7e7-e63a-4d39-9785-11d2c64f5ec3-kube-api-access-q4m2m" (OuterVolumeSpecName: "kube-api-access-q4m2m") pod "1922b7e7-e63a-4d39-9785-11d2c64f5ec3" (UID: "1922b7e7-e63a-4d39-9785-11d2c64f5ec3"). InnerVolumeSpecName "kube-api-access-q4m2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:01:59 crc kubenswrapper[4901]: I0309 03:01:59.248845 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1922b7e7-e63a-4d39-9785-11d2c64f5ec3-config-data" (OuterVolumeSpecName: "config-data") pod "1922b7e7-e63a-4d39-9785-11d2c64f5ec3" (UID: "1922b7e7-e63a-4d39-9785-11d2c64f5ec3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:01:59 crc kubenswrapper[4901]: I0309 03:01:59.249594 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1922b7e7-e63a-4d39-9785-11d2c64f5ec3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1922b7e7-e63a-4d39-9785-11d2c64f5ec3" (UID: "1922b7e7-e63a-4d39-9785-11d2c64f5ec3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:01:59 crc kubenswrapper[4901]: I0309 03:01:59.253632 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1922b7e7-e63a-4d39-9785-11d2c64f5ec3-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "1922b7e7-e63a-4d39-9785-11d2c64f5ec3" (UID: "1922b7e7-e63a-4d39-9785-11d2c64f5ec3"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:01:59 crc kubenswrapper[4901]: I0309 03:01:59.324069 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1922b7e7-e63a-4d39-9785-11d2c64f5ec3-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:59 crc kubenswrapper[4901]: I0309 03:01:59.324099 4901 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1922b7e7-e63a-4d39-9785-11d2c64f5ec3-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:59 crc kubenswrapper[4901]: I0309 03:01:59.324111 4901 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1922b7e7-e63a-4d39-9785-11d2c64f5ec3-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:59 crc kubenswrapper[4901]: I0309 03:01:59.324119 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1922b7e7-e63a-4d39-9785-11d2c64f5ec3-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:59 crc kubenswrapper[4901]: I0309 03:01:59.324130 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4m2m\" (UniqueName: \"kubernetes.io/projected/1922b7e7-e63a-4d39-9785-11d2c64f5ec3-kube-api-access-q4m2m\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:59 crc kubenswrapper[4901]: I0309 03:01:59.324139 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1922b7e7-e63a-4d39-9785-11d2c64f5ec3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:01:59 crc kubenswrapper[4901]: I0309 03:01:59.768468 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5bf6456ddf-pv679" Mar 09 03:01:59 crc kubenswrapper[4901]: I0309 03:01:59.833110 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bdcf4fccc-2vwc7"] Mar 09 03:01:59 crc kubenswrapper[4901]: I0309 03:01:59.833395 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5bdcf4fccc-2vwc7" podUID="e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe" containerName="dnsmasq-dns" containerID="cri-o://5b5ddb2d4483db37db677d989441cc79de881a4e50a87a1af5715665794290a6" gracePeriod=10 Mar 09 03:01:59 crc kubenswrapper[4901]: I0309 03:01:59.993089 4901 generic.go:334] "Generic (PLEG): container finished" podID="e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe" containerID="5b5ddb2d4483db37db677d989441cc79de881a4e50a87a1af5715665794290a6" exitCode=0 Mar 09 03:01:59 crc kubenswrapper[4901]: I0309 03:01:59.993169 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wgm9z" Mar 09 03:01:59 crc kubenswrapper[4901]: I0309 03:01:59.993334 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bdcf4fccc-2vwc7" event={"ID":"e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe","Type":"ContainerDied","Data":"5b5ddb2d4483db37db677d989441cc79de881a4e50a87a1af5715665794290a6"} Mar 09 03:02:00 crc kubenswrapper[4901]: I0309 03:02:00.156026 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550422-wg2vp"] Mar 09 03:02:00 crc kubenswrapper[4901]: E0309 03:02:00.156367 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1922b7e7-e63a-4d39-9785-11d2c64f5ec3" containerName="keystone-bootstrap" Mar 09 03:02:00 crc kubenswrapper[4901]: I0309 03:02:00.156377 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="1922b7e7-e63a-4d39-9785-11d2c64f5ec3" containerName="keystone-bootstrap" Mar 09 03:02:00 crc kubenswrapper[4901]: E0309 03:02:00.156392 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87" containerName="dnsmasq-dns" Mar 09 03:02:00 crc kubenswrapper[4901]: I0309 03:02:00.156398 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87" containerName="dnsmasq-dns" Mar 09 03:02:00 crc kubenswrapper[4901]: E0309 03:02:00.156421 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87" containerName="init" Mar 09 03:02:00 crc kubenswrapper[4901]: I0309 03:02:00.156428 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87" containerName="init" Mar 09 03:02:00 crc kubenswrapper[4901]: E0309 03:02:00.156439 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50b48e26-823e-4741-9985-85d776d2002f" containerName="init" Mar 09 03:02:00 crc kubenswrapper[4901]: I0309 03:02:00.156444 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="50b48e26-823e-4741-9985-85d776d2002f" containerName="init" Mar 09 03:02:00 crc kubenswrapper[4901]: I0309 03:02:00.156625 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="50b48e26-823e-4741-9985-85d776d2002f" containerName="init" Mar 09 03:02:00 crc kubenswrapper[4901]: I0309 03:02:00.156632 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="1922b7e7-e63a-4d39-9785-11d2c64f5ec3" containerName="keystone-bootstrap" Mar 09 03:02:00 crc kubenswrapper[4901]: I0309 03:02:00.156653 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc4d36d6-5ae4-4b51-ac8e-58ca1f094e87" containerName="dnsmasq-dns" Mar 09 03:02:00 crc kubenswrapper[4901]: I0309 03:02:00.157169 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550422-wg2vp" Mar 09 03:02:00 crc kubenswrapper[4901]: I0309 03:02:00.164636 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550422-wg2vp"] Mar 09 03:02:00 crc kubenswrapper[4901]: I0309 03:02:00.165584 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lq6vs" Mar 09 03:02:00 crc kubenswrapper[4901]: I0309 03:02:00.165602 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 03:02:00 crc kubenswrapper[4901]: I0309 03:02:00.167126 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 03:02:00 crc kubenswrapper[4901]: I0309 03:02:00.173111 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-wgm9z"] Mar 09 03:02:00 crc kubenswrapper[4901]: I0309 03:02:00.179555 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-wgm9z"] Mar 09 03:02:00 crc kubenswrapper[4901]: I0309 03:02:00.240977 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rkxc\" (UniqueName: \"kubernetes.io/projected/f285768d-05c2-41f7-a3db-7c76d4df9fb8-kube-api-access-4rkxc\") pod \"auto-csr-approver-29550422-wg2vp\" (UID: \"f285768d-05c2-41f7-a3db-7c76d4df9fb8\") " pod="openshift-infra/auto-csr-approver-29550422-wg2vp" Mar 09 03:02:00 crc kubenswrapper[4901]: I0309 03:02:00.245496 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-qzdch"] Mar 09 03:02:00 crc kubenswrapper[4901]: I0309 03:02:00.246484 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qzdch" Mar 09 03:02:00 crc kubenswrapper[4901]: I0309 03:02:00.248199 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 09 03:02:00 crc kubenswrapper[4901]: I0309 03:02:00.248915 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 09 03:02:00 crc kubenswrapper[4901]: I0309 03:02:00.249311 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-ghd6s" Mar 09 03:02:00 crc kubenswrapper[4901]: I0309 03:02:00.249430 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 09 03:02:00 crc kubenswrapper[4901]: I0309 03:02:00.249661 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 09 03:02:00 crc kubenswrapper[4901]: I0309 03:02:00.253793 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-qzdch"] Mar 09 03:02:00 crc kubenswrapper[4901]: I0309 03:02:00.342388 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5140d205-de33-4e39-95fb-451471d3e7e9-credential-keys\") pod \"keystone-bootstrap-qzdch\" (UID: \"5140d205-de33-4e39-95fb-451471d3e7e9\") " pod="openstack/keystone-bootstrap-qzdch" Mar 09 03:02:00 crc kubenswrapper[4901]: I0309 03:02:00.342695 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rkxc\" (UniqueName: \"kubernetes.io/projected/f285768d-05c2-41f7-a3db-7c76d4df9fb8-kube-api-access-4rkxc\") pod \"auto-csr-approver-29550422-wg2vp\" (UID: \"f285768d-05c2-41f7-a3db-7c76d4df9fb8\") " pod="openshift-infra/auto-csr-approver-29550422-wg2vp" Mar 09 03:02:00 crc kubenswrapper[4901]: I0309 03:02:00.342799 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5140d205-de33-4e39-95fb-451471d3e7e9-combined-ca-bundle\") pod \"keystone-bootstrap-qzdch\" (UID: \"5140d205-de33-4e39-95fb-451471d3e7e9\") " pod="openstack/keystone-bootstrap-qzdch" Mar 09 03:02:00 crc kubenswrapper[4901]: I0309 03:02:00.342910 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5140d205-de33-4e39-95fb-451471d3e7e9-fernet-keys\") pod \"keystone-bootstrap-qzdch\" (UID: \"5140d205-de33-4e39-95fb-451471d3e7e9\") " pod="openstack/keystone-bootstrap-qzdch" Mar 09 03:02:00 crc kubenswrapper[4901]: I0309 03:02:00.343003 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp5pd\" (UniqueName: \"kubernetes.io/projected/5140d205-de33-4e39-95fb-451471d3e7e9-kube-api-access-qp5pd\") pod \"keystone-bootstrap-qzdch\" (UID: \"5140d205-de33-4e39-95fb-451471d3e7e9\") " pod="openstack/keystone-bootstrap-qzdch" Mar 09 03:02:00 crc kubenswrapper[4901]: I0309 03:02:00.343041 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5140d205-de33-4e39-95fb-451471d3e7e9-config-data\") pod \"keystone-bootstrap-qzdch\" (UID: \"5140d205-de33-4e39-95fb-451471d3e7e9\") " pod="openstack/keystone-bootstrap-qzdch" Mar 09 03:02:00 crc kubenswrapper[4901]: I0309 03:02:00.343087 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5140d205-de33-4e39-95fb-451471d3e7e9-scripts\") pod \"keystone-bootstrap-qzdch\" (UID: \"5140d205-de33-4e39-95fb-451471d3e7e9\") " pod="openstack/keystone-bootstrap-qzdch" Mar 09 03:02:00 crc kubenswrapper[4901]: I0309 03:02:00.364779 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rkxc\" (UniqueName: \"kubernetes.io/projected/f285768d-05c2-41f7-a3db-7c76d4df9fb8-kube-api-access-4rkxc\") pod \"auto-csr-approver-29550422-wg2vp\" (UID: \"f285768d-05c2-41f7-a3db-7c76d4df9fb8\") " pod="openshift-infra/auto-csr-approver-29550422-wg2vp" Mar 09 03:02:00 crc kubenswrapper[4901]: I0309 03:02:00.444550 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5140d205-de33-4e39-95fb-451471d3e7e9-scripts\") pod \"keystone-bootstrap-qzdch\" (UID: \"5140d205-de33-4e39-95fb-451471d3e7e9\") " pod="openstack/keystone-bootstrap-qzdch" Mar 09 03:02:00 crc kubenswrapper[4901]: I0309 03:02:00.444651 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5140d205-de33-4e39-95fb-451471d3e7e9-credential-keys\") pod \"keystone-bootstrap-qzdch\" (UID: \"5140d205-de33-4e39-95fb-451471d3e7e9\") " pod="openstack/keystone-bootstrap-qzdch" Mar 09 03:02:00 crc kubenswrapper[4901]: I0309 03:02:00.444743 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5140d205-de33-4e39-95fb-451471d3e7e9-combined-ca-bundle\") pod \"keystone-bootstrap-qzdch\" (UID: \"5140d205-de33-4e39-95fb-451471d3e7e9\") " pod="openstack/keystone-bootstrap-qzdch" Mar 09 03:02:00 crc kubenswrapper[4901]: I0309 03:02:00.444781 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5140d205-de33-4e39-95fb-451471d3e7e9-fernet-keys\") pod \"keystone-bootstrap-qzdch\" (UID: \"5140d205-de33-4e39-95fb-451471d3e7e9\") " pod="openstack/keystone-bootstrap-qzdch" Mar 09 03:02:00 crc kubenswrapper[4901]: I0309 03:02:00.444809 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qp5pd\" (UniqueName: \"kubernetes.io/projected/5140d205-de33-4e39-95fb-451471d3e7e9-kube-api-access-qp5pd\") pod \"keystone-bootstrap-qzdch\" (UID: \"5140d205-de33-4e39-95fb-451471d3e7e9\") " pod="openstack/keystone-bootstrap-qzdch" Mar 09 03:02:00 crc kubenswrapper[4901]: I0309 03:02:00.444836 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5140d205-de33-4e39-95fb-451471d3e7e9-config-data\") pod \"keystone-bootstrap-qzdch\" (UID: \"5140d205-de33-4e39-95fb-451471d3e7e9\") " pod="openstack/keystone-bootstrap-qzdch" Mar 09 03:02:00 crc kubenswrapper[4901]: I0309 03:02:00.448101 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5140d205-de33-4e39-95fb-451471d3e7e9-scripts\") pod \"keystone-bootstrap-qzdch\" (UID: \"5140d205-de33-4e39-95fb-451471d3e7e9\") " pod="openstack/keystone-bootstrap-qzdch" Mar 09 03:02:00 crc kubenswrapper[4901]: I0309 03:02:00.448655 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5140d205-de33-4e39-95fb-451471d3e7e9-credential-keys\") pod \"keystone-bootstrap-qzdch\" (UID: \"5140d205-de33-4e39-95fb-451471d3e7e9\") " pod="openstack/keystone-bootstrap-qzdch" Mar 09 03:02:00 crc kubenswrapper[4901]: I0309 03:02:00.449970 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5140d205-de33-4e39-95fb-451471d3e7e9-combined-ca-bundle\") pod \"keystone-bootstrap-qzdch\" (UID: \"5140d205-de33-4e39-95fb-451471d3e7e9\") " pod="openstack/keystone-bootstrap-qzdch" Mar 09 03:02:00 crc kubenswrapper[4901]: I0309 03:02:00.450110 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5140d205-de33-4e39-95fb-451471d3e7e9-fernet-keys\") pod \"keystone-bootstrap-qzdch\" (UID: \"5140d205-de33-4e39-95fb-451471d3e7e9\") " pod="openstack/keystone-bootstrap-qzdch" Mar 09 03:02:00 crc kubenswrapper[4901]: I0309 03:02:00.451055 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5140d205-de33-4e39-95fb-451471d3e7e9-config-data\") pod \"keystone-bootstrap-qzdch\" (UID: \"5140d205-de33-4e39-95fb-451471d3e7e9\") " pod="openstack/keystone-bootstrap-qzdch" Mar 09 03:02:00 crc kubenswrapper[4901]: I0309 03:02:00.471412 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550422-wg2vp" Mar 09 03:02:00 crc kubenswrapper[4901]: I0309 03:02:00.472028 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp5pd\" (UniqueName: \"kubernetes.io/projected/5140d205-de33-4e39-95fb-451471d3e7e9-kube-api-access-qp5pd\") pod \"keystone-bootstrap-qzdch\" (UID: \"5140d205-de33-4e39-95fb-451471d3e7e9\") " pod="openstack/keystone-bootstrap-qzdch" Mar 09 03:02:00 crc kubenswrapper[4901]: I0309 03:02:00.560030 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qzdch" Mar 09 03:02:02 crc kubenswrapper[4901]: I0309 03:02:02.117082 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1922b7e7-e63a-4d39-9785-11d2c64f5ec3" path="/var/lib/kubelet/pods/1922b7e7-e63a-4d39-9785-11d2c64f5ec3/volumes" Mar 09 03:02:04 crc kubenswrapper[4901]: I0309 03:02:04.779941 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5bdcf4fccc-2vwc7" podUID="e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.131:5353: connect: connection refused" Mar 09 03:02:09 crc kubenswrapper[4901]: I0309 03:02:09.091494 4901 generic.go:334] "Generic (PLEG): container finished" podID="ecfb1b2f-eb9b-47e0-905b-5785fc307df9" containerID="6fb0dbd3461b344fda5d6ea3fedec48538094ee2e81dcd7ca011d50e8e49ef2a" exitCode=0 Mar 09 03:02:09 crc kubenswrapper[4901]: I0309 03:02:09.092056 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-kzq5q" event={"ID":"ecfb1b2f-eb9b-47e0-905b-5785fc307df9","Type":"ContainerDied","Data":"6fb0dbd3461b344fda5d6ea3fedec48538094ee2e81dcd7ca011d50e8e49ef2a"} Mar 09 03:02:09 crc kubenswrapper[4901]: I0309 03:02:09.585558 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 03:02:09 crc kubenswrapper[4901]: I0309 03:02:09.731265 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea1dc125-f1fd-422e-b18e-8b54956dd53e-combined-ca-bundle\") pod \"ea1dc125-f1fd-422e-b18e-8b54956dd53e\" (UID: \"ea1dc125-f1fd-422e-b18e-8b54956dd53e\") " Mar 09 03:02:09 crc kubenswrapper[4901]: I0309 03:02:09.731336 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ea1dc125-f1fd-422e-b18e-8b54956dd53e-httpd-run\") pod \"ea1dc125-f1fd-422e-b18e-8b54956dd53e\" (UID: \"ea1dc125-f1fd-422e-b18e-8b54956dd53e\") " Mar 09 03:02:09 crc kubenswrapper[4901]: I0309 03:02:09.731399 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea1dc125-f1fd-422e-b18e-8b54956dd53e-internal-tls-certs\") pod \"ea1dc125-f1fd-422e-b18e-8b54956dd53e\" (UID: \"ea1dc125-f1fd-422e-b18e-8b54956dd53e\") " Mar 09 03:02:09 crc kubenswrapper[4901]: I0309 03:02:09.731449 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea1dc125-f1fd-422e-b18e-8b54956dd53e-logs\") pod \"ea1dc125-f1fd-422e-b18e-8b54956dd53e\" (UID: \"ea1dc125-f1fd-422e-b18e-8b54956dd53e\") " Mar 09 03:02:09 crc kubenswrapper[4901]: I0309 03:02:09.731489 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ea1dc125-f1fd-422e-b18e-8b54956dd53e\" (UID: \"ea1dc125-f1fd-422e-b18e-8b54956dd53e\") " Mar 09 03:02:09 crc kubenswrapper[4901]: I0309 03:02:09.731574 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea1dc125-f1fd-422e-b18e-8b54956dd53e-scripts\") pod \"ea1dc125-f1fd-422e-b18e-8b54956dd53e\" (UID: \"ea1dc125-f1fd-422e-b18e-8b54956dd53e\") " Mar 09 03:02:09 crc kubenswrapper[4901]: I0309 03:02:09.731635 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2457j\" (UniqueName: \"kubernetes.io/projected/ea1dc125-f1fd-422e-b18e-8b54956dd53e-kube-api-access-2457j\") pod \"ea1dc125-f1fd-422e-b18e-8b54956dd53e\" (UID: \"ea1dc125-f1fd-422e-b18e-8b54956dd53e\") " Mar 09 03:02:09 crc kubenswrapper[4901]: I0309 03:02:09.731650 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea1dc125-f1fd-422e-b18e-8b54956dd53e-config-data\") pod \"ea1dc125-f1fd-422e-b18e-8b54956dd53e\" (UID: \"ea1dc125-f1fd-422e-b18e-8b54956dd53e\") " Mar 09 03:02:09 crc kubenswrapper[4901]: I0309 03:02:09.732731 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea1dc125-f1fd-422e-b18e-8b54956dd53e-logs" (OuterVolumeSpecName: "logs") pod "ea1dc125-f1fd-422e-b18e-8b54956dd53e" (UID: "ea1dc125-f1fd-422e-b18e-8b54956dd53e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:02:09 crc kubenswrapper[4901]: I0309 03:02:09.732752 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea1dc125-f1fd-422e-b18e-8b54956dd53e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ea1dc125-f1fd-422e-b18e-8b54956dd53e" (UID: "ea1dc125-f1fd-422e-b18e-8b54956dd53e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:02:09 crc kubenswrapper[4901]: I0309 03:02:09.739095 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "ea1dc125-f1fd-422e-b18e-8b54956dd53e" (UID: "ea1dc125-f1fd-422e-b18e-8b54956dd53e"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 09 03:02:09 crc kubenswrapper[4901]: I0309 03:02:09.739122 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea1dc125-f1fd-422e-b18e-8b54956dd53e-scripts" (OuterVolumeSpecName: "scripts") pod "ea1dc125-f1fd-422e-b18e-8b54956dd53e" (UID: "ea1dc125-f1fd-422e-b18e-8b54956dd53e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:02:09 crc kubenswrapper[4901]: I0309 03:02:09.739207 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea1dc125-f1fd-422e-b18e-8b54956dd53e-kube-api-access-2457j" (OuterVolumeSpecName: "kube-api-access-2457j") pod "ea1dc125-f1fd-422e-b18e-8b54956dd53e" (UID: "ea1dc125-f1fd-422e-b18e-8b54956dd53e"). InnerVolumeSpecName "kube-api-access-2457j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:02:09 crc kubenswrapper[4901]: I0309 03:02:09.776244 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea1dc125-f1fd-422e-b18e-8b54956dd53e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ea1dc125-f1fd-422e-b18e-8b54956dd53e" (UID: "ea1dc125-f1fd-422e-b18e-8b54956dd53e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:02:09 crc kubenswrapper[4901]: I0309 03:02:09.779443 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea1dc125-f1fd-422e-b18e-8b54956dd53e-config-data" (OuterVolumeSpecName: "config-data") pod "ea1dc125-f1fd-422e-b18e-8b54956dd53e" (UID: "ea1dc125-f1fd-422e-b18e-8b54956dd53e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:02:09 crc kubenswrapper[4901]: I0309 03:02:09.779751 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5bdcf4fccc-2vwc7" podUID="e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.131:5353: connect: connection refused" Mar 09 03:02:09 crc kubenswrapper[4901]: I0309 03:02:09.787309 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea1dc125-f1fd-422e-b18e-8b54956dd53e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ea1dc125-f1fd-422e-b18e-8b54956dd53e" (UID: "ea1dc125-f1fd-422e-b18e-8b54956dd53e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:02:09 crc kubenswrapper[4901]: I0309 03:02:09.833095 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea1dc125-f1fd-422e-b18e-8b54956dd53e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:09 crc kubenswrapper[4901]: I0309 03:02:09.833126 4901 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ea1dc125-f1fd-422e-b18e-8b54956dd53e-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:09 crc kubenswrapper[4901]: I0309 03:02:09.833136 4901 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea1dc125-f1fd-422e-b18e-8b54956dd53e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:09 crc kubenswrapper[4901]: I0309 03:02:09.833147 4901 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea1dc125-f1fd-422e-b18e-8b54956dd53e-logs\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:09 crc kubenswrapper[4901]: I0309 03:02:09.833180 4901 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Mar 09 03:02:09 crc kubenswrapper[4901]: I0309 03:02:09.833190 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea1dc125-f1fd-422e-b18e-8b54956dd53e-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:09 crc kubenswrapper[4901]: I0309 03:02:09.833198 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea1dc125-f1fd-422e-b18e-8b54956dd53e-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:09 crc kubenswrapper[4901]: I0309 03:02:09.833207 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2457j\" (UniqueName: \"kubernetes.io/projected/ea1dc125-f1fd-422e-b18e-8b54956dd53e-kube-api-access-2457j\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:09 crc kubenswrapper[4901]: I0309 03:02:09.849258 4901 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Mar 09 03:02:09 crc kubenswrapper[4901]: I0309 03:02:09.934935 4901 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.101055 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.101068 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ea1dc125-f1fd-422e-b18e-8b54956dd53e","Type":"ContainerDied","Data":"aa8a499ade2275a64d3c0f9ff6ea6bbf530c2d407e782153902b152957decbb3"} Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.101157 4901 scope.go:117] "RemoveContainer" containerID="4b7faea4b66cc0d858bafaff43bcdba13fe992511969c7171160a58bf185b191" Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.138402 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.148712 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.177627 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 03:02:10 crc kubenswrapper[4901]: E0309 03:02:10.178050 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea1dc125-f1fd-422e-b18e-8b54956dd53e" containerName="glance-log" Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.178066 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea1dc125-f1fd-422e-b18e-8b54956dd53e" containerName="glance-log" Mar 09 03:02:10 crc kubenswrapper[4901]: E0309 03:02:10.178094 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea1dc125-f1fd-422e-b18e-8b54956dd53e" containerName="glance-httpd" Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.178110 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea1dc125-f1fd-422e-b18e-8b54956dd53e" containerName="glance-httpd" Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.178308 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea1dc125-f1fd-422e-b18e-8b54956dd53e" containerName="glance-log" Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.178330 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea1dc125-f1fd-422e-b18e-8b54956dd53e" containerName="glance-httpd" Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.179399 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.187861 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.188438 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.193152 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.295857 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.341820 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98c082b4-bd96-4b48-80ec-6c0dc672ec58-scripts\") pod \"glance-default-internal-api-0\" (UID: \"98c082b4-bd96-4b48-80ec-6c0dc672ec58\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.341856 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98c082b4-bd96-4b48-80ec-6c0dc672ec58-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"98c082b4-bd96-4b48-80ec-6c0dc672ec58\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.341891 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"98c082b4-bd96-4b48-80ec-6c0dc672ec58\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.341915 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98c082b4-bd96-4b48-80ec-6c0dc672ec58-config-data\") pod \"glance-default-internal-api-0\" (UID: \"98c082b4-bd96-4b48-80ec-6c0dc672ec58\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.341938 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/98c082b4-bd96-4b48-80ec-6c0dc672ec58-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"98c082b4-bd96-4b48-80ec-6c0dc672ec58\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.341955 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98c082b4-bd96-4b48-80ec-6c0dc672ec58-logs\") pod \"glance-default-internal-api-0\" (UID: \"98c082b4-bd96-4b48-80ec-6c0dc672ec58\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.341984 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55q5b\" (UniqueName: \"kubernetes.io/projected/98c082b4-bd96-4b48-80ec-6c0dc672ec58-kube-api-access-55q5b\") pod \"glance-default-internal-api-0\" (UID: \"98c082b4-bd96-4b48-80ec-6c0dc672ec58\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.342027 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/98c082b4-bd96-4b48-80ec-6c0dc672ec58-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"98c082b4-bd96-4b48-80ec-6c0dc672ec58\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.442878 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"36150a58-4658-4234-95e1-fbd8a2150e3a\" (UID: \"36150a58-4658-4234-95e1-fbd8a2150e3a\") " Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.442925 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36150a58-4658-4234-95e1-fbd8a2150e3a-scripts\") pod \"36150a58-4658-4234-95e1-fbd8a2150e3a\" (UID: \"36150a58-4658-4234-95e1-fbd8a2150e3a\") " Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.442979 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/36150a58-4658-4234-95e1-fbd8a2150e3a-httpd-run\") pod \"36150a58-4658-4234-95e1-fbd8a2150e3a\" (UID: \"36150a58-4658-4234-95e1-fbd8a2150e3a\") " Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.443024 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36150a58-4658-4234-95e1-fbd8a2150e3a-config-data\") pod \"36150a58-4658-4234-95e1-fbd8a2150e3a\" (UID: \"36150a58-4658-4234-95e1-fbd8a2150e3a\") " Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.443099 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/36150a58-4658-4234-95e1-fbd8a2150e3a-public-tls-certs\") pod \"36150a58-4658-4234-95e1-fbd8a2150e3a\" (UID: \"36150a58-4658-4234-95e1-fbd8a2150e3a\") " Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.443120 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hckkg\" (UniqueName: \"kubernetes.io/projected/36150a58-4658-4234-95e1-fbd8a2150e3a-kube-api-access-hckkg\") pod \"36150a58-4658-4234-95e1-fbd8a2150e3a\" (UID: \"36150a58-4658-4234-95e1-fbd8a2150e3a\") " Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.443136 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36150a58-4658-4234-95e1-fbd8a2150e3a-combined-ca-bundle\") pod \"36150a58-4658-4234-95e1-fbd8a2150e3a\" (UID: \"36150a58-4658-4234-95e1-fbd8a2150e3a\") " Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.443171 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36150a58-4658-4234-95e1-fbd8a2150e3a-logs\") pod \"36150a58-4658-4234-95e1-fbd8a2150e3a\" (UID: \"36150a58-4658-4234-95e1-fbd8a2150e3a\") " Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.443431 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98c082b4-bd96-4b48-80ec-6c0dc672ec58-scripts\") pod \"glance-default-internal-api-0\" (UID: \"98c082b4-bd96-4b48-80ec-6c0dc672ec58\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.443453 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98c082b4-bd96-4b48-80ec-6c0dc672ec58-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"98c082b4-bd96-4b48-80ec-6c0dc672ec58\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.443481 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"98c082b4-bd96-4b48-80ec-6c0dc672ec58\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.443507 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98c082b4-bd96-4b48-80ec-6c0dc672ec58-config-data\") pod \"glance-default-internal-api-0\" (UID: \"98c082b4-bd96-4b48-80ec-6c0dc672ec58\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.443527 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/98c082b4-bd96-4b48-80ec-6c0dc672ec58-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"98c082b4-bd96-4b48-80ec-6c0dc672ec58\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.443541 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98c082b4-bd96-4b48-80ec-6c0dc672ec58-logs\") pod \"glance-default-internal-api-0\" (UID: \"98c082b4-bd96-4b48-80ec-6c0dc672ec58\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.443566 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55q5b\" (UniqueName: \"kubernetes.io/projected/98c082b4-bd96-4b48-80ec-6c0dc672ec58-kube-api-access-55q5b\") pod \"glance-default-internal-api-0\" (UID: \"98c082b4-bd96-4b48-80ec-6c0dc672ec58\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.443613 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/98c082b4-bd96-4b48-80ec-6c0dc672ec58-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"98c082b4-bd96-4b48-80ec-6c0dc672ec58\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.443911 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36150a58-4658-4234-95e1-fbd8a2150e3a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "36150a58-4658-4234-95e1-fbd8a2150e3a" (UID: "36150a58-4658-4234-95e1-fbd8a2150e3a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.444310 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36150a58-4658-4234-95e1-fbd8a2150e3a-logs" (OuterVolumeSpecName: "logs") pod "36150a58-4658-4234-95e1-fbd8a2150e3a" (UID: "36150a58-4658-4234-95e1-fbd8a2150e3a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.444490 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/98c082b4-bd96-4b48-80ec-6c0dc672ec58-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"98c082b4-bd96-4b48-80ec-6c0dc672ec58\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.445382 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98c082b4-bd96-4b48-80ec-6c0dc672ec58-logs\") pod \"glance-default-internal-api-0\" (UID: \"98c082b4-bd96-4b48-80ec-6c0dc672ec58\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.448145 4901 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"98c082b4-bd96-4b48-80ec-6c0dc672ec58\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.449119 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36150a58-4658-4234-95e1-fbd8a2150e3a-scripts" (OuterVolumeSpecName: "scripts") pod "36150a58-4658-4234-95e1-fbd8a2150e3a" (UID: "36150a58-4658-4234-95e1-fbd8a2150e3a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.449214 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36150a58-4658-4234-95e1-fbd8a2150e3a-kube-api-access-hckkg" (OuterVolumeSpecName: "kube-api-access-hckkg") pod "36150a58-4658-4234-95e1-fbd8a2150e3a" (UID: "36150a58-4658-4234-95e1-fbd8a2150e3a"). InnerVolumeSpecName "kube-api-access-hckkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.451051 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98c082b4-bd96-4b48-80ec-6c0dc672ec58-config-data\") pod \"glance-default-internal-api-0\" (UID: \"98c082b4-bd96-4b48-80ec-6c0dc672ec58\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.451783 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/98c082b4-bd96-4b48-80ec-6c0dc672ec58-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"98c082b4-bd96-4b48-80ec-6c0dc672ec58\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.455819 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98c082b4-bd96-4b48-80ec-6c0dc672ec58-scripts\") pod \"glance-default-internal-api-0\" (UID: \"98c082b4-bd96-4b48-80ec-6c0dc672ec58\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.456471 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "36150a58-4658-4234-95e1-fbd8a2150e3a" (UID: "36150a58-4658-4234-95e1-fbd8a2150e3a"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.462041 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98c082b4-bd96-4b48-80ec-6c0dc672ec58-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"98c082b4-bd96-4b48-80ec-6c0dc672ec58\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.463103 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55q5b\" (UniqueName: \"kubernetes.io/projected/98c082b4-bd96-4b48-80ec-6c0dc672ec58-kube-api-access-55q5b\") pod \"glance-default-internal-api-0\" (UID: \"98c082b4-bd96-4b48-80ec-6c0dc672ec58\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.479732 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36150a58-4658-4234-95e1-fbd8a2150e3a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "36150a58-4658-4234-95e1-fbd8a2150e3a" (UID: "36150a58-4658-4234-95e1-fbd8a2150e3a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.496678 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"98c082b4-bd96-4b48-80ec-6c0dc672ec58\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.502921 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36150a58-4658-4234-95e1-fbd8a2150e3a-config-data" (OuterVolumeSpecName: "config-data") pod "36150a58-4658-4234-95e1-fbd8a2150e3a" (UID: "36150a58-4658-4234-95e1-fbd8a2150e3a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.506371 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.510554 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36150a58-4658-4234-95e1-fbd8a2150e3a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "36150a58-4658-4234-95e1-fbd8a2150e3a" (UID: "36150a58-4658-4234-95e1-fbd8a2150e3a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.544830 4901 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.545068 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36150a58-4658-4234-95e1-fbd8a2150e3a-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.545176 4901 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/36150a58-4658-4234-95e1-fbd8a2150e3a-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.545299 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36150a58-4658-4234-95e1-fbd8a2150e3a-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.545390 4901 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/36150a58-4658-4234-95e1-fbd8a2150e3a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.545475 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hckkg\" (UniqueName: \"kubernetes.io/projected/36150a58-4658-4234-95e1-fbd8a2150e3a-kube-api-access-hckkg\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.545560 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36150a58-4658-4234-95e1-fbd8a2150e3a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.545636 4901 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36150a58-4658-4234-95e1-fbd8a2150e3a-logs\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.573374 4901 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 09 03:02:10 crc kubenswrapper[4901]: I0309 03:02:10.647085 4901 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:11 crc kubenswrapper[4901]: I0309 03:02:11.113569 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"36150a58-4658-4234-95e1-fbd8a2150e3a","Type":"ContainerDied","Data":"caa6a2a84fad48dbbf9edb07dce2a1f55c372afd792e5e97df7fb0f1eba652b9"} Mar 09 03:02:11 crc kubenswrapper[4901]: I0309 03:02:11.114738 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 03:02:11 crc kubenswrapper[4901]: I0309 03:02:11.159496 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 03:02:11 crc kubenswrapper[4901]: I0309 03:02:11.178986 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 03:02:11 crc kubenswrapper[4901]: I0309 03:02:11.193452 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 03:02:11 crc kubenswrapper[4901]: E0309 03:02:11.193863 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36150a58-4658-4234-95e1-fbd8a2150e3a" containerName="glance-log" Mar 09 03:02:11 crc kubenswrapper[4901]: I0309 03:02:11.193887 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="36150a58-4658-4234-95e1-fbd8a2150e3a" containerName="glance-log" Mar 09 03:02:11 crc kubenswrapper[4901]: E0309 03:02:11.193929 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36150a58-4658-4234-95e1-fbd8a2150e3a" containerName="glance-httpd" Mar 09 03:02:11 crc kubenswrapper[4901]: I0309 03:02:11.193938 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="36150a58-4658-4234-95e1-fbd8a2150e3a" containerName="glance-httpd" Mar 09 03:02:11 crc kubenswrapper[4901]: I0309 03:02:11.194135 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="36150a58-4658-4234-95e1-fbd8a2150e3a" containerName="glance-log" Mar 09 03:02:11 crc kubenswrapper[4901]: I0309 03:02:11.194156 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="36150a58-4658-4234-95e1-fbd8a2150e3a" containerName="glance-httpd" Mar 09 03:02:11 crc kubenswrapper[4901]: I0309 03:02:11.195193 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 03:02:11 crc kubenswrapper[4901]: I0309 03:02:11.197636 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 09 03:02:11 crc kubenswrapper[4901]: I0309 03:02:11.197761 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 09 03:02:11 crc kubenswrapper[4901]: I0309 03:02:11.203956 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 03:02:11 crc kubenswrapper[4901]: I0309 03:02:11.356747 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e7f8f6c-6ee2-4c69-a626-59821baff365-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2e7f8f6c-6ee2-4c69-a626-59821baff365\") " pod="openstack/glance-default-external-api-0" Mar 09 03:02:11 crc kubenswrapper[4901]: I0309 03:02:11.356803 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gngb5\" (UniqueName: \"kubernetes.io/projected/2e7f8f6c-6ee2-4c69-a626-59821baff365-kube-api-access-gngb5\") pod \"glance-default-external-api-0\" (UID: \"2e7f8f6c-6ee2-4c69-a626-59821baff365\") " pod="openstack/glance-default-external-api-0" Mar 09 03:02:11 crc kubenswrapper[4901]: I0309 03:02:11.356830 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e7f8f6c-6ee2-4c69-a626-59821baff365-logs\") pod \"glance-default-external-api-0\" (UID: \"2e7f8f6c-6ee2-4c69-a626-59821baff365\") " pod="openstack/glance-default-external-api-0" Mar 09 03:02:11 crc kubenswrapper[4901]: I0309 03:02:11.356966 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2e7f8f6c-6ee2-4c69-a626-59821baff365-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2e7f8f6c-6ee2-4c69-a626-59821baff365\") " pod="openstack/glance-default-external-api-0" Mar 09 03:02:11 crc kubenswrapper[4901]: I0309 03:02:11.357112 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"2e7f8f6c-6ee2-4c69-a626-59821baff365\") " pod="openstack/glance-default-external-api-0" Mar 09 03:02:11 crc kubenswrapper[4901]: I0309 03:02:11.357528 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e7f8f6c-6ee2-4c69-a626-59821baff365-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2e7f8f6c-6ee2-4c69-a626-59821baff365\") " pod="openstack/glance-default-external-api-0" Mar 09 03:02:11 crc kubenswrapper[4901]: I0309 03:02:11.357581 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e7f8f6c-6ee2-4c69-a626-59821baff365-scripts\") pod \"glance-default-external-api-0\" (UID: \"2e7f8f6c-6ee2-4c69-a626-59821baff365\") " pod="openstack/glance-default-external-api-0" Mar 09 03:02:11 crc kubenswrapper[4901]: I0309 03:02:11.357609 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e7f8f6c-6ee2-4c69-a626-59821baff365-config-data\") pod \"glance-default-external-api-0\" (UID: \"2e7f8f6c-6ee2-4c69-a626-59821baff365\") " pod="openstack/glance-default-external-api-0" Mar 09 03:02:11 crc kubenswrapper[4901]: I0309 03:02:11.460002 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e7f8f6c-6ee2-4c69-a626-59821baff365-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2e7f8f6c-6ee2-4c69-a626-59821baff365\") " pod="openstack/glance-default-external-api-0" Mar 09 03:02:11 crc kubenswrapper[4901]: I0309 03:02:11.460082 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gngb5\" (UniqueName: \"kubernetes.io/projected/2e7f8f6c-6ee2-4c69-a626-59821baff365-kube-api-access-gngb5\") pod \"glance-default-external-api-0\" (UID: \"2e7f8f6c-6ee2-4c69-a626-59821baff365\") " pod="openstack/glance-default-external-api-0" Mar 09 03:02:11 crc kubenswrapper[4901]: I0309 03:02:11.460127 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e7f8f6c-6ee2-4c69-a626-59821baff365-logs\") pod \"glance-default-external-api-0\" (UID: \"2e7f8f6c-6ee2-4c69-a626-59821baff365\") " pod="openstack/glance-default-external-api-0" Mar 09 03:02:11 crc kubenswrapper[4901]: I0309 03:02:11.460160 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2e7f8f6c-6ee2-4c69-a626-59821baff365-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2e7f8f6c-6ee2-4c69-a626-59821baff365\") " pod="openstack/glance-default-external-api-0" Mar 09 03:02:11 crc kubenswrapper[4901]: I0309 03:02:11.460216 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"2e7f8f6c-6ee2-4c69-a626-59821baff365\") " pod="openstack/glance-default-external-api-0" Mar 09 03:02:11 crc kubenswrapper[4901]: I0309 03:02:11.460404 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e7f8f6c-6ee2-4c69-a626-59821baff365-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2e7f8f6c-6ee2-4c69-a626-59821baff365\") " pod="openstack/glance-default-external-api-0" Mar 09 03:02:11 crc kubenswrapper[4901]: I0309 03:02:11.460444 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e7f8f6c-6ee2-4c69-a626-59821baff365-scripts\") pod \"glance-default-external-api-0\" (UID: \"2e7f8f6c-6ee2-4c69-a626-59821baff365\") " pod="openstack/glance-default-external-api-0" Mar 09 03:02:11 crc kubenswrapper[4901]: I0309 03:02:11.460484 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e7f8f6c-6ee2-4c69-a626-59821baff365-config-data\") pod \"glance-default-external-api-0\" (UID: \"2e7f8f6c-6ee2-4c69-a626-59821baff365\") " pod="openstack/glance-default-external-api-0" Mar 09 03:02:11 crc kubenswrapper[4901]: I0309 03:02:11.461052 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2e7f8f6c-6ee2-4c69-a626-59821baff365-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2e7f8f6c-6ee2-4c69-a626-59821baff365\") " pod="openstack/glance-default-external-api-0" Mar 09 03:02:11 crc kubenswrapper[4901]: I0309 03:02:11.461164 4901 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"2e7f8f6c-6ee2-4c69-a626-59821baff365\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Mar 09 03:02:11 crc kubenswrapper[4901]: I0309 03:02:11.463307 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e7f8f6c-6ee2-4c69-a626-59821baff365-logs\") pod \"glance-default-external-api-0\" (UID: \"2e7f8f6c-6ee2-4c69-a626-59821baff365\") " pod="openstack/glance-default-external-api-0" Mar 09 03:02:11 crc kubenswrapper[4901]: I0309 03:02:11.465468 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e7f8f6c-6ee2-4c69-a626-59821baff365-scripts\") pod \"glance-default-external-api-0\" (UID: \"2e7f8f6c-6ee2-4c69-a626-59821baff365\") " pod="openstack/glance-default-external-api-0" Mar 09 03:02:11 crc kubenswrapper[4901]: I0309 03:02:11.466018 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e7f8f6c-6ee2-4c69-a626-59821baff365-config-data\") pod \"glance-default-external-api-0\" (UID: \"2e7f8f6c-6ee2-4c69-a626-59821baff365\") " pod="openstack/glance-default-external-api-0" Mar 09 03:02:11 crc kubenswrapper[4901]: I0309 03:02:11.474398 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e7f8f6c-6ee2-4c69-a626-59821baff365-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2e7f8f6c-6ee2-4c69-a626-59821baff365\") " pod="openstack/glance-default-external-api-0" Mar 09 03:02:11 crc kubenswrapper[4901]: I0309 03:02:11.474840 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e7f8f6c-6ee2-4c69-a626-59821baff365-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2e7f8f6c-6ee2-4c69-a626-59821baff365\") " pod="openstack/glance-default-external-api-0" Mar 09 03:02:11 crc kubenswrapper[4901]: I0309 03:02:11.486569 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gngb5\" (UniqueName: \"kubernetes.io/projected/2e7f8f6c-6ee2-4c69-a626-59821baff365-kube-api-access-gngb5\") pod \"glance-default-external-api-0\" (UID: \"2e7f8f6c-6ee2-4c69-a626-59821baff365\") " pod="openstack/glance-default-external-api-0" Mar 09 03:02:11 crc kubenswrapper[4901]: I0309 03:02:11.489824 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"2e7f8f6c-6ee2-4c69-a626-59821baff365\") " pod="openstack/glance-default-external-api-0" Mar 09 03:02:11 crc kubenswrapper[4901]: I0309 03:02:11.536052 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 03:02:11 crc kubenswrapper[4901]: E0309 03:02:11.589980 4901 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:2c52c0f4b4baa15796eb284522adff7fa9e5c85a2d77c2e47ef4afdf8e4a7c7d" Mar 09 03:02:11 crc kubenswrapper[4901]: E0309 03:02:11.590169 4901 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:2c52c0f4b4baa15796eb284522adff7fa9e5c85a2d77c2e47ef4afdf8e4a7c7d,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cq94g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-2mh7b_openstack(7acea3ec-d1eb-4971-b3b5-7c0b898cf07c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 03:02:11 crc kubenswrapper[4901]: E0309 03:02:11.591335 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-2mh7b" podUID="7acea3ec-d1eb-4971-b3b5-7c0b898cf07c" Mar 09 03:02:12 crc kubenswrapper[4901]: I0309 03:02:12.116736 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36150a58-4658-4234-95e1-fbd8a2150e3a" path="/var/lib/kubelet/pods/36150a58-4658-4234-95e1-fbd8a2150e3a/volumes" Mar 09 03:02:12 crc kubenswrapper[4901]: I0309 03:02:12.119058 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea1dc125-f1fd-422e-b18e-8b54956dd53e" path="/var/lib/kubelet/pods/ea1dc125-f1fd-422e-b18e-8b54956dd53e/volumes" Mar 09 03:02:12 crc kubenswrapper[4901]: E0309 03:02:12.123866 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:2c52c0f4b4baa15796eb284522adff7fa9e5c85a2d77c2e47ef4afdf8e4a7c7d\\\"\"" pod="openstack/barbican-db-sync-2mh7b" podUID="7acea3ec-d1eb-4971-b3b5-7c0b898cf07c" Mar 09 03:02:12 crc kubenswrapper[4901]: I0309 03:02:12.793144 4901 scope.go:117] "RemoveContainer" containerID="eca5e585e55c2eef8d4b52f8f46e780c2b0716d6ad22f3090ca5e05353b7663d" Mar 09 03:02:12 crc kubenswrapper[4901]: E0309 03:02:12.852270 4901 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:44ed1ca84e17bd0f004cfbdc3c0827d767daba52abb8e83e076bfd0e6c02f838" Mar 09 03:02:12 crc kubenswrapper[4901]: E0309 03:02:12.852459 4901 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:44ed1ca84e17bd0f004cfbdc3c0827d767daba52abb8e83e076bfd0e6c02f838,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r8fr5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-24d57_openstack(e040c407-4b37-4bee-b200-0d97b5767ef1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 03:02:12 crc kubenswrapper[4901]: E0309 03:02:12.853940 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-24d57" podUID="e040c407-4b37-4bee-b200-0d97b5767ef1" Mar 09 03:02:12 crc kubenswrapper[4901]: I0309 03:02:12.943752 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-kzq5q" Mar 09 03:02:12 crc kubenswrapper[4901]: I0309 03:02:12.956320 4901 scope.go:117] "RemoveContainer" containerID="d9ce7705317f5d77180a38dd8974551c075744ed35ba9bed92bb35fdc6a17f7a" Mar 09 03:02:13 crc kubenswrapper[4901]: I0309 03:02:13.066762 4901 scope.go:117] "RemoveContainer" containerID="0befcbdae9824655d573f04a64974e8227995442b9cd5555f2e28f83c9e8d63d" Mar 09 03:02:13 crc kubenswrapper[4901]: I0309 03:02:13.086544 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ecfb1b2f-eb9b-47e0-905b-5785fc307df9-config\") pod \"ecfb1b2f-eb9b-47e0-905b-5785fc307df9\" (UID: \"ecfb1b2f-eb9b-47e0-905b-5785fc307df9\") " Mar 09 03:02:13 crc kubenswrapper[4901]: I0309 03:02:13.086643 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5m7z7\" (UniqueName: \"kubernetes.io/projected/ecfb1b2f-eb9b-47e0-905b-5785fc307df9-kube-api-access-5m7z7\") pod \"ecfb1b2f-eb9b-47e0-905b-5785fc307df9\" (UID: \"ecfb1b2f-eb9b-47e0-905b-5785fc307df9\") " Mar 09 03:02:13 crc kubenswrapper[4901]: I0309 03:02:13.086695 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecfb1b2f-eb9b-47e0-905b-5785fc307df9-combined-ca-bundle\") pod \"ecfb1b2f-eb9b-47e0-905b-5785fc307df9\" (UID: \"ecfb1b2f-eb9b-47e0-905b-5785fc307df9\") " Mar 09 03:02:13 crc kubenswrapper[4901]: I0309 03:02:13.092211 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecfb1b2f-eb9b-47e0-905b-5785fc307df9-kube-api-access-5m7z7" (OuterVolumeSpecName: "kube-api-access-5m7z7") pod "ecfb1b2f-eb9b-47e0-905b-5785fc307df9" (UID: "ecfb1b2f-eb9b-47e0-905b-5785fc307df9"). InnerVolumeSpecName "kube-api-access-5m7z7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:02:13 crc kubenswrapper[4901]: I0309 03:02:13.161381 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecfb1b2f-eb9b-47e0-905b-5785fc307df9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ecfb1b2f-eb9b-47e0-905b-5785fc307df9" (UID: "ecfb1b2f-eb9b-47e0-905b-5785fc307df9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:02:13 crc kubenswrapper[4901]: I0309 03:02:13.164363 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecfb1b2f-eb9b-47e0-905b-5785fc307df9-config" (OuterVolumeSpecName: "config") pod "ecfb1b2f-eb9b-47e0-905b-5785fc307df9" (UID: "ecfb1b2f-eb9b-47e0-905b-5785fc307df9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:02:13 crc kubenswrapper[4901]: I0309 03:02:13.190475 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bdcf4fccc-2vwc7" Mar 09 03:02:13 crc kubenswrapper[4901]: I0309 03:02:13.191215 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/ecfb1b2f-eb9b-47e0-905b-5785fc307df9-config\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:13 crc kubenswrapper[4901]: I0309 03:02:13.191267 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5m7z7\" (UniqueName: \"kubernetes.io/projected/ecfb1b2f-eb9b-47e0-905b-5785fc307df9-kube-api-access-5m7z7\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:13 crc kubenswrapper[4901]: I0309 03:02:13.191277 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecfb1b2f-eb9b-47e0-905b-5785fc307df9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:13 crc kubenswrapper[4901]: I0309 03:02:13.254736 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-kzq5q" event={"ID":"ecfb1b2f-eb9b-47e0-905b-5785fc307df9","Type":"ContainerDied","Data":"809a082bce41aab01df9eb27fbca185981a19fc13639453729eb3c558a4b7b81"} Mar 09 03:02:13 crc kubenswrapper[4901]: I0309 03:02:13.255036 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="809a082bce41aab01df9eb27fbca185981a19fc13639453729eb3c558a4b7b81" Mar 09 03:02:13 crc kubenswrapper[4901]: I0309 03:02:13.255110 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-kzq5q" Mar 09 03:02:13 crc kubenswrapper[4901]: I0309 03:02:13.285133 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bdcf4fccc-2vwc7" Mar 09 03:02:13 crc kubenswrapper[4901]: I0309 03:02:13.285326 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bdcf4fccc-2vwc7" event={"ID":"e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe","Type":"ContainerDied","Data":"fd2fd4d0512fad1a2d30715fd2ba612e370d39334f001ff1a942426bdca7066c"} Mar 09 03:02:13 crc kubenswrapper[4901]: I0309 03:02:13.285357 4901 scope.go:117] "RemoveContainer" containerID="5b5ddb2d4483db37db677d989441cc79de881a4e50a87a1af5715665794290a6" Mar 09 03:02:13 crc kubenswrapper[4901]: E0309 03:02:13.294550 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:44ed1ca84e17bd0f004cfbdc3c0827d767daba52abb8e83e076bfd0e6c02f838\\\"\"" pod="openstack/cinder-db-sync-24d57" podUID="e040c407-4b37-4bee-b200-0d97b5767ef1" Mar 09 03:02:13 crc kubenswrapper[4901]: I0309 03:02:13.294767 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzshw\" (UniqueName: \"kubernetes.io/projected/e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe-kube-api-access-nzshw\") pod \"e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe\" (UID: \"e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe\") " Mar 09 03:02:13 crc kubenswrapper[4901]: I0309 03:02:13.294936 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe-config\") pod \"e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe\" (UID: \"e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe\") " Mar 09 03:02:13 crc kubenswrapper[4901]: I0309 03:02:13.294958 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe-dns-svc\") pod \"e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe\" (UID: \"e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe\") " Mar 09 03:02:13 crc kubenswrapper[4901]: I0309 03:02:13.295008 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe-ovsdbserver-sb\") pod \"e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe\" (UID: \"e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe\") " Mar 09 03:02:13 crc kubenswrapper[4901]: I0309 03:02:13.295029 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe-ovsdbserver-nb\") pod \"e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe\" (UID: \"e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe\") " Mar 09 03:02:13 crc kubenswrapper[4901]: I0309 03:02:13.295080 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe-dns-swift-storage-0\") pod \"e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe\" (UID: \"e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe\") " Mar 09 03:02:13 crc kubenswrapper[4901]: I0309 03:02:13.305311 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe-kube-api-access-nzshw" (OuterVolumeSpecName: "kube-api-access-nzshw") pod "e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe" (UID: "e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe"). InnerVolumeSpecName "kube-api-access-nzshw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:02:13 crc kubenswrapper[4901]: I0309 03:02:13.345820 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-qzdch"] Mar 09 03:02:13 crc kubenswrapper[4901]: I0309 03:02:13.358202 4901 scope.go:117] "RemoveContainer" containerID="df9376a93fc75107929870fac305532c8f8535f22398bb8dc3f3654ba39d51f4" Mar 09 03:02:13 crc kubenswrapper[4901]: I0309 03:02:13.359745 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe" (UID: "e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:02:13 crc kubenswrapper[4901]: I0309 03:02:13.396749 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzshw\" (UniqueName: \"kubernetes.io/projected/e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe-kube-api-access-nzshw\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:13 crc kubenswrapper[4901]: I0309 03:02:13.396777 4901 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:13 crc kubenswrapper[4901]: I0309 03:02:13.414659 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 03:02:13 crc kubenswrapper[4901]: I0309 03:02:13.415660 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe" (UID: "e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:02:13 crc kubenswrapper[4901]: I0309 03:02:13.422384 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe" (UID: "e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:02:13 crc kubenswrapper[4901]: I0309 03:02:13.422797 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe-config" (OuterVolumeSpecName: "config") pod "e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe" (UID: "e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:02:13 crc kubenswrapper[4901]: I0309 03:02:13.434698 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe" (UID: "e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:02:13 crc kubenswrapper[4901]: W0309 03:02:13.450698 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98c082b4_bd96_4b48_80ec_6c0dc672ec58.slice/crio-20da0afed589aa15b9d0021134c60a488b9306e15f3779f5aba6f286522d4555 WatchSource:0}: Error finding container 20da0afed589aa15b9d0021134c60a488b9306e15f3779f5aba6f286522d4555: Status 404 returned error can't find the container with id 20da0afed589aa15b9d0021134c60a488b9306e15f3779f5aba6f286522d4555 Mar 09 03:02:13 crc kubenswrapper[4901]: I0309 03:02:13.498299 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe-config\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:13 crc kubenswrapper[4901]: I0309 03:02:13.498324 4901 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:13 crc kubenswrapper[4901]: I0309 03:02:13.498336 4901 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:13 crc kubenswrapper[4901]: I0309 03:02:13.498345 4901 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:13 crc kubenswrapper[4901]: I0309 03:02:13.606096 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550422-wg2vp"] Mar 09 03:02:13 crc kubenswrapper[4901]: I0309 03:02:13.633854 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bdcf4fccc-2vwc7"] Mar 09 03:02:13 crc kubenswrapper[4901]: I0309 03:02:13.650287 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bdcf4fccc-2vwc7"] Mar 09 03:02:13 crc kubenswrapper[4901]: I0309 03:02:13.695632 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 03:02:13 crc kubenswrapper[4901]: W0309 03:02:13.697831 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e7f8f6c_6ee2_4c69_a626_59821baff365.slice/crio-ff4516cb1ea306f9399f867a4f7428ac49315b2eec48f097247e19cc02ad50a1 WatchSource:0}: Error finding container ff4516cb1ea306f9399f867a4f7428ac49315b2eec48f097247e19cc02ad50a1: Status 404 returned error can't find the container with id ff4516cb1ea306f9399f867a4f7428ac49315b2eec48f097247e19cc02ad50a1 Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.179200 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe" path="/var/lib/kubelet/pods/e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe/volumes" Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.180058 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6cbd95f657-4psgt"] Mar 09 03:02:14 crc kubenswrapper[4901]: E0309 03:02:14.181376 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecfb1b2f-eb9b-47e0-905b-5785fc307df9" containerName="neutron-db-sync" Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.181391 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecfb1b2f-eb9b-47e0-905b-5785fc307df9" containerName="neutron-db-sync" Mar 09 03:02:14 crc kubenswrapper[4901]: E0309 03:02:14.181413 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe" containerName="dnsmasq-dns" Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.181419 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe" containerName="dnsmasq-dns" Mar 09 03:02:14 crc kubenswrapper[4901]: E0309 03:02:14.181431 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe" containerName="init" Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.181436 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe" containerName="init" Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.181592 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="e48bc07d-a9a2-45bb-ac1a-c1dd8eca08fe" containerName="dnsmasq-dns" Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.181606 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecfb1b2f-eb9b-47e0-905b-5785fc307df9" containerName="neutron-db-sync" Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.182641 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cbd95f657-4psgt"] Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.182719 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cbd95f657-4psgt" Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.274659 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-766b7d5cd8-9xjl5"] Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.323687 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-766b7d5cd8-9xjl5" Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.327807 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.327933 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.328080 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-8mvrg" Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.328298 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.333482 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-766b7d5cd8-9xjl5"] Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.349297 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d0a22f21-beb0-44a1-943c-08547dc523a8","Type":"ContainerStarted","Data":"a048f67353dd52b46e0392d54861d5946c5eb7a4ed3aa23de4255953621009eb"} Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.351135 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ck6bj\" (UniqueName: \"kubernetes.io/projected/189df7f6-2b92-4f36-a6e6-1462bc471159-kube-api-access-ck6bj\") pod \"neutron-766b7d5cd8-9xjl5\" (UID: \"189df7f6-2b92-4f36-a6e6-1462bc471159\") " pod="openstack/neutron-766b7d5cd8-9xjl5" Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.351170 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1da2c732-ff7f-4359-8ce4-575fa65b8da0-dns-svc\") pod \"dnsmasq-dns-6cbd95f657-4psgt\" (UID: \"1da2c732-ff7f-4359-8ce4-575fa65b8da0\") " pod="openstack/dnsmasq-dns-6cbd95f657-4psgt" Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.351185 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/189df7f6-2b92-4f36-a6e6-1462bc471159-httpd-config\") pod \"neutron-766b7d5cd8-9xjl5\" (UID: \"189df7f6-2b92-4f36-a6e6-1462bc471159\") " pod="openstack/neutron-766b7d5cd8-9xjl5" Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.351216 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1da2c732-ff7f-4359-8ce4-575fa65b8da0-config\") pod \"dnsmasq-dns-6cbd95f657-4psgt\" (UID: \"1da2c732-ff7f-4359-8ce4-575fa65b8da0\") " pod="openstack/dnsmasq-dns-6cbd95f657-4psgt" Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.351280 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/189df7f6-2b92-4f36-a6e6-1462bc471159-config\") pod \"neutron-766b7d5cd8-9xjl5\" (UID: \"189df7f6-2b92-4f36-a6e6-1462bc471159\") " pod="openstack/neutron-766b7d5cd8-9xjl5" Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.351304 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5plbp\" (UniqueName: \"kubernetes.io/projected/1da2c732-ff7f-4359-8ce4-575fa65b8da0-kube-api-access-5plbp\") pod \"dnsmasq-dns-6cbd95f657-4psgt\" (UID: \"1da2c732-ff7f-4359-8ce4-575fa65b8da0\") " pod="openstack/dnsmasq-dns-6cbd95f657-4psgt" Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.351322 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/189df7f6-2b92-4f36-a6e6-1462bc471159-combined-ca-bundle\") pod \"neutron-766b7d5cd8-9xjl5\" (UID: \"189df7f6-2b92-4f36-a6e6-1462bc471159\") " pod="openstack/neutron-766b7d5cd8-9xjl5" Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.351336 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1da2c732-ff7f-4359-8ce4-575fa65b8da0-ovsdbserver-sb\") pod \"dnsmasq-dns-6cbd95f657-4psgt\" (UID: \"1da2c732-ff7f-4359-8ce4-575fa65b8da0\") " pod="openstack/dnsmasq-dns-6cbd95f657-4psgt" Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.355791 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1da2c732-ff7f-4359-8ce4-575fa65b8da0-ovsdbserver-nb\") pod \"dnsmasq-dns-6cbd95f657-4psgt\" (UID: \"1da2c732-ff7f-4359-8ce4-575fa65b8da0\") " pod="openstack/dnsmasq-dns-6cbd95f657-4psgt" Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.355997 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1da2c732-ff7f-4359-8ce4-575fa65b8da0-dns-swift-storage-0\") pod \"dnsmasq-dns-6cbd95f657-4psgt\" (UID: \"1da2c732-ff7f-4359-8ce4-575fa65b8da0\") " pod="openstack/dnsmasq-dns-6cbd95f657-4psgt" Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.356051 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/189df7f6-2b92-4f36-a6e6-1462bc471159-ovndb-tls-certs\") pod \"neutron-766b7d5cd8-9xjl5\" (UID: \"189df7f6-2b92-4f36-a6e6-1462bc471159\") " pod="openstack/neutron-766b7d5cd8-9xjl5" Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.365084 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"98c082b4-bd96-4b48-80ec-6c0dc672ec58","Type":"ContainerStarted","Data":"41acdeef4d56deb89483a044161d31214685253b8c9d8783b8df0b16a66ac281"} Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.365341 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"98c082b4-bd96-4b48-80ec-6c0dc672ec58","Type":"ContainerStarted","Data":"20da0afed589aa15b9d0021134c60a488b9306e15f3779f5aba6f286522d4555"} Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.387730 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-hvvzs" event={"ID":"cd374a21-cd74-447e-ab94-4e60e6f0d7be","Type":"ContainerStarted","Data":"56f5155451bdd26f36bf3cf5c74566c873dff320677eab6537898737b39650c4"} Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.391088 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2e7f8f6c-6ee2-4c69-a626-59821baff365","Type":"ContainerStarted","Data":"ff4516cb1ea306f9399f867a4f7428ac49315b2eec48f097247e19cc02ad50a1"} Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.399695 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qzdch" event={"ID":"5140d205-de33-4e39-95fb-451471d3e7e9","Type":"ContainerStarted","Data":"02af839290485dbd8e363b21c0ec495ab229698aebd9a2373b229156c393f2f6"} Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.399734 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qzdch" event={"ID":"5140d205-de33-4e39-95fb-451471d3e7e9","Type":"ContainerStarted","Data":"c174b0753f88f53216966bbbee693bd36c9f43b9e3397d771c1ef2b97d72ee40"} Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.405454 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-hvvzs" podStartSLOduration=3.840518892 podStartE2EDuration="25.405440061s" podCreationTimestamp="2026-03-09 03:01:49 +0000 UTC" firstStartedPulling="2026-03-09 03:01:51.193913528 +0000 UTC m=+1235.783577250" lastFinishedPulling="2026-03-09 03:02:12.758834687 +0000 UTC m=+1257.348498419" observedRunningTime="2026-03-09 03:02:14.402738853 +0000 UTC m=+1258.992402585" watchObservedRunningTime="2026-03-09 03:02:14.405440061 +0000 UTC m=+1258.995103793" Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.406597 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550422-wg2vp" event={"ID":"f285768d-05c2-41f7-a3db-7c76d4df9fb8","Type":"ContainerStarted","Data":"d9525771df177f020c4f08dc832b902c7b59bf319df0364668dd10e9d97f8141"} Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.429746 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-qzdch" podStartSLOduration=14.429725554000001 podStartE2EDuration="14.429725554s" podCreationTimestamp="2026-03-09 03:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 03:02:14.417080765 +0000 UTC m=+1259.006744497" watchObservedRunningTime="2026-03-09 03:02:14.429725554 +0000 UTC m=+1259.019389286" Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.459867 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/189df7f6-2b92-4f36-a6e6-1462bc471159-config\") pod \"neutron-766b7d5cd8-9xjl5\" (UID: \"189df7f6-2b92-4f36-a6e6-1462bc471159\") " pod="openstack/neutron-766b7d5cd8-9xjl5" Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.459928 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5plbp\" (UniqueName: \"kubernetes.io/projected/1da2c732-ff7f-4359-8ce4-575fa65b8da0-kube-api-access-5plbp\") pod \"dnsmasq-dns-6cbd95f657-4psgt\" (UID: \"1da2c732-ff7f-4359-8ce4-575fa65b8da0\") " pod="openstack/dnsmasq-dns-6cbd95f657-4psgt" Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.459955 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1da2c732-ff7f-4359-8ce4-575fa65b8da0-ovsdbserver-sb\") pod \"dnsmasq-dns-6cbd95f657-4psgt\" (UID: \"1da2c732-ff7f-4359-8ce4-575fa65b8da0\") " pod="openstack/dnsmasq-dns-6cbd95f657-4psgt" Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.459970 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/189df7f6-2b92-4f36-a6e6-1462bc471159-combined-ca-bundle\") pod \"neutron-766b7d5cd8-9xjl5\" (UID: \"189df7f6-2b92-4f36-a6e6-1462bc471159\") " pod="openstack/neutron-766b7d5cd8-9xjl5" Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.459988 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1da2c732-ff7f-4359-8ce4-575fa65b8da0-ovsdbserver-nb\") pod \"dnsmasq-dns-6cbd95f657-4psgt\" (UID: \"1da2c732-ff7f-4359-8ce4-575fa65b8da0\") " pod="openstack/dnsmasq-dns-6cbd95f657-4psgt" Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.460064 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1da2c732-ff7f-4359-8ce4-575fa65b8da0-dns-swift-storage-0\") pod \"dnsmasq-dns-6cbd95f657-4psgt\" (UID: \"1da2c732-ff7f-4359-8ce4-575fa65b8da0\") " pod="openstack/dnsmasq-dns-6cbd95f657-4psgt" Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.460106 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/189df7f6-2b92-4f36-a6e6-1462bc471159-ovndb-tls-certs\") pod \"neutron-766b7d5cd8-9xjl5\" (UID: \"189df7f6-2b92-4f36-a6e6-1462bc471159\") " pod="openstack/neutron-766b7d5cd8-9xjl5" Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.460155 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ck6bj\" (UniqueName: \"kubernetes.io/projected/189df7f6-2b92-4f36-a6e6-1462bc471159-kube-api-access-ck6bj\") pod \"neutron-766b7d5cd8-9xjl5\" (UID: \"189df7f6-2b92-4f36-a6e6-1462bc471159\") " pod="openstack/neutron-766b7d5cd8-9xjl5" Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.460174 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1da2c732-ff7f-4359-8ce4-575fa65b8da0-dns-svc\") pod \"dnsmasq-dns-6cbd95f657-4psgt\" (UID: \"1da2c732-ff7f-4359-8ce4-575fa65b8da0\") " pod="openstack/dnsmasq-dns-6cbd95f657-4psgt" Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.460190 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/189df7f6-2b92-4f36-a6e6-1462bc471159-httpd-config\") pod \"neutron-766b7d5cd8-9xjl5\" (UID: \"189df7f6-2b92-4f36-a6e6-1462bc471159\") " pod="openstack/neutron-766b7d5cd8-9xjl5" Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.460210 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1da2c732-ff7f-4359-8ce4-575fa65b8da0-config\") pod \"dnsmasq-dns-6cbd95f657-4psgt\" (UID: \"1da2c732-ff7f-4359-8ce4-575fa65b8da0\") " pod="openstack/dnsmasq-dns-6cbd95f657-4psgt" Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.460999 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1da2c732-ff7f-4359-8ce4-575fa65b8da0-config\") pod \"dnsmasq-dns-6cbd95f657-4psgt\" (UID: \"1da2c732-ff7f-4359-8ce4-575fa65b8da0\") " pod="openstack/dnsmasq-dns-6cbd95f657-4psgt" Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.462260 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1da2c732-ff7f-4359-8ce4-575fa65b8da0-ovsdbserver-sb\") pod \"dnsmasq-dns-6cbd95f657-4psgt\" (UID: \"1da2c732-ff7f-4359-8ce4-575fa65b8da0\") " pod="openstack/dnsmasq-dns-6cbd95f657-4psgt" Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.462590 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1da2c732-ff7f-4359-8ce4-575fa65b8da0-dns-svc\") pod \"dnsmasq-dns-6cbd95f657-4psgt\" (UID: \"1da2c732-ff7f-4359-8ce4-575fa65b8da0\") " pod="openstack/dnsmasq-dns-6cbd95f657-4psgt" Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.462802 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1da2c732-ff7f-4359-8ce4-575fa65b8da0-ovsdbserver-nb\") pod \"dnsmasq-dns-6cbd95f657-4psgt\" (UID: \"1da2c732-ff7f-4359-8ce4-575fa65b8da0\") " pod="openstack/dnsmasq-dns-6cbd95f657-4psgt" Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.462907 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1da2c732-ff7f-4359-8ce4-575fa65b8da0-dns-swift-storage-0\") pod \"dnsmasq-dns-6cbd95f657-4psgt\" (UID: \"1da2c732-ff7f-4359-8ce4-575fa65b8da0\") " pod="openstack/dnsmasq-dns-6cbd95f657-4psgt" Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.470831 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/189df7f6-2b92-4f36-a6e6-1462bc471159-ovndb-tls-certs\") pod \"neutron-766b7d5cd8-9xjl5\" (UID: \"189df7f6-2b92-4f36-a6e6-1462bc471159\") " pod="openstack/neutron-766b7d5cd8-9xjl5" Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.471574 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/189df7f6-2b92-4f36-a6e6-1462bc471159-config\") pod \"neutron-766b7d5cd8-9xjl5\" (UID: \"189df7f6-2b92-4f36-a6e6-1462bc471159\") " pod="openstack/neutron-766b7d5cd8-9xjl5" Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.483902 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/189df7f6-2b92-4f36-a6e6-1462bc471159-httpd-config\") pod \"neutron-766b7d5cd8-9xjl5\" (UID: \"189df7f6-2b92-4f36-a6e6-1462bc471159\") " pod="openstack/neutron-766b7d5cd8-9xjl5" Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.484406 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5plbp\" (UniqueName: \"kubernetes.io/projected/1da2c732-ff7f-4359-8ce4-575fa65b8da0-kube-api-access-5plbp\") pod \"dnsmasq-dns-6cbd95f657-4psgt\" (UID: \"1da2c732-ff7f-4359-8ce4-575fa65b8da0\") " pod="openstack/dnsmasq-dns-6cbd95f657-4psgt" Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.485849 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/189df7f6-2b92-4f36-a6e6-1462bc471159-combined-ca-bundle\") pod \"neutron-766b7d5cd8-9xjl5\" (UID: \"189df7f6-2b92-4f36-a6e6-1462bc471159\") " pod="openstack/neutron-766b7d5cd8-9xjl5" Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.502983 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ck6bj\" (UniqueName: \"kubernetes.io/projected/189df7f6-2b92-4f36-a6e6-1462bc471159-kube-api-access-ck6bj\") pod \"neutron-766b7d5cd8-9xjl5\" (UID: \"189df7f6-2b92-4f36-a6e6-1462bc471159\") " pod="openstack/neutron-766b7d5cd8-9xjl5" Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.554625 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cbd95f657-4psgt" Mar 09 03:02:14 crc kubenswrapper[4901]: I0309 03:02:14.664359 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-766b7d5cd8-9xjl5" Mar 09 03:02:15 crc kubenswrapper[4901]: I0309 03:02:15.174788 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cbd95f657-4psgt"] Mar 09 03:02:15 crc kubenswrapper[4901]: W0309 03:02:15.178075 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1da2c732_ff7f_4359_8ce4_575fa65b8da0.slice/crio-d3f6eae51aa94690b4c937feaa0a4ada44470bf588890dfd09bc5704b1245748 WatchSource:0}: Error finding container d3f6eae51aa94690b4c937feaa0a4ada44470bf588890dfd09bc5704b1245748: Status 404 returned error can't find the container with id d3f6eae51aa94690b4c937feaa0a4ada44470bf588890dfd09bc5704b1245748 Mar 09 03:02:15 crc kubenswrapper[4901]: I0309 03:02:15.355128 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-766b7d5cd8-9xjl5"] Mar 09 03:02:15 crc kubenswrapper[4901]: I0309 03:02:15.417984 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550422-wg2vp" event={"ID":"f285768d-05c2-41f7-a3db-7c76d4df9fb8","Type":"ContainerStarted","Data":"ec8bd32114b4f887e1e946b40b5745464a56078a3296b072c0682cc38da9fb6e"} Mar 09 03:02:15 crc kubenswrapper[4901]: I0309 03:02:15.422104 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"98c082b4-bd96-4b48-80ec-6c0dc672ec58","Type":"ContainerStarted","Data":"1138330afb28e0cc6c7476a832e0a83a44deeaf627f09fd8766d81d2294dff1e"} Mar 09 03:02:15 crc kubenswrapper[4901]: I0309 03:02:15.424088 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2e7f8f6c-6ee2-4c69-a626-59821baff365","Type":"ContainerStarted","Data":"48a396afec967efb548fbf555fa8d2eba2851c4ce077fb176764c6f76b80f5f6"} Mar 09 03:02:15 crc kubenswrapper[4901]: I0309 03:02:15.424111 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2e7f8f6c-6ee2-4c69-a626-59821baff365","Type":"ContainerStarted","Data":"4ecd8a9a0c326c2f31d5e17c784e9ff098a2c8264061ea6c9e191bc6bcedc8f9"} Mar 09 03:02:15 crc kubenswrapper[4901]: I0309 03:02:15.429700 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cbd95f657-4psgt" event={"ID":"1da2c732-ff7f-4359-8ce4-575fa65b8da0","Type":"ContainerStarted","Data":"d3f6eae51aa94690b4c937feaa0a4ada44470bf588890dfd09bc5704b1245748"} Mar 09 03:02:15 crc kubenswrapper[4901]: I0309 03:02:15.437893 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550422-wg2vp" podStartSLOduration=14.323515834 podStartE2EDuration="15.437878046s" podCreationTimestamp="2026-03-09 03:02:00 +0000 UTC" firstStartedPulling="2026-03-09 03:02:13.625839957 +0000 UTC m=+1258.215503689" lastFinishedPulling="2026-03-09 03:02:14.740202169 +0000 UTC m=+1259.329865901" observedRunningTime="2026-03-09 03:02:15.435740462 +0000 UTC m=+1260.025404194" watchObservedRunningTime="2026-03-09 03:02:15.437878046 +0000 UTC m=+1260.027541778" Mar 09 03:02:15 crc kubenswrapper[4901]: I0309 03:02:15.468384 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.4683706260000005 podStartE2EDuration="4.468370626s" podCreationTimestamp="2026-03-09 03:02:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 03:02:15.461882632 +0000 UTC m=+1260.051546364" watchObservedRunningTime="2026-03-09 03:02:15.468370626 +0000 UTC m=+1260.058034358" Mar 09 03:02:15 crc kubenswrapper[4901]: I0309 03:02:15.490788 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.490772021 podStartE2EDuration="5.490772021s" podCreationTimestamp="2026-03-09 03:02:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 03:02:15.487613281 +0000 UTC m=+1260.077277013" watchObservedRunningTime="2026-03-09 03:02:15.490772021 +0000 UTC m=+1260.080435753" Mar 09 03:02:15 crc kubenswrapper[4901]: W0309 03:02:15.677277 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod189df7f6_2b92_4f36_a6e6_1462bc471159.slice/crio-7ab7fb0f26a5094af208c916b52ca40577d07593fc91974ee793bf1a6b39da45 WatchSource:0}: Error finding container 7ab7fb0f26a5094af208c916b52ca40577d07593fc91974ee793bf1a6b39da45: Status 404 returned error can't find the container with id 7ab7fb0f26a5094af208c916b52ca40577d07593fc91974ee793bf1a6b39da45 Mar 09 03:02:16 crc kubenswrapper[4901]: I0309 03:02:16.439094 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-766b7d5cd8-9xjl5" event={"ID":"189df7f6-2b92-4f36-a6e6-1462bc471159","Type":"ContainerStarted","Data":"bb6a418de9f8462e709a1779b927ca42664b3205b311697c756748bd01a3bb1e"} Mar 09 03:02:16 crc kubenswrapper[4901]: I0309 03:02:16.439605 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-766b7d5cd8-9xjl5" event={"ID":"189df7f6-2b92-4f36-a6e6-1462bc471159","Type":"ContainerStarted","Data":"7ab7fb0f26a5094af208c916b52ca40577d07593fc91974ee793bf1a6b39da45"} Mar 09 03:02:16 crc kubenswrapper[4901]: I0309 03:02:16.440875 4901 generic.go:334] "Generic (PLEG): container finished" podID="1da2c732-ff7f-4359-8ce4-575fa65b8da0" containerID="1d53ed824e5bcad8a60829f5bcbcafb26fcda34318175adc46a6fe47e190e181" exitCode=0 Mar 09 03:02:16 crc kubenswrapper[4901]: I0309 03:02:16.440919 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cbd95f657-4psgt" event={"ID":"1da2c732-ff7f-4359-8ce4-575fa65b8da0","Type":"ContainerDied","Data":"1d53ed824e5bcad8a60829f5bcbcafb26fcda34318175adc46a6fe47e190e181"} Mar 09 03:02:16 crc kubenswrapper[4901]: I0309 03:02:16.443630 4901 generic.go:334] "Generic (PLEG): container finished" podID="f285768d-05c2-41f7-a3db-7c76d4df9fb8" containerID="ec8bd32114b4f887e1e946b40b5745464a56078a3296b072c0682cc38da9fb6e" exitCode=0 Mar 09 03:02:16 crc kubenswrapper[4901]: I0309 03:02:16.443714 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550422-wg2vp" event={"ID":"f285768d-05c2-41f7-a3db-7c76d4df9fb8","Type":"ContainerDied","Data":"ec8bd32114b4f887e1e946b40b5745464a56078a3296b072c0682cc38da9fb6e"} Mar 09 03:02:16 crc kubenswrapper[4901]: I0309 03:02:16.455002 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d0a22f21-beb0-44a1-943c-08547dc523a8","Type":"ContainerStarted","Data":"84edc7847955c806b33a960202f07429804c90a4829772399bbe7190284967e8"} Mar 09 03:02:16 crc kubenswrapper[4901]: I0309 03:02:16.462539 4901 generic.go:334] "Generic (PLEG): container finished" podID="cd374a21-cd74-447e-ab94-4e60e6f0d7be" containerID="56f5155451bdd26f36bf3cf5c74566c873dff320677eab6537898737b39650c4" exitCode=0 Mar 09 03:02:16 crc kubenswrapper[4901]: I0309 03:02:16.463268 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-hvvzs" event={"ID":"cd374a21-cd74-447e-ab94-4e60e6f0d7be","Type":"ContainerDied","Data":"56f5155451bdd26f36bf3cf5c74566c873dff320677eab6537898737b39650c4"} Mar 09 03:02:16 crc kubenswrapper[4901]: I0309 03:02:16.846976 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-57d6545db5-vnkq4"] Mar 09 03:02:16 crc kubenswrapper[4901]: I0309 03:02:16.848587 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-57d6545db5-vnkq4" Mar 09 03:02:16 crc kubenswrapper[4901]: I0309 03:02:16.850985 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 09 03:02:16 crc kubenswrapper[4901]: I0309 03:02:16.851149 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 09 03:02:16 crc kubenswrapper[4901]: I0309 03:02:16.863345 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-57d6545db5-vnkq4"] Mar 09 03:02:17 crc kubenswrapper[4901]: I0309 03:02:17.021157 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f7aec5c-9887-4331-8806-3164120e927e-ovndb-tls-certs\") pod \"neutron-57d6545db5-vnkq4\" (UID: \"7f7aec5c-9887-4331-8806-3164120e927e\") " pod="openstack/neutron-57d6545db5-vnkq4" Mar 09 03:02:17 crc kubenswrapper[4901]: I0309 03:02:17.021203 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f7aec5c-9887-4331-8806-3164120e927e-combined-ca-bundle\") pod \"neutron-57d6545db5-vnkq4\" (UID: \"7f7aec5c-9887-4331-8806-3164120e927e\") " pod="openstack/neutron-57d6545db5-vnkq4" Mar 09 03:02:17 crc kubenswrapper[4901]: I0309 03:02:17.021283 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f7aec5c-9887-4331-8806-3164120e927e-public-tls-certs\") pod \"neutron-57d6545db5-vnkq4\" (UID: \"7f7aec5c-9887-4331-8806-3164120e927e\") " pod="openstack/neutron-57d6545db5-vnkq4" Mar 09 03:02:17 crc kubenswrapper[4901]: I0309 03:02:17.021309 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7f7aec5c-9887-4331-8806-3164120e927e-config\") pod \"neutron-57d6545db5-vnkq4\" (UID: \"7f7aec5c-9887-4331-8806-3164120e927e\") " pod="openstack/neutron-57d6545db5-vnkq4" Mar 09 03:02:17 crc kubenswrapper[4901]: I0309 03:02:17.021333 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f7aec5c-9887-4331-8806-3164120e927e-internal-tls-certs\") pod \"neutron-57d6545db5-vnkq4\" (UID: \"7f7aec5c-9887-4331-8806-3164120e927e\") " pod="openstack/neutron-57d6545db5-vnkq4" Mar 09 03:02:17 crc kubenswrapper[4901]: I0309 03:02:17.021407 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd6hh\" (UniqueName: \"kubernetes.io/projected/7f7aec5c-9887-4331-8806-3164120e927e-kube-api-access-kd6hh\") pod \"neutron-57d6545db5-vnkq4\" (UID: \"7f7aec5c-9887-4331-8806-3164120e927e\") " pod="openstack/neutron-57d6545db5-vnkq4" Mar 09 03:02:17 crc kubenswrapper[4901]: I0309 03:02:17.021445 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7f7aec5c-9887-4331-8806-3164120e927e-httpd-config\") pod \"neutron-57d6545db5-vnkq4\" (UID: \"7f7aec5c-9887-4331-8806-3164120e927e\") " pod="openstack/neutron-57d6545db5-vnkq4" Mar 09 03:02:17 crc kubenswrapper[4901]: I0309 03:02:17.123190 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kd6hh\" (UniqueName: \"kubernetes.io/projected/7f7aec5c-9887-4331-8806-3164120e927e-kube-api-access-kd6hh\") pod \"neutron-57d6545db5-vnkq4\" (UID: \"7f7aec5c-9887-4331-8806-3164120e927e\") " pod="openstack/neutron-57d6545db5-vnkq4" Mar 09 03:02:17 crc kubenswrapper[4901]: I0309 03:02:17.123298 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7f7aec5c-9887-4331-8806-3164120e927e-httpd-config\") pod \"neutron-57d6545db5-vnkq4\" (UID: \"7f7aec5c-9887-4331-8806-3164120e927e\") " pod="openstack/neutron-57d6545db5-vnkq4" Mar 09 03:02:17 crc kubenswrapper[4901]: I0309 03:02:17.123379 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f7aec5c-9887-4331-8806-3164120e927e-ovndb-tls-certs\") pod \"neutron-57d6545db5-vnkq4\" (UID: \"7f7aec5c-9887-4331-8806-3164120e927e\") " pod="openstack/neutron-57d6545db5-vnkq4" Mar 09 03:02:17 crc kubenswrapper[4901]: I0309 03:02:17.123408 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f7aec5c-9887-4331-8806-3164120e927e-combined-ca-bundle\") pod \"neutron-57d6545db5-vnkq4\" (UID: \"7f7aec5c-9887-4331-8806-3164120e927e\") " pod="openstack/neutron-57d6545db5-vnkq4" Mar 09 03:02:17 crc kubenswrapper[4901]: I0309 03:02:17.123611 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f7aec5c-9887-4331-8806-3164120e927e-public-tls-certs\") pod \"neutron-57d6545db5-vnkq4\" (UID: \"7f7aec5c-9887-4331-8806-3164120e927e\") " pod="openstack/neutron-57d6545db5-vnkq4" Mar 09 03:02:17 crc kubenswrapper[4901]: I0309 03:02:17.123634 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7f7aec5c-9887-4331-8806-3164120e927e-config\") pod \"neutron-57d6545db5-vnkq4\" (UID: \"7f7aec5c-9887-4331-8806-3164120e927e\") " pod="openstack/neutron-57d6545db5-vnkq4" Mar 09 03:02:17 crc kubenswrapper[4901]: I0309 03:02:17.123666 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f7aec5c-9887-4331-8806-3164120e927e-internal-tls-certs\") pod \"neutron-57d6545db5-vnkq4\" (UID: \"7f7aec5c-9887-4331-8806-3164120e927e\") " pod="openstack/neutron-57d6545db5-vnkq4" Mar 09 03:02:17 crc kubenswrapper[4901]: I0309 03:02:17.138467 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7f7aec5c-9887-4331-8806-3164120e927e-httpd-config\") pod \"neutron-57d6545db5-vnkq4\" (UID: \"7f7aec5c-9887-4331-8806-3164120e927e\") " pod="openstack/neutron-57d6545db5-vnkq4" Mar 09 03:02:17 crc kubenswrapper[4901]: I0309 03:02:17.139015 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f7aec5c-9887-4331-8806-3164120e927e-internal-tls-certs\") pod \"neutron-57d6545db5-vnkq4\" (UID: \"7f7aec5c-9887-4331-8806-3164120e927e\") " pod="openstack/neutron-57d6545db5-vnkq4" Mar 09 03:02:17 crc kubenswrapper[4901]: I0309 03:02:17.140810 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f7aec5c-9887-4331-8806-3164120e927e-public-tls-certs\") pod \"neutron-57d6545db5-vnkq4\" (UID: \"7f7aec5c-9887-4331-8806-3164120e927e\") " pod="openstack/neutron-57d6545db5-vnkq4" Mar 09 03:02:17 crc kubenswrapper[4901]: I0309 03:02:17.140865 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f7aec5c-9887-4331-8806-3164120e927e-combined-ca-bundle\") pod \"neutron-57d6545db5-vnkq4\" (UID: \"7f7aec5c-9887-4331-8806-3164120e927e\") " pod="openstack/neutron-57d6545db5-vnkq4" Mar 09 03:02:17 crc kubenswrapper[4901]: I0309 03:02:17.141074 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f7aec5c-9887-4331-8806-3164120e927e-ovndb-tls-certs\") pod \"neutron-57d6545db5-vnkq4\" (UID: \"7f7aec5c-9887-4331-8806-3164120e927e\") " pod="openstack/neutron-57d6545db5-vnkq4" Mar 09 03:02:17 crc kubenswrapper[4901]: I0309 03:02:17.141483 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7f7aec5c-9887-4331-8806-3164120e927e-config\") pod \"neutron-57d6545db5-vnkq4\" (UID: \"7f7aec5c-9887-4331-8806-3164120e927e\") " pod="openstack/neutron-57d6545db5-vnkq4" Mar 09 03:02:17 crc kubenswrapper[4901]: I0309 03:02:17.141739 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd6hh\" (UniqueName: \"kubernetes.io/projected/7f7aec5c-9887-4331-8806-3164120e927e-kube-api-access-kd6hh\") pod \"neutron-57d6545db5-vnkq4\" (UID: \"7f7aec5c-9887-4331-8806-3164120e927e\") " pod="openstack/neutron-57d6545db5-vnkq4" Mar 09 03:02:17 crc kubenswrapper[4901]: I0309 03:02:17.172390 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-57d6545db5-vnkq4" Mar 09 03:02:17 crc kubenswrapper[4901]: I0309 03:02:17.476186 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cbd95f657-4psgt" event={"ID":"1da2c732-ff7f-4359-8ce4-575fa65b8da0","Type":"ContainerStarted","Data":"b061f4aaf549eb86d1d7efe612297014d2cec92cbdf9b8e77e24149dce0551ac"} Mar 09 03:02:17 crc kubenswrapper[4901]: I0309 03:02:17.476591 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6cbd95f657-4psgt" Mar 09 03:02:17 crc kubenswrapper[4901]: I0309 03:02:17.477801 4901 generic.go:334] "Generic (PLEG): container finished" podID="5140d205-de33-4e39-95fb-451471d3e7e9" containerID="02af839290485dbd8e363b21c0ec495ab229698aebd9a2373b229156c393f2f6" exitCode=0 Mar 09 03:02:17 crc kubenswrapper[4901]: I0309 03:02:17.477870 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qzdch" event={"ID":"5140d205-de33-4e39-95fb-451471d3e7e9","Type":"ContainerDied","Data":"02af839290485dbd8e363b21c0ec495ab229698aebd9a2373b229156c393f2f6"} Mar 09 03:02:17 crc kubenswrapper[4901]: I0309 03:02:17.479887 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-766b7d5cd8-9xjl5" event={"ID":"189df7f6-2b92-4f36-a6e6-1462bc471159","Type":"ContainerStarted","Data":"b63c90a7bd193594cf46a664293cb499ada4dd2e2e4629d238df3be2d6a82bd4"} Mar 09 03:02:17 crc kubenswrapper[4901]: I0309 03:02:17.501236 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6cbd95f657-4psgt" podStartSLOduration=3.501207387 podStartE2EDuration="3.501207387s" podCreationTimestamp="2026-03-09 03:02:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 03:02:17.496764085 +0000 UTC m=+1262.086427837" watchObservedRunningTime="2026-03-09 03:02:17.501207387 +0000 UTC m=+1262.090871119" Mar 09 03:02:17 crc kubenswrapper[4901]: I0309 03:02:17.542689 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-766b7d5cd8-9xjl5" podStartSLOduration=3.542673873 podStartE2EDuration="3.542673873s" podCreationTimestamp="2026-03-09 03:02:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 03:02:17.539018381 +0000 UTC m=+1262.128682113" watchObservedRunningTime="2026-03-09 03:02:17.542673873 +0000 UTC m=+1262.132337605" Mar 09 03:02:17 crc kubenswrapper[4901]: I0309 03:02:17.731804 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-57d6545db5-vnkq4"] Mar 09 03:02:17 crc kubenswrapper[4901]: W0309 03:02:17.754340 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f7aec5c_9887_4331_8806_3164120e927e.slice/crio-3702362ca50b6d59759c487de59179513d8ee76d883cfb504dcec113ac0330e4 WatchSource:0}: Error finding container 3702362ca50b6d59759c487de59179513d8ee76d883cfb504dcec113ac0330e4: Status 404 returned error can't find the container with id 3702362ca50b6d59759c487de59179513d8ee76d883cfb504dcec113ac0330e4 Mar 09 03:02:17 crc kubenswrapper[4901]: I0309 03:02:17.930624 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-hvvzs" Mar 09 03:02:17 crc kubenswrapper[4901]: I0309 03:02:17.953092 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550422-wg2vp" Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.050544 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd374a21-cd74-447e-ab94-4e60e6f0d7be-config-data\") pod \"cd374a21-cd74-447e-ab94-4e60e6f0d7be\" (UID: \"cd374a21-cd74-447e-ab94-4e60e6f0d7be\") " Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.050799 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd374a21-cd74-447e-ab94-4e60e6f0d7be-scripts\") pod \"cd374a21-cd74-447e-ab94-4e60e6f0d7be\" (UID: \"cd374a21-cd74-447e-ab94-4e60e6f0d7be\") " Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.050945 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd374a21-cd74-447e-ab94-4e60e6f0d7be-logs\") pod \"cd374a21-cd74-447e-ab94-4e60e6f0d7be\" (UID: \"cd374a21-cd74-447e-ab94-4e60e6f0d7be\") " Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.051011 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd374a21-cd74-447e-ab94-4e60e6f0d7be-combined-ca-bundle\") pod \"cd374a21-cd74-447e-ab94-4e60e6f0d7be\" (UID: \"cd374a21-cd74-447e-ab94-4e60e6f0d7be\") " Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.051082 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbnqd\" (UniqueName: \"kubernetes.io/projected/cd374a21-cd74-447e-ab94-4e60e6f0d7be-kube-api-access-sbnqd\") pod \"cd374a21-cd74-447e-ab94-4e60e6f0d7be\" (UID: \"cd374a21-cd74-447e-ab94-4e60e6f0d7be\") " Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.054172 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd374a21-cd74-447e-ab94-4e60e6f0d7be-logs" (OuterVolumeSpecName: "logs") pod "cd374a21-cd74-447e-ab94-4e60e6f0d7be" (UID: "cd374a21-cd74-447e-ab94-4e60e6f0d7be"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.057407 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd374a21-cd74-447e-ab94-4e60e6f0d7be-kube-api-access-sbnqd" (OuterVolumeSpecName: "kube-api-access-sbnqd") pod "cd374a21-cd74-447e-ab94-4e60e6f0d7be" (UID: "cd374a21-cd74-447e-ab94-4e60e6f0d7be"). InnerVolumeSpecName "kube-api-access-sbnqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.057367 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd374a21-cd74-447e-ab94-4e60e6f0d7be-scripts" (OuterVolumeSpecName: "scripts") pod "cd374a21-cd74-447e-ab94-4e60e6f0d7be" (UID: "cd374a21-cd74-447e-ab94-4e60e6f0d7be"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.086858 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd374a21-cd74-447e-ab94-4e60e6f0d7be-config-data" (OuterVolumeSpecName: "config-data") pod "cd374a21-cd74-447e-ab94-4e60e6f0d7be" (UID: "cd374a21-cd74-447e-ab94-4e60e6f0d7be"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.092314 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd374a21-cd74-447e-ab94-4e60e6f0d7be-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cd374a21-cd74-447e-ab94-4e60e6f0d7be" (UID: "cd374a21-cd74-447e-ab94-4e60e6f0d7be"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.155018 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rkxc\" (UniqueName: \"kubernetes.io/projected/f285768d-05c2-41f7-a3db-7c76d4df9fb8-kube-api-access-4rkxc\") pod \"f285768d-05c2-41f7-a3db-7c76d4df9fb8\" (UID: \"f285768d-05c2-41f7-a3db-7c76d4df9fb8\") " Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.155899 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd374a21-cd74-447e-ab94-4e60e6f0d7be-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.155917 4901 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd374a21-cd74-447e-ab94-4e60e6f0d7be-logs\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.155926 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd374a21-cd74-447e-ab94-4e60e6f0d7be-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.155935 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbnqd\" (UniqueName: \"kubernetes.io/projected/cd374a21-cd74-447e-ab94-4e60e6f0d7be-kube-api-access-sbnqd\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.155943 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd374a21-cd74-447e-ab94-4e60e6f0d7be-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.166579 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f285768d-05c2-41f7-a3db-7c76d4df9fb8-kube-api-access-4rkxc" (OuterVolumeSpecName: "kube-api-access-4rkxc") pod "f285768d-05c2-41f7-a3db-7c76d4df9fb8" (UID: "f285768d-05c2-41f7-a3db-7c76d4df9fb8"). InnerVolumeSpecName "kube-api-access-4rkxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.258287 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rkxc\" (UniqueName: \"kubernetes.io/projected/f285768d-05c2-41f7-a3db-7c76d4df9fb8-kube-api-access-4rkxc\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.497584 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-hvvzs" event={"ID":"cd374a21-cd74-447e-ab94-4e60e6f0d7be","Type":"ContainerDied","Data":"17866f29c8a8e93d4f89215a4f491b65610bfd63f5c2cd348280be8566eb6f51"} Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.498683 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17866f29c8a8e93d4f89215a4f491b65610bfd63f5c2cd348280be8566eb6f51" Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.497646 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-hvvzs" Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.519692 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57d6545db5-vnkq4" event={"ID":"7f7aec5c-9887-4331-8806-3164120e927e","Type":"ContainerStarted","Data":"8c06a059ceda4f7e216a798de721d42fa01fd9f026bcf1c558dc22f3544edb04"} Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.520056 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57d6545db5-vnkq4" event={"ID":"7f7aec5c-9887-4331-8806-3164120e927e","Type":"ContainerStarted","Data":"d470f4319d83b0f2465857ddb8bb7578426b84cca0130f611b83b80a7b4431a5"} Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.520129 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57d6545db5-vnkq4" event={"ID":"7f7aec5c-9887-4331-8806-3164120e927e","Type":"ContainerStarted","Data":"3702362ca50b6d59759c487de59179513d8ee76d883cfb504dcec113ac0330e4"} Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.521380 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550416-zc9wt"] Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.521546 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-57d6545db5-vnkq4" Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.536490 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550416-zc9wt"] Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.541699 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550422-wg2vp" Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.550387 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550422-wg2vp" event={"ID":"f285768d-05c2-41f7-a3db-7c76d4df9fb8","Type":"ContainerDied","Data":"d9525771df177f020c4f08dc832b902c7b59bf319df0364668dd10e9d97f8141"} Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.550423 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9525771df177f020c4f08dc832b902c7b59bf319df0364668dd10e9d97f8141" Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.553077 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-766b7d5cd8-9xjl5" Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.570526 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-57d6545db5-vnkq4" podStartSLOduration=2.570506452 podStartE2EDuration="2.570506452s" podCreationTimestamp="2026-03-09 03:02:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 03:02:18.554476898 +0000 UTC m=+1263.144140630" watchObservedRunningTime="2026-03-09 03:02:18.570506452 +0000 UTC m=+1263.160170184" Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.631464 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-588d7b64fd-wbsl2"] Mar 09 03:02:18 crc kubenswrapper[4901]: E0309 03:02:18.631798 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd374a21-cd74-447e-ab94-4e60e6f0d7be" containerName="placement-db-sync" Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.631816 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd374a21-cd74-447e-ab94-4e60e6f0d7be" containerName="placement-db-sync" Mar 09 03:02:18 crc kubenswrapper[4901]: E0309 03:02:18.631832 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f285768d-05c2-41f7-a3db-7c76d4df9fb8" containerName="oc" Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.631839 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="f285768d-05c2-41f7-a3db-7c76d4df9fb8" containerName="oc" Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.632034 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd374a21-cd74-447e-ab94-4e60e6f0d7be" containerName="placement-db-sync" Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.632065 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="f285768d-05c2-41f7-a3db-7c76d4df9fb8" containerName="oc" Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.632997 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-588d7b64fd-wbsl2" Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.635370 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.635417 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.635370 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.635532 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.635944 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-5mjl2" Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.649058 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-588d7b64fd-wbsl2"] Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.684496 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50fd5778-3018-4a41-8db7-285ca63540a5-scripts\") pod \"placement-588d7b64fd-wbsl2\" (UID: \"50fd5778-3018-4a41-8db7-285ca63540a5\") " pod="openstack/placement-588d7b64fd-wbsl2" Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.684553 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/50fd5778-3018-4a41-8db7-285ca63540a5-internal-tls-certs\") pod \"placement-588d7b64fd-wbsl2\" (UID: \"50fd5778-3018-4a41-8db7-285ca63540a5\") " pod="openstack/placement-588d7b64fd-wbsl2" Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.684576 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/50fd5778-3018-4a41-8db7-285ca63540a5-public-tls-certs\") pod \"placement-588d7b64fd-wbsl2\" (UID: \"50fd5778-3018-4a41-8db7-285ca63540a5\") " pod="openstack/placement-588d7b64fd-wbsl2" Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.684620 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50fd5778-3018-4a41-8db7-285ca63540a5-logs\") pod \"placement-588d7b64fd-wbsl2\" (UID: \"50fd5778-3018-4a41-8db7-285ca63540a5\") " pod="openstack/placement-588d7b64fd-wbsl2" Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.684638 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50fd5778-3018-4a41-8db7-285ca63540a5-config-data\") pod \"placement-588d7b64fd-wbsl2\" (UID: \"50fd5778-3018-4a41-8db7-285ca63540a5\") " pod="openstack/placement-588d7b64fd-wbsl2" Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.684654 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n7x9\" (UniqueName: \"kubernetes.io/projected/50fd5778-3018-4a41-8db7-285ca63540a5-kube-api-access-6n7x9\") pod \"placement-588d7b64fd-wbsl2\" (UID: \"50fd5778-3018-4a41-8db7-285ca63540a5\") " pod="openstack/placement-588d7b64fd-wbsl2" Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.684687 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50fd5778-3018-4a41-8db7-285ca63540a5-combined-ca-bundle\") pod \"placement-588d7b64fd-wbsl2\" (UID: \"50fd5778-3018-4a41-8db7-285ca63540a5\") " pod="openstack/placement-588d7b64fd-wbsl2" Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.786377 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50fd5778-3018-4a41-8db7-285ca63540a5-scripts\") pod \"placement-588d7b64fd-wbsl2\" (UID: \"50fd5778-3018-4a41-8db7-285ca63540a5\") " pod="openstack/placement-588d7b64fd-wbsl2" Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.786660 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/50fd5778-3018-4a41-8db7-285ca63540a5-internal-tls-certs\") pod \"placement-588d7b64fd-wbsl2\" (UID: \"50fd5778-3018-4a41-8db7-285ca63540a5\") " pod="openstack/placement-588d7b64fd-wbsl2" Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.786690 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/50fd5778-3018-4a41-8db7-285ca63540a5-public-tls-certs\") pod \"placement-588d7b64fd-wbsl2\" (UID: \"50fd5778-3018-4a41-8db7-285ca63540a5\") " pod="openstack/placement-588d7b64fd-wbsl2" Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.786736 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50fd5778-3018-4a41-8db7-285ca63540a5-logs\") pod \"placement-588d7b64fd-wbsl2\" (UID: \"50fd5778-3018-4a41-8db7-285ca63540a5\") " pod="openstack/placement-588d7b64fd-wbsl2" Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.786757 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50fd5778-3018-4a41-8db7-285ca63540a5-config-data\") pod \"placement-588d7b64fd-wbsl2\" (UID: \"50fd5778-3018-4a41-8db7-285ca63540a5\") " pod="openstack/placement-588d7b64fd-wbsl2" Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.786777 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6n7x9\" (UniqueName: \"kubernetes.io/projected/50fd5778-3018-4a41-8db7-285ca63540a5-kube-api-access-6n7x9\") pod \"placement-588d7b64fd-wbsl2\" (UID: \"50fd5778-3018-4a41-8db7-285ca63540a5\") " pod="openstack/placement-588d7b64fd-wbsl2" Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.786820 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50fd5778-3018-4a41-8db7-285ca63540a5-combined-ca-bundle\") pod \"placement-588d7b64fd-wbsl2\" (UID: \"50fd5778-3018-4a41-8db7-285ca63540a5\") " pod="openstack/placement-588d7b64fd-wbsl2" Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.787455 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50fd5778-3018-4a41-8db7-285ca63540a5-logs\") pod \"placement-588d7b64fd-wbsl2\" (UID: \"50fd5778-3018-4a41-8db7-285ca63540a5\") " pod="openstack/placement-588d7b64fd-wbsl2" Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.790937 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/50fd5778-3018-4a41-8db7-285ca63540a5-internal-tls-certs\") pod \"placement-588d7b64fd-wbsl2\" (UID: \"50fd5778-3018-4a41-8db7-285ca63540a5\") " pod="openstack/placement-588d7b64fd-wbsl2" Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.791091 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50fd5778-3018-4a41-8db7-285ca63540a5-scripts\") pod \"placement-588d7b64fd-wbsl2\" (UID: \"50fd5778-3018-4a41-8db7-285ca63540a5\") " pod="openstack/placement-588d7b64fd-wbsl2" Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.791782 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/50fd5778-3018-4a41-8db7-285ca63540a5-public-tls-certs\") pod \"placement-588d7b64fd-wbsl2\" (UID: \"50fd5778-3018-4a41-8db7-285ca63540a5\") " pod="openstack/placement-588d7b64fd-wbsl2" Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.793692 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50fd5778-3018-4a41-8db7-285ca63540a5-combined-ca-bundle\") pod \"placement-588d7b64fd-wbsl2\" (UID: \"50fd5778-3018-4a41-8db7-285ca63540a5\") " pod="openstack/placement-588d7b64fd-wbsl2" Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.795057 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50fd5778-3018-4a41-8db7-285ca63540a5-config-data\") pod \"placement-588d7b64fd-wbsl2\" (UID: \"50fd5778-3018-4a41-8db7-285ca63540a5\") " pod="openstack/placement-588d7b64fd-wbsl2" Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.808125 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n7x9\" (UniqueName: \"kubernetes.io/projected/50fd5778-3018-4a41-8db7-285ca63540a5-kube-api-access-6n7x9\") pod \"placement-588d7b64fd-wbsl2\" (UID: \"50fd5778-3018-4a41-8db7-285ca63540a5\") " pod="openstack/placement-588d7b64fd-wbsl2" Mar 09 03:02:18 crc kubenswrapper[4901]: I0309 03:02:18.962208 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-588d7b64fd-wbsl2" Mar 09 03:02:19 crc kubenswrapper[4901]: I0309 03:02:19.084282 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qzdch" Mar 09 03:02:19 crc kubenswrapper[4901]: I0309 03:02:19.202801 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5140d205-de33-4e39-95fb-451471d3e7e9-combined-ca-bundle\") pod \"5140d205-de33-4e39-95fb-451471d3e7e9\" (UID: \"5140d205-de33-4e39-95fb-451471d3e7e9\") " Mar 09 03:02:19 crc kubenswrapper[4901]: I0309 03:02:19.202850 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5140d205-de33-4e39-95fb-451471d3e7e9-config-data\") pod \"5140d205-de33-4e39-95fb-451471d3e7e9\" (UID: \"5140d205-de33-4e39-95fb-451471d3e7e9\") " Mar 09 03:02:19 crc kubenswrapper[4901]: I0309 03:02:19.202903 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5140d205-de33-4e39-95fb-451471d3e7e9-fernet-keys\") pod \"5140d205-de33-4e39-95fb-451471d3e7e9\" (UID: \"5140d205-de33-4e39-95fb-451471d3e7e9\") " Mar 09 03:02:19 crc kubenswrapper[4901]: I0309 03:02:19.202925 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5140d205-de33-4e39-95fb-451471d3e7e9-scripts\") pod \"5140d205-de33-4e39-95fb-451471d3e7e9\" (UID: \"5140d205-de33-4e39-95fb-451471d3e7e9\") " Mar 09 03:02:19 crc kubenswrapper[4901]: I0309 03:02:19.202959 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5140d205-de33-4e39-95fb-451471d3e7e9-credential-keys\") pod \"5140d205-de33-4e39-95fb-451471d3e7e9\" (UID: \"5140d205-de33-4e39-95fb-451471d3e7e9\") " Mar 09 03:02:19 crc kubenswrapper[4901]: I0309 03:02:19.203002 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qp5pd\" (UniqueName: \"kubernetes.io/projected/5140d205-de33-4e39-95fb-451471d3e7e9-kube-api-access-qp5pd\") pod \"5140d205-de33-4e39-95fb-451471d3e7e9\" (UID: \"5140d205-de33-4e39-95fb-451471d3e7e9\") " Mar 09 03:02:19 crc kubenswrapper[4901]: I0309 03:02:19.235661 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5140d205-de33-4e39-95fb-451471d3e7e9-kube-api-access-qp5pd" (OuterVolumeSpecName: "kube-api-access-qp5pd") pod "5140d205-de33-4e39-95fb-451471d3e7e9" (UID: "5140d205-de33-4e39-95fb-451471d3e7e9"). InnerVolumeSpecName "kube-api-access-qp5pd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:02:19 crc kubenswrapper[4901]: I0309 03:02:19.250655 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5140d205-de33-4e39-95fb-451471d3e7e9-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "5140d205-de33-4e39-95fb-451471d3e7e9" (UID: "5140d205-de33-4e39-95fb-451471d3e7e9"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:02:19 crc kubenswrapper[4901]: I0309 03:02:19.260803 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5140d205-de33-4e39-95fb-451471d3e7e9-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "5140d205-de33-4e39-95fb-451471d3e7e9" (UID: "5140d205-de33-4e39-95fb-451471d3e7e9"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:02:19 crc kubenswrapper[4901]: I0309 03:02:19.261675 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5140d205-de33-4e39-95fb-451471d3e7e9-scripts" (OuterVolumeSpecName: "scripts") pod "5140d205-de33-4e39-95fb-451471d3e7e9" (UID: "5140d205-de33-4e39-95fb-451471d3e7e9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:02:19 crc kubenswrapper[4901]: I0309 03:02:19.312362 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5140d205-de33-4e39-95fb-451471d3e7e9-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:19 crc kubenswrapper[4901]: I0309 03:02:19.312373 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5140d205-de33-4e39-95fb-451471d3e7e9-config-data" (OuterVolumeSpecName: "config-data") pod "5140d205-de33-4e39-95fb-451471d3e7e9" (UID: "5140d205-de33-4e39-95fb-451471d3e7e9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:02:19 crc kubenswrapper[4901]: I0309 03:02:19.312398 4901 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5140d205-de33-4e39-95fb-451471d3e7e9-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:19 crc kubenswrapper[4901]: I0309 03:02:19.312457 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qp5pd\" (UniqueName: \"kubernetes.io/projected/5140d205-de33-4e39-95fb-451471d3e7e9-kube-api-access-qp5pd\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:19 crc kubenswrapper[4901]: I0309 03:02:19.312469 4901 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5140d205-de33-4e39-95fb-451471d3e7e9-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:19 crc kubenswrapper[4901]: I0309 03:02:19.356408 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5140d205-de33-4e39-95fb-451471d3e7e9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5140d205-de33-4e39-95fb-451471d3e7e9" (UID: "5140d205-de33-4e39-95fb-451471d3e7e9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:02:19 crc kubenswrapper[4901]: I0309 03:02:19.416396 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5140d205-de33-4e39-95fb-451471d3e7e9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:19 crc kubenswrapper[4901]: I0309 03:02:19.416422 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5140d205-de33-4e39-95fb-451471d3e7e9-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:19 crc kubenswrapper[4901]: I0309 03:02:19.558564 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qzdch" event={"ID":"5140d205-de33-4e39-95fb-451471d3e7e9","Type":"ContainerDied","Data":"c174b0753f88f53216966bbbee693bd36c9f43b9e3397d771c1ef2b97d72ee40"} Mar 09 03:02:19 crc kubenswrapper[4901]: I0309 03:02:19.558624 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c174b0753f88f53216966bbbee693bd36c9f43b9e3397d771c1ef2b97d72ee40" Mar 09 03:02:19 crc kubenswrapper[4901]: I0309 03:02:19.558796 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qzdch" Mar 09 03:02:19 crc kubenswrapper[4901]: I0309 03:02:19.602586 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-588d7b64fd-wbsl2"] Mar 09 03:02:19 crc kubenswrapper[4901]: W0309 03:02:19.609889 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50fd5778_3018_4a41_8db7_285ca63540a5.slice/crio-eb6bf3e01b9775960887acf875cff95b811097b67344ba8c4818e22727cd4a25 WatchSource:0}: Error finding container eb6bf3e01b9775960887acf875cff95b811097b67344ba8c4818e22727cd4a25: Status 404 returned error can't find the container with id eb6bf3e01b9775960887acf875cff95b811097b67344ba8c4818e22727cd4a25 Mar 09 03:02:19 crc kubenswrapper[4901]: I0309 03:02:19.668133 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-79578f965f-zpp5p"] Mar 09 03:02:19 crc kubenswrapper[4901]: E0309 03:02:19.668497 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5140d205-de33-4e39-95fb-451471d3e7e9" containerName="keystone-bootstrap" Mar 09 03:02:19 crc kubenswrapper[4901]: I0309 03:02:19.668512 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="5140d205-de33-4e39-95fb-451471d3e7e9" containerName="keystone-bootstrap" Mar 09 03:02:19 crc kubenswrapper[4901]: I0309 03:02:19.668693 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="5140d205-de33-4e39-95fb-451471d3e7e9" containerName="keystone-bootstrap" Mar 09 03:02:19 crc kubenswrapper[4901]: I0309 03:02:19.672798 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-79578f965f-zpp5p" Mar 09 03:02:19 crc kubenswrapper[4901]: I0309 03:02:19.675731 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 09 03:02:19 crc kubenswrapper[4901]: I0309 03:02:19.676018 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 09 03:02:19 crc kubenswrapper[4901]: I0309 03:02:19.676175 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 09 03:02:19 crc kubenswrapper[4901]: I0309 03:02:19.676365 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 09 03:02:19 crc kubenswrapper[4901]: I0309 03:02:19.676755 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-ghd6s" Mar 09 03:02:19 crc kubenswrapper[4901]: I0309 03:02:19.676770 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 09 03:02:19 crc kubenswrapper[4901]: I0309 03:02:19.683944 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-79578f965f-zpp5p"] Mar 09 03:02:19 crc kubenswrapper[4901]: I0309 03:02:19.720244 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e45b6a38-6035-4fd4-a525-5d51ac6d0a2d-config-data\") pod \"keystone-79578f965f-zpp5p\" (UID: \"e45b6a38-6035-4fd4-a525-5d51ac6d0a2d\") " pod="openstack/keystone-79578f965f-zpp5p" Mar 09 03:02:19 crc kubenswrapper[4901]: I0309 03:02:19.720321 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e45b6a38-6035-4fd4-a525-5d51ac6d0a2d-public-tls-certs\") pod \"keystone-79578f965f-zpp5p\" (UID: \"e45b6a38-6035-4fd4-a525-5d51ac6d0a2d\") " pod="openstack/keystone-79578f965f-zpp5p" Mar 09 03:02:19 crc kubenswrapper[4901]: I0309 03:02:19.720342 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e45b6a38-6035-4fd4-a525-5d51ac6d0a2d-fernet-keys\") pod \"keystone-79578f965f-zpp5p\" (UID: \"e45b6a38-6035-4fd4-a525-5d51ac6d0a2d\") " pod="openstack/keystone-79578f965f-zpp5p" Mar 09 03:02:19 crc kubenswrapper[4901]: I0309 03:02:19.720367 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e45b6a38-6035-4fd4-a525-5d51ac6d0a2d-credential-keys\") pod \"keystone-79578f965f-zpp5p\" (UID: \"e45b6a38-6035-4fd4-a525-5d51ac6d0a2d\") " pod="openstack/keystone-79578f965f-zpp5p" Mar 09 03:02:19 crc kubenswrapper[4901]: I0309 03:02:19.720400 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e45b6a38-6035-4fd4-a525-5d51ac6d0a2d-combined-ca-bundle\") pod \"keystone-79578f965f-zpp5p\" (UID: \"e45b6a38-6035-4fd4-a525-5d51ac6d0a2d\") " pod="openstack/keystone-79578f965f-zpp5p" Mar 09 03:02:19 crc kubenswrapper[4901]: I0309 03:02:19.720420 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e45b6a38-6035-4fd4-a525-5d51ac6d0a2d-scripts\") pod \"keystone-79578f965f-zpp5p\" (UID: \"e45b6a38-6035-4fd4-a525-5d51ac6d0a2d\") " pod="openstack/keystone-79578f965f-zpp5p" Mar 09 03:02:19 crc kubenswrapper[4901]: I0309 03:02:19.720437 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbnsj\" (UniqueName: \"kubernetes.io/projected/e45b6a38-6035-4fd4-a525-5d51ac6d0a2d-kube-api-access-dbnsj\") pod \"keystone-79578f965f-zpp5p\" (UID: \"e45b6a38-6035-4fd4-a525-5d51ac6d0a2d\") " pod="openstack/keystone-79578f965f-zpp5p" Mar 09 03:02:19 crc kubenswrapper[4901]: I0309 03:02:19.720481 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e45b6a38-6035-4fd4-a525-5d51ac6d0a2d-internal-tls-certs\") pod \"keystone-79578f965f-zpp5p\" (UID: \"e45b6a38-6035-4fd4-a525-5d51ac6d0a2d\") " pod="openstack/keystone-79578f965f-zpp5p" Mar 09 03:02:19 crc kubenswrapper[4901]: I0309 03:02:19.821935 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e45b6a38-6035-4fd4-a525-5d51ac6d0a2d-public-tls-certs\") pod \"keystone-79578f965f-zpp5p\" (UID: \"e45b6a38-6035-4fd4-a525-5d51ac6d0a2d\") " pod="openstack/keystone-79578f965f-zpp5p" Mar 09 03:02:19 crc kubenswrapper[4901]: I0309 03:02:19.822256 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e45b6a38-6035-4fd4-a525-5d51ac6d0a2d-fernet-keys\") pod \"keystone-79578f965f-zpp5p\" (UID: \"e45b6a38-6035-4fd4-a525-5d51ac6d0a2d\") " pod="openstack/keystone-79578f965f-zpp5p" Mar 09 03:02:19 crc kubenswrapper[4901]: I0309 03:02:19.822278 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e45b6a38-6035-4fd4-a525-5d51ac6d0a2d-credential-keys\") pod \"keystone-79578f965f-zpp5p\" (UID: \"e45b6a38-6035-4fd4-a525-5d51ac6d0a2d\") " pod="openstack/keystone-79578f965f-zpp5p" Mar 09 03:02:19 crc kubenswrapper[4901]: I0309 03:02:19.822307 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e45b6a38-6035-4fd4-a525-5d51ac6d0a2d-combined-ca-bundle\") pod \"keystone-79578f965f-zpp5p\" (UID: \"e45b6a38-6035-4fd4-a525-5d51ac6d0a2d\") " pod="openstack/keystone-79578f965f-zpp5p" Mar 09 03:02:19 crc kubenswrapper[4901]: I0309 03:02:19.822328 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e45b6a38-6035-4fd4-a525-5d51ac6d0a2d-scripts\") pod \"keystone-79578f965f-zpp5p\" (UID: \"e45b6a38-6035-4fd4-a525-5d51ac6d0a2d\") " pod="openstack/keystone-79578f965f-zpp5p" Mar 09 03:02:19 crc kubenswrapper[4901]: I0309 03:02:19.822349 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbnsj\" (UniqueName: \"kubernetes.io/projected/e45b6a38-6035-4fd4-a525-5d51ac6d0a2d-kube-api-access-dbnsj\") pod \"keystone-79578f965f-zpp5p\" (UID: \"e45b6a38-6035-4fd4-a525-5d51ac6d0a2d\") " pod="openstack/keystone-79578f965f-zpp5p" Mar 09 03:02:19 crc kubenswrapper[4901]: I0309 03:02:19.822396 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e45b6a38-6035-4fd4-a525-5d51ac6d0a2d-internal-tls-certs\") pod \"keystone-79578f965f-zpp5p\" (UID: \"e45b6a38-6035-4fd4-a525-5d51ac6d0a2d\") " pod="openstack/keystone-79578f965f-zpp5p" Mar 09 03:02:19 crc kubenswrapper[4901]: I0309 03:02:19.822443 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e45b6a38-6035-4fd4-a525-5d51ac6d0a2d-config-data\") pod \"keystone-79578f965f-zpp5p\" (UID: \"e45b6a38-6035-4fd4-a525-5d51ac6d0a2d\") " pod="openstack/keystone-79578f965f-zpp5p" Mar 09 03:02:19 crc kubenswrapper[4901]: I0309 03:02:19.829810 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e45b6a38-6035-4fd4-a525-5d51ac6d0a2d-internal-tls-certs\") pod \"keystone-79578f965f-zpp5p\" (UID: \"e45b6a38-6035-4fd4-a525-5d51ac6d0a2d\") " pod="openstack/keystone-79578f965f-zpp5p" Mar 09 03:02:19 crc kubenswrapper[4901]: I0309 03:02:19.831001 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e45b6a38-6035-4fd4-a525-5d51ac6d0a2d-fernet-keys\") pod \"keystone-79578f965f-zpp5p\" (UID: \"e45b6a38-6035-4fd4-a525-5d51ac6d0a2d\") " pod="openstack/keystone-79578f965f-zpp5p" Mar 09 03:02:19 crc kubenswrapper[4901]: I0309 03:02:19.832259 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e45b6a38-6035-4fd4-a525-5d51ac6d0a2d-credential-keys\") pod \"keystone-79578f965f-zpp5p\" (UID: \"e45b6a38-6035-4fd4-a525-5d51ac6d0a2d\") " pod="openstack/keystone-79578f965f-zpp5p" Mar 09 03:02:19 crc kubenswrapper[4901]: I0309 03:02:19.833718 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e45b6a38-6035-4fd4-a525-5d51ac6d0a2d-config-data\") pod \"keystone-79578f965f-zpp5p\" (UID: \"e45b6a38-6035-4fd4-a525-5d51ac6d0a2d\") " pod="openstack/keystone-79578f965f-zpp5p" Mar 09 03:02:19 crc kubenswrapper[4901]: I0309 03:02:19.835890 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e45b6a38-6035-4fd4-a525-5d51ac6d0a2d-combined-ca-bundle\") pod \"keystone-79578f965f-zpp5p\" (UID: \"e45b6a38-6035-4fd4-a525-5d51ac6d0a2d\") " pod="openstack/keystone-79578f965f-zpp5p" Mar 09 03:02:19 crc kubenswrapper[4901]: I0309 03:02:19.835958 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e45b6a38-6035-4fd4-a525-5d51ac6d0a2d-public-tls-certs\") pod \"keystone-79578f965f-zpp5p\" (UID: \"e45b6a38-6035-4fd4-a525-5d51ac6d0a2d\") " pod="openstack/keystone-79578f965f-zpp5p" Mar 09 03:02:19 crc kubenswrapper[4901]: I0309 03:02:19.846316 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbnsj\" (UniqueName: \"kubernetes.io/projected/e45b6a38-6035-4fd4-a525-5d51ac6d0a2d-kube-api-access-dbnsj\") pod \"keystone-79578f965f-zpp5p\" (UID: \"e45b6a38-6035-4fd4-a525-5d51ac6d0a2d\") " pod="openstack/keystone-79578f965f-zpp5p" Mar 09 03:02:19 crc kubenswrapper[4901]: I0309 03:02:19.851583 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e45b6a38-6035-4fd4-a525-5d51ac6d0a2d-scripts\") pod \"keystone-79578f965f-zpp5p\" (UID: \"e45b6a38-6035-4fd4-a525-5d51ac6d0a2d\") " pod="openstack/keystone-79578f965f-zpp5p" Mar 09 03:02:20 crc kubenswrapper[4901]: I0309 03:02:20.004939 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-79578f965f-zpp5p" Mar 09 03:02:20 crc kubenswrapper[4901]: I0309 03:02:20.121713 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fd0bf33-39b8-4401-a8af-63b45e82b5fd" path="/var/lib/kubelet/pods/8fd0bf33-39b8-4401-a8af-63b45e82b5fd/volumes" Mar 09 03:02:20 crc kubenswrapper[4901]: I0309 03:02:20.496280 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-79578f965f-zpp5p"] Mar 09 03:02:20 crc kubenswrapper[4901]: W0309 03:02:20.499507 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode45b6a38_6035_4fd4_a525_5d51ac6d0a2d.slice/crio-e9eb5e1585081f94ba18b305e25fe998c1db60ca1d2a6be40107c6d77dfa8484 WatchSource:0}: Error finding container e9eb5e1585081f94ba18b305e25fe998c1db60ca1d2a6be40107c6d77dfa8484: Status 404 returned error can't find the container with id e9eb5e1585081f94ba18b305e25fe998c1db60ca1d2a6be40107c6d77dfa8484 Mar 09 03:02:20 crc kubenswrapper[4901]: I0309 03:02:20.506647 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 09 03:02:20 crc kubenswrapper[4901]: I0309 03:02:20.507363 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 09 03:02:20 crc kubenswrapper[4901]: I0309 03:02:20.536547 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 09 03:02:20 crc kubenswrapper[4901]: I0309 03:02:20.545262 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 09 03:02:20 crc kubenswrapper[4901]: I0309 03:02:20.577601 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-588d7b64fd-wbsl2" event={"ID":"50fd5778-3018-4a41-8db7-285ca63540a5","Type":"ContainerStarted","Data":"0b687bccedee7ceb14535726d105414601f7fa2b74a29a4654dc4138b0a31d91"} Mar 09 03:02:20 crc kubenswrapper[4901]: I0309 03:02:20.577645 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-588d7b64fd-wbsl2" event={"ID":"50fd5778-3018-4a41-8db7-285ca63540a5","Type":"ContainerStarted","Data":"92f0b018fb501ac1d31f864c3c26312799db4eaeb5131c441e1d09cb7f469811"} Mar 09 03:02:20 crc kubenswrapper[4901]: I0309 03:02:20.577656 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-588d7b64fd-wbsl2" event={"ID":"50fd5778-3018-4a41-8db7-285ca63540a5","Type":"ContainerStarted","Data":"eb6bf3e01b9775960887acf875cff95b811097b67344ba8c4818e22727cd4a25"} Mar 09 03:02:20 crc kubenswrapper[4901]: I0309 03:02:20.580932 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-79578f965f-zpp5p" event={"ID":"e45b6a38-6035-4fd4-a525-5d51ac6d0a2d","Type":"ContainerStarted","Data":"e9eb5e1585081f94ba18b305e25fe998c1db60ca1d2a6be40107c6d77dfa8484"} Mar 09 03:02:20 crc kubenswrapper[4901]: I0309 03:02:20.582255 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 09 03:02:20 crc kubenswrapper[4901]: I0309 03:02:20.582301 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 09 03:02:21 crc kubenswrapper[4901]: I0309 03:02:21.536267 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 09 03:02:21 crc kubenswrapper[4901]: I0309 03:02:21.536604 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 09 03:02:21 crc kubenswrapper[4901]: I0309 03:02:21.590103 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-588d7b64fd-wbsl2" Mar 09 03:02:21 crc kubenswrapper[4901]: I0309 03:02:21.590154 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-588d7b64fd-wbsl2" Mar 09 03:02:21 crc kubenswrapper[4901]: I0309 03:02:21.601349 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 09 03:02:21 crc kubenswrapper[4901]: I0309 03:02:21.601865 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 09 03:02:21 crc kubenswrapper[4901]: I0309 03:02:21.607332 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 09 03:02:21 crc kubenswrapper[4901]: I0309 03:02:21.621804 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-588d7b64fd-wbsl2" podStartSLOduration=3.621789745 podStartE2EDuration="3.621789745s" podCreationTimestamp="2026-03-09 03:02:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 03:02:21.615159188 +0000 UTC m=+1266.204822920" watchObservedRunningTime="2026-03-09 03:02:21.621789745 +0000 UTC m=+1266.211453477" Mar 09 03:02:22 crc kubenswrapper[4901]: I0309 03:02:22.541673 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 09 03:02:22 crc kubenswrapper[4901]: I0309 03:02:22.592976 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 09 03:02:22 crc kubenswrapper[4901]: I0309 03:02:22.596344 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 09 03:02:23 crc kubenswrapper[4901]: I0309 03:02:23.603236 4901 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 09 03:02:24 crc kubenswrapper[4901]: I0309 03:02:24.208137 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 09 03:02:24 crc kubenswrapper[4901]: I0309 03:02:24.556526 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6cbd95f657-4psgt" Mar 09 03:02:24 crc kubenswrapper[4901]: I0309 03:02:24.635098 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-79578f965f-zpp5p" event={"ID":"e45b6a38-6035-4fd4-a525-5d51ac6d0a2d","Type":"ContainerStarted","Data":"2e31d5cb8dd1ce0c928d719260414673c9ea3395f2fcb765ea37517d7821c355"} Mar 09 03:02:24 crc kubenswrapper[4901]: I0309 03:02:24.635254 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-79578f965f-zpp5p" Mar 09 03:02:24 crc kubenswrapper[4901]: I0309 03:02:24.636634 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf6456ddf-pv679"] Mar 09 03:02:24 crc kubenswrapper[4901]: I0309 03:02:24.636918 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5bf6456ddf-pv679" podUID="4e886d37-056b-4000-876d-881906f1e3b3" containerName="dnsmasq-dns" containerID="cri-o://cb6cb33ae3974e418b66973e91b2f8a2d4dea520a43558c5b8091cf96377b2ef" gracePeriod=10 Mar 09 03:02:24 crc kubenswrapper[4901]: I0309 03:02:24.648270 4901 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 09 03:02:24 crc kubenswrapper[4901]: I0309 03:02:24.682715 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-79578f965f-zpp5p" podStartSLOduration=5.682699821 podStartE2EDuration="5.682699821s" podCreationTimestamp="2026-03-09 03:02:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 03:02:24.660546102 +0000 UTC m=+1269.250209834" watchObservedRunningTime="2026-03-09 03:02:24.682699821 +0000 UTC m=+1269.272363553" Mar 09 03:02:24 crc kubenswrapper[4901]: E0309 03:02:24.757649 4901 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e886d37_056b_4000_876d_881906f1e3b3.slice/crio-cb6cb33ae3974e418b66973e91b2f8a2d4dea520a43558c5b8091cf96377b2ef.scope\": RecentStats: unable to find data in memory cache]" Mar 09 03:02:24 crc kubenswrapper[4901]: I0309 03:02:24.769769 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5bf6456ddf-pv679" podUID="4e886d37-056b-4000-876d-881906f1e3b3" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.148:5353: connect: connection refused" Mar 09 03:02:24 crc kubenswrapper[4901]: I0309 03:02:24.845724 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 09 03:02:25 crc kubenswrapper[4901]: I0309 03:02:25.258077 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf6456ddf-pv679" Mar 09 03:02:25 crc kubenswrapper[4901]: I0309 03:02:25.336713 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e886d37-056b-4000-876d-881906f1e3b3-dns-svc\") pod \"4e886d37-056b-4000-876d-881906f1e3b3\" (UID: \"4e886d37-056b-4000-876d-881906f1e3b3\") " Mar 09 03:02:25 crc kubenswrapper[4901]: I0309 03:02:25.336850 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnvp2\" (UniqueName: \"kubernetes.io/projected/4e886d37-056b-4000-876d-881906f1e3b3-kube-api-access-jnvp2\") pod \"4e886d37-056b-4000-876d-881906f1e3b3\" (UID: \"4e886d37-056b-4000-876d-881906f1e3b3\") " Mar 09 03:02:25 crc kubenswrapper[4901]: I0309 03:02:25.336871 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e886d37-056b-4000-876d-881906f1e3b3-config\") pod \"4e886d37-056b-4000-876d-881906f1e3b3\" (UID: \"4e886d37-056b-4000-876d-881906f1e3b3\") " Mar 09 03:02:25 crc kubenswrapper[4901]: I0309 03:02:25.337023 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e886d37-056b-4000-876d-881906f1e3b3-ovsdbserver-nb\") pod \"4e886d37-056b-4000-876d-881906f1e3b3\" (UID: \"4e886d37-056b-4000-876d-881906f1e3b3\") " Mar 09 03:02:25 crc kubenswrapper[4901]: I0309 03:02:25.337064 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e886d37-056b-4000-876d-881906f1e3b3-ovsdbserver-sb\") pod \"4e886d37-056b-4000-876d-881906f1e3b3\" (UID: \"4e886d37-056b-4000-876d-881906f1e3b3\") " Mar 09 03:02:25 crc kubenswrapper[4901]: I0309 03:02:25.337104 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4e886d37-056b-4000-876d-881906f1e3b3-dns-swift-storage-0\") pod \"4e886d37-056b-4000-876d-881906f1e3b3\" (UID: \"4e886d37-056b-4000-876d-881906f1e3b3\") " Mar 09 03:02:25 crc kubenswrapper[4901]: I0309 03:02:25.342487 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e886d37-056b-4000-876d-881906f1e3b3-kube-api-access-jnvp2" (OuterVolumeSpecName: "kube-api-access-jnvp2") pod "4e886d37-056b-4000-876d-881906f1e3b3" (UID: "4e886d37-056b-4000-876d-881906f1e3b3"). InnerVolumeSpecName "kube-api-access-jnvp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:02:25 crc kubenswrapper[4901]: I0309 03:02:25.388085 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e886d37-056b-4000-876d-881906f1e3b3-config" (OuterVolumeSpecName: "config") pod "4e886d37-056b-4000-876d-881906f1e3b3" (UID: "4e886d37-056b-4000-876d-881906f1e3b3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:02:25 crc kubenswrapper[4901]: I0309 03:02:25.402781 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e886d37-056b-4000-876d-881906f1e3b3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4e886d37-056b-4000-876d-881906f1e3b3" (UID: "4e886d37-056b-4000-876d-881906f1e3b3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:02:25 crc kubenswrapper[4901]: I0309 03:02:25.406807 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e886d37-056b-4000-876d-881906f1e3b3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4e886d37-056b-4000-876d-881906f1e3b3" (UID: "4e886d37-056b-4000-876d-881906f1e3b3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:02:25 crc kubenswrapper[4901]: I0309 03:02:25.417701 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e886d37-056b-4000-876d-881906f1e3b3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4e886d37-056b-4000-876d-881906f1e3b3" (UID: "4e886d37-056b-4000-876d-881906f1e3b3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:02:25 crc kubenswrapper[4901]: I0309 03:02:25.439277 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e886d37-056b-4000-876d-881906f1e3b3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4e886d37-056b-4000-876d-881906f1e3b3" (UID: "4e886d37-056b-4000-876d-881906f1e3b3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:02:25 crc kubenswrapper[4901]: I0309 03:02:25.439738 4901 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e886d37-056b-4000-876d-881906f1e3b3-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:25 crc kubenswrapper[4901]: I0309 03:02:25.439757 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnvp2\" (UniqueName: \"kubernetes.io/projected/4e886d37-056b-4000-876d-881906f1e3b3-kube-api-access-jnvp2\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:25 crc kubenswrapper[4901]: I0309 03:02:25.439768 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e886d37-056b-4000-876d-881906f1e3b3-config\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:25 crc kubenswrapper[4901]: I0309 03:02:25.439777 4901 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e886d37-056b-4000-876d-881906f1e3b3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:25 crc kubenswrapper[4901]: I0309 03:02:25.439785 4901 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e886d37-056b-4000-876d-881906f1e3b3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:25 crc kubenswrapper[4901]: I0309 03:02:25.439795 4901 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4e886d37-056b-4000-876d-881906f1e3b3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:25 crc kubenswrapper[4901]: I0309 03:02:25.659936 4901 generic.go:334] "Generic (PLEG): container finished" podID="4e886d37-056b-4000-876d-881906f1e3b3" containerID="cb6cb33ae3974e418b66973e91b2f8a2d4dea520a43558c5b8091cf96377b2ef" exitCode=0 Mar 09 03:02:25 crc kubenswrapper[4901]: I0309 03:02:25.660003 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf6456ddf-pv679" event={"ID":"4e886d37-056b-4000-876d-881906f1e3b3","Type":"ContainerDied","Data":"cb6cb33ae3974e418b66973e91b2f8a2d4dea520a43558c5b8091cf96377b2ef"} Mar 09 03:02:25 crc kubenswrapper[4901]: I0309 03:02:25.660034 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf6456ddf-pv679" event={"ID":"4e886d37-056b-4000-876d-881906f1e3b3","Type":"ContainerDied","Data":"c9995225a00974cc442dc9a8d79ed0476c1ebc61295d633c1280d0d84035f10c"} Mar 09 03:02:25 crc kubenswrapper[4901]: I0309 03:02:25.660053 4901 scope.go:117] "RemoveContainer" containerID="cb6cb33ae3974e418b66973e91b2f8a2d4dea520a43558c5b8091cf96377b2ef" Mar 09 03:02:25 crc kubenswrapper[4901]: I0309 03:02:25.660174 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf6456ddf-pv679" Mar 09 03:02:25 crc kubenswrapper[4901]: I0309 03:02:25.664143 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d0a22f21-beb0-44a1-943c-08547dc523a8","Type":"ContainerStarted","Data":"69e8a54e428b6d310fb06f38db4fd281b827264f761e0bfe9127e0f12d067812"} Mar 09 03:02:25 crc kubenswrapper[4901]: I0309 03:02:25.667615 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-2mh7b" event={"ID":"7acea3ec-d1eb-4971-b3b5-7c0b898cf07c","Type":"ContainerStarted","Data":"1318be7d0c979674383ef3ff96fcc0e4c7a31c99f6b2aff5bad67a5d745786d1"} Mar 09 03:02:25 crc kubenswrapper[4901]: I0309 03:02:25.686483 4901 scope.go:117] "RemoveContainer" containerID="7b8f099fa8eab8f44a3ad033edfe6be9203fd9faee348e42b6abfacee9b039ef" Mar 09 03:02:25 crc kubenswrapper[4901]: I0309 03:02:25.705071 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-2mh7b" podStartSLOduration=3.566092387 podStartE2EDuration="36.705053132s" podCreationTimestamp="2026-03-09 03:01:49 +0000 UTC" firstStartedPulling="2026-03-09 03:01:51.067875077 +0000 UTC m=+1235.657538809" lastFinishedPulling="2026-03-09 03:02:24.206835822 +0000 UTC m=+1268.796499554" observedRunningTime="2026-03-09 03:02:25.69308855 +0000 UTC m=+1270.282752292" watchObservedRunningTime="2026-03-09 03:02:25.705053132 +0000 UTC m=+1270.294716864" Mar 09 03:02:25 crc kubenswrapper[4901]: I0309 03:02:25.720905 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf6456ddf-pv679"] Mar 09 03:02:25 crc kubenswrapper[4901]: I0309 03:02:25.721592 4901 scope.go:117] "RemoveContainer" containerID="cb6cb33ae3974e418b66973e91b2f8a2d4dea520a43558c5b8091cf96377b2ef" Mar 09 03:02:25 crc kubenswrapper[4901]: E0309 03:02:25.722055 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb6cb33ae3974e418b66973e91b2f8a2d4dea520a43558c5b8091cf96377b2ef\": container with ID starting with cb6cb33ae3974e418b66973e91b2f8a2d4dea520a43558c5b8091cf96377b2ef not found: ID does not exist" containerID="cb6cb33ae3974e418b66973e91b2f8a2d4dea520a43558c5b8091cf96377b2ef" Mar 09 03:02:25 crc kubenswrapper[4901]: I0309 03:02:25.722139 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb6cb33ae3974e418b66973e91b2f8a2d4dea520a43558c5b8091cf96377b2ef"} err="failed to get container status \"cb6cb33ae3974e418b66973e91b2f8a2d4dea520a43558c5b8091cf96377b2ef\": rpc error: code = NotFound desc = could not find container \"cb6cb33ae3974e418b66973e91b2f8a2d4dea520a43558c5b8091cf96377b2ef\": container with ID starting with cb6cb33ae3974e418b66973e91b2f8a2d4dea520a43558c5b8091cf96377b2ef not found: ID does not exist" Mar 09 03:02:25 crc kubenswrapper[4901]: I0309 03:02:25.722172 4901 scope.go:117] "RemoveContainer" containerID="7b8f099fa8eab8f44a3ad033edfe6be9203fd9faee348e42b6abfacee9b039ef" Mar 09 03:02:25 crc kubenswrapper[4901]: E0309 03:02:25.722559 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b8f099fa8eab8f44a3ad033edfe6be9203fd9faee348e42b6abfacee9b039ef\": container with ID starting with 7b8f099fa8eab8f44a3ad033edfe6be9203fd9faee348e42b6abfacee9b039ef not found: ID does not exist" containerID="7b8f099fa8eab8f44a3ad033edfe6be9203fd9faee348e42b6abfacee9b039ef" Mar 09 03:02:25 crc kubenswrapper[4901]: I0309 03:02:25.722583 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b8f099fa8eab8f44a3ad033edfe6be9203fd9faee348e42b6abfacee9b039ef"} err="failed to get container status \"7b8f099fa8eab8f44a3ad033edfe6be9203fd9faee348e42b6abfacee9b039ef\": rpc error: code = NotFound desc = could not find container \"7b8f099fa8eab8f44a3ad033edfe6be9203fd9faee348e42b6abfacee9b039ef\": container with ID starting with 7b8f099fa8eab8f44a3ad033edfe6be9203fd9faee348e42b6abfacee9b039ef not found: ID does not exist" Mar 09 03:02:25 crc kubenswrapper[4901]: I0309 03:02:25.730148 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bf6456ddf-pv679"] Mar 09 03:02:26 crc kubenswrapper[4901]: I0309 03:02:26.118369 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e886d37-056b-4000-876d-881906f1e3b3" path="/var/lib/kubelet/pods/4e886d37-056b-4000-876d-881906f1e3b3/volumes" Mar 09 03:02:27 crc kubenswrapper[4901]: I0309 03:02:27.691722 4901 generic.go:334] "Generic (PLEG): container finished" podID="7acea3ec-d1eb-4971-b3b5-7c0b898cf07c" containerID="1318be7d0c979674383ef3ff96fcc0e4c7a31c99f6b2aff5bad67a5d745786d1" exitCode=0 Mar 09 03:02:27 crc kubenswrapper[4901]: I0309 03:02:27.691789 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-2mh7b" event={"ID":"7acea3ec-d1eb-4971-b3b5-7c0b898cf07c","Type":"ContainerDied","Data":"1318be7d0c979674383ef3ff96fcc0e4c7a31c99f6b2aff5bad67a5d745786d1"} Mar 09 03:02:28 crc kubenswrapper[4901]: I0309 03:02:28.724526 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-24d57" event={"ID":"e040c407-4b37-4bee-b200-0d97b5767ef1","Type":"ContainerStarted","Data":"e864148fe534dd0649b1839ddd987f99c31ebbf9373117d1b9294d3a433bc88e"} Mar 09 03:02:28 crc kubenswrapper[4901]: I0309 03:02:28.751046 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-24d57" podStartSLOduration=3.5701215570000002 podStartE2EDuration="39.751029851s" podCreationTimestamp="2026-03-09 03:01:49 +0000 UTC" firstStartedPulling="2026-03-09 03:01:51.377352797 +0000 UTC m=+1235.967016529" lastFinishedPulling="2026-03-09 03:02:27.558261081 +0000 UTC m=+1272.147924823" observedRunningTime="2026-03-09 03:02:28.74547196 +0000 UTC m=+1273.335135702" watchObservedRunningTime="2026-03-09 03:02:28.751029851 +0000 UTC m=+1273.340693583" Mar 09 03:02:31 crc kubenswrapper[4901]: I0309 03:02:31.653213 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-2mh7b" Mar 09 03:02:31 crc kubenswrapper[4901]: I0309 03:02:31.671406 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cq94g\" (UniqueName: \"kubernetes.io/projected/7acea3ec-d1eb-4971-b3b5-7c0b898cf07c-kube-api-access-cq94g\") pod \"7acea3ec-d1eb-4971-b3b5-7c0b898cf07c\" (UID: \"7acea3ec-d1eb-4971-b3b5-7c0b898cf07c\") " Mar 09 03:02:31 crc kubenswrapper[4901]: I0309 03:02:31.671515 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7acea3ec-d1eb-4971-b3b5-7c0b898cf07c-db-sync-config-data\") pod \"7acea3ec-d1eb-4971-b3b5-7c0b898cf07c\" (UID: \"7acea3ec-d1eb-4971-b3b5-7c0b898cf07c\") " Mar 09 03:02:31 crc kubenswrapper[4901]: I0309 03:02:31.671578 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7acea3ec-d1eb-4971-b3b5-7c0b898cf07c-combined-ca-bundle\") pod \"7acea3ec-d1eb-4971-b3b5-7c0b898cf07c\" (UID: \"7acea3ec-d1eb-4971-b3b5-7c0b898cf07c\") " Mar 09 03:02:31 crc kubenswrapper[4901]: I0309 03:02:31.678496 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7acea3ec-d1eb-4971-b3b5-7c0b898cf07c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "7acea3ec-d1eb-4971-b3b5-7c0b898cf07c" (UID: "7acea3ec-d1eb-4971-b3b5-7c0b898cf07c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:02:31 crc kubenswrapper[4901]: I0309 03:02:31.681062 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7acea3ec-d1eb-4971-b3b5-7c0b898cf07c-kube-api-access-cq94g" (OuterVolumeSpecName: "kube-api-access-cq94g") pod "7acea3ec-d1eb-4971-b3b5-7c0b898cf07c" (UID: "7acea3ec-d1eb-4971-b3b5-7c0b898cf07c"). InnerVolumeSpecName "kube-api-access-cq94g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:02:31 crc kubenswrapper[4901]: I0309 03:02:31.713526 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7acea3ec-d1eb-4971-b3b5-7c0b898cf07c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7acea3ec-d1eb-4971-b3b5-7c0b898cf07c" (UID: "7acea3ec-d1eb-4971-b3b5-7c0b898cf07c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:02:31 crc kubenswrapper[4901]: I0309 03:02:31.757498 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-2mh7b" event={"ID":"7acea3ec-d1eb-4971-b3b5-7c0b898cf07c","Type":"ContainerDied","Data":"0875d8be1bdf57e78a21e99e38b8e031a6989e2cf5f767e3ceedb36940ee24dc"} Mar 09 03:02:31 crc kubenswrapper[4901]: I0309 03:02:31.757537 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0875d8be1bdf57e78a21e99e38b8e031a6989e2cf5f767e3ceedb36940ee24dc" Mar 09 03:02:31 crc kubenswrapper[4901]: I0309 03:02:31.757585 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-2mh7b" Mar 09 03:02:31 crc kubenswrapper[4901]: I0309 03:02:31.773283 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cq94g\" (UniqueName: \"kubernetes.io/projected/7acea3ec-d1eb-4971-b3b5-7c0b898cf07c-kube-api-access-cq94g\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:31 crc kubenswrapper[4901]: I0309 03:02:31.773316 4901 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7acea3ec-d1eb-4971-b3b5-7c0b898cf07c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:31 crc kubenswrapper[4901]: I0309 03:02:31.773349 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7acea3ec-d1eb-4971-b3b5-7c0b898cf07c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:32 crc kubenswrapper[4901]: I0309 03:02:32.784399 4901 generic.go:334] "Generic (PLEG): container finished" podID="e040c407-4b37-4bee-b200-0d97b5767ef1" containerID="e864148fe534dd0649b1839ddd987f99c31ebbf9373117d1b9294d3a433bc88e" exitCode=0 Mar 09 03:02:32 crc kubenswrapper[4901]: I0309 03:02:32.784513 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-24d57" event={"ID":"e040c407-4b37-4bee-b200-0d97b5767ef1","Type":"ContainerDied","Data":"e864148fe534dd0649b1839ddd987f99c31ebbf9373117d1b9294d3a433bc88e"} Mar 09 03:02:32 crc kubenswrapper[4901]: I0309 03:02:32.987064 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-875b9dd78-8t9g6"] Mar 09 03:02:32 crc kubenswrapper[4901]: E0309 03:02:32.987639 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e886d37-056b-4000-876d-881906f1e3b3" containerName="init" Mar 09 03:02:32 crc kubenswrapper[4901]: I0309 03:02:32.987655 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e886d37-056b-4000-876d-881906f1e3b3" containerName="init" Mar 09 03:02:32 crc kubenswrapper[4901]: E0309 03:02:32.987689 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7acea3ec-d1eb-4971-b3b5-7c0b898cf07c" containerName="barbican-db-sync" Mar 09 03:02:32 crc kubenswrapper[4901]: I0309 03:02:32.987694 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="7acea3ec-d1eb-4971-b3b5-7c0b898cf07c" containerName="barbican-db-sync" Mar 09 03:02:32 crc kubenswrapper[4901]: E0309 03:02:32.987710 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e886d37-056b-4000-876d-881906f1e3b3" containerName="dnsmasq-dns" Mar 09 03:02:32 crc kubenswrapper[4901]: I0309 03:02:32.987718 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e886d37-056b-4000-876d-881906f1e3b3" containerName="dnsmasq-dns" Mar 09 03:02:32 crc kubenswrapper[4901]: I0309 03:02:32.987875 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="7acea3ec-d1eb-4971-b3b5-7c0b898cf07c" containerName="barbican-db-sync" Mar 09 03:02:32 crc kubenswrapper[4901]: I0309 03:02:32.987902 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e886d37-056b-4000-876d-881906f1e3b3" containerName="dnsmasq-dns" Mar 09 03:02:32 crc kubenswrapper[4901]: I0309 03:02:32.988739 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-875b9dd78-8t9g6" Mar 09 03:02:32 crc kubenswrapper[4901]: I0309 03:02:32.991434 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 09 03:02:32 crc kubenswrapper[4901]: I0309 03:02:32.991659 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-7z846" Mar 09 03:02:32 crc kubenswrapper[4901]: I0309 03:02:32.993707 4901 scope.go:117] "RemoveContainer" containerID="b155cf8af242930d449d148a50eeb7478ddef3a68e54ecc4a4107560daad0045" Mar 09 03:02:32 crc kubenswrapper[4901]: I0309 03:02:32.994356 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.009745 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-875b9dd78-8t9g6"] Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.026442 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7d497f76dc-pptvt"] Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.054973 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7d497f76dc-pptvt" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.058946 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.087209 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7d497f76dc-pptvt"] Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.097175 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6790ccc5-8f7f-4de8-bd69-652661631307-config-data\") pod \"barbican-worker-7d497f76dc-pptvt\" (UID: \"6790ccc5-8f7f-4de8-bd69-652661631307\") " pod="openstack/barbican-worker-7d497f76dc-pptvt" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.097226 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6790ccc5-8f7f-4de8-bd69-652661631307-combined-ca-bundle\") pod \"barbican-worker-7d497f76dc-pptvt\" (UID: \"6790ccc5-8f7f-4de8-bd69-652661631307\") " pod="openstack/barbican-worker-7d497f76dc-pptvt" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.097274 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79g9h\" (UniqueName: \"kubernetes.io/projected/baa336b3-abdd-43e2-9c54-6d8d34c71204-kube-api-access-79g9h\") pod \"barbican-keystone-listener-875b9dd78-8t9g6\" (UID: \"baa336b3-abdd-43e2-9c54-6d8d34c71204\") " pod="openstack/barbican-keystone-listener-875b9dd78-8t9g6" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.097290 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6790ccc5-8f7f-4de8-bd69-652661631307-logs\") pod \"barbican-worker-7d497f76dc-pptvt\" (UID: \"6790ccc5-8f7f-4de8-bd69-652661631307\") " pod="openstack/barbican-worker-7d497f76dc-pptvt" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.097308 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6790ccc5-8f7f-4de8-bd69-652661631307-config-data-custom\") pod \"barbican-worker-7d497f76dc-pptvt\" (UID: \"6790ccc5-8f7f-4de8-bd69-652661631307\") " pod="openstack/barbican-worker-7d497f76dc-pptvt" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.097325 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/baa336b3-abdd-43e2-9c54-6d8d34c71204-config-data\") pod \"barbican-keystone-listener-875b9dd78-8t9g6\" (UID: \"baa336b3-abdd-43e2-9c54-6d8d34c71204\") " pod="openstack/barbican-keystone-listener-875b9dd78-8t9g6" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.097364 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/baa336b3-abdd-43e2-9c54-6d8d34c71204-combined-ca-bundle\") pod \"barbican-keystone-listener-875b9dd78-8t9g6\" (UID: \"baa336b3-abdd-43e2-9c54-6d8d34c71204\") " pod="openstack/barbican-keystone-listener-875b9dd78-8t9g6" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.097382 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdcnr\" (UniqueName: \"kubernetes.io/projected/6790ccc5-8f7f-4de8-bd69-652661631307-kube-api-access-wdcnr\") pod \"barbican-worker-7d497f76dc-pptvt\" (UID: \"6790ccc5-8f7f-4de8-bd69-652661631307\") " pod="openstack/barbican-worker-7d497f76dc-pptvt" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.097410 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/baa336b3-abdd-43e2-9c54-6d8d34c71204-config-data-custom\") pod \"barbican-keystone-listener-875b9dd78-8t9g6\" (UID: \"baa336b3-abdd-43e2-9c54-6d8d34c71204\") " pod="openstack/barbican-keystone-listener-875b9dd78-8t9g6" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.097443 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/baa336b3-abdd-43e2-9c54-6d8d34c71204-logs\") pod \"barbican-keystone-listener-875b9dd78-8t9g6\" (UID: \"baa336b3-abdd-43e2-9c54-6d8d34c71204\") " pod="openstack/barbican-keystone-listener-875b9dd78-8t9g6" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.111519 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55c649d8d5-bs2zk"] Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.113474 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55c649d8d5-bs2zk" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.158326 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55c649d8d5-bs2zk"] Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.193754 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6f87bb5fc6-qwprj"] Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.195108 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6f87bb5fc6-qwprj" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.199717 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.200085 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/baa336b3-abdd-43e2-9c54-6d8d34c71204-combined-ca-bundle\") pod \"barbican-keystone-listener-875b9dd78-8t9g6\" (UID: \"baa336b3-abdd-43e2-9c54-6d8d34c71204\") " pod="openstack/barbican-keystone-listener-875b9dd78-8t9g6" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.200120 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdcnr\" (UniqueName: \"kubernetes.io/projected/6790ccc5-8f7f-4de8-bd69-652661631307-kube-api-access-wdcnr\") pod \"barbican-worker-7d497f76dc-pptvt\" (UID: \"6790ccc5-8f7f-4de8-bd69-652661631307\") " pod="openstack/barbican-worker-7d497f76dc-pptvt" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.200145 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glpdj\" (UniqueName: \"kubernetes.io/projected/8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e-kube-api-access-glpdj\") pod \"dnsmasq-dns-55c649d8d5-bs2zk\" (UID: \"8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e\") " pod="openstack/dnsmasq-dns-55c649d8d5-bs2zk" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.200170 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/baa336b3-abdd-43e2-9c54-6d8d34c71204-config-data-custom\") pod \"barbican-keystone-listener-875b9dd78-8t9g6\" (UID: \"baa336b3-abdd-43e2-9c54-6d8d34c71204\") " pod="openstack/barbican-keystone-listener-875b9dd78-8t9g6" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.200186 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e-ovsdbserver-sb\") pod \"dnsmasq-dns-55c649d8d5-bs2zk\" (UID: \"8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e\") " pod="openstack/dnsmasq-dns-55c649d8d5-bs2zk" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.200204 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e-ovsdbserver-nb\") pod \"dnsmasq-dns-55c649d8d5-bs2zk\" (UID: \"8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e\") " pod="openstack/dnsmasq-dns-55c649d8d5-bs2zk" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.200236 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e-config\") pod \"dnsmasq-dns-55c649d8d5-bs2zk\" (UID: \"8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e\") " pod="openstack/dnsmasq-dns-55c649d8d5-bs2zk" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.200276 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/baa336b3-abdd-43e2-9c54-6d8d34c71204-logs\") pod \"barbican-keystone-listener-875b9dd78-8t9g6\" (UID: \"baa336b3-abdd-43e2-9c54-6d8d34c71204\") " pod="openstack/barbican-keystone-listener-875b9dd78-8t9g6" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.200300 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/907b295e-2a2e-4bb0-ba41-4aa445eb8a29-logs\") pod \"barbican-api-6f87bb5fc6-qwprj\" (UID: \"907b295e-2a2e-4bb0-ba41-4aa445eb8a29\") " pod="openstack/barbican-api-6f87bb5fc6-qwprj" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.200326 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/907b295e-2a2e-4bb0-ba41-4aa445eb8a29-combined-ca-bundle\") pod \"barbican-api-6f87bb5fc6-qwprj\" (UID: \"907b295e-2a2e-4bb0-ba41-4aa445eb8a29\") " pod="openstack/barbican-api-6f87bb5fc6-qwprj" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.200352 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6790ccc5-8f7f-4de8-bd69-652661631307-config-data\") pod \"barbican-worker-7d497f76dc-pptvt\" (UID: \"6790ccc5-8f7f-4de8-bd69-652661631307\") " pod="openstack/barbican-worker-7d497f76dc-pptvt" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.200371 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/907b295e-2a2e-4bb0-ba41-4aa445eb8a29-config-data\") pod \"barbican-api-6f87bb5fc6-qwprj\" (UID: \"907b295e-2a2e-4bb0-ba41-4aa445eb8a29\") " pod="openstack/barbican-api-6f87bb5fc6-qwprj" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.200390 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6790ccc5-8f7f-4de8-bd69-652661631307-combined-ca-bundle\") pod \"barbican-worker-7d497f76dc-pptvt\" (UID: \"6790ccc5-8f7f-4de8-bd69-652661631307\") " pod="openstack/barbican-worker-7d497f76dc-pptvt" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.200404 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e-dns-swift-storage-0\") pod \"dnsmasq-dns-55c649d8d5-bs2zk\" (UID: \"8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e\") " pod="openstack/dnsmasq-dns-55c649d8d5-bs2zk" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.200441 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/907b295e-2a2e-4bb0-ba41-4aa445eb8a29-config-data-custom\") pod \"barbican-api-6f87bb5fc6-qwprj\" (UID: \"907b295e-2a2e-4bb0-ba41-4aa445eb8a29\") " pod="openstack/barbican-api-6f87bb5fc6-qwprj" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.200474 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79g9h\" (UniqueName: \"kubernetes.io/projected/baa336b3-abdd-43e2-9c54-6d8d34c71204-kube-api-access-79g9h\") pod \"barbican-keystone-listener-875b9dd78-8t9g6\" (UID: \"baa336b3-abdd-43e2-9c54-6d8d34c71204\") " pod="openstack/barbican-keystone-listener-875b9dd78-8t9g6" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.200494 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6790ccc5-8f7f-4de8-bd69-652661631307-logs\") pod \"barbican-worker-7d497f76dc-pptvt\" (UID: \"6790ccc5-8f7f-4de8-bd69-652661631307\") " pod="openstack/barbican-worker-7d497f76dc-pptvt" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.200512 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6790ccc5-8f7f-4de8-bd69-652661631307-config-data-custom\") pod \"barbican-worker-7d497f76dc-pptvt\" (UID: \"6790ccc5-8f7f-4de8-bd69-652661631307\") " pod="openstack/barbican-worker-7d497f76dc-pptvt" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.200530 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzpqc\" (UniqueName: \"kubernetes.io/projected/907b295e-2a2e-4bb0-ba41-4aa445eb8a29-kube-api-access-bzpqc\") pod \"barbican-api-6f87bb5fc6-qwprj\" (UID: \"907b295e-2a2e-4bb0-ba41-4aa445eb8a29\") " pod="openstack/barbican-api-6f87bb5fc6-qwprj" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.200551 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/baa336b3-abdd-43e2-9c54-6d8d34c71204-config-data\") pod \"barbican-keystone-listener-875b9dd78-8t9g6\" (UID: \"baa336b3-abdd-43e2-9c54-6d8d34c71204\") " pod="openstack/barbican-keystone-listener-875b9dd78-8t9g6" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.200582 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e-dns-svc\") pod \"dnsmasq-dns-55c649d8d5-bs2zk\" (UID: \"8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e\") " pod="openstack/dnsmasq-dns-55c649d8d5-bs2zk" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.203259 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/baa336b3-abdd-43e2-9c54-6d8d34c71204-logs\") pod \"barbican-keystone-listener-875b9dd78-8t9g6\" (UID: \"baa336b3-abdd-43e2-9c54-6d8d34c71204\") " pod="openstack/barbican-keystone-listener-875b9dd78-8t9g6" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.204187 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6f87bb5fc6-qwprj"] Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.207561 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6790ccc5-8f7f-4de8-bd69-652661631307-logs\") pod \"barbican-worker-7d497f76dc-pptvt\" (UID: \"6790ccc5-8f7f-4de8-bd69-652661631307\") " pod="openstack/barbican-worker-7d497f76dc-pptvt" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.207821 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6790ccc5-8f7f-4de8-bd69-652661631307-combined-ca-bundle\") pod \"barbican-worker-7d497f76dc-pptvt\" (UID: \"6790ccc5-8f7f-4de8-bd69-652661631307\") " pod="openstack/barbican-worker-7d497f76dc-pptvt" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.207940 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/baa336b3-abdd-43e2-9c54-6d8d34c71204-config-data-custom\") pod \"barbican-keystone-listener-875b9dd78-8t9g6\" (UID: \"baa336b3-abdd-43e2-9c54-6d8d34c71204\") " pod="openstack/barbican-keystone-listener-875b9dd78-8t9g6" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.212502 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6790ccc5-8f7f-4de8-bd69-652661631307-config-data\") pod \"barbican-worker-7d497f76dc-pptvt\" (UID: \"6790ccc5-8f7f-4de8-bd69-652661631307\") " pod="openstack/barbican-worker-7d497f76dc-pptvt" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.213795 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6790ccc5-8f7f-4de8-bd69-652661631307-config-data-custom\") pod \"barbican-worker-7d497f76dc-pptvt\" (UID: \"6790ccc5-8f7f-4de8-bd69-652661631307\") " pod="openstack/barbican-worker-7d497f76dc-pptvt" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.215311 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/baa336b3-abdd-43e2-9c54-6d8d34c71204-config-data\") pod \"barbican-keystone-listener-875b9dd78-8t9g6\" (UID: \"baa336b3-abdd-43e2-9c54-6d8d34c71204\") " pod="openstack/barbican-keystone-listener-875b9dd78-8t9g6" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.229385 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/baa336b3-abdd-43e2-9c54-6d8d34c71204-combined-ca-bundle\") pod \"barbican-keystone-listener-875b9dd78-8t9g6\" (UID: \"baa336b3-abdd-43e2-9c54-6d8d34c71204\") " pod="openstack/barbican-keystone-listener-875b9dd78-8t9g6" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.233065 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdcnr\" (UniqueName: \"kubernetes.io/projected/6790ccc5-8f7f-4de8-bd69-652661631307-kube-api-access-wdcnr\") pod \"barbican-worker-7d497f76dc-pptvt\" (UID: \"6790ccc5-8f7f-4de8-bd69-652661631307\") " pod="openstack/barbican-worker-7d497f76dc-pptvt" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.234605 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79g9h\" (UniqueName: \"kubernetes.io/projected/baa336b3-abdd-43e2-9c54-6d8d34c71204-kube-api-access-79g9h\") pod \"barbican-keystone-listener-875b9dd78-8t9g6\" (UID: \"baa336b3-abdd-43e2-9c54-6d8d34c71204\") " pod="openstack/barbican-keystone-listener-875b9dd78-8t9g6" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.302669 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/907b295e-2a2e-4bb0-ba41-4aa445eb8a29-config-data\") pod \"barbican-api-6f87bb5fc6-qwprj\" (UID: \"907b295e-2a2e-4bb0-ba41-4aa445eb8a29\") " pod="openstack/barbican-api-6f87bb5fc6-qwprj" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.302715 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e-dns-swift-storage-0\") pod \"dnsmasq-dns-55c649d8d5-bs2zk\" (UID: \"8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e\") " pod="openstack/dnsmasq-dns-55c649d8d5-bs2zk" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.302743 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/907b295e-2a2e-4bb0-ba41-4aa445eb8a29-config-data-custom\") pod \"barbican-api-6f87bb5fc6-qwprj\" (UID: \"907b295e-2a2e-4bb0-ba41-4aa445eb8a29\") " pod="openstack/barbican-api-6f87bb5fc6-qwprj" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.302767 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzpqc\" (UniqueName: \"kubernetes.io/projected/907b295e-2a2e-4bb0-ba41-4aa445eb8a29-kube-api-access-bzpqc\") pod \"barbican-api-6f87bb5fc6-qwprj\" (UID: \"907b295e-2a2e-4bb0-ba41-4aa445eb8a29\") " pod="openstack/barbican-api-6f87bb5fc6-qwprj" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.302795 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e-dns-svc\") pod \"dnsmasq-dns-55c649d8d5-bs2zk\" (UID: \"8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e\") " pod="openstack/dnsmasq-dns-55c649d8d5-bs2zk" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.302835 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glpdj\" (UniqueName: \"kubernetes.io/projected/8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e-kube-api-access-glpdj\") pod \"dnsmasq-dns-55c649d8d5-bs2zk\" (UID: \"8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e\") " pod="openstack/dnsmasq-dns-55c649d8d5-bs2zk" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.302862 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e-ovsdbserver-sb\") pod \"dnsmasq-dns-55c649d8d5-bs2zk\" (UID: \"8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e\") " pod="openstack/dnsmasq-dns-55c649d8d5-bs2zk" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.302879 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e-ovsdbserver-nb\") pod \"dnsmasq-dns-55c649d8d5-bs2zk\" (UID: \"8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e\") " pod="openstack/dnsmasq-dns-55c649d8d5-bs2zk" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.302911 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e-config\") pod \"dnsmasq-dns-55c649d8d5-bs2zk\" (UID: \"8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e\") " pod="openstack/dnsmasq-dns-55c649d8d5-bs2zk" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.302935 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/907b295e-2a2e-4bb0-ba41-4aa445eb8a29-logs\") pod \"barbican-api-6f87bb5fc6-qwprj\" (UID: \"907b295e-2a2e-4bb0-ba41-4aa445eb8a29\") " pod="openstack/barbican-api-6f87bb5fc6-qwprj" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.302963 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/907b295e-2a2e-4bb0-ba41-4aa445eb8a29-combined-ca-bundle\") pod \"barbican-api-6f87bb5fc6-qwprj\" (UID: \"907b295e-2a2e-4bb0-ba41-4aa445eb8a29\") " pod="openstack/barbican-api-6f87bb5fc6-qwprj" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.304068 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e-dns-svc\") pod \"dnsmasq-dns-55c649d8d5-bs2zk\" (UID: \"8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e\") " pod="openstack/dnsmasq-dns-55c649d8d5-bs2zk" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.304641 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e-ovsdbserver-sb\") pod \"dnsmasq-dns-55c649d8d5-bs2zk\" (UID: \"8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e\") " pod="openstack/dnsmasq-dns-55c649d8d5-bs2zk" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.305214 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e-ovsdbserver-nb\") pod \"dnsmasq-dns-55c649d8d5-bs2zk\" (UID: \"8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e\") " pod="openstack/dnsmasq-dns-55c649d8d5-bs2zk" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.305490 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e-dns-swift-storage-0\") pod \"dnsmasq-dns-55c649d8d5-bs2zk\" (UID: \"8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e\") " pod="openstack/dnsmasq-dns-55c649d8d5-bs2zk" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.305754 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e-config\") pod \"dnsmasq-dns-55c649d8d5-bs2zk\" (UID: \"8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e\") " pod="openstack/dnsmasq-dns-55c649d8d5-bs2zk" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.305923 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/907b295e-2a2e-4bb0-ba41-4aa445eb8a29-combined-ca-bundle\") pod \"barbican-api-6f87bb5fc6-qwprj\" (UID: \"907b295e-2a2e-4bb0-ba41-4aa445eb8a29\") " pod="openstack/barbican-api-6f87bb5fc6-qwprj" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.305975 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/907b295e-2a2e-4bb0-ba41-4aa445eb8a29-logs\") pod \"barbican-api-6f87bb5fc6-qwprj\" (UID: \"907b295e-2a2e-4bb0-ba41-4aa445eb8a29\") " pod="openstack/barbican-api-6f87bb5fc6-qwprj" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.308079 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/907b295e-2a2e-4bb0-ba41-4aa445eb8a29-config-data\") pod \"barbican-api-6f87bb5fc6-qwprj\" (UID: \"907b295e-2a2e-4bb0-ba41-4aa445eb8a29\") " pod="openstack/barbican-api-6f87bb5fc6-qwprj" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.309856 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/907b295e-2a2e-4bb0-ba41-4aa445eb8a29-config-data-custom\") pod \"barbican-api-6f87bb5fc6-qwprj\" (UID: \"907b295e-2a2e-4bb0-ba41-4aa445eb8a29\") " pod="openstack/barbican-api-6f87bb5fc6-qwprj" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.318962 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glpdj\" (UniqueName: \"kubernetes.io/projected/8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e-kube-api-access-glpdj\") pod \"dnsmasq-dns-55c649d8d5-bs2zk\" (UID: \"8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e\") " pod="openstack/dnsmasq-dns-55c649d8d5-bs2zk" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.319205 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzpqc\" (UniqueName: \"kubernetes.io/projected/907b295e-2a2e-4bb0-ba41-4aa445eb8a29-kube-api-access-bzpqc\") pod \"barbican-api-6f87bb5fc6-qwprj\" (UID: \"907b295e-2a2e-4bb0-ba41-4aa445eb8a29\") " pod="openstack/barbican-api-6f87bb5fc6-qwprj" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.336211 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-875b9dd78-8t9g6" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.424896 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7d497f76dc-pptvt" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.428616 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55c649d8d5-bs2zk" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.564717 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6f87bb5fc6-qwprj" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.819131 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-875b9dd78-8t9g6"] Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.852724 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d0a22f21-beb0-44a1-943c-08547dc523a8" containerName="ceilometer-central-agent" containerID="cri-o://a048f67353dd52b46e0392d54861d5946c5eb7a4ed3aa23de4255953621009eb" gracePeriod=30 Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.852964 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d0a22f21-beb0-44a1-943c-08547dc523a8","Type":"ContainerStarted","Data":"96389301d6269e7091ac7b20fa6c1097f2bdac307d2e60bfe35648c8e26060b9"} Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.852999 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.853210 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d0a22f21-beb0-44a1-943c-08547dc523a8" containerName="proxy-httpd" containerID="cri-o://96389301d6269e7091ac7b20fa6c1097f2bdac307d2e60bfe35648c8e26060b9" gracePeriod=30 Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.853274 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d0a22f21-beb0-44a1-943c-08547dc523a8" containerName="sg-core" containerID="cri-o://69e8a54e428b6d310fb06f38db4fd281b827264f761e0bfe9127e0f12d067812" gracePeriod=30 Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.853310 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d0a22f21-beb0-44a1-943c-08547dc523a8" containerName="ceilometer-notification-agent" containerID="cri-o://84edc7847955c806b33a960202f07429804c90a4829772399bbe7190284967e8" gracePeriod=30 Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.890949 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.54588258 podStartE2EDuration="45.890932233s" podCreationTimestamp="2026-03-09 03:01:48 +0000 UTC" firstStartedPulling="2026-03-09 03:01:50.404370322 +0000 UTC m=+1234.994034054" lastFinishedPulling="2026-03-09 03:02:32.749419965 +0000 UTC m=+1277.339083707" observedRunningTime="2026-03-09 03:02:33.881918185 +0000 UTC m=+1278.471581917" watchObservedRunningTime="2026-03-09 03:02:33.890932233 +0000 UTC m=+1278.480595965" Mar 09 03:02:33 crc kubenswrapper[4901]: I0309 03:02:33.926377 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55c649d8d5-bs2zk"] Mar 09 03:02:34 crc kubenswrapper[4901]: I0309 03:02:34.162456 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7d497f76dc-pptvt"] Mar 09 03:02:34 crc kubenswrapper[4901]: I0309 03:02:34.344959 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-24d57" Mar 09 03:02:34 crc kubenswrapper[4901]: W0309 03:02:34.362049 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod907b295e_2a2e_4bb0_ba41_4aa445eb8a29.slice/crio-30824ba14fe53d011c60a0a56da541656b9ed79b46d94f5fc0a7b0560e8bad46 WatchSource:0}: Error finding container 30824ba14fe53d011c60a0a56da541656b9ed79b46d94f5fc0a7b0560e8bad46: Status 404 returned error can't find the container with id 30824ba14fe53d011c60a0a56da541656b9ed79b46d94f5fc0a7b0560e8bad46 Mar 09 03:02:34 crc kubenswrapper[4901]: I0309 03:02:34.366438 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6f87bb5fc6-qwprj"] Mar 09 03:02:34 crc kubenswrapper[4901]: I0309 03:02:34.531664 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e040c407-4b37-4bee-b200-0d97b5767ef1-combined-ca-bundle\") pod \"e040c407-4b37-4bee-b200-0d97b5767ef1\" (UID: \"e040c407-4b37-4bee-b200-0d97b5767ef1\") " Mar 09 03:02:34 crc kubenswrapper[4901]: I0309 03:02:34.532080 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e040c407-4b37-4bee-b200-0d97b5767ef1-scripts\") pod \"e040c407-4b37-4bee-b200-0d97b5767ef1\" (UID: \"e040c407-4b37-4bee-b200-0d97b5767ef1\") " Mar 09 03:02:34 crc kubenswrapper[4901]: I0309 03:02:34.532134 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e040c407-4b37-4bee-b200-0d97b5767ef1-db-sync-config-data\") pod \"e040c407-4b37-4bee-b200-0d97b5767ef1\" (UID: \"e040c407-4b37-4bee-b200-0d97b5767ef1\") " Mar 09 03:02:34 crc kubenswrapper[4901]: I0309 03:02:34.532301 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e040c407-4b37-4bee-b200-0d97b5767ef1-etc-machine-id\") pod \"e040c407-4b37-4bee-b200-0d97b5767ef1\" (UID: \"e040c407-4b37-4bee-b200-0d97b5767ef1\") " Mar 09 03:02:34 crc kubenswrapper[4901]: I0309 03:02:34.532360 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e040c407-4b37-4bee-b200-0d97b5767ef1-config-data\") pod \"e040c407-4b37-4bee-b200-0d97b5767ef1\" (UID: \"e040c407-4b37-4bee-b200-0d97b5767ef1\") " Mar 09 03:02:34 crc kubenswrapper[4901]: I0309 03:02:34.532396 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8fr5\" (UniqueName: \"kubernetes.io/projected/e040c407-4b37-4bee-b200-0d97b5767ef1-kube-api-access-r8fr5\") pod \"e040c407-4b37-4bee-b200-0d97b5767ef1\" (UID: \"e040c407-4b37-4bee-b200-0d97b5767ef1\") " Mar 09 03:02:34 crc kubenswrapper[4901]: I0309 03:02:34.532515 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e040c407-4b37-4bee-b200-0d97b5767ef1-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e040c407-4b37-4bee-b200-0d97b5767ef1" (UID: "e040c407-4b37-4bee-b200-0d97b5767ef1"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 03:02:34 crc kubenswrapper[4901]: I0309 03:02:34.536130 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e040c407-4b37-4bee-b200-0d97b5767ef1-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "e040c407-4b37-4bee-b200-0d97b5767ef1" (UID: "e040c407-4b37-4bee-b200-0d97b5767ef1"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:02:34 crc kubenswrapper[4901]: I0309 03:02:34.536440 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e040c407-4b37-4bee-b200-0d97b5767ef1-scripts" (OuterVolumeSpecName: "scripts") pod "e040c407-4b37-4bee-b200-0d97b5767ef1" (UID: "e040c407-4b37-4bee-b200-0d97b5767ef1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:02:34 crc kubenswrapper[4901]: I0309 03:02:34.549039 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e040c407-4b37-4bee-b200-0d97b5767ef1-kube-api-access-r8fr5" (OuterVolumeSpecName: "kube-api-access-r8fr5") pod "e040c407-4b37-4bee-b200-0d97b5767ef1" (UID: "e040c407-4b37-4bee-b200-0d97b5767ef1"). InnerVolumeSpecName "kube-api-access-r8fr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:02:34 crc kubenswrapper[4901]: I0309 03:02:34.560457 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e040c407-4b37-4bee-b200-0d97b5767ef1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e040c407-4b37-4bee-b200-0d97b5767ef1" (UID: "e040c407-4b37-4bee-b200-0d97b5767ef1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:02:34 crc kubenswrapper[4901]: I0309 03:02:34.600394 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e040c407-4b37-4bee-b200-0d97b5767ef1-config-data" (OuterVolumeSpecName: "config-data") pod "e040c407-4b37-4bee-b200-0d97b5767ef1" (UID: "e040c407-4b37-4bee-b200-0d97b5767ef1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:02:34 crc kubenswrapper[4901]: I0309 03:02:34.635278 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e040c407-4b37-4bee-b200-0d97b5767ef1-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:34 crc kubenswrapper[4901]: I0309 03:02:34.635310 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8fr5\" (UniqueName: \"kubernetes.io/projected/e040c407-4b37-4bee-b200-0d97b5767ef1-kube-api-access-r8fr5\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:34 crc kubenswrapper[4901]: I0309 03:02:34.635321 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e040c407-4b37-4bee-b200-0d97b5767ef1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:34 crc kubenswrapper[4901]: I0309 03:02:34.635331 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e040c407-4b37-4bee-b200-0d97b5767ef1-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:34 crc kubenswrapper[4901]: I0309 03:02:34.635339 4901 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e040c407-4b37-4bee-b200-0d97b5767ef1-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:34 crc kubenswrapper[4901]: I0309 03:02:34.635347 4901 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e040c407-4b37-4bee-b200-0d97b5767ef1-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:34 crc kubenswrapper[4901]: I0309 03:02:34.874782 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f87bb5fc6-qwprj" event={"ID":"907b295e-2a2e-4bb0-ba41-4aa445eb8a29","Type":"ContainerStarted","Data":"b0b1e1836b4a7e67bc4916199d7607a4524f8d7550ca48b13dd84f6ef5e20171"} Mar 09 03:02:34 crc kubenswrapper[4901]: I0309 03:02:34.875002 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6f87bb5fc6-qwprj" Mar 09 03:02:34 crc kubenswrapper[4901]: I0309 03:02:34.875012 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f87bb5fc6-qwprj" event={"ID":"907b295e-2a2e-4bb0-ba41-4aa445eb8a29","Type":"ContainerStarted","Data":"5db3a19e40c37a82437ff38abde51a4a6e1c83227d3710d42e45676f33438ee5"} Mar 09 03:02:34 crc kubenswrapper[4901]: I0309 03:02:34.875021 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f87bb5fc6-qwprj" event={"ID":"907b295e-2a2e-4bb0-ba41-4aa445eb8a29","Type":"ContainerStarted","Data":"30824ba14fe53d011c60a0a56da541656b9ed79b46d94f5fc0a7b0560e8bad46"} Mar 09 03:02:34 crc kubenswrapper[4901]: I0309 03:02:34.875127 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6f87bb5fc6-qwprj" Mar 09 03:02:34 crc kubenswrapper[4901]: I0309 03:02:34.879628 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-24d57" event={"ID":"e040c407-4b37-4bee-b200-0d97b5767ef1","Type":"ContainerDied","Data":"c4d80066ab01a9e40394d0f2dc80715b884a6e89d2fd47341d66b0347e3860bc"} Mar 09 03:02:34 crc kubenswrapper[4901]: I0309 03:02:34.879652 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4d80066ab01a9e40394d0f2dc80715b884a6e89d2fd47341d66b0347e3860bc" Mar 09 03:02:34 crc kubenswrapper[4901]: I0309 03:02:34.879706 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-24d57" Mar 09 03:02:34 crc kubenswrapper[4901]: I0309 03:02:34.882949 4901 generic.go:334] "Generic (PLEG): container finished" podID="8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e" containerID="dc3a599575b60a21c71f623e29f5f5b5582a07c90c9238d7f5e671558262e0e8" exitCode=0 Mar 09 03:02:34 crc kubenswrapper[4901]: I0309 03:02:34.882997 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55c649d8d5-bs2zk" event={"ID":"8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e","Type":"ContainerDied","Data":"dc3a599575b60a21c71f623e29f5f5b5582a07c90c9238d7f5e671558262e0e8"} Mar 09 03:02:34 crc kubenswrapper[4901]: I0309 03:02:34.883014 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55c649d8d5-bs2zk" event={"ID":"8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e","Type":"ContainerStarted","Data":"7743ff4a6782ca5cd749573ad52f1c5a6c2f9b4ba800487ffaad1961eb7aafea"} Mar 09 03:02:34 crc kubenswrapper[4901]: I0309 03:02:34.884773 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-875b9dd78-8t9g6" event={"ID":"baa336b3-abdd-43e2-9c54-6d8d34c71204","Type":"ContainerStarted","Data":"edffbabcd44631a243dfb610d1b83dcc827a66615b9e0f18860161ce19828a1c"} Mar 09 03:02:34 crc kubenswrapper[4901]: I0309 03:02:34.898884 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 03:02:34 crc kubenswrapper[4901]: I0309 03:02:34.899311 4901 generic.go:334] "Generic (PLEG): container finished" podID="d0a22f21-beb0-44a1-943c-08547dc523a8" containerID="96389301d6269e7091ac7b20fa6c1097f2bdac307d2e60bfe35648c8e26060b9" exitCode=0 Mar 09 03:02:34 crc kubenswrapper[4901]: I0309 03:02:34.899329 4901 generic.go:334] "Generic (PLEG): container finished" podID="d0a22f21-beb0-44a1-943c-08547dc523a8" containerID="69e8a54e428b6d310fb06f38db4fd281b827264f761e0bfe9127e0f12d067812" exitCode=2 Mar 09 03:02:34 crc kubenswrapper[4901]: I0309 03:02:34.899337 4901 generic.go:334] "Generic (PLEG): container finished" podID="d0a22f21-beb0-44a1-943c-08547dc523a8" containerID="84edc7847955c806b33a960202f07429804c90a4829772399bbe7190284967e8" exitCode=0 Mar 09 03:02:34 crc kubenswrapper[4901]: I0309 03:02:34.899344 4901 generic.go:334] "Generic (PLEG): container finished" podID="d0a22f21-beb0-44a1-943c-08547dc523a8" containerID="a048f67353dd52b46e0392d54861d5946c5eb7a4ed3aa23de4255953621009eb" exitCode=0 Mar 09 03:02:34 crc kubenswrapper[4901]: I0309 03:02:34.899380 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d0a22f21-beb0-44a1-943c-08547dc523a8","Type":"ContainerDied","Data":"96389301d6269e7091ac7b20fa6c1097f2bdac307d2e60bfe35648c8e26060b9"} Mar 09 03:02:34 crc kubenswrapper[4901]: I0309 03:02:34.899403 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d0a22f21-beb0-44a1-943c-08547dc523a8","Type":"ContainerDied","Data":"69e8a54e428b6d310fb06f38db4fd281b827264f761e0bfe9127e0f12d067812"} Mar 09 03:02:34 crc kubenswrapper[4901]: I0309 03:02:34.899413 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d0a22f21-beb0-44a1-943c-08547dc523a8","Type":"ContainerDied","Data":"84edc7847955c806b33a960202f07429804c90a4829772399bbe7190284967e8"} Mar 09 03:02:34 crc kubenswrapper[4901]: I0309 03:02:34.899422 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d0a22f21-beb0-44a1-943c-08547dc523a8","Type":"ContainerDied","Data":"a048f67353dd52b46e0392d54861d5946c5eb7a4ed3aa23de4255953621009eb"} Mar 09 03:02:34 crc kubenswrapper[4901]: I0309 03:02:34.899431 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d0a22f21-beb0-44a1-943c-08547dc523a8","Type":"ContainerDied","Data":"27224698f4f4a0df2a0d33bf1fec0814ffee80676a2af26cd1937b2e34d8651d"} Mar 09 03:02:34 crc kubenswrapper[4901]: I0309 03:02:34.899445 4901 scope.go:117] "RemoveContainer" containerID="96389301d6269e7091ac7b20fa6c1097f2bdac307d2e60bfe35648c8e26060b9" Mar 09 03:02:34 crc kubenswrapper[4901]: I0309 03:02:34.910797 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6f87bb5fc6-qwprj" podStartSLOduration=1.910777349 podStartE2EDuration="1.910777349s" podCreationTimestamp="2026-03-09 03:02:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 03:02:34.897575006 +0000 UTC m=+1279.487238728" watchObservedRunningTime="2026-03-09 03:02:34.910777349 +0000 UTC m=+1279.500441081" Mar 09 03:02:34 crc kubenswrapper[4901]: I0309 03:02:34.911825 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7d497f76dc-pptvt" event={"ID":"6790ccc5-8f7f-4de8-bd69-652661631307","Type":"ContainerStarted","Data":"bca77a73491ce63027ad0e0375b417d26e30eb5d643291def8d0accfd5d9c349"} Mar 09 03:02:34 crc kubenswrapper[4901]: I0309 03:02:34.944866 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0a22f21-beb0-44a1-943c-08547dc523a8-config-data\") pod \"d0a22f21-beb0-44a1-943c-08547dc523a8\" (UID: \"d0a22f21-beb0-44a1-943c-08547dc523a8\") " Mar 09 03:02:34 crc kubenswrapper[4901]: I0309 03:02:34.944938 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0a22f21-beb0-44a1-943c-08547dc523a8-scripts\") pod \"d0a22f21-beb0-44a1-943c-08547dc523a8\" (UID: \"d0a22f21-beb0-44a1-943c-08547dc523a8\") " Mar 09 03:02:34 crc kubenswrapper[4901]: I0309 03:02:34.944993 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0a22f21-beb0-44a1-943c-08547dc523a8-run-httpd\") pod \"d0a22f21-beb0-44a1-943c-08547dc523a8\" (UID: \"d0a22f21-beb0-44a1-943c-08547dc523a8\") " Mar 09 03:02:34 crc kubenswrapper[4901]: I0309 03:02:34.945016 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0a22f21-beb0-44a1-943c-08547dc523a8-log-httpd\") pod \"d0a22f21-beb0-44a1-943c-08547dc523a8\" (UID: \"d0a22f21-beb0-44a1-943c-08547dc523a8\") " Mar 09 03:02:34 crc kubenswrapper[4901]: I0309 03:02:34.945057 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0a22f21-beb0-44a1-943c-08547dc523a8-combined-ca-bundle\") pod \"d0a22f21-beb0-44a1-943c-08547dc523a8\" (UID: \"d0a22f21-beb0-44a1-943c-08547dc523a8\") " Mar 09 03:02:34 crc kubenswrapper[4901]: I0309 03:02:34.945142 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgtt9\" (UniqueName: \"kubernetes.io/projected/d0a22f21-beb0-44a1-943c-08547dc523a8-kube-api-access-tgtt9\") pod \"d0a22f21-beb0-44a1-943c-08547dc523a8\" (UID: \"d0a22f21-beb0-44a1-943c-08547dc523a8\") " Mar 09 03:02:34 crc kubenswrapper[4901]: I0309 03:02:34.945247 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d0a22f21-beb0-44a1-943c-08547dc523a8-sg-core-conf-yaml\") pod \"d0a22f21-beb0-44a1-943c-08547dc523a8\" (UID: \"d0a22f21-beb0-44a1-943c-08547dc523a8\") " Mar 09 03:02:34 crc kubenswrapper[4901]: I0309 03:02:34.956919 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0a22f21-beb0-44a1-943c-08547dc523a8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d0a22f21-beb0-44a1-943c-08547dc523a8" (UID: "d0a22f21-beb0-44a1-943c-08547dc523a8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:02:34 crc kubenswrapper[4901]: I0309 03:02:34.975570 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0a22f21-beb0-44a1-943c-08547dc523a8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d0a22f21-beb0-44a1-943c-08547dc523a8" (UID: "d0a22f21-beb0-44a1-943c-08547dc523a8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:02:34 crc kubenswrapper[4901]: I0309 03:02:34.986161 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0a22f21-beb0-44a1-943c-08547dc523a8-kube-api-access-tgtt9" (OuterVolumeSpecName: "kube-api-access-tgtt9") pod "d0a22f21-beb0-44a1-943c-08547dc523a8" (UID: "d0a22f21-beb0-44a1-943c-08547dc523a8"). InnerVolumeSpecName "kube-api-access-tgtt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:02:34 crc kubenswrapper[4901]: I0309 03:02:34.986170 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0a22f21-beb0-44a1-943c-08547dc523a8-scripts" (OuterVolumeSpecName: "scripts") pod "d0a22f21-beb0-44a1-943c-08547dc523a8" (UID: "d0a22f21-beb0-44a1-943c-08547dc523a8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.048157 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0a22f21-beb0-44a1-943c-08547dc523a8-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.048181 4901 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0a22f21-beb0-44a1-943c-08547dc523a8-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.048192 4901 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0a22f21-beb0-44a1-943c-08547dc523a8-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.048200 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgtt9\" (UniqueName: \"kubernetes.io/projected/d0a22f21-beb0-44a1-943c-08547dc523a8-kube-api-access-tgtt9\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.049407 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0a22f21-beb0-44a1-943c-08547dc523a8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d0a22f21-beb0-44a1-943c-08547dc523a8" (UID: "d0a22f21-beb0-44a1-943c-08547dc523a8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.081587 4901 scope.go:117] "RemoveContainer" containerID="69e8a54e428b6d310fb06f38db4fd281b827264f761e0bfe9127e0f12d067812" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.138129 4901 scope.go:117] "RemoveContainer" containerID="84edc7847955c806b33a960202f07429804c90a4829772399bbe7190284967e8" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.151796 4901 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d0a22f21-beb0-44a1-943c-08547dc523a8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.189066 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 03:02:35 crc kubenswrapper[4901]: E0309 03:02:35.189462 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e040c407-4b37-4bee-b200-0d97b5767ef1" containerName="cinder-db-sync" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.189480 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="e040c407-4b37-4bee-b200-0d97b5767ef1" containerName="cinder-db-sync" Mar 09 03:02:35 crc kubenswrapper[4901]: E0309 03:02:35.189489 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0a22f21-beb0-44a1-943c-08547dc523a8" containerName="sg-core" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.189496 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a22f21-beb0-44a1-943c-08547dc523a8" containerName="sg-core" Mar 09 03:02:35 crc kubenswrapper[4901]: E0309 03:02:35.189514 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0a22f21-beb0-44a1-943c-08547dc523a8" containerName="ceilometer-central-agent" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.189521 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a22f21-beb0-44a1-943c-08547dc523a8" containerName="ceilometer-central-agent" Mar 09 03:02:35 crc kubenswrapper[4901]: E0309 03:02:35.189534 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0a22f21-beb0-44a1-943c-08547dc523a8" containerName="ceilometer-notification-agent" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.189540 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a22f21-beb0-44a1-943c-08547dc523a8" containerName="ceilometer-notification-agent" Mar 09 03:02:35 crc kubenswrapper[4901]: E0309 03:02:35.189557 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0a22f21-beb0-44a1-943c-08547dc523a8" containerName="proxy-httpd" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.189562 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a22f21-beb0-44a1-943c-08547dc523a8" containerName="proxy-httpd" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.189730 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0a22f21-beb0-44a1-943c-08547dc523a8" containerName="sg-core" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.189749 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="e040c407-4b37-4bee-b200-0d97b5767ef1" containerName="cinder-db-sync" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.189763 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0a22f21-beb0-44a1-943c-08547dc523a8" containerName="ceilometer-notification-agent" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.189775 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0a22f21-beb0-44a1-943c-08547dc523a8" containerName="proxy-httpd" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.189791 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0a22f21-beb0-44a1-943c-08547dc523a8" containerName="ceilometer-central-agent" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.190609 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.197414 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.197623 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-f9knf" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.197762 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.198407 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 09 03:02:35 crc kubenswrapper[4901]: E0309 03:02:35.198439 4901 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode040c407_4b37_4bee_b200_0d97b5767ef1.slice/crio-c4d80066ab01a9e40394d0f2dc80715b884a6e89d2fd47341d66b0347e3860bc\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode040c407_4b37_4bee_b200_0d97b5767ef1.slice\": RecentStats: unable to find data in memory cache]" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.205039 4901 scope.go:117] "RemoveContainer" containerID="a048f67353dd52b46e0392d54861d5946c5eb7a4ed3aa23de4255953621009eb" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.205708 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.246918 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55c649d8d5-bs2zk"] Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.253110 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/114dc2da-7bec-47de-a6f4-a37e2c56e1f8-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"114dc2da-7bec-47de-a6f4-a37e2c56e1f8\") " pod="openstack/cinder-scheduler-0" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.253232 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj2z5\" (UniqueName: \"kubernetes.io/projected/114dc2da-7bec-47de-a6f4-a37e2c56e1f8-kube-api-access-mj2z5\") pod \"cinder-scheduler-0\" (UID: \"114dc2da-7bec-47de-a6f4-a37e2c56e1f8\") " pod="openstack/cinder-scheduler-0" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.253597 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/114dc2da-7bec-47de-a6f4-a37e2c56e1f8-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"114dc2da-7bec-47de-a6f4-a37e2c56e1f8\") " pod="openstack/cinder-scheduler-0" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.253716 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/114dc2da-7bec-47de-a6f4-a37e2c56e1f8-scripts\") pod \"cinder-scheduler-0\" (UID: \"114dc2da-7bec-47de-a6f4-a37e2c56e1f8\") " pod="openstack/cinder-scheduler-0" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.253816 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/114dc2da-7bec-47de-a6f4-a37e2c56e1f8-config-data\") pod \"cinder-scheduler-0\" (UID: \"114dc2da-7bec-47de-a6f4-a37e2c56e1f8\") " pod="openstack/cinder-scheduler-0" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.254158 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/114dc2da-7bec-47de-a6f4-a37e2c56e1f8-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"114dc2da-7bec-47de-a6f4-a37e2c56e1f8\") " pod="openstack/cinder-scheduler-0" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.259655 4901 scope.go:117] "RemoveContainer" containerID="96389301d6269e7091ac7b20fa6c1097f2bdac307d2e60bfe35648c8e26060b9" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.259919 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0a22f21-beb0-44a1-943c-08547dc523a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d0a22f21-beb0-44a1-943c-08547dc523a8" (UID: "d0a22f21-beb0-44a1-943c-08547dc523a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:02:35 crc kubenswrapper[4901]: E0309 03:02:35.264047 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96389301d6269e7091ac7b20fa6c1097f2bdac307d2e60bfe35648c8e26060b9\": container with ID starting with 96389301d6269e7091ac7b20fa6c1097f2bdac307d2e60bfe35648c8e26060b9 not found: ID does not exist" containerID="96389301d6269e7091ac7b20fa6c1097f2bdac307d2e60bfe35648c8e26060b9" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.264087 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96389301d6269e7091ac7b20fa6c1097f2bdac307d2e60bfe35648c8e26060b9"} err="failed to get container status \"96389301d6269e7091ac7b20fa6c1097f2bdac307d2e60bfe35648c8e26060b9\": rpc error: code = NotFound desc = could not find container \"96389301d6269e7091ac7b20fa6c1097f2bdac307d2e60bfe35648c8e26060b9\": container with ID starting with 96389301d6269e7091ac7b20fa6c1097f2bdac307d2e60bfe35648c8e26060b9 not found: ID does not exist" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.264111 4901 scope.go:117] "RemoveContainer" containerID="69e8a54e428b6d310fb06f38db4fd281b827264f761e0bfe9127e0f12d067812" Mar 09 03:02:35 crc kubenswrapper[4901]: E0309 03:02:35.264458 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69e8a54e428b6d310fb06f38db4fd281b827264f761e0bfe9127e0f12d067812\": container with ID starting with 69e8a54e428b6d310fb06f38db4fd281b827264f761e0bfe9127e0f12d067812 not found: ID does not exist" containerID="69e8a54e428b6d310fb06f38db4fd281b827264f761e0bfe9127e0f12d067812" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.264478 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69e8a54e428b6d310fb06f38db4fd281b827264f761e0bfe9127e0f12d067812"} err="failed to get container status \"69e8a54e428b6d310fb06f38db4fd281b827264f761e0bfe9127e0f12d067812\": rpc error: code = NotFound desc = could not find container \"69e8a54e428b6d310fb06f38db4fd281b827264f761e0bfe9127e0f12d067812\": container with ID starting with 69e8a54e428b6d310fb06f38db4fd281b827264f761e0bfe9127e0f12d067812 not found: ID does not exist" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.264491 4901 scope.go:117] "RemoveContainer" containerID="84edc7847955c806b33a960202f07429804c90a4829772399bbe7190284967e8" Mar 09 03:02:35 crc kubenswrapper[4901]: E0309 03:02:35.264735 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84edc7847955c806b33a960202f07429804c90a4829772399bbe7190284967e8\": container with ID starting with 84edc7847955c806b33a960202f07429804c90a4829772399bbe7190284967e8 not found: ID does not exist" containerID="84edc7847955c806b33a960202f07429804c90a4829772399bbe7190284967e8" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.264761 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84edc7847955c806b33a960202f07429804c90a4829772399bbe7190284967e8"} err="failed to get container status \"84edc7847955c806b33a960202f07429804c90a4829772399bbe7190284967e8\": rpc error: code = NotFound desc = could not find container \"84edc7847955c806b33a960202f07429804c90a4829772399bbe7190284967e8\": container with ID starting with 84edc7847955c806b33a960202f07429804c90a4829772399bbe7190284967e8 not found: ID does not exist" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.264772 4901 scope.go:117] "RemoveContainer" containerID="a048f67353dd52b46e0392d54861d5946c5eb7a4ed3aa23de4255953621009eb" Mar 09 03:02:35 crc kubenswrapper[4901]: E0309 03:02:35.267036 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a048f67353dd52b46e0392d54861d5946c5eb7a4ed3aa23de4255953621009eb\": container with ID starting with a048f67353dd52b46e0392d54861d5946c5eb7a4ed3aa23de4255953621009eb not found: ID does not exist" containerID="a048f67353dd52b46e0392d54861d5946c5eb7a4ed3aa23de4255953621009eb" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.267068 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a048f67353dd52b46e0392d54861d5946c5eb7a4ed3aa23de4255953621009eb"} err="failed to get container status \"a048f67353dd52b46e0392d54861d5946c5eb7a4ed3aa23de4255953621009eb\": rpc error: code = NotFound desc = could not find container \"a048f67353dd52b46e0392d54861d5946c5eb7a4ed3aa23de4255953621009eb\": container with ID starting with a048f67353dd52b46e0392d54861d5946c5eb7a4ed3aa23de4255953621009eb not found: ID does not exist" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.267083 4901 scope.go:117] "RemoveContainer" containerID="96389301d6269e7091ac7b20fa6c1097f2bdac307d2e60bfe35648c8e26060b9" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.267336 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96389301d6269e7091ac7b20fa6c1097f2bdac307d2e60bfe35648c8e26060b9"} err="failed to get container status \"96389301d6269e7091ac7b20fa6c1097f2bdac307d2e60bfe35648c8e26060b9\": rpc error: code = NotFound desc = could not find container \"96389301d6269e7091ac7b20fa6c1097f2bdac307d2e60bfe35648c8e26060b9\": container with ID starting with 96389301d6269e7091ac7b20fa6c1097f2bdac307d2e60bfe35648c8e26060b9 not found: ID does not exist" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.267363 4901 scope.go:117] "RemoveContainer" containerID="69e8a54e428b6d310fb06f38db4fd281b827264f761e0bfe9127e0f12d067812" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.269432 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69e8a54e428b6d310fb06f38db4fd281b827264f761e0bfe9127e0f12d067812"} err="failed to get container status \"69e8a54e428b6d310fb06f38db4fd281b827264f761e0bfe9127e0f12d067812\": rpc error: code = NotFound desc = could not find container \"69e8a54e428b6d310fb06f38db4fd281b827264f761e0bfe9127e0f12d067812\": container with ID starting with 69e8a54e428b6d310fb06f38db4fd281b827264f761e0bfe9127e0f12d067812 not found: ID does not exist" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.269455 4901 scope.go:117] "RemoveContainer" containerID="84edc7847955c806b33a960202f07429804c90a4829772399bbe7190284967e8" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.269788 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84edc7847955c806b33a960202f07429804c90a4829772399bbe7190284967e8"} err="failed to get container status \"84edc7847955c806b33a960202f07429804c90a4829772399bbe7190284967e8\": rpc error: code = NotFound desc = could not find container \"84edc7847955c806b33a960202f07429804c90a4829772399bbe7190284967e8\": container with ID starting with 84edc7847955c806b33a960202f07429804c90a4829772399bbe7190284967e8 not found: ID does not exist" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.269809 4901 scope.go:117] "RemoveContainer" containerID="a048f67353dd52b46e0392d54861d5946c5eb7a4ed3aa23de4255953621009eb" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.275356 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0a22f21-beb0-44a1-943c-08547dc523a8-config-data" (OuterVolumeSpecName: "config-data") pod "d0a22f21-beb0-44a1-943c-08547dc523a8" (UID: "d0a22f21-beb0-44a1-943c-08547dc523a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.275481 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a048f67353dd52b46e0392d54861d5946c5eb7a4ed3aa23de4255953621009eb"} err="failed to get container status \"a048f67353dd52b46e0392d54861d5946c5eb7a4ed3aa23de4255953621009eb\": rpc error: code = NotFound desc = could not find container \"a048f67353dd52b46e0392d54861d5946c5eb7a4ed3aa23de4255953621009eb\": container with ID starting with a048f67353dd52b46e0392d54861d5946c5eb7a4ed3aa23de4255953621009eb not found: ID does not exist" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.275539 4901 scope.go:117] "RemoveContainer" containerID="96389301d6269e7091ac7b20fa6c1097f2bdac307d2e60bfe35648c8e26060b9" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.282619 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96389301d6269e7091ac7b20fa6c1097f2bdac307d2e60bfe35648c8e26060b9"} err="failed to get container status \"96389301d6269e7091ac7b20fa6c1097f2bdac307d2e60bfe35648c8e26060b9\": rpc error: code = NotFound desc = could not find container \"96389301d6269e7091ac7b20fa6c1097f2bdac307d2e60bfe35648c8e26060b9\": container with ID starting with 96389301d6269e7091ac7b20fa6c1097f2bdac307d2e60bfe35648c8e26060b9 not found: ID does not exist" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.282692 4901 scope.go:117] "RemoveContainer" containerID="69e8a54e428b6d310fb06f38db4fd281b827264f761e0bfe9127e0f12d067812" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.283199 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69e8a54e428b6d310fb06f38db4fd281b827264f761e0bfe9127e0f12d067812"} err="failed to get container status \"69e8a54e428b6d310fb06f38db4fd281b827264f761e0bfe9127e0f12d067812\": rpc error: code = NotFound desc = could not find container \"69e8a54e428b6d310fb06f38db4fd281b827264f761e0bfe9127e0f12d067812\": container with ID starting with 69e8a54e428b6d310fb06f38db4fd281b827264f761e0bfe9127e0f12d067812 not found: ID does not exist" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.283252 4901 scope.go:117] "RemoveContainer" containerID="84edc7847955c806b33a960202f07429804c90a4829772399bbe7190284967e8" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.283584 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84edc7847955c806b33a960202f07429804c90a4829772399bbe7190284967e8"} err="failed to get container status \"84edc7847955c806b33a960202f07429804c90a4829772399bbe7190284967e8\": rpc error: code = NotFound desc = could not find container \"84edc7847955c806b33a960202f07429804c90a4829772399bbe7190284967e8\": container with ID starting with 84edc7847955c806b33a960202f07429804c90a4829772399bbe7190284967e8 not found: ID does not exist" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.283628 4901 scope.go:117] "RemoveContainer" containerID="a048f67353dd52b46e0392d54861d5946c5eb7a4ed3aa23de4255953621009eb" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.283917 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a048f67353dd52b46e0392d54861d5946c5eb7a4ed3aa23de4255953621009eb"} err="failed to get container status \"a048f67353dd52b46e0392d54861d5946c5eb7a4ed3aa23de4255953621009eb\": rpc error: code = NotFound desc = could not find container \"a048f67353dd52b46e0392d54861d5946c5eb7a4ed3aa23de4255953621009eb\": container with ID starting with a048f67353dd52b46e0392d54861d5946c5eb7a4ed3aa23de4255953621009eb not found: ID does not exist" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.283967 4901 scope.go:117] "RemoveContainer" containerID="96389301d6269e7091ac7b20fa6c1097f2bdac307d2e60bfe35648c8e26060b9" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.284180 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96389301d6269e7091ac7b20fa6c1097f2bdac307d2e60bfe35648c8e26060b9"} err="failed to get container status \"96389301d6269e7091ac7b20fa6c1097f2bdac307d2e60bfe35648c8e26060b9\": rpc error: code = NotFound desc = could not find container \"96389301d6269e7091ac7b20fa6c1097f2bdac307d2e60bfe35648c8e26060b9\": container with ID starting with 96389301d6269e7091ac7b20fa6c1097f2bdac307d2e60bfe35648c8e26060b9 not found: ID does not exist" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.284196 4901 scope.go:117] "RemoveContainer" containerID="69e8a54e428b6d310fb06f38db4fd281b827264f761e0bfe9127e0f12d067812" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.284430 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69e8a54e428b6d310fb06f38db4fd281b827264f761e0bfe9127e0f12d067812"} err="failed to get container status \"69e8a54e428b6d310fb06f38db4fd281b827264f761e0bfe9127e0f12d067812\": rpc error: code = NotFound desc = could not find container \"69e8a54e428b6d310fb06f38db4fd281b827264f761e0bfe9127e0f12d067812\": container with ID starting with 69e8a54e428b6d310fb06f38db4fd281b827264f761e0bfe9127e0f12d067812 not found: ID does not exist" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.284462 4901 scope.go:117] "RemoveContainer" containerID="84edc7847955c806b33a960202f07429804c90a4829772399bbe7190284967e8" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.284667 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84edc7847955c806b33a960202f07429804c90a4829772399bbe7190284967e8"} err="failed to get container status \"84edc7847955c806b33a960202f07429804c90a4829772399bbe7190284967e8\": rpc error: code = NotFound desc = could not find container \"84edc7847955c806b33a960202f07429804c90a4829772399bbe7190284967e8\": container with ID starting with 84edc7847955c806b33a960202f07429804c90a4829772399bbe7190284967e8 not found: ID does not exist" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.284683 4901 scope.go:117] "RemoveContainer" containerID="a048f67353dd52b46e0392d54861d5946c5eb7a4ed3aa23de4255953621009eb" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.284862 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a048f67353dd52b46e0392d54861d5946c5eb7a4ed3aa23de4255953621009eb"} err="failed to get container status \"a048f67353dd52b46e0392d54861d5946c5eb7a4ed3aa23de4255953621009eb\": rpc error: code = NotFound desc = could not find container \"a048f67353dd52b46e0392d54861d5946c5eb7a4ed3aa23de4255953621009eb\": container with ID starting with a048f67353dd52b46e0392d54861d5946c5eb7a4ed3aa23de4255953621009eb not found: ID does not exist" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.299526 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-765c5b6b49-64hwd"] Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.301058 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-765c5b6b49-64hwd" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.339621 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-765c5b6b49-64hwd"] Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.354736 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/114dc2da-7bec-47de-a6f4-a37e2c56e1f8-config-data\") pod \"cinder-scheduler-0\" (UID: \"114dc2da-7bec-47de-a6f4-a37e2c56e1f8\") " pod="openstack/cinder-scheduler-0" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.354794 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2224855e-40d6-45cd-b001-18e3cc94610d-config\") pod \"dnsmasq-dns-765c5b6b49-64hwd\" (UID: \"2224855e-40d6-45cd-b001-18e3cc94610d\") " pod="openstack/dnsmasq-dns-765c5b6b49-64hwd" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.354813 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2224855e-40d6-45cd-b001-18e3cc94610d-ovsdbserver-nb\") pod \"dnsmasq-dns-765c5b6b49-64hwd\" (UID: \"2224855e-40d6-45cd-b001-18e3cc94610d\") " pod="openstack/dnsmasq-dns-765c5b6b49-64hwd" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.354836 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/114dc2da-7bec-47de-a6f4-a37e2c56e1f8-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"114dc2da-7bec-47de-a6f4-a37e2c56e1f8\") " pod="openstack/cinder-scheduler-0" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.354892 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2224855e-40d6-45cd-b001-18e3cc94610d-ovsdbserver-sb\") pod \"dnsmasq-dns-765c5b6b49-64hwd\" (UID: \"2224855e-40d6-45cd-b001-18e3cc94610d\") " pod="openstack/dnsmasq-dns-765c5b6b49-64hwd" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.354932 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/114dc2da-7bec-47de-a6f4-a37e2c56e1f8-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"114dc2da-7bec-47de-a6f4-a37e2c56e1f8\") " pod="openstack/cinder-scheduler-0" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.354961 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mj2z5\" (UniqueName: \"kubernetes.io/projected/114dc2da-7bec-47de-a6f4-a37e2c56e1f8-kube-api-access-mj2z5\") pod \"cinder-scheduler-0\" (UID: \"114dc2da-7bec-47de-a6f4-a37e2c56e1f8\") " pod="openstack/cinder-scheduler-0" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.354979 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw2br\" (UniqueName: \"kubernetes.io/projected/2224855e-40d6-45cd-b001-18e3cc94610d-kube-api-access-hw2br\") pod \"dnsmasq-dns-765c5b6b49-64hwd\" (UID: \"2224855e-40d6-45cd-b001-18e3cc94610d\") " pod="openstack/dnsmasq-dns-765c5b6b49-64hwd" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.354995 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/114dc2da-7bec-47de-a6f4-a37e2c56e1f8-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"114dc2da-7bec-47de-a6f4-a37e2c56e1f8\") " pod="openstack/cinder-scheduler-0" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.355025 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2224855e-40d6-45cd-b001-18e3cc94610d-dns-swift-storage-0\") pod \"dnsmasq-dns-765c5b6b49-64hwd\" (UID: \"2224855e-40d6-45cd-b001-18e3cc94610d\") " pod="openstack/dnsmasq-dns-765c5b6b49-64hwd" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.355043 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2224855e-40d6-45cd-b001-18e3cc94610d-dns-svc\") pod \"dnsmasq-dns-765c5b6b49-64hwd\" (UID: \"2224855e-40d6-45cd-b001-18e3cc94610d\") " pod="openstack/dnsmasq-dns-765c5b6b49-64hwd" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.355081 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/114dc2da-7bec-47de-a6f4-a37e2c56e1f8-scripts\") pod \"cinder-scheduler-0\" (UID: \"114dc2da-7bec-47de-a6f4-a37e2c56e1f8\") " pod="openstack/cinder-scheduler-0" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.355127 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0a22f21-beb0-44a1-943c-08547dc523a8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.355138 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0a22f21-beb0-44a1-943c-08547dc523a8-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.355624 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/114dc2da-7bec-47de-a6f4-a37e2c56e1f8-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"114dc2da-7bec-47de-a6f4-a37e2c56e1f8\") " pod="openstack/cinder-scheduler-0" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.359562 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/114dc2da-7bec-47de-a6f4-a37e2c56e1f8-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"114dc2da-7bec-47de-a6f4-a37e2c56e1f8\") " pod="openstack/cinder-scheduler-0" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.361788 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/114dc2da-7bec-47de-a6f4-a37e2c56e1f8-scripts\") pod \"cinder-scheduler-0\" (UID: \"114dc2da-7bec-47de-a6f4-a37e2c56e1f8\") " pod="openstack/cinder-scheduler-0" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.361867 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/114dc2da-7bec-47de-a6f4-a37e2c56e1f8-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"114dc2da-7bec-47de-a6f4-a37e2c56e1f8\") " pod="openstack/cinder-scheduler-0" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.362116 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/114dc2da-7bec-47de-a6f4-a37e2c56e1f8-config-data\") pod \"cinder-scheduler-0\" (UID: \"114dc2da-7bec-47de-a6f4-a37e2c56e1f8\") " pod="openstack/cinder-scheduler-0" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.381299 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj2z5\" (UniqueName: \"kubernetes.io/projected/114dc2da-7bec-47de-a6f4-a37e2c56e1f8-kube-api-access-mj2z5\") pod \"cinder-scheduler-0\" (UID: \"114dc2da-7bec-47de-a6f4-a37e2c56e1f8\") " pod="openstack/cinder-scheduler-0" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.447507 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.448937 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.454308 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.455702 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a49290e-8dbd-47d0-9fa6-babbf3b53b96-scripts\") pod \"cinder-api-0\" (UID: \"3a49290e-8dbd-47d0-9fa6-babbf3b53b96\") " pod="openstack/cinder-api-0" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.455735 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2224855e-40d6-45cd-b001-18e3cc94610d-dns-swift-storage-0\") pod \"dnsmasq-dns-765c5b6b49-64hwd\" (UID: \"2224855e-40d6-45cd-b001-18e3cc94610d\") " pod="openstack/dnsmasq-dns-765c5b6b49-64hwd" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.455830 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2224855e-40d6-45cd-b001-18e3cc94610d-dns-svc\") pod \"dnsmasq-dns-765c5b6b49-64hwd\" (UID: \"2224855e-40d6-45cd-b001-18e3cc94610d\") " pod="openstack/dnsmasq-dns-765c5b6b49-64hwd" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.455981 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3a49290e-8dbd-47d0-9fa6-babbf3b53b96-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3a49290e-8dbd-47d0-9fa6-babbf3b53b96\") " pod="openstack/cinder-api-0" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.456009 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a49290e-8dbd-47d0-9fa6-babbf3b53b96-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3a49290e-8dbd-47d0-9fa6-babbf3b53b96\") " pod="openstack/cinder-api-0" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.456055 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a49290e-8dbd-47d0-9fa6-babbf3b53b96-logs\") pod \"cinder-api-0\" (UID: \"3a49290e-8dbd-47d0-9fa6-babbf3b53b96\") " pod="openstack/cinder-api-0" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.456096 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bpp9\" (UniqueName: \"kubernetes.io/projected/3a49290e-8dbd-47d0-9fa6-babbf3b53b96-kube-api-access-6bpp9\") pod \"cinder-api-0\" (UID: \"3a49290e-8dbd-47d0-9fa6-babbf3b53b96\") " pod="openstack/cinder-api-0" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.456171 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3a49290e-8dbd-47d0-9fa6-babbf3b53b96-config-data-custom\") pod \"cinder-api-0\" (UID: \"3a49290e-8dbd-47d0-9fa6-babbf3b53b96\") " pod="openstack/cinder-api-0" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.456253 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2224855e-40d6-45cd-b001-18e3cc94610d-config\") pod \"dnsmasq-dns-765c5b6b49-64hwd\" (UID: \"2224855e-40d6-45cd-b001-18e3cc94610d\") " pod="openstack/dnsmasq-dns-765c5b6b49-64hwd" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.456269 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2224855e-40d6-45cd-b001-18e3cc94610d-ovsdbserver-nb\") pod \"dnsmasq-dns-765c5b6b49-64hwd\" (UID: \"2224855e-40d6-45cd-b001-18e3cc94610d\") " pod="openstack/dnsmasq-dns-765c5b6b49-64hwd" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.456326 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2224855e-40d6-45cd-b001-18e3cc94610d-ovsdbserver-sb\") pod \"dnsmasq-dns-765c5b6b49-64hwd\" (UID: \"2224855e-40d6-45cd-b001-18e3cc94610d\") " pod="openstack/dnsmasq-dns-765c5b6b49-64hwd" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.456527 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a49290e-8dbd-47d0-9fa6-babbf3b53b96-config-data\") pod \"cinder-api-0\" (UID: \"3a49290e-8dbd-47d0-9fa6-babbf3b53b96\") " pod="openstack/cinder-api-0" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.456659 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hw2br\" (UniqueName: \"kubernetes.io/projected/2224855e-40d6-45cd-b001-18e3cc94610d-kube-api-access-hw2br\") pod \"dnsmasq-dns-765c5b6b49-64hwd\" (UID: \"2224855e-40d6-45cd-b001-18e3cc94610d\") " pod="openstack/dnsmasq-dns-765c5b6b49-64hwd" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.456551 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2224855e-40d6-45cd-b001-18e3cc94610d-dns-swift-storage-0\") pod \"dnsmasq-dns-765c5b6b49-64hwd\" (UID: \"2224855e-40d6-45cd-b001-18e3cc94610d\") " pod="openstack/dnsmasq-dns-765c5b6b49-64hwd" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.457736 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2224855e-40d6-45cd-b001-18e3cc94610d-config\") pod \"dnsmasq-dns-765c5b6b49-64hwd\" (UID: \"2224855e-40d6-45cd-b001-18e3cc94610d\") " pod="openstack/dnsmasq-dns-765c5b6b49-64hwd" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.457788 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2224855e-40d6-45cd-b001-18e3cc94610d-dns-svc\") pod \"dnsmasq-dns-765c5b6b49-64hwd\" (UID: \"2224855e-40d6-45cd-b001-18e3cc94610d\") " pod="openstack/dnsmasq-dns-765c5b6b49-64hwd" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.457936 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2224855e-40d6-45cd-b001-18e3cc94610d-ovsdbserver-sb\") pod \"dnsmasq-dns-765c5b6b49-64hwd\" (UID: \"2224855e-40d6-45cd-b001-18e3cc94610d\") " pod="openstack/dnsmasq-dns-765c5b6b49-64hwd" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.458247 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2224855e-40d6-45cd-b001-18e3cc94610d-ovsdbserver-nb\") pod \"dnsmasq-dns-765c5b6b49-64hwd\" (UID: \"2224855e-40d6-45cd-b001-18e3cc94610d\") " pod="openstack/dnsmasq-dns-765c5b6b49-64hwd" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.464017 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.486305 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw2br\" (UniqueName: \"kubernetes.io/projected/2224855e-40d6-45cd-b001-18e3cc94610d-kube-api-access-hw2br\") pod \"dnsmasq-dns-765c5b6b49-64hwd\" (UID: \"2224855e-40d6-45cd-b001-18e3cc94610d\") " pod="openstack/dnsmasq-dns-765c5b6b49-64hwd" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.539601 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.558387 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3a49290e-8dbd-47d0-9fa6-babbf3b53b96-config-data-custom\") pod \"cinder-api-0\" (UID: \"3a49290e-8dbd-47d0-9fa6-babbf3b53b96\") " pod="openstack/cinder-api-0" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.558478 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a49290e-8dbd-47d0-9fa6-babbf3b53b96-config-data\") pod \"cinder-api-0\" (UID: \"3a49290e-8dbd-47d0-9fa6-babbf3b53b96\") " pod="openstack/cinder-api-0" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.558530 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a49290e-8dbd-47d0-9fa6-babbf3b53b96-scripts\") pod \"cinder-api-0\" (UID: \"3a49290e-8dbd-47d0-9fa6-babbf3b53b96\") " pod="openstack/cinder-api-0" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.558576 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3a49290e-8dbd-47d0-9fa6-babbf3b53b96-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3a49290e-8dbd-47d0-9fa6-babbf3b53b96\") " pod="openstack/cinder-api-0" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.558592 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a49290e-8dbd-47d0-9fa6-babbf3b53b96-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3a49290e-8dbd-47d0-9fa6-babbf3b53b96\") " pod="openstack/cinder-api-0" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.558607 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a49290e-8dbd-47d0-9fa6-babbf3b53b96-logs\") pod \"cinder-api-0\" (UID: \"3a49290e-8dbd-47d0-9fa6-babbf3b53b96\") " pod="openstack/cinder-api-0" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.558629 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bpp9\" (UniqueName: \"kubernetes.io/projected/3a49290e-8dbd-47d0-9fa6-babbf3b53b96-kube-api-access-6bpp9\") pod \"cinder-api-0\" (UID: \"3a49290e-8dbd-47d0-9fa6-babbf3b53b96\") " pod="openstack/cinder-api-0" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.558814 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3a49290e-8dbd-47d0-9fa6-babbf3b53b96-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3a49290e-8dbd-47d0-9fa6-babbf3b53b96\") " pod="openstack/cinder-api-0" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.559175 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a49290e-8dbd-47d0-9fa6-babbf3b53b96-logs\") pod \"cinder-api-0\" (UID: \"3a49290e-8dbd-47d0-9fa6-babbf3b53b96\") " pod="openstack/cinder-api-0" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.561502 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a49290e-8dbd-47d0-9fa6-babbf3b53b96-scripts\") pod \"cinder-api-0\" (UID: \"3a49290e-8dbd-47d0-9fa6-babbf3b53b96\") " pod="openstack/cinder-api-0" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.562359 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a49290e-8dbd-47d0-9fa6-babbf3b53b96-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3a49290e-8dbd-47d0-9fa6-babbf3b53b96\") " pod="openstack/cinder-api-0" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.562848 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a49290e-8dbd-47d0-9fa6-babbf3b53b96-config-data\") pod \"cinder-api-0\" (UID: \"3a49290e-8dbd-47d0-9fa6-babbf3b53b96\") " pod="openstack/cinder-api-0" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.565596 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3a49290e-8dbd-47d0-9fa6-babbf3b53b96-config-data-custom\") pod \"cinder-api-0\" (UID: \"3a49290e-8dbd-47d0-9fa6-babbf3b53b96\") " pod="openstack/cinder-api-0" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.583076 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bpp9\" (UniqueName: \"kubernetes.io/projected/3a49290e-8dbd-47d0-9fa6-babbf3b53b96-kube-api-access-6bpp9\") pod \"cinder-api-0\" (UID: \"3a49290e-8dbd-47d0-9fa6-babbf3b53b96\") " pod="openstack/cinder-api-0" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.629534 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-765c5b6b49-64hwd" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.766527 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.823517 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.919393 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.925119 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55c649d8d5-bs2zk" event={"ID":"8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e","Type":"ContainerStarted","Data":"9cf43734be8e6dc3ec33718d683ec770eaf32c81e43d5e3570abc6248a66f417"} Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.925351 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55c649d8d5-bs2zk" podUID="8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e" containerName="dnsmasq-dns" containerID="cri-o://9cf43734be8e6dc3ec33718d683ec770eaf32c81e43d5e3570abc6248a66f417" gracePeriod=10 Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.966145 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55c649d8d5-bs2zk" podStartSLOduration=2.966124282 podStartE2EDuration="2.966124282s" podCreationTimestamp="2026-03-09 03:02:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 03:02:35.952666713 +0000 UTC m=+1280.542330445" watchObservedRunningTime="2026-03-09 03:02:35.966124282 +0000 UTC m=+1280.555788014" Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.973109 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.982020 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.996698 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 03:02:35 crc kubenswrapper[4901]: I0309 03:02:35.998646 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 03:02:36 crc kubenswrapper[4901]: I0309 03:02:36.001557 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 03:02:36 crc kubenswrapper[4901]: I0309 03:02:36.002172 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 03:02:36 crc kubenswrapper[4901]: I0309 03:02:36.009817 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 03:02:36 crc kubenswrapper[4901]: I0309 03:02:36.126819 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0a22f21-beb0-44a1-943c-08547dc523a8" path="/var/lib/kubelet/pods/d0a22f21-beb0-44a1-943c-08547dc523a8/volumes" Mar 09 03:02:36 crc kubenswrapper[4901]: I0309 03:02:36.172012 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-765c5b6b49-64hwd"] Mar 09 03:02:36 crc kubenswrapper[4901]: I0309 03:02:36.172980 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5157e341-8a57-4f03-8061-e6c8853dddb4-scripts\") pod \"ceilometer-0\" (UID: \"5157e341-8a57-4f03-8061-e6c8853dddb4\") " pod="openstack/ceilometer-0" Mar 09 03:02:36 crc kubenswrapper[4901]: I0309 03:02:36.173042 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5157e341-8a57-4f03-8061-e6c8853dddb4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5157e341-8a57-4f03-8061-e6c8853dddb4\") " pod="openstack/ceilometer-0" Mar 09 03:02:36 crc kubenswrapper[4901]: I0309 03:02:36.173061 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5157e341-8a57-4f03-8061-e6c8853dddb4-config-data\") pod \"ceilometer-0\" (UID: \"5157e341-8a57-4f03-8061-e6c8853dddb4\") " pod="openstack/ceilometer-0" Mar 09 03:02:36 crc kubenswrapper[4901]: I0309 03:02:36.173104 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5157e341-8a57-4f03-8061-e6c8853dddb4-run-httpd\") pod \"ceilometer-0\" (UID: \"5157e341-8a57-4f03-8061-e6c8853dddb4\") " pod="openstack/ceilometer-0" Mar 09 03:02:36 crc kubenswrapper[4901]: I0309 03:02:36.173142 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5157e341-8a57-4f03-8061-e6c8853dddb4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5157e341-8a57-4f03-8061-e6c8853dddb4\") " pod="openstack/ceilometer-0" Mar 09 03:02:36 crc kubenswrapper[4901]: I0309 03:02:36.173184 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5157e341-8a57-4f03-8061-e6c8853dddb4-log-httpd\") pod \"ceilometer-0\" (UID: \"5157e341-8a57-4f03-8061-e6c8853dddb4\") " pod="openstack/ceilometer-0" Mar 09 03:02:36 crc kubenswrapper[4901]: I0309 03:02:36.173201 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdzxk\" (UniqueName: \"kubernetes.io/projected/5157e341-8a57-4f03-8061-e6c8853dddb4-kube-api-access-kdzxk\") pod \"ceilometer-0\" (UID: \"5157e341-8a57-4f03-8061-e6c8853dddb4\") " pod="openstack/ceilometer-0" Mar 09 03:02:36 crc kubenswrapper[4901]: I0309 03:02:36.274489 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5157e341-8a57-4f03-8061-e6c8853dddb4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5157e341-8a57-4f03-8061-e6c8853dddb4\") " pod="openstack/ceilometer-0" Mar 09 03:02:36 crc kubenswrapper[4901]: I0309 03:02:36.274531 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5157e341-8a57-4f03-8061-e6c8853dddb4-config-data\") pod \"ceilometer-0\" (UID: \"5157e341-8a57-4f03-8061-e6c8853dddb4\") " pod="openstack/ceilometer-0" Mar 09 03:02:36 crc kubenswrapper[4901]: I0309 03:02:36.274573 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5157e341-8a57-4f03-8061-e6c8853dddb4-run-httpd\") pod \"ceilometer-0\" (UID: \"5157e341-8a57-4f03-8061-e6c8853dddb4\") " pod="openstack/ceilometer-0" Mar 09 03:02:36 crc kubenswrapper[4901]: I0309 03:02:36.274609 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5157e341-8a57-4f03-8061-e6c8853dddb4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5157e341-8a57-4f03-8061-e6c8853dddb4\") " pod="openstack/ceilometer-0" Mar 09 03:02:36 crc kubenswrapper[4901]: I0309 03:02:36.274648 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5157e341-8a57-4f03-8061-e6c8853dddb4-log-httpd\") pod \"ceilometer-0\" (UID: \"5157e341-8a57-4f03-8061-e6c8853dddb4\") " pod="openstack/ceilometer-0" Mar 09 03:02:36 crc kubenswrapper[4901]: I0309 03:02:36.274688 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdzxk\" (UniqueName: \"kubernetes.io/projected/5157e341-8a57-4f03-8061-e6c8853dddb4-kube-api-access-kdzxk\") pod \"ceilometer-0\" (UID: \"5157e341-8a57-4f03-8061-e6c8853dddb4\") " pod="openstack/ceilometer-0" Mar 09 03:02:36 crc kubenswrapper[4901]: I0309 03:02:36.274716 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5157e341-8a57-4f03-8061-e6c8853dddb4-scripts\") pod \"ceilometer-0\" (UID: \"5157e341-8a57-4f03-8061-e6c8853dddb4\") " pod="openstack/ceilometer-0" Mar 09 03:02:36 crc kubenswrapper[4901]: I0309 03:02:36.275050 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5157e341-8a57-4f03-8061-e6c8853dddb4-run-httpd\") pod \"ceilometer-0\" (UID: \"5157e341-8a57-4f03-8061-e6c8853dddb4\") " pod="openstack/ceilometer-0" Mar 09 03:02:36 crc kubenswrapper[4901]: I0309 03:02:36.275808 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5157e341-8a57-4f03-8061-e6c8853dddb4-log-httpd\") pod \"ceilometer-0\" (UID: \"5157e341-8a57-4f03-8061-e6c8853dddb4\") " pod="openstack/ceilometer-0" Mar 09 03:02:36 crc kubenswrapper[4901]: I0309 03:02:36.282774 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5157e341-8a57-4f03-8061-e6c8853dddb4-config-data\") pod \"ceilometer-0\" (UID: \"5157e341-8a57-4f03-8061-e6c8853dddb4\") " pod="openstack/ceilometer-0" Mar 09 03:02:36 crc kubenswrapper[4901]: I0309 03:02:36.283256 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5157e341-8a57-4f03-8061-e6c8853dddb4-scripts\") pod \"ceilometer-0\" (UID: \"5157e341-8a57-4f03-8061-e6c8853dddb4\") " pod="openstack/ceilometer-0" Mar 09 03:02:36 crc kubenswrapper[4901]: I0309 03:02:36.283468 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5157e341-8a57-4f03-8061-e6c8853dddb4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5157e341-8a57-4f03-8061-e6c8853dddb4\") " pod="openstack/ceilometer-0" Mar 09 03:02:36 crc kubenswrapper[4901]: I0309 03:02:36.287239 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5157e341-8a57-4f03-8061-e6c8853dddb4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5157e341-8a57-4f03-8061-e6c8853dddb4\") " pod="openstack/ceilometer-0" Mar 09 03:02:36 crc kubenswrapper[4901]: I0309 03:02:36.294820 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdzxk\" (UniqueName: \"kubernetes.io/projected/5157e341-8a57-4f03-8061-e6c8853dddb4-kube-api-access-kdzxk\") pod \"ceilometer-0\" (UID: \"5157e341-8a57-4f03-8061-e6c8853dddb4\") " pod="openstack/ceilometer-0" Mar 09 03:02:36 crc kubenswrapper[4901]: I0309 03:02:36.319764 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 03:02:36 crc kubenswrapper[4901]: W0309 03:02:36.402185 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod114dc2da_7bec_47de_a6f4_a37e2c56e1f8.slice/crio-fa7fc77893d67f929c1af8f14634b52181777a47d8e787146d600dfc23f00892 WatchSource:0}: Error finding container fa7fc77893d67f929c1af8f14634b52181777a47d8e787146d600dfc23f00892: Status 404 returned error can't find the container with id fa7fc77893d67f929c1af8f14634b52181777a47d8e787146d600dfc23f00892 Mar 09 03:02:36 crc kubenswrapper[4901]: W0309 03:02:36.402504 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2224855e_40d6_45cd_b001_18e3cc94610d.slice/crio-cbf0c3a00aaaa134deaab2830ae6a58dc74f3e70f0f557dab5d46534bdc85cb2 WatchSource:0}: Error finding container cbf0c3a00aaaa134deaab2830ae6a58dc74f3e70f0f557dab5d46534bdc85cb2: Status 404 returned error can't find the container with id cbf0c3a00aaaa134deaab2830ae6a58dc74f3e70f0f557dab5d46534bdc85cb2 Mar 09 03:02:36 crc kubenswrapper[4901]: I0309 03:02:36.936145 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"114dc2da-7bec-47de-a6f4-a37e2c56e1f8","Type":"ContainerStarted","Data":"fa7fc77893d67f929c1af8f14634b52181777a47d8e787146d600dfc23f00892"} Mar 09 03:02:36 crc kubenswrapper[4901]: I0309 03:02:36.937992 4901 generic.go:334] "Generic (PLEG): container finished" podID="8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e" containerID="9cf43734be8e6dc3ec33718d683ec770eaf32c81e43d5e3570abc6248a66f417" exitCode=0 Mar 09 03:02:36 crc kubenswrapper[4901]: I0309 03:02:36.938066 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55c649d8d5-bs2zk" event={"ID":"8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e","Type":"ContainerDied","Data":"9cf43734be8e6dc3ec33718d683ec770eaf32c81e43d5e3570abc6248a66f417"} Mar 09 03:02:36 crc kubenswrapper[4901]: I0309 03:02:36.938095 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55c649d8d5-bs2zk" event={"ID":"8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e","Type":"ContainerDied","Data":"7743ff4a6782ca5cd749573ad52f1c5a6c2f9b4ba800487ffaad1961eb7aafea"} Mar 09 03:02:36 crc kubenswrapper[4901]: I0309 03:02:36.938108 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7743ff4a6782ca5cd749573ad52f1c5a6c2f9b4ba800487ffaad1961eb7aafea" Mar 09 03:02:36 crc kubenswrapper[4901]: I0309 03:02:36.938980 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-765c5b6b49-64hwd" event={"ID":"2224855e-40d6-45cd-b001-18e3cc94610d","Type":"ContainerStarted","Data":"cbf0c3a00aaaa134deaab2830ae6a58dc74f3e70f0f557dab5d46534bdc85cb2"} Mar 09 03:02:36 crc kubenswrapper[4901]: I0309 03:02:36.977836 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55c649d8d5-bs2zk" Mar 09 03:02:36 crc kubenswrapper[4901]: I0309 03:02:36.986351 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e-dns-swift-storage-0\") pod \"8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e\" (UID: \"8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e\") " Mar 09 03:02:36 crc kubenswrapper[4901]: I0309 03:02:36.986421 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e-dns-svc\") pod \"8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e\" (UID: \"8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e\") " Mar 09 03:02:36 crc kubenswrapper[4901]: I0309 03:02:36.986515 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e-ovsdbserver-sb\") pod \"8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e\" (UID: \"8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e\") " Mar 09 03:02:36 crc kubenswrapper[4901]: I0309 03:02:36.986539 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e-ovsdbserver-nb\") pod \"8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e\" (UID: \"8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e\") " Mar 09 03:02:36 crc kubenswrapper[4901]: I0309 03:02:36.986567 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e-config\") pod \"8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e\" (UID: \"8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e\") " Mar 09 03:02:36 crc kubenswrapper[4901]: I0309 03:02:36.986611 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glpdj\" (UniqueName: \"kubernetes.io/projected/8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e-kube-api-access-glpdj\") pod \"8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e\" (UID: \"8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e\") " Mar 09 03:02:36 crc kubenswrapper[4901]: I0309 03:02:36.996037 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e-kube-api-access-glpdj" (OuterVolumeSpecName: "kube-api-access-glpdj") pod "8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e" (UID: "8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e"). InnerVolumeSpecName "kube-api-access-glpdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:02:37 crc kubenswrapper[4901]: I0309 03:02:37.083796 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e" (UID: "8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:02:37 crc kubenswrapper[4901]: I0309 03:02:37.087920 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e" (UID: "8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:02:37 crc kubenswrapper[4901]: I0309 03:02:37.089014 4901 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:37 crc kubenswrapper[4901]: I0309 03:02:37.089033 4901 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:37 crc kubenswrapper[4901]: I0309 03:02:37.089043 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glpdj\" (UniqueName: \"kubernetes.io/projected/8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e-kube-api-access-glpdj\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:37 crc kubenswrapper[4901]: I0309 03:02:37.089391 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e-config" (OuterVolumeSpecName: "config") pod "8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e" (UID: "8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:02:37 crc kubenswrapper[4901]: I0309 03:02:37.116595 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e" (UID: "8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:02:37 crc kubenswrapper[4901]: I0309 03:02:37.117096 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e" (UID: "8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:02:37 crc kubenswrapper[4901]: I0309 03:02:37.194429 4901 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:37 crc kubenswrapper[4901]: I0309 03:02:37.194462 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e-config\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:37 crc kubenswrapper[4901]: I0309 03:02:37.194472 4901 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:37 crc kubenswrapper[4901]: I0309 03:02:37.447352 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 09 03:02:37 crc kubenswrapper[4901]: W0309 03:02:37.455134 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a49290e_8dbd_47d0_9fa6_babbf3b53b96.slice/crio-c5a5622cde38959975a0ceaf782f72f108b4a121c7e14ce8809126c4a1607365 WatchSource:0}: Error finding container c5a5622cde38959975a0ceaf782f72f108b4a121c7e14ce8809126c4a1607365: Status 404 returned error can't find the container with id c5a5622cde38959975a0ceaf782f72f108b4a121c7e14ce8809126c4a1607365 Mar 09 03:02:37 crc kubenswrapper[4901]: I0309 03:02:37.525771 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 03:02:37 crc kubenswrapper[4901]: I0309 03:02:37.964627 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5157e341-8a57-4f03-8061-e6c8853dddb4","Type":"ContainerStarted","Data":"025c1f8e927ed124ac565ae6e468bf87410439ab05e8e0908ea26de249ac72bf"} Mar 09 03:02:37 crc kubenswrapper[4901]: I0309 03:02:37.974832 4901 generic.go:334] "Generic (PLEG): container finished" podID="2224855e-40d6-45cd-b001-18e3cc94610d" containerID="51dd6706760fdfd6e90bc55edfe74e1ead6282dd2f77159077d8a1101b087c17" exitCode=0 Mar 09 03:02:37 crc kubenswrapper[4901]: I0309 03:02:37.974892 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-765c5b6b49-64hwd" event={"ID":"2224855e-40d6-45cd-b001-18e3cc94610d","Type":"ContainerDied","Data":"51dd6706760fdfd6e90bc55edfe74e1ead6282dd2f77159077d8a1101b087c17"} Mar 09 03:02:37 crc kubenswrapper[4901]: I0309 03:02:37.983421 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 09 03:02:37 crc kubenswrapper[4901]: I0309 03:02:37.990171 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3a49290e-8dbd-47d0-9fa6-babbf3b53b96","Type":"ContainerStarted","Data":"c5a5622cde38959975a0ceaf782f72f108b4a121c7e14ce8809126c4a1607365"} Mar 09 03:02:38 crc kubenswrapper[4901]: I0309 03:02:38.009527 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-875b9dd78-8t9g6" event={"ID":"baa336b3-abdd-43e2-9c54-6d8d34c71204","Type":"ContainerStarted","Data":"3ed5c94eba80813636cf9478d7655211647869ac0b7da74e90655f7e8fc79465"} Mar 09 03:02:38 crc kubenswrapper[4901]: I0309 03:02:38.009575 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-875b9dd78-8t9g6" event={"ID":"baa336b3-abdd-43e2-9c54-6d8d34c71204","Type":"ContainerStarted","Data":"6e6b9a76927163c431cd48ce16cec53d765e06541e962dcf3dccf5f2b20e4b6e"} Mar 09 03:02:38 crc kubenswrapper[4901]: I0309 03:02:38.026383 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55c649d8d5-bs2zk" Mar 09 03:02:38 crc kubenswrapper[4901]: I0309 03:02:38.026450 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7d497f76dc-pptvt" event={"ID":"6790ccc5-8f7f-4de8-bd69-652661631307","Type":"ContainerStarted","Data":"c3f30f9dc1ce91e181e2e5cb21149d56e84cb129c67ad539aaf2087c64070e73"} Mar 09 03:02:38 crc kubenswrapper[4901]: I0309 03:02:38.026517 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7d497f76dc-pptvt" event={"ID":"6790ccc5-8f7f-4de8-bd69-652661631307","Type":"ContainerStarted","Data":"4b17bc681ef09af1769a69b4b52d036b3ca7d2d0d4b4182be89cf1ab304d4772"} Mar 09 03:02:38 crc kubenswrapper[4901]: I0309 03:02:38.073533 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-875b9dd78-8t9g6" podStartSLOduration=2.997681644 podStartE2EDuration="6.073515536s" podCreationTimestamp="2026-03-09 03:02:32 +0000 UTC" firstStartedPulling="2026-03-09 03:02:33.840779627 +0000 UTC m=+1278.430443359" lastFinishedPulling="2026-03-09 03:02:36.916613519 +0000 UTC m=+1281.506277251" observedRunningTime="2026-03-09 03:02:38.040764459 +0000 UTC m=+1282.630428191" watchObservedRunningTime="2026-03-09 03:02:38.073515536 +0000 UTC m=+1282.663179268" Mar 09 03:02:39 crc kubenswrapper[4901]: I0309 03:02:39.124804 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-765c5b6b49-64hwd" event={"ID":"2224855e-40d6-45cd-b001-18e3cc94610d","Type":"ContainerStarted","Data":"2cabf147c324c9b0d8e1ce65fce8ffa92f5c6ad3220ab420158e1d3de32dea84"} Mar 09 03:02:39 crc kubenswrapper[4901]: I0309 03:02:39.125565 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-765c5b6b49-64hwd" Mar 09 03:02:39 crc kubenswrapper[4901]: I0309 03:02:39.133880 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3a49290e-8dbd-47d0-9fa6-babbf3b53b96","Type":"ContainerStarted","Data":"6bbe3b8c869d602179800a4301c3807c9e1aeb60e71d30d88632cbe573bfd8fe"} Mar 09 03:02:39 crc kubenswrapper[4901]: I0309 03:02:39.144441 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"114dc2da-7bec-47de-a6f4-a37e2c56e1f8","Type":"ContainerStarted","Data":"a55bddb0310856452ae351719287141366e86613b5d5808d8245686cb1b2b46c"} Mar 09 03:02:39 crc kubenswrapper[4901]: I0309 03:02:39.144485 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"114dc2da-7bec-47de-a6f4-a37e2c56e1f8","Type":"ContainerStarted","Data":"25c4d07f357e45acabb171f36d58fbe887d3725a24d1b8089d0660e83c055af4"} Mar 09 03:02:39 crc kubenswrapper[4901]: I0309 03:02:39.152185 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7d497f76dc-pptvt" podStartSLOduration=4.39198825 podStartE2EDuration="7.152169776s" podCreationTimestamp="2026-03-09 03:02:32 +0000 UTC" firstStartedPulling="2026-03-09 03:02:34.168891788 +0000 UTC m=+1278.758555520" lastFinishedPulling="2026-03-09 03:02:36.929073314 +0000 UTC m=+1281.518737046" observedRunningTime="2026-03-09 03:02:38.068173071 +0000 UTC m=+1282.657836803" watchObservedRunningTime="2026-03-09 03:02:39.152169776 +0000 UTC m=+1283.741833508" Mar 09 03:02:39 crc kubenswrapper[4901]: I0309 03:02:39.154458 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5157e341-8a57-4f03-8061-e6c8853dddb4","Type":"ContainerStarted","Data":"a0c62ebde44dc827946759905296f74d7febc121c09c142bec80c26419afec3b"} Mar 09 03:02:39 crc kubenswrapper[4901]: I0309 03:02:39.163445 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-765c5b6b49-64hwd" podStartSLOduration=4.16342454 podStartE2EDuration="4.16342454s" podCreationTimestamp="2026-03-09 03:02:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 03:02:39.150284589 +0000 UTC m=+1283.739948331" watchObservedRunningTime="2026-03-09 03:02:39.16342454 +0000 UTC m=+1283.753088272" Mar 09 03:02:39 crc kubenswrapper[4901]: I0309 03:02:39.173032 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.295713072 podStartE2EDuration="4.173017592s" podCreationTimestamp="2026-03-09 03:02:35 +0000 UTC" firstStartedPulling="2026-03-09 03:02:36.406869125 +0000 UTC m=+1280.996532857" lastFinishedPulling="2026-03-09 03:02:37.284173645 +0000 UTC m=+1281.873837377" observedRunningTime="2026-03-09 03:02:39.171534355 +0000 UTC m=+1283.761198107" watchObservedRunningTime="2026-03-09 03:02:39.173017592 +0000 UTC m=+1283.762681324" Mar 09 03:02:39 crc kubenswrapper[4901]: I0309 03:02:39.884109 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6cc8485b48-f86rl"] Mar 09 03:02:39 crc kubenswrapper[4901]: E0309 03:02:39.884941 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e" containerName="dnsmasq-dns" Mar 09 03:02:39 crc kubenswrapper[4901]: I0309 03:02:39.885031 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e" containerName="dnsmasq-dns" Mar 09 03:02:39 crc kubenswrapper[4901]: E0309 03:02:39.885119 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e" containerName="init" Mar 09 03:02:39 crc kubenswrapper[4901]: I0309 03:02:39.885191 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e" containerName="init" Mar 09 03:02:39 crc kubenswrapper[4901]: I0309 03:02:39.885479 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e" containerName="dnsmasq-dns" Mar 09 03:02:39 crc kubenswrapper[4901]: I0309 03:02:39.886648 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6cc8485b48-f86rl" Mar 09 03:02:39 crc kubenswrapper[4901]: I0309 03:02:39.888322 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 09 03:02:39 crc kubenswrapper[4901]: I0309 03:02:39.889747 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 09 03:02:39 crc kubenswrapper[4901]: I0309 03:02:39.901037 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6cc8485b48-f86rl"] Mar 09 03:02:40 crc kubenswrapper[4901]: I0309 03:02:40.049707 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5efc6dd-6a36-4491-b090-b4c9301ec7d0-internal-tls-certs\") pod \"barbican-api-6cc8485b48-f86rl\" (UID: \"e5efc6dd-6a36-4491-b090-b4c9301ec7d0\") " pod="openstack/barbican-api-6cc8485b48-f86rl" Mar 09 03:02:40 crc kubenswrapper[4901]: I0309 03:02:40.049969 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5efc6dd-6a36-4491-b090-b4c9301ec7d0-config-data\") pod \"barbican-api-6cc8485b48-f86rl\" (UID: \"e5efc6dd-6a36-4491-b090-b4c9301ec7d0\") " pod="openstack/barbican-api-6cc8485b48-f86rl" Mar 09 03:02:40 crc kubenswrapper[4901]: I0309 03:02:40.050151 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5efc6dd-6a36-4491-b090-b4c9301ec7d0-combined-ca-bundle\") pod \"barbican-api-6cc8485b48-f86rl\" (UID: \"e5efc6dd-6a36-4491-b090-b4c9301ec7d0\") " pod="openstack/barbican-api-6cc8485b48-f86rl" Mar 09 03:02:40 crc kubenswrapper[4901]: I0309 03:02:40.050328 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5efc6dd-6a36-4491-b090-b4c9301ec7d0-logs\") pod \"barbican-api-6cc8485b48-f86rl\" (UID: \"e5efc6dd-6a36-4491-b090-b4c9301ec7d0\") " pod="openstack/barbican-api-6cc8485b48-f86rl" Mar 09 03:02:40 crc kubenswrapper[4901]: I0309 03:02:40.050460 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e5efc6dd-6a36-4491-b090-b4c9301ec7d0-config-data-custom\") pod \"barbican-api-6cc8485b48-f86rl\" (UID: \"e5efc6dd-6a36-4491-b090-b4c9301ec7d0\") " pod="openstack/barbican-api-6cc8485b48-f86rl" Mar 09 03:02:40 crc kubenswrapper[4901]: I0309 03:02:40.050658 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5efc6dd-6a36-4491-b090-b4c9301ec7d0-public-tls-certs\") pod \"barbican-api-6cc8485b48-f86rl\" (UID: \"e5efc6dd-6a36-4491-b090-b4c9301ec7d0\") " pod="openstack/barbican-api-6cc8485b48-f86rl" Mar 09 03:02:40 crc kubenswrapper[4901]: I0309 03:02:40.050784 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws4pt\" (UniqueName: \"kubernetes.io/projected/e5efc6dd-6a36-4491-b090-b4c9301ec7d0-kube-api-access-ws4pt\") pod \"barbican-api-6cc8485b48-f86rl\" (UID: \"e5efc6dd-6a36-4491-b090-b4c9301ec7d0\") " pod="openstack/barbican-api-6cc8485b48-f86rl" Mar 09 03:02:40 crc kubenswrapper[4901]: I0309 03:02:40.152810 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5efc6dd-6a36-4491-b090-b4c9301ec7d0-logs\") pod \"barbican-api-6cc8485b48-f86rl\" (UID: \"e5efc6dd-6a36-4491-b090-b4c9301ec7d0\") " pod="openstack/barbican-api-6cc8485b48-f86rl" Mar 09 03:02:40 crc kubenswrapper[4901]: I0309 03:02:40.153879 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e5efc6dd-6a36-4491-b090-b4c9301ec7d0-config-data-custom\") pod \"barbican-api-6cc8485b48-f86rl\" (UID: \"e5efc6dd-6a36-4491-b090-b4c9301ec7d0\") " pod="openstack/barbican-api-6cc8485b48-f86rl" Mar 09 03:02:40 crc kubenswrapper[4901]: I0309 03:02:40.153166 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5efc6dd-6a36-4491-b090-b4c9301ec7d0-logs\") pod \"barbican-api-6cc8485b48-f86rl\" (UID: \"e5efc6dd-6a36-4491-b090-b4c9301ec7d0\") " pod="openstack/barbican-api-6cc8485b48-f86rl" Mar 09 03:02:40 crc kubenswrapper[4901]: I0309 03:02:40.154169 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5efc6dd-6a36-4491-b090-b4c9301ec7d0-public-tls-certs\") pod \"barbican-api-6cc8485b48-f86rl\" (UID: \"e5efc6dd-6a36-4491-b090-b4c9301ec7d0\") " pod="openstack/barbican-api-6cc8485b48-f86rl" Mar 09 03:02:40 crc kubenswrapper[4901]: I0309 03:02:40.154307 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ws4pt\" (UniqueName: \"kubernetes.io/projected/e5efc6dd-6a36-4491-b090-b4c9301ec7d0-kube-api-access-ws4pt\") pod \"barbican-api-6cc8485b48-f86rl\" (UID: \"e5efc6dd-6a36-4491-b090-b4c9301ec7d0\") " pod="openstack/barbican-api-6cc8485b48-f86rl" Mar 09 03:02:40 crc kubenswrapper[4901]: I0309 03:02:40.154469 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5efc6dd-6a36-4491-b090-b4c9301ec7d0-internal-tls-certs\") pod \"barbican-api-6cc8485b48-f86rl\" (UID: \"e5efc6dd-6a36-4491-b090-b4c9301ec7d0\") " pod="openstack/barbican-api-6cc8485b48-f86rl" Mar 09 03:02:40 crc kubenswrapper[4901]: I0309 03:02:40.154573 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5efc6dd-6a36-4491-b090-b4c9301ec7d0-config-data\") pod \"barbican-api-6cc8485b48-f86rl\" (UID: \"e5efc6dd-6a36-4491-b090-b4c9301ec7d0\") " pod="openstack/barbican-api-6cc8485b48-f86rl" Mar 09 03:02:40 crc kubenswrapper[4901]: I0309 03:02:40.154736 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5efc6dd-6a36-4491-b090-b4c9301ec7d0-combined-ca-bundle\") pod \"barbican-api-6cc8485b48-f86rl\" (UID: \"e5efc6dd-6a36-4491-b090-b4c9301ec7d0\") " pod="openstack/barbican-api-6cc8485b48-f86rl" Mar 09 03:02:40 crc kubenswrapper[4901]: I0309 03:02:40.162300 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5efc6dd-6a36-4491-b090-b4c9301ec7d0-public-tls-certs\") pod \"barbican-api-6cc8485b48-f86rl\" (UID: \"e5efc6dd-6a36-4491-b090-b4c9301ec7d0\") " pod="openstack/barbican-api-6cc8485b48-f86rl" Mar 09 03:02:40 crc kubenswrapper[4901]: I0309 03:02:40.167370 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5efc6dd-6a36-4491-b090-b4c9301ec7d0-internal-tls-certs\") pod \"barbican-api-6cc8485b48-f86rl\" (UID: \"e5efc6dd-6a36-4491-b090-b4c9301ec7d0\") " pod="openstack/barbican-api-6cc8485b48-f86rl" Mar 09 03:02:40 crc kubenswrapper[4901]: I0309 03:02:40.167971 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5efc6dd-6a36-4491-b090-b4c9301ec7d0-combined-ca-bundle\") pod \"barbican-api-6cc8485b48-f86rl\" (UID: \"e5efc6dd-6a36-4491-b090-b4c9301ec7d0\") " pod="openstack/barbican-api-6cc8485b48-f86rl" Mar 09 03:02:40 crc kubenswrapper[4901]: I0309 03:02:40.168250 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3a49290e-8dbd-47d0-9fa6-babbf3b53b96","Type":"ContainerStarted","Data":"516851314c1c141458a097240df1375a133b9905fece9e4818bf88a408f83080"} Mar 09 03:02:40 crc kubenswrapper[4901]: I0309 03:02:40.168398 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="3a49290e-8dbd-47d0-9fa6-babbf3b53b96" containerName="cinder-api-log" containerID="cri-o://6bbe3b8c869d602179800a4301c3807c9e1aeb60e71d30d88632cbe573bfd8fe" gracePeriod=30 Mar 09 03:02:40 crc kubenswrapper[4901]: I0309 03:02:40.168502 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 09 03:02:40 crc kubenswrapper[4901]: I0309 03:02:40.168567 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e5efc6dd-6a36-4491-b090-b4c9301ec7d0-config-data-custom\") pod \"barbican-api-6cc8485b48-f86rl\" (UID: \"e5efc6dd-6a36-4491-b090-b4c9301ec7d0\") " pod="openstack/barbican-api-6cc8485b48-f86rl" Mar 09 03:02:40 crc kubenswrapper[4901]: I0309 03:02:40.168657 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="3a49290e-8dbd-47d0-9fa6-babbf3b53b96" containerName="cinder-api" containerID="cri-o://516851314c1c141458a097240df1375a133b9905fece9e4818bf88a408f83080" gracePeriod=30 Mar 09 03:02:40 crc kubenswrapper[4901]: I0309 03:02:40.176448 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5efc6dd-6a36-4491-b090-b4c9301ec7d0-config-data\") pod \"barbican-api-6cc8485b48-f86rl\" (UID: \"e5efc6dd-6a36-4491-b090-b4c9301ec7d0\") " pod="openstack/barbican-api-6cc8485b48-f86rl" Mar 09 03:02:40 crc kubenswrapper[4901]: I0309 03:02:40.183376 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5157e341-8a57-4f03-8061-e6c8853dddb4","Type":"ContainerStarted","Data":"699fbdc0c622a452d327135f74b14302cd75bb5453d66ac8e9692e83dbabf682"} Mar 09 03:02:40 crc kubenswrapper[4901]: I0309 03:02:40.186206 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.186195122 podStartE2EDuration="5.186195122s" podCreationTimestamp="2026-03-09 03:02:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 03:02:40.185129715 +0000 UTC m=+1284.774793447" watchObservedRunningTime="2026-03-09 03:02:40.186195122 +0000 UTC m=+1284.775858854" Mar 09 03:02:40 crc kubenswrapper[4901]: I0309 03:02:40.196726 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws4pt\" (UniqueName: \"kubernetes.io/projected/e5efc6dd-6a36-4491-b090-b4c9301ec7d0-kube-api-access-ws4pt\") pod \"barbican-api-6cc8485b48-f86rl\" (UID: \"e5efc6dd-6a36-4491-b090-b4c9301ec7d0\") " pod="openstack/barbican-api-6cc8485b48-f86rl" Mar 09 03:02:40 crc kubenswrapper[4901]: I0309 03:02:40.209606 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6cc8485b48-f86rl" Mar 09 03:02:40 crc kubenswrapper[4901]: I0309 03:02:40.540417 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 09 03:02:40 crc kubenswrapper[4901]: I0309 03:02:40.720145 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6cc8485b48-f86rl"] Mar 09 03:02:40 crc kubenswrapper[4901]: W0309 03:02:40.721422 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5efc6dd_6a36_4491_b090_b4c9301ec7d0.slice/crio-dc7315e94862f6ebce57ac75530e055aa7deb921396c97632ce6afa930c45ff9 WatchSource:0}: Error finding container dc7315e94862f6ebce57ac75530e055aa7deb921396c97632ce6afa930c45ff9: Status 404 returned error can't find the container with id dc7315e94862f6ebce57ac75530e055aa7deb921396c97632ce6afa930c45ff9 Mar 09 03:02:40 crc kubenswrapper[4901]: I0309 03:02:40.970350 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.074938 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a49290e-8dbd-47d0-9fa6-babbf3b53b96-scripts\") pod \"3a49290e-8dbd-47d0-9fa6-babbf3b53b96\" (UID: \"3a49290e-8dbd-47d0-9fa6-babbf3b53b96\") " Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.074968 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a49290e-8dbd-47d0-9fa6-babbf3b53b96-config-data\") pod \"3a49290e-8dbd-47d0-9fa6-babbf3b53b96\" (UID: \"3a49290e-8dbd-47d0-9fa6-babbf3b53b96\") " Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.075006 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3a49290e-8dbd-47d0-9fa6-babbf3b53b96-etc-machine-id\") pod \"3a49290e-8dbd-47d0-9fa6-babbf3b53b96\" (UID: \"3a49290e-8dbd-47d0-9fa6-babbf3b53b96\") " Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.075038 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a49290e-8dbd-47d0-9fa6-babbf3b53b96-logs\") pod \"3a49290e-8dbd-47d0-9fa6-babbf3b53b96\" (UID: \"3a49290e-8dbd-47d0-9fa6-babbf3b53b96\") " Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.075063 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a49290e-8dbd-47d0-9fa6-babbf3b53b96-combined-ca-bundle\") pod \"3a49290e-8dbd-47d0-9fa6-babbf3b53b96\" (UID: \"3a49290e-8dbd-47d0-9fa6-babbf3b53b96\") " Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.075158 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a49290e-8dbd-47d0-9fa6-babbf3b53b96-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "3a49290e-8dbd-47d0-9fa6-babbf3b53b96" (UID: "3a49290e-8dbd-47d0-9fa6-babbf3b53b96"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.075207 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3a49290e-8dbd-47d0-9fa6-babbf3b53b96-config-data-custom\") pod \"3a49290e-8dbd-47d0-9fa6-babbf3b53b96\" (UID: \"3a49290e-8dbd-47d0-9fa6-babbf3b53b96\") " Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.075255 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bpp9\" (UniqueName: \"kubernetes.io/projected/3a49290e-8dbd-47d0-9fa6-babbf3b53b96-kube-api-access-6bpp9\") pod \"3a49290e-8dbd-47d0-9fa6-babbf3b53b96\" (UID: \"3a49290e-8dbd-47d0-9fa6-babbf3b53b96\") " Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.075479 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a49290e-8dbd-47d0-9fa6-babbf3b53b96-logs" (OuterVolumeSpecName: "logs") pod "3a49290e-8dbd-47d0-9fa6-babbf3b53b96" (UID: "3a49290e-8dbd-47d0-9fa6-babbf3b53b96"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.075802 4901 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3a49290e-8dbd-47d0-9fa6-babbf3b53b96-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.075822 4901 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a49290e-8dbd-47d0-9fa6-babbf3b53b96-logs\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.079185 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a49290e-8dbd-47d0-9fa6-babbf3b53b96-kube-api-access-6bpp9" (OuterVolumeSpecName: "kube-api-access-6bpp9") pod "3a49290e-8dbd-47d0-9fa6-babbf3b53b96" (UID: "3a49290e-8dbd-47d0-9fa6-babbf3b53b96"). InnerVolumeSpecName "kube-api-access-6bpp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.079599 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a49290e-8dbd-47d0-9fa6-babbf3b53b96-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3a49290e-8dbd-47d0-9fa6-babbf3b53b96" (UID: "3a49290e-8dbd-47d0-9fa6-babbf3b53b96"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.079768 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a49290e-8dbd-47d0-9fa6-babbf3b53b96-scripts" (OuterVolumeSpecName: "scripts") pod "3a49290e-8dbd-47d0-9fa6-babbf3b53b96" (UID: "3a49290e-8dbd-47d0-9fa6-babbf3b53b96"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.099616 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a49290e-8dbd-47d0-9fa6-babbf3b53b96-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3a49290e-8dbd-47d0-9fa6-babbf3b53b96" (UID: "3a49290e-8dbd-47d0-9fa6-babbf3b53b96"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.123684 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a49290e-8dbd-47d0-9fa6-babbf3b53b96-config-data" (OuterVolumeSpecName: "config-data") pod "3a49290e-8dbd-47d0-9fa6-babbf3b53b96" (UID: "3a49290e-8dbd-47d0-9fa6-babbf3b53b96"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.177130 4901 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3a49290e-8dbd-47d0-9fa6-babbf3b53b96-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.177162 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bpp9\" (UniqueName: \"kubernetes.io/projected/3a49290e-8dbd-47d0-9fa6-babbf3b53b96-kube-api-access-6bpp9\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.177175 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a49290e-8dbd-47d0-9fa6-babbf3b53b96-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.177184 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a49290e-8dbd-47d0-9fa6-babbf3b53b96-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.177193 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a49290e-8dbd-47d0-9fa6-babbf3b53b96-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.191789 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6cc8485b48-f86rl" event={"ID":"e5efc6dd-6a36-4491-b090-b4c9301ec7d0","Type":"ContainerStarted","Data":"cf4f1b9588039d52f25928fa5f258488b5893e580645e30a5d8a63dcf3396c0b"} Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.191828 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6cc8485b48-f86rl" event={"ID":"e5efc6dd-6a36-4491-b090-b4c9301ec7d0","Type":"ContainerStarted","Data":"148d0935d0af545a31bff0013cab741797e444db8fdba8b7ef3fe82da71d3a67"} Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.191837 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6cc8485b48-f86rl" event={"ID":"e5efc6dd-6a36-4491-b090-b4c9301ec7d0","Type":"ContainerStarted","Data":"dc7315e94862f6ebce57ac75530e055aa7deb921396c97632ce6afa930c45ff9"} Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.192691 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6cc8485b48-f86rl" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.192714 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6cc8485b48-f86rl" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.196216 4901 generic.go:334] "Generic (PLEG): container finished" podID="3a49290e-8dbd-47d0-9fa6-babbf3b53b96" containerID="516851314c1c141458a097240df1375a133b9905fece9e4818bf88a408f83080" exitCode=0 Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.196256 4901 generic.go:334] "Generic (PLEG): container finished" podID="3a49290e-8dbd-47d0-9fa6-babbf3b53b96" containerID="6bbe3b8c869d602179800a4301c3807c9e1aeb60e71d30d88632cbe573bfd8fe" exitCode=143 Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.196292 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3a49290e-8dbd-47d0-9fa6-babbf3b53b96","Type":"ContainerDied","Data":"516851314c1c141458a097240df1375a133b9905fece9e4818bf88a408f83080"} Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.196312 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3a49290e-8dbd-47d0-9fa6-babbf3b53b96","Type":"ContainerDied","Data":"6bbe3b8c869d602179800a4301c3807c9e1aeb60e71d30d88632cbe573bfd8fe"} Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.196322 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3a49290e-8dbd-47d0-9fa6-babbf3b53b96","Type":"ContainerDied","Data":"c5a5622cde38959975a0ceaf782f72f108b4a121c7e14ce8809126c4a1607365"} Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.196336 4901 scope.go:117] "RemoveContainer" containerID="516851314c1c141458a097240df1375a133b9905fece9e4818bf88a408f83080" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.196438 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.201058 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5157e341-8a57-4f03-8061-e6c8853dddb4","Type":"ContainerStarted","Data":"575fa649b573943652ea995943aae329e269a75ec468388a3fb8f2d76c031b60"} Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.222071 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6cc8485b48-f86rl" podStartSLOduration=2.222055553 podStartE2EDuration="2.222055553s" podCreationTimestamp="2026-03-09 03:02:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 03:02:41.216210696 +0000 UTC m=+1285.805874478" watchObservedRunningTime="2026-03-09 03:02:41.222055553 +0000 UTC m=+1285.811719285" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.287667 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.296450 4901 scope.go:117] "RemoveContainer" containerID="6bbe3b8c869d602179800a4301c3807c9e1aeb60e71d30d88632cbe573bfd8fe" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.307400 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.319581 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 09 03:02:41 crc kubenswrapper[4901]: E0309 03:02:41.319969 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a49290e-8dbd-47d0-9fa6-babbf3b53b96" containerName="cinder-api-log" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.319982 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a49290e-8dbd-47d0-9fa6-babbf3b53b96" containerName="cinder-api-log" Mar 09 03:02:41 crc kubenswrapper[4901]: E0309 03:02:41.320010 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a49290e-8dbd-47d0-9fa6-babbf3b53b96" containerName="cinder-api" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.320016 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a49290e-8dbd-47d0-9fa6-babbf3b53b96" containerName="cinder-api" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.320212 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a49290e-8dbd-47d0-9fa6-babbf3b53b96" containerName="cinder-api" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.320220 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a49290e-8dbd-47d0-9fa6-babbf3b53b96" containerName="cinder-api-log" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.321564 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.323773 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.324059 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.324166 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.335555 4901 scope.go:117] "RemoveContainer" containerID="516851314c1c141458a097240df1375a133b9905fece9e4818bf88a408f83080" Mar 09 03:02:41 crc kubenswrapper[4901]: E0309 03:02:41.335950 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"516851314c1c141458a097240df1375a133b9905fece9e4818bf88a408f83080\": container with ID starting with 516851314c1c141458a097240df1375a133b9905fece9e4818bf88a408f83080 not found: ID does not exist" containerID="516851314c1c141458a097240df1375a133b9905fece9e4818bf88a408f83080" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.335979 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"516851314c1c141458a097240df1375a133b9905fece9e4818bf88a408f83080"} err="failed to get container status \"516851314c1c141458a097240df1375a133b9905fece9e4818bf88a408f83080\": rpc error: code = NotFound desc = could not find container \"516851314c1c141458a097240df1375a133b9905fece9e4818bf88a408f83080\": container with ID starting with 516851314c1c141458a097240df1375a133b9905fece9e4818bf88a408f83080 not found: ID does not exist" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.336003 4901 scope.go:117] "RemoveContainer" containerID="6bbe3b8c869d602179800a4301c3807c9e1aeb60e71d30d88632cbe573bfd8fe" Mar 09 03:02:41 crc kubenswrapper[4901]: E0309 03:02:41.336624 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bbe3b8c869d602179800a4301c3807c9e1aeb60e71d30d88632cbe573bfd8fe\": container with ID starting with 6bbe3b8c869d602179800a4301c3807c9e1aeb60e71d30d88632cbe573bfd8fe not found: ID does not exist" containerID="6bbe3b8c869d602179800a4301c3807c9e1aeb60e71d30d88632cbe573bfd8fe" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.336663 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bbe3b8c869d602179800a4301c3807c9e1aeb60e71d30d88632cbe573bfd8fe"} err="failed to get container status \"6bbe3b8c869d602179800a4301c3807c9e1aeb60e71d30d88632cbe573bfd8fe\": rpc error: code = NotFound desc = could not find container \"6bbe3b8c869d602179800a4301c3807c9e1aeb60e71d30d88632cbe573bfd8fe\": container with ID starting with 6bbe3b8c869d602179800a4301c3807c9e1aeb60e71d30d88632cbe573bfd8fe not found: ID does not exist" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.336690 4901 scope.go:117] "RemoveContainer" containerID="516851314c1c141458a097240df1375a133b9905fece9e4818bf88a408f83080" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.337050 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"516851314c1c141458a097240df1375a133b9905fece9e4818bf88a408f83080"} err="failed to get container status \"516851314c1c141458a097240df1375a133b9905fece9e4818bf88a408f83080\": rpc error: code = NotFound desc = could not find container \"516851314c1c141458a097240df1375a133b9905fece9e4818bf88a408f83080\": container with ID starting with 516851314c1c141458a097240df1375a133b9905fece9e4818bf88a408f83080 not found: ID does not exist" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.337068 4901 scope.go:117] "RemoveContainer" containerID="6bbe3b8c869d602179800a4301c3807c9e1aeb60e71d30d88632cbe573bfd8fe" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.337421 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bbe3b8c869d602179800a4301c3807c9e1aeb60e71d30d88632cbe573bfd8fe"} err="failed to get container status \"6bbe3b8c869d602179800a4301c3807c9e1aeb60e71d30d88632cbe573bfd8fe\": rpc error: code = NotFound desc = could not find container \"6bbe3b8c869d602179800a4301c3807c9e1aeb60e71d30d88632cbe573bfd8fe\": container with ID starting with 6bbe3b8c869d602179800a4301c3807c9e1aeb60e71d30d88632cbe573bfd8fe not found: ID does not exist" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.357138 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.483377 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/719d451b-159a-4fa7-9c72-54f42fb4f216-scripts\") pod \"cinder-api-0\" (UID: \"719d451b-159a-4fa7-9c72-54f42fb4f216\") " pod="openstack/cinder-api-0" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.483435 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/719d451b-159a-4fa7-9c72-54f42fb4f216-logs\") pod \"cinder-api-0\" (UID: \"719d451b-159a-4fa7-9c72-54f42fb4f216\") " pod="openstack/cinder-api-0" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.483473 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/719d451b-159a-4fa7-9c72-54f42fb4f216-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"719d451b-159a-4fa7-9c72-54f42fb4f216\") " pod="openstack/cinder-api-0" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.483614 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/719d451b-159a-4fa7-9c72-54f42fb4f216-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"719d451b-159a-4fa7-9c72-54f42fb4f216\") " pod="openstack/cinder-api-0" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.483722 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhmww\" (UniqueName: \"kubernetes.io/projected/719d451b-159a-4fa7-9c72-54f42fb4f216-kube-api-access-hhmww\") pod \"cinder-api-0\" (UID: \"719d451b-159a-4fa7-9c72-54f42fb4f216\") " pod="openstack/cinder-api-0" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.483796 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/719d451b-159a-4fa7-9c72-54f42fb4f216-etc-machine-id\") pod \"cinder-api-0\" (UID: \"719d451b-159a-4fa7-9c72-54f42fb4f216\") " pod="openstack/cinder-api-0" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.483823 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/719d451b-159a-4fa7-9c72-54f42fb4f216-public-tls-certs\") pod \"cinder-api-0\" (UID: \"719d451b-159a-4fa7-9c72-54f42fb4f216\") " pod="openstack/cinder-api-0" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.483876 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/719d451b-159a-4fa7-9c72-54f42fb4f216-config-data\") pod \"cinder-api-0\" (UID: \"719d451b-159a-4fa7-9c72-54f42fb4f216\") " pod="openstack/cinder-api-0" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.483987 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/719d451b-159a-4fa7-9c72-54f42fb4f216-config-data-custom\") pod \"cinder-api-0\" (UID: \"719d451b-159a-4fa7-9c72-54f42fb4f216\") " pod="openstack/cinder-api-0" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.585776 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/719d451b-159a-4fa7-9c72-54f42fb4f216-config-data\") pod \"cinder-api-0\" (UID: \"719d451b-159a-4fa7-9c72-54f42fb4f216\") " pod="openstack/cinder-api-0" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.585832 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/719d451b-159a-4fa7-9c72-54f42fb4f216-config-data-custom\") pod \"cinder-api-0\" (UID: \"719d451b-159a-4fa7-9c72-54f42fb4f216\") " pod="openstack/cinder-api-0" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.585874 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/719d451b-159a-4fa7-9c72-54f42fb4f216-scripts\") pod \"cinder-api-0\" (UID: \"719d451b-159a-4fa7-9c72-54f42fb4f216\") " pod="openstack/cinder-api-0" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.585896 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/719d451b-159a-4fa7-9c72-54f42fb4f216-logs\") pod \"cinder-api-0\" (UID: \"719d451b-159a-4fa7-9c72-54f42fb4f216\") " pod="openstack/cinder-api-0" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.585927 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/719d451b-159a-4fa7-9c72-54f42fb4f216-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"719d451b-159a-4fa7-9c72-54f42fb4f216\") " pod="openstack/cinder-api-0" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.585978 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/719d451b-159a-4fa7-9c72-54f42fb4f216-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"719d451b-159a-4fa7-9c72-54f42fb4f216\") " pod="openstack/cinder-api-0" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.586007 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhmww\" (UniqueName: \"kubernetes.io/projected/719d451b-159a-4fa7-9c72-54f42fb4f216-kube-api-access-hhmww\") pod \"cinder-api-0\" (UID: \"719d451b-159a-4fa7-9c72-54f42fb4f216\") " pod="openstack/cinder-api-0" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.586033 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/719d451b-159a-4fa7-9c72-54f42fb4f216-etc-machine-id\") pod \"cinder-api-0\" (UID: \"719d451b-159a-4fa7-9c72-54f42fb4f216\") " pod="openstack/cinder-api-0" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.586047 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/719d451b-159a-4fa7-9c72-54f42fb4f216-public-tls-certs\") pod \"cinder-api-0\" (UID: \"719d451b-159a-4fa7-9c72-54f42fb4f216\") " pod="openstack/cinder-api-0" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.586687 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/719d451b-159a-4fa7-9c72-54f42fb4f216-logs\") pod \"cinder-api-0\" (UID: \"719d451b-159a-4fa7-9c72-54f42fb4f216\") " pod="openstack/cinder-api-0" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.587643 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/719d451b-159a-4fa7-9c72-54f42fb4f216-etc-machine-id\") pod \"cinder-api-0\" (UID: \"719d451b-159a-4fa7-9c72-54f42fb4f216\") " pod="openstack/cinder-api-0" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.592158 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/719d451b-159a-4fa7-9c72-54f42fb4f216-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"719d451b-159a-4fa7-9c72-54f42fb4f216\") " pod="openstack/cinder-api-0" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.592517 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/719d451b-159a-4fa7-9c72-54f42fb4f216-scripts\") pod \"cinder-api-0\" (UID: \"719d451b-159a-4fa7-9c72-54f42fb4f216\") " pod="openstack/cinder-api-0" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.593108 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/719d451b-159a-4fa7-9c72-54f42fb4f216-config-data\") pod \"cinder-api-0\" (UID: \"719d451b-159a-4fa7-9c72-54f42fb4f216\") " pod="openstack/cinder-api-0" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.593601 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/719d451b-159a-4fa7-9c72-54f42fb4f216-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"719d451b-159a-4fa7-9c72-54f42fb4f216\") " pod="openstack/cinder-api-0" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.593612 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/719d451b-159a-4fa7-9c72-54f42fb4f216-public-tls-certs\") pod \"cinder-api-0\" (UID: \"719d451b-159a-4fa7-9c72-54f42fb4f216\") " pod="openstack/cinder-api-0" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.595092 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/719d451b-159a-4fa7-9c72-54f42fb4f216-config-data-custom\") pod \"cinder-api-0\" (UID: \"719d451b-159a-4fa7-9c72-54f42fb4f216\") " pod="openstack/cinder-api-0" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.604023 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhmww\" (UniqueName: \"kubernetes.io/projected/719d451b-159a-4fa7-9c72-54f42fb4f216-kube-api-access-hhmww\") pod \"cinder-api-0\" (UID: \"719d451b-159a-4fa7-9c72-54f42fb4f216\") " pod="openstack/cinder-api-0" Mar 09 03:02:41 crc kubenswrapper[4901]: I0309 03:02:41.636479 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 09 03:02:42 crc kubenswrapper[4901]: I0309 03:02:42.082720 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 09 03:02:42 crc kubenswrapper[4901]: I0309 03:02:42.116784 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a49290e-8dbd-47d0-9fa6-babbf3b53b96" path="/var/lib/kubelet/pods/3a49290e-8dbd-47d0-9fa6-babbf3b53b96/volumes" Mar 09 03:02:42 crc kubenswrapper[4901]: I0309 03:02:42.225242 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"719d451b-159a-4fa7-9c72-54f42fb4f216","Type":"ContainerStarted","Data":"ae52503a05f8a417fd0d4fe6767b11247b012f0c8a7497a4bdaf32ad11e4ccb2"} Mar 09 03:02:43 crc kubenswrapper[4901]: I0309 03:02:43.243435 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"719d451b-159a-4fa7-9c72-54f42fb4f216","Type":"ContainerStarted","Data":"3d80911034ba2c7400726494bee1c9c208e59f1ab6ee5fe2955eaf17b33102d5"} Mar 09 03:02:43 crc kubenswrapper[4901]: I0309 03:02:43.248302 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5157e341-8a57-4f03-8061-e6c8853dddb4","Type":"ContainerStarted","Data":"b8495428bc91f79334a918c6d694f02ee4ef2e45cc7507747ca003edd66fe46d"} Mar 09 03:02:43 crc kubenswrapper[4901]: I0309 03:02:43.248467 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 03:02:43 crc kubenswrapper[4901]: I0309 03:02:43.269073 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.862971663 podStartE2EDuration="8.269058562s" podCreationTimestamp="2026-03-09 03:02:35 +0000 UTC" firstStartedPulling="2026-03-09 03:02:37.535682573 +0000 UTC m=+1282.125346305" lastFinishedPulling="2026-03-09 03:02:42.941769462 +0000 UTC m=+1287.531433204" observedRunningTime="2026-03-09 03:02:43.267993895 +0000 UTC m=+1287.857657627" watchObservedRunningTime="2026-03-09 03:02:43.269058562 +0000 UTC m=+1287.858722294" Mar 09 03:02:44 crc kubenswrapper[4901]: I0309 03:02:44.258014 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"719d451b-159a-4fa7-9c72-54f42fb4f216","Type":"ContainerStarted","Data":"bddc8674c0121619ce3f35bb3a4687012269faf10b370e3405b16aff91ea8de7"} Mar 09 03:02:44 crc kubenswrapper[4901]: I0309 03:02:44.283680 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.283662557 podStartE2EDuration="3.283662557s" podCreationTimestamp="2026-03-09 03:02:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 03:02:44.276435804 +0000 UTC m=+1288.866099546" watchObservedRunningTime="2026-03-09 03:02:44.283662557 +0000 UTC m=+1288.873326299" Mar 09 03:02:44 crc kubenswrapper[4901]: I0309 03:02:44.676117 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-766b7d5cd8-9xjl5" Mar 09 03:02:44 crc kubenswrapper[4901]: I0309 03:02:44.953022 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6f87bb5fc6-qwprj" Mar 09 03:02:44 crc kubenswrapper[4901]: I0309 03:02:44.958292 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-57d6545db5-vnkq4"] Mar 09 03:02:44 crc kubenswrapper[4901]: I0309 03:02:44.958545 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-57d6545db5-vnkq4" podUID="7f7aec5c-9887-4331-8806-3164120e927e" containerName="neutron-api" containerID="cri-o://d470f4319d83b0f2465857ddb8bb7578426b84cca0130f611b83b80a7b4431a5" gracePeriod=30 Mar 09 03:02:44 crc kubenswrapper[4901]: I0309 03:02:44.958835 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-57d6545db5-vnkq4" podUID="7f7aec5c-9887-4331-8806-3164120e927e" containerName="neutron-httpd" containerID="cri-o://8c06a059ceda4f7e216a798de721d42fa01fd9f026bcf1c558dc22f3544edb04" gracePeriod=30 Mar 09 03:02:44 crc kubenswrapper[4901]: I0309 03:02:44.981459 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-57d6545db5-vnkq4" podUID="7f7aec5c-9887-4331-8806-3164120e927e" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.157:9696/\": read tcp 10.217.0.2:52748->10.217.0.157:9696: read: connection reset by peer" Mar 09 03:02:45 crc kubenswrapper[4901]: I0309 03:02:45.027428 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5b7f6df545-whtgc"] Mar 09 03:02:45 crc kubenswrapper[4901]: I0309 03:02:45.030090 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5b7f6df545-whtgc" Mar 09 03:02:45 crc kubenswrapper[4901]: I0309 03:02:45.050591 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5b7f6df545-whtgc"] Mar 09 03:02:45 crc kubenswrapper[4901]: I0309 03:02:45.079786 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6f87bb5fc6-qwprj" Mar 09 03:02:45 crc kubenswrapper[4901]: I0309 03:02:45.170128 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a29f795d-59d2-4e43-a6ee-6190dc0ad67d-ovndb-tls-certs\") pod \"neutron-5b7f6df545-whtgc\" (UID: \"a29f795d-59d2-4e43-a6ee-6190dc0ad67d\") " pod="openstack/neutron-5b7f6df545-whtgc" Mar 09 03:02:45 crc kubenswrapper[4901]: I0309 03:02:45.170166 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a29f795d-59d2-4e43-a6ee-6190dc0ad67d-config\") pod \"neutron-5b7f6df545-whtgc\" (UID: \"a29f795d-59d2-4e43-a6ee-6190dc0ad67d\") " pod="openstack/neutron-5b7f6df545-whtgc" Mar 09 03:02:45 crc kubenswrapper[4901]: I0309 03:02:45.170199 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrv9x\" (UniqueName: \"kubernetes.io/projected/a29f795d-59d2-4e43-a6ee-6190dc0ad67d-kube-api-access-hrv9x\") pod \"neutron-5b7f6df545-whtgc\" (UID: \"a29f795d-59d2-4e43-a6ee-6190dc0ad67d\") " pod="openstack/neutron-5b7f6df545-whtgc" Mar 09 03:02:45 crc kubenswrapper[4901]: I0309 03:02:45.170218 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a29f795d-59d2-4e43-a6ee-6190dc0ad67d-httpd-config\") pod \"neutron-5b7f6df545-whtgc\" (UID: \"a29f795d-59d2-4e43-a6ee-6190dc0ad67d\") " pod="openstack/neutron-5b7f6df545-whtgc" Mar 09 03:02:45 crc kubenswrapper[4901]: I0309 03:02:45.170284 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a29f795d-59d2-4e43-a6ee-6190dc0ad67d-public-tls-certs\") pod \"neutron-5b7f6df545-whtgc\" (UID: \"a29f795d-59d2-4e43-a6ee-6190dc0ad67d\") " pod="openstack/neutron-5b7f6df545-whtgc" Mar 09 03:02:45 crc kubenswrapper[4901]: I0309 03:02:45.170331 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a29f795d-59d2-4e43-a6ee-6190dc0ad67d-internal-tls-certs\") pod \"neutron-5b7f6df545-whtgc\" (UID: \"a29f795d-59d2-4e43-a6ee-6190dc0ad67d\") " pod="openstack/neutron-5b7f6df545-whtgc" Mar 09 03:02:45 crc kubenswrapper[4901]: I0309 03:02:45.170392 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a29f795d-59d2-4e43-a6ee-6190dc0ad67d-combined-ca-bundle\") pod \"neutron-5b7f6df545-whtgc\" (UID: \"a29f795d-59d2-4e43-a6ee-6190dc0ad67d\") " pod="openstack/neutron-5b7f6df545-whtgc" Mar 09 03:02:45 crc kubenswrapper[4901]: I0309 03:02:45.265819 4901 generic.go:334] "Generic (PLEG): container finished" podID="7f7aec5c-9887-4331-8806-3164120e927e" containerID="8c06a059ceda4f7e216a798de721d42fa01fd9f026bcf1c558dc22f3544edb04" exitCode=0 Mar 09 03:02:45 crc kubenswrapper[4901]: I0309 03:02:45.266238 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57d6545db5-vnkq4" event={"ID":"7f7aec5c-9887-4331-8806-3164120e927e","Type":"ContainerDied","Data":"8c06a059ceda4f7e216a798de721d42fa01fd9f026bcf1c558dc22f3544edb04"} Mar 09 03:02:45 crc kubenswrapper[4901]: I0309 03:02:45.268079 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 09 03:02:45 crc kubenswrapper[4901]: I0309 03:02:45.272013 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrv9x\" (UniqueName: \"kubernetes.io/projected/a29f795d-59d2-4e43-a6ee-6190dc0ad67d-kube-api-access-hrv9x\") pod \"neutron-5b7f6df545-whtgc\" (UID: \"a29f795d-59d2-4e43-a6ee-6190dc0ad67d\") " pod="openstack/neutron-5b7f6df545-whtgc" Mar 09 03:02:45 crc kubenswrapper[4901]: I0309 03:02:45.272050 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a29f795d-59d2-4e43-a6ee-6190dc0ad67d-httpd-config\") pod \"neutron-5b7f6df545-whtgc\" (UID: \"a29f795d-59d2-4e43-a6ee-6190dc0ad67d\") " pod="openstack/neutron-5b7f6df545-whtgc" Mar 09 03:02:45 crc kubenswrapper[4901]: I0309 03:02:45.272115 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a29f795d-59d2-4e43-a6ee-6190dc0ad67d-public-tls-certs\") pod \"neutron-5b7f6df545-whtgc\" (UID: \"a29f795d-59d2-4e43-a6ee-6190dc0ad67d\") " pod="openstack/neutron-5b7f6df545-whtgc" Mar 09 03:02:45 crc kubenswrapper[4901]: I0309 03:02:45.272185 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a29f795d-59d2-4e43-a6ee-6190dc0ad67d-internal-tls-certs\") pod \"neutron-5b7f6df545-whtgc\" (UID: \"a29f795d-59d2-4e43-a6ee-6190dc0ad67d\") " pod="openstack/neutron-5b7f6df545-whtgc" Mar 09 03:02:45 crc kubenswrapper[4901]: I0309 03:02:45.272268 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a29f795d-59d2-4e43-a6ee-6190dc0ad67d-combined-ca-bundle\") pod \"neutron-5b7f6df545-whtgc\" (UID: \"a29f795d-59d2-4e43-a6ee-6190dc0ad67d\") " pod="openstack/neutron-5b7f6df545-whtgc" Mar 09 03:02:45 crc kubenswrapper[4901]: I0309 03:02:45.272295 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a29f795d-59d2-4e43-a6ee-6190dc0ad67d-ovndb-tls-certs\") pod \"neutron-5b7f6df545-whtgc\" (UID: \"a29f795d-59d2-4e43-a6ee-6190dc0ad67d\") " pod="openstack/neutron-5b7f6df545-whtgc" Mar 09 03:02:45 crc kubenswrapper[4901]: I0309 03:02:45.273773 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a29f795d-59d2-4e43-a6ee-6190dc0ad67d-config\") pod \"neutron-5b7f6df545-whtgc\" (UID: \"a29f795d-59d2-4e43-a6ee-6190dc0ad67d\") " pod="openstack/neutron-5b7f6df545-whtgc" Mar 09 03:02:45 crc kubenswrapper[4901]: I0309 03:02:45.287522 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a29f795d-59d2-4e43-a6ee-6190dc0ad67d-combined-ca-bundle\") pod \"neutron-5b7f6df545-whtgc\" (UID: \"a29f795d-59d2-4e43-a6ee-6190dc0ad67d\") " pod="openstack/neutron-5b7f6df545-whtgc" Mar 09 03:02:45 crc kubenswrapper[4901]: I0309 03:02:45.289749 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a29f795d-59d2-4e43-a6ee-6190dc0ad67d-public-tls-certs\") pod \"neutron-5b7f6df545-whtgc\" (UID: \"a29f795d-59d2-4e43-a6ee-6190dc0ad67d\") " pod="openstack/neutron-5b7f6df545-whtgc" Mar 09 03:02:45 crc kubenswrapper[4901]: I0309 03:02:45.290308 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a29f795d-59d2-4e43-a6ee-6190dc0ad67d-internal-tls-certs\") pod \"neutron-5b7f6df545-whtgc\" (UID: \"a29f795d-59d2-4e43-a6ee-6190dc0ad67d\") " pod="openstack/neutron-5b7f6df545-whtgc" Mar 09 03:02:45 crc kubenswrapper[4901]: I0309 03:02:45.294005 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a29f795d-59d2-4e43-a6ee-6190dc0ad67d-httpd-config\") pod \"neutron-5b7f6df545-whtgc\" (UID: \"a29f795d-59d2-4e43-a6ee-6190dc0ad67d\") " pod="openstack/neutron-5b7f6df545-whtgc" Mar 09 03:02:45 crc kubenswrapper[4901]: I0309 03:02:45.294296 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a29f795d-59d2-4e43-a6ee-6190dc0ad67d-config\") pod \"neutron-5b7f6df545-whtgc\" (UID: \"a29f795d-59d2-4e43-a6ee-6190dc0ad67d\") " pod="openstack/neutron-5b7f6df545-whtgc" Mar 09 03:02:45 crc kubenswrapper[4901]: I0309 03:02:45.294428 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a29f795d-59d2-4e43-a6ee-6190dc0ad67d-ovndb-tls-certs\") pod \"neutron-5b7f6df545-whtgc\" (UID: \"a29f795d-59d2-4e43-a6ee-6190dc0ad67d\") " pod="openstack/neutron-5b7f6df545-whtgc" Mar 09 03:02:45 crc kubenswrapper[4901]: I0309 03:02:45.298873 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrv9x\" (UniqueName: \"kubernetes.io/projected/a29f795d-59d2-4e43-a6ee-6190dc0ad67d-kube-api-access-hrv9x\") pod \"neutron-5b7f6df545-whtgc\" (UID: \"a29f795d-59d2-4e43-a6ee-6190dc0ad67d\") " pod="openstack/neutron-5b7f6df545-whtgc" Mar 09 03:02:45 crc kubenswrapper[4901]: I0309 03:02:45.353577 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5b7f6df545-whtgc" Mar 09 03:02:45 crc kubenswrapper[4901]: I0309 03:02:45.630353 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-765c5b6b49-64hwd" Mar 09 03:02:45 crc kubenswrapper[4901]: I0309 03:02:45.680737 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cbd95f657-4psgt"] Mar 09 03:02:45 crc kubenswrapper[4901]: I0309 03:02:45.680953 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6cbd95f657-4psgt" podUID="1da2c732-ff7f-4359-8ce4-575fa65b8da0" containerName="dnsmasq-dns" containerID="cri-o://b061f4aaf549eb86d1d7efe612297014d2cec92cbdf9b8e77e24149dce0551ac" gracePeriod=10 Mar 09 03:02:45 crc kubenswrapper[4901]: I0309 03:02:45.932856 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5b7f6df545-whtgc"] Mar 09 03:02:45 crc kubenswrapper[4901]: W0309 03:02:45.966980 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda29f795d_59d2_4e43_a6ee_6190dc0ad67d.slice/crio-0a9683b7241c768c2c7616fd8067934d41f3bad519d9e1af200d5f4cc77411d5 WatchSource:0}: Error finding container 0a9683b7241c768c2c7616fd8067934d41f3bad519d9e1af200d5f4cc77411d5: Status 404 returned error can't find the container with id 0a9683b7241c768c2c7616fd8067934d41f3bad519d9e1af200d5f4cc77411d5 Mar 09 03:02:46 crc kubenswrapper[4901]: I0309 03:02:46.006581 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 09 03:02:46 crc kubenswrapper[4901]: I0309 03:02:46.072917 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 03:02:46 crc kubenswrapper[4901]: I0309 03:02:46.273921 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b7f6df545-whtgc" event={"ID":"a29f795d-59d2-4e43-a6ee-6190dc0ad67d","Type":"ContainerStarted","Data":"0a9683b7241c768c2c7616fd8067934d41f3bad519d9e1af200d5f4cc77411d5"} Mar 09 03:02:46 crc kubenswrapper[4901]: I0309 03:02:46.275535 4901 generic.go:334] "Generic (PLEG): container finished" podID="1da2c732-ff7f-4359-8ce4-575fa65b8da0" containerID="b061f4aaf549eb86d1d7efe612297014d2cec92cbdf9b8e77e24149dce0551ac" exitCode=0 Mar 09 03:02:46 crc kubenswrapper[4901]: I0309 03:02:46.275702 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="114dc2da-7bec-47de-a6f4-a37e2c56e1f8" containerName="cinder-scheduler" containerID="cri-o://25c4d07f357e45acabb171f36d58fbe887d3725a24d1b8089d0660e83c055af4" gracePeriod=30 Mar 09 03:02:46 crc kubenswrapper[4901]: I0309 03:02:46.275906 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cbd95f657-4psgt" event={"ID":"1da2c732-ff7f-4359-8ce4-575fa65b8da0","Type":"ContainerDied","Data":"b061f4aaf549eb86d1d7efe612297014d2cec92cbdf9b8e77e24149dce0551ac"} Mar 09 03:02:46 crc kubenswrapper[4901]: I0309 03:02:46.275930 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cbd95f657-4psgt" event={"ID":"1da2c732-ff7f-4359-8ce4-575fa65b8da0","Type":"ContainerDied","Data":"d3f6eae51aa94690b4c937feaa0a4ada44470bf588890dfd09bc5704b1245748"} Mar 09 03:02:46 crc kubenswrapper[4901]: I0309 03:02:46.275940 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3f6eae51aa94690b4c937feaa0a4ada44470bf588890dfd09bc5704b1245748" Mar 09 03:02:46 crc kubenswrapper[4901]: I0309 03:02:46.276910 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="114dc2da-7bec-47de-a6f4-a37e2c56e1f8" containerName="probe" containerID="cri-o://a55bddb0310856452ae351719287141366e86613b5d5808d8245686cb1b2b46c" gracePeriod=30 Mar 09 03:02:46 crc kubenswrapper[4901]: I0309 03:02:46.335692 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cbd95f657-4psgt" Mar 09 03:02:46 crc kubenswrapper[4901]: I0309 03:02:46.497964 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1da2c732-ff7f-4359-8ce4-575fa65b8da0-ovsdbserver-nb\") pod \"1da2c732-ff7f-4359-8ce4-575fa65b8da0\" (UID: \"1da2c732-ff7f-4359-8ce4-575fa65b8da0\") " Mar 09 03:02:46 crc kubenswrapper[4901]: I0309 03:02:46.498054 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1da2c732-ff7f-4359-8ce4-575fa65b8da0-dns-swift-storage-0\") pod \"1da2c732-ff7f-4359-8ce4-575fa65b8da0\" (UID: \"1da2c732-ff7f-4359-8ce4-575fa65b8da0\") " Mar 09 03:02:46 crc kubenswrapper[4901]: I0309 03:02:46.498164 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1da2c732-ff7f-4359-8ce4-575fa65b8da0-dns-svc\") pod \"1da2c732-ff7f-4359-8ce4-575fa65b8da0\" (UID: \"1da2c732-ff7f-4359-8ce4-575fa65b8da0\") " Mar 09 03:02:46 crc kubenswrapper[4901]: I0309 03:02:46.498189 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1da2c732-ff7f-4359-8ce4-575fa65b8da0-ovsdbserver-sb\") pod \"1da2c732-ff7f-4359-8ce4-575fa65b8da0\" (UID: \"1da2c732-ff7f-4359-8ce4-575fa65b8da0\") " Mar 09 03:02:46 crc kubenswrapper[4901]: I0309 03:02:46.498282 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1da2c732-ff7f-4359-8ce4-575fa65b8da0-config\") pod \"1da2c732-ff7f-4359-8ce4-575fa65b8da0\" (UID: \"1da2c732-ff7f-4359-8ce4-575fa65b8da0\") " Mar 09 03:02:46 crc kubenswrapper[4901]: I0309 03:02:46.498325 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5plbp\" (UniqueName: \"kubernetes.io/projected/1da2c732-ff7f-4359-8ce4-575fa65b8da0-kube-api-access-5plbp\") pod \"1da2c732-ff7f-4359-8ce4-575fa65b8da0\" (UID: \"1da2c732-ff7f-4359-8ce4-575fa65b8da0\") " Mar 09 03:02:46 crc kubenswrapper[4901]: I0309 03:02:46.505044 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1da2c732-ff7f-4359-8ce4-575fa65b8da0-kube-api-access-5plbp" (OuterVolumeSpecName: "kube-api-access-5plbp") pod "1da2c732-ff7f-4359-8ce4-575fa65b8da0" (UID: "1da2c732-ff7f-4359-8ce4-575fa65b8da0"). InnerVolumeSpecName "kube-api-access-5plbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:02:46 crc kubenswrapper[4901]: I0309 03:02:46.567904 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1da2c732-ff7f-4359-8ce4-575fa65b8da0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1da2c732-ff7f-4359-8ce4-575fa65b8da0" (UID: "1da2c732-ff7f-4359-8ce4-575fa65b8da0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:02:46 crc kubenswrapper[4901]: I0309 03:02:46.573682 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1da2c732-ff7f-4359-8ce4-575fa65b8da0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1da2c732-ff7f-4359-8ce4-575fa65b8da0" (UID: "1da2c732-ff7f-4359-8ce4-575fa65b8da0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:02:46 crc kubenswrapper[4901]: I0309 03:02:46.578568 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1da2c732-ff7f-4359-8ce4-575fa65b8da0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1da2c732-ff7f-4359-8ce4-575fa65b8da0" (UID: "1da2c732-ff7f-4359-8ce4-575fa65b8da0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:02:46 crc kubenswrapper[4901]: I0309 03:02:46.584815 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1da2c732-ff7f-4359-8ce4-575fa65b8da0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1da2c732-ff7f-4359-8ce4-575fa65b8da0" (UID: "1da2c732-ff7f-4359-8ce4-575fa65b8da0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:02:46 crc kubenswrapper[4901]: I0309 03:02:46.594102 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1da2c732-ff7f-4359-8ce4-575fa65b8da0-config" (OuterVolumeSpecName: "config") pod "1da2c732-ff7f-4359-8ce4-575fa65b8da0" (UID: "1da2c732-ff7f-4359-8ce4-575fa65b8da0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:02:46 crc kubenswrapper[4901]: I0309 03:02:46.600191 4901 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1da2c732-ff7f-4359-8ce4-575fa65b8da0-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:46 crc kubenswrapper[4901]: I0309 03:02:46.600350 4901 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1da2c732-ff7f-4359-8ce4-575fa65b8da0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:46 crc kubenswrapper[4901]: I0309 03:02:46.600432 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1da2c732-ff7f-4359-8ce4-575fa65b8da0-config\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:46 crc kubenswrapper[4901]: I0309 03:02:46.600504 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5plbp\" (UniqueName: \"kubernetes.io/projected/1da2c732-ff7f-4359-8ce4-575fa65b8da0-kube-api-access-5plbp\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:46 crc kubenswrapper[4901]: I0309 03:02:46.600565 4901 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1da2c732-ff7f-4359-8ce4-575fa65b8da0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:46 crc kubenswrapper[4901]: I0309 03:02:46.600633 4901 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1da2c732-ff7f-4359-8ce4-575fa65b8da0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:47 crc kubenswrapper[4901]: I0309 03:02:47.174173 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-57d6545db5-vnkq4" podUID="7f7aec5c-9887-4331-8806-3164120e927e" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.157:9696/\": dial tcp 10.217.0.157:9696: connect: connection refused" Mar 09 03:02:47 crc kubenswrapper[4901]: I0309 03:02:47.283792 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cbd95f657-4psgt" Mar 09 03:02:47 crc kubenswrapper[4901]: I0309 03:02:47.290934 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b7f6df545-whtgc" event={"ID":"a29f795d-59d2-4e43-a6ee-6190dc0ad67d","Type":"ContainerStarted","Data":"42ab54fcd0a589e516d8eb72267b465d9fcb7af80252fced9a388a56a784446b"} Mar 09 03:02:47 crc kubenswrapper[4901]: I0309 03:02:47.290960 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b7f6df545-whtgc" event={"ID":"a29f795d-59d2-4e43-a6ee-6190dc0ad67d","Type":"ContainerStarted","Data":"fd7c32b5c4e4206c35907a41758ab9c087a3999127a4db227227c92656f344b5"} Mar 09 03:02:47 crc kubenswrapper[4901]: I0309 03:02:47.290972 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5b7f6df545-whtgc" Mar 09 03:02:47 crc kubenswrapper[4901]: I0309 03:02:47.319833 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5b7f6df545-whtgc" podStartSLOduration=3.319816388 podStartE2EDuration="3.319816388s" podCreationTimestamp="2026-03-09 03:02:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 03:02:47.315996322 +0000 UTC m=+1291.905660054" watchObservedRunningTime="2026-03-09 03:02:47.319816388 +0000 UTC m=+1291.909480120" Mar 09 03:02:47 crc kubenswrapper[4901]: I0309 03:02:47.351012 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cbd95f657-4psgt"] Mar 09 03:02:47 crc kubenswrapper[4901]: I0309 03:02:47.357942 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6cbd95f657-4psgt"] Mar 09 03:02:47 crc kubenswrapper[4901]: I0309 03:02:47.871530 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6cc8485b48-f86rl" Mar 09 03:02:48 crc kubenswrapper[4901]: I0309 03:02:48.127956 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1da2c732-ff7f-4359-8ce4-575fa65b8da0" path="/var/lib/kubelet/pods/1da2c732-ff7f-4359-8ce4-575fa65b8da0/volumes" Mar 09 03:02:48 crc kubenswrapper[4901]: I0309 03:02:48.294552 4901 generic.go:334] "Generic (PLEG): container finished" podID="114dc2da-7bec-47de-a6f4-a37e2c56e1f8" containerID="a55bddb0310856452ae351719287141366e86613b5d5808d8245686cb1b2b46c" exitCode=0 Mar 09 03:02:48 crc kubenswrapper[4901]: I0309 03:02:48.295274 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"114dc2da-7bec-47de-a6f4-a37e2c56e1f8","Type":"ContainerDied","Data":"a55bddb0310856452ae351719287141366e86613b5d5808d8245686cb1b2b46c"} Mar 09 03:02:49 crc kubenswrapper[4901]: I0309 03:02:49.145731 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6cc8485b48-f86rl" Mar 09 03:02:49 crc kubenswrapper[4901]: I0309 03:02:49.214694 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6f87bb5fc6-qwprj"] Mar 09 03:02:49 crc kubenswrapper[4901]: I0309 03:02:49.215238 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6f87bb5fc6-qwprj" podUID="907b295e-2a2e-4bb0-ba41-4aa445eb8a29" containerName="barbican-api-log" containerID="cri-o://5db3a19e40c37a82437ff38abde51a4a6e1c83227d3710d42e45676f33438ee5" gracePeriod=30 Mar 09 03:02:49 crc kubenswrapper[4901]: I0309 03:02:49.215308 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6f87bb5fc6-qwprj" podUID="907b295e-2a2e-4bb0-ba41-4aa445eb8a29" containerName="barbican-api" containerID="cri-o://b0b1e1836b4a7e67bc4916199d7607a4524f8d7550ca48b13dd84f6ef5e20171" gracePeriod=30 Mar 09 03:02:49 crc kubenswrapper[4901]: I0309 03:02:49.342921 4901 generic.go:334] "Generic (PLEG): container finished" podID="114dc2da-7bec-47de-a6f4-a37e2c56e1f8" containerID="25c4d07f357e45acabb171f36d58fbe887d3725a24d1b8089d0660e83c055af4" exitCode=0 Mar 09 03:02:49 crc kubenswrapper[4901]: I0309 03:02:49.343196 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"114dc2da-7bec-47de-a6f4-a37e2c56e1f8","Type":"ContainerDied","Data":"25c4d07f357e45acabb171f36d58fbe887d3725a24d1b8089d0660e83c055af4"} Mar 09 03:02:49 crc kubenswrapper[4901]: I0309 03:02:49.524561 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 09 03:02:49 crc kubenswrapper[4901]: I0309 03:02:49.666768 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mj2z5\" (UniqueName: \"kubernetes.io/projected/114dc2da-7bec-47de-a6f4-a37e2c56e1f8-kube-api-access-mj2z5\") pod \"114dc2da-7bec-47de-a6f4-a37e2c56e1f8\" (UID: \"114dc2da-7bec-47de-a6f4-a37e2c56e1f8\") " Mar 09 03:02:49 crc kubenswrapper[4901]: I0309 03:02:49.666864 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/114dc2da-7bec-47de-a6f4-a37e2c56e1f8-scripts\") pod \"114dc2da-7bec-47de-a6f4-a37e2c56e1f8\" (UID: \"114dc2da-7bec-47de-a6f4-a37e2c56e1f8\") " Mar 09 03:02:49 crc kubenswrapper[4901]: I0309 03:02:49.666921 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/114dc2da-7bec-47de-a6f4-a37e2c56e1f8-etc-machine-id\") pod \"114dc2da-7bec-47de-a6f4-a37e2c56e1f8\" (UID: \"114dc2da-7bec-47de-a6f4-a37e2c56e1f8\") " Mar 09 03:02:49 crc kubenswrapper[4901]: I0309 03:02:49.666937 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/114dc2da-7bec-47de-a6f4-a37e2c56e1f8-config-data-custom\") pod \"114dc2da-7bec-47de-a6f4-a37e2c56e1f8\" (UID: \"114dc2da-7bec-47de-a6f4-a37e2c56e1f8\") " Mar 09 03:02:49 crc kubenswrapper[4901]: I0309 03:02:49.666997 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/114dc2da-7bec-47de-a6f4-a37e2c56e1f8-combined-ca-bundle\") pod \"114dc2da-7bec-47de-a6f4-a37e2c56e1f8\" (UID: \"114dc2da-7bec-47de-a6f4-a37e2c56e1f8\") " Mar 09 03:02:49 crc kubenswrapper[4901]: I0309 03:02:49.667019 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/114dc2da-7bec-47de-a6f4-a37e2c56e1f8-config-data\") pod \"114dc2da-7bec-47de-a6f4-a37e2c56e1f8\" (UID: \"114dc2da-7bec-47de-a6f4-a37e2c56e1f8\") " Mar 09 03:02:49 crc kubenswrapper[4901]: I0309 03:02:49.667831 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/114dc2da-7bec-47de-a6f4-a37e2c56e1f8-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "114dc2da-7bec-47de-a6f4-a37e2c56e1f8" (UID: "114dc2da-7bec-47de-a6f4-a37e2c56e1f8"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 03:02:49 crc kubenswrapper[4901]: I0309 03:02:49.673978 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/114dc2da-7bec-47de-a6f4-a37e2c56e1f8-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "114dc2da-7bec-47de-a6f4-a37e2c56e1f8" (UID: "114dc2da-7bec-47de-a6f4-a37e2c56e1f8"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:02:49 crc kubenswrapper[4901]: I0309 03:02:49.674155 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/114dc2da-7bec-47de-a6f4-a37e2c56e1f8-kube-api-access-mj2z5" (OuterVolumeSpecName: "kube-api-access-mj2z5") pod "114dc2da-7bec-47de-a6f4-a37e2c56e1f8" (UID: "114dc2da-7bec-47de-a6f4-a37e2c56e1f8"). InnerVolumeSpecName "kube-api-access-mj2z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:02:49 crc kubenswrapper[4901]: I0309 03:02:49.685352 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/114dc2da-7bec-47de-a6f4-a37e2c56e1f8-scripts" (OuterVolumeSpecName: "scripts") pod "114dc2da-7bec-47de-a6f4-a37e2c56e1f8" (UID: "114dc2da-7bec-47de-a6f4-a37e2c56e1f8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:02:49 crc kubenswrapper[4901]: I0309 03:02:49.736310 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/114dc2da-7bec-47de-a6f4-a37e2c56e1f8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "114dc2da-7bec-47de-a6f4-a37e2c56e1f8" (UID: "114dc2da-7bec-47de-a6f4-a37e2c56e1f8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:02:49 crc kubenswrapper[4901]: I0309 03:02:49.769356 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/114dc2da-7bec-47de-a6f4-a37e2c56e1f8-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:49 crc kubenswrapper[4901]: I0309 03:02:49.769394 4901 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/114dc2da-7bec-47de-a6f4-a37e2c56e1f8-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:49 crc kubenswrapper[4901]: I0309 03:02:49.769405 4901 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/114dc2da-7bec-47de-a6f4-a37e2c56e1f8-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:49 crc kubenswrapper[4901]: I0309 03:02:49.769414 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/114dc2da-7bec-47de-a6f4-a37e2c56e1f8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:49 crc kubenswrapper[4901]: I0309 03:02:49.769424 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mj2z5\" (UniqueName: \"kubernetes.io/projected/114dc2da-7bec-47de-a6f4-a37e2c56e1f8-kube-api-access-mj2z5\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:49 crc kubenswrapper[4901]: I0309 03:02:49.776443 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/114dc2da-7bec-47de-a6f4-a37e2c56e1f8-config-data" (OuterVolumeSpecName: "config-data") pod "114dc2da-7bec-47de-a6f4-a37e2c56e1f8" (UID: "114dc2da-7bec-47de-a6f4-a37e2c56e1f8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:02:49 crc kubenswrapper[4901]: I0309 03:02:49.871018 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/114dc2da-7bec-47de-a6f4-a37e2c56e1f8-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:50 crc kubenswrapper[4901]: I0309 03:02:50.340628 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-588d7b64fd-wbsl2" Mar 09 03:02:50 crc kubenswrapper[4901]: I0309 03:02:50.351954 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"114dc2da-7bec-47de-a6f4-a37e2c56e1f8","Type":"ContainerDied","Data":"fa7fc77893d67f929c1af8f14634b52181777a47d8e787146d600dfc23f00892"} Mar 09 03:02:50 crc kubenswrapper[4901]: I0309 03:02:50.352757 4901 scope.go:117] "RemoveContainer" containerID="a55bddb0310856452ae351719287141366e86613b5d5808d8245686cb1b2b46c" Mar 09 03:02:50 crc kubenswrapper[4901]: I0309 03:02:50.351986 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 09 03:02:50 crc kubenswrapper[4901]: I0309 03:02:50.353740 4901 generic.go:334] "Generic (PLEG): container finished" podID="907b295e-2a2e-4bb0-ba41-4aa445eb8a29" containerID="5db3a19e40c37a82437ff38abde51a4a6e1c83227d3710d42e45676f33438ee5" exitCode=143 Mar 09 03:02:50 crc kubenswrapper[4901]: I0309 03:02:50.353768 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f87bb5fc6-qwprj" event={"ID":"907b295e-2a2e-4bb0-ba41-4aa445eb8a29","Type":"ContainerDied","Data":"5db3a19e40c37a82437ff38abde51a4a6e1c83227d3710d42e45676f33438ee5"} Mar 09 03:02:50 crc kubenswrapper[4901]: I0309 03:02:50.372587 4901 scope.go:117] "RemoveContainer" containerID="25c4d07f357e45acabb171f36d58fbe887d3725a24d1b8089d0660e83c055af4" Mar 09 03:02:50 crc kubenswrapper[4901]: I0309 03:02:50.380574 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 03:02:50 crc kubenswrapper[4901]: I0309 03:02:50.391469 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 03:02:50 crc kubenswrapper[4901]: I0309 03:02:50.407341 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 03:02:50 crc kubenswrapper[4901]: E0309 03:02:50.407683 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1da2c732-ff7f-4359-8ce4-575fa65b8da0" containerName="dnsmasq-dns" Mar 09 03:02:50 crc kubenswrapper[4901]: I0309 03:02:50.407699 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="1da2c732-ff7f-4359-8ce4-575fa65b8da0" containerName="dnsmasq-dns" Mar 09 03:02:50 crc kubenswrapper[4901]: E0309 03:02:50.407713 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="114dc2da-7bec-47de-a6f4-a37e2c56e1f8" containerName="probe" Mar 09 03:02:50 crc kubenswrapper[4901]: I0309 03:02:50.407719 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="114dc2da-7bec-47de-a6f4-a37e2c56e1f8" containerName="probe" Mar 09 03:02:50 crc kubenswrapper[4901]: E0309 03:02:50.407734 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="114dc2da-7bec-47de-a6f4-a37e2c56e1f8" containerName="cinder-scheduler" Mar 09 03:02:50 crc kubenswrapper[4901]: I0309 03:02:50.407740 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="114dc2da-7bec-47de-a6f4-a37e2c56e1f8" containerName="cinder-scheduler" Mar 09 03:02:50 crc kubenswrapper[4901]: E0309 03:02:50.407756 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1da2c732-ff7f-4359-8ce4-575fa65b8da0" containerName="init" Mar 09 03:02:50 crc kubenswrapper[4901]: I0309 03:02:50.407761 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="1da2c732-ff7f-4359-8ce4-575fa65b8da0" containerName="init" Mar 09 03:02:50 crc kubenswrapper[4901]: I0309 03:02:50.407918 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="114dc2da-7bec-47de-a6f4-a37e2c56e1f8" containerName="cinder-scheduler" Mar 09 03:02:50 crc kubenswrapper[4901]: I0309 03:02:50.407933 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="114dc2da-7bec-47de-a6f4-a37e2c56e1f8" containerName="probe" Mar 09 03:02:50 crc kubenswrapper[4901]: I0309 03:02:50.407953 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="1da2c732-ff7f-4359-8ce4-575fa65b8da0" containerName="dnsmasq-dns" Mar 09 03:02:50 crc kubenswrapper[4901]: I0309 03:02:50.408784 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 09 03:02:50 crc kubenswrapper[4901]: I0309 03:02:50.411387 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 09 03:02:50 crc kubenswrapper[4901]: I0309 03:02:50.421864 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 03:02:50 crc kubenswrapper[4901]: I0309 03:02:50.583650 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5276569a-5e4d-4bbb-ab39-f9a420e4a3e9-config-data\") pod \"cinder-scheduler-0\" (UID: \"5276569a-5e4d-4bbb-ab39-f9a420e4a3e9\") " pod="openstack/cinder-scheduler-0" Mar 09 03:02:50 crc kubenswrapper[4901]: I0309 03:02:50.583697 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5276569a-5e4d-4bbb-ab39-f9a420e4a3e9-scripts\") pod \"cinder-scheduler-0\" (UID: \"5276569a-5e4d-4bbb-ab39-f9a420e4a3e9\") " pod="openstack/cinder-scheduler-0" Mar 09 03:02:50 crc kubenswrapper[4901]: I0309 03:02:50.583766 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5276569a-5e4d-4bbb-ab39-f9a420e4a3e9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5276569a-5e4d-4bbb-ab39-f9a420e4a3e9\") " pod="openstack/cinder-scheduler-0" Mar 09 03:02:50 crc kubenswrapper[4901]: I0309 03:02:50.583789 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9w864\" (UniqueName: \"kubernetes.io/projected/5276569a-5e4d-4bbb-ab39-f9a420e4a3e9-kube-api-access-9w864\") pod \"cinder-scheduler-0\" (UID: \"5276569a-5e4d-4bbb-ab39-f9a420e4a3e9\") " pod="openstack/cinder-scheduler-0" Mar 09 03:02:50 crc kubenswrapper[4901]: I0309 03:02:50.583849 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5276569a-5e4d-4bbb-ab39-f9a420e4a3e9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5276569a-5e4d-4bbb-ab39-f9a420e4a3e9\") " pod="openstack/cinder-scheduler-0" Mar 09 03:02:50 crc kubenswrapper[4901]: I0309 03:02:50.583880 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5276569a-5e4d-4bbb-ab39-f9a420e4a3e9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5276569a-5e4d-4bbb-ab39-f9a420e4a3e9\") " pod="openstack/cinder-scheduler-0" Mar 09 03:02:50 crc kubenswrapper[4901]: I0309 03:02:50.613386 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-588d7b64fd-wbsl2" Mar 09 03:02:50 crc kubenswrapper[4901]: I0309 03:02:50.685758 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5276569a-5e4d-4bbb-ab39-f9a420e4a3e9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5276569a-5e4d-4bbb-ab39-f9a420e4a3e9\") " pod="openstack/cinder-scheduler-0" Mar 09 03:02:50 crc kubenswrapper[4901]: I0309 03:02:50.685814 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5276569a-5e4d-4bbb-ab39-f9a420e4a3e9-config-data\") pod \"cinder-scheduler-0\" (UID: \"5276569a-5e4d-4bbb-ab39-f9a420e4a3e9\") " pod="openstack/cinder-scheduler-0" Mar 09 03:02:50 crc kubenswrapper[4901]: I0309 03:02:50.685835 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5276569a-5e4d-4bbb-ab39-f9a420e4a3e9-scripts\") pod \"cinder-scheduler-0\" (UID: \"5276569a-5e4d-4bbb-ab39-f9a420e4a3e9\") " pod="openstack/cinder-scheduler-0" Mar 09 03:02:50 crc kubenswrapper[4901]: I0309 03:02:50.685894 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5276569a-5e4d-4bbb-ab39-f9a420e4a3e9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5276569a-5e4d-4bbb-ab39-f9a420e4a3e9\") " pod="openstack/cinder-scheduler-0" Mar 09 03:02:50 crc kubenswrapper[4901]: I0309 03:02:50.685917 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9w864\" (UniqueName: \"kubernetes.io/projected/5276569a-5e4d-4bbb-ab39-f9a420e4a3e9-kube-api-access-9w864\") pod \"cinder-scheduler-0\" (UID: \"5276569a-5e4d-4bbb-ab39-f9a420e4a3e9\") " pod="openstack/cinder-scheduler-0" Mar 09 03:02:50 crc kubenswrapper[4901]: I0309 03:02:50.685977 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5276569a-5e4d-4bbb-ab39-f9a420e4a3e9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5276569a-5e4d-4bbb-ab39-f9a420e4a3e9\") " pod="openstack/cinder-scheduler-0" Mar 09 03:02:50 crc kubenswrapper[4901]: I0309 03:02:50.686411 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5276569a-5e4d-4bbb-ab39-f9a420e4a3e9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5276569a-5e4d-4bbb-ab39-f9a420e4a3e9\") " pod="openstack/cinder-scheduler-0" Mar 09 03:02:50 crc kubenswrapper[4901]: I0309 03:02:50.691433 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5276569a-5e4d-4bbb-ab39-f9a420e4a3e9-scripts\") pod \"cinder-scheduler-0\" (UID: \"5276569a-5e4d-4bbb-ab39-f9a420e4a3e9\") " pod="openstack/cinder-scheduler-0" Mar 09 03:02:50 crc kubenswrapper[4901]: I0309 03:02:50.694719 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5276569a-5e4d-4bbb-ab39-f9a420e4a3e9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5276569a-5e4d-4bbb-ab39-f9a420e4a3e9\") " pod="openstack/cinder-scheduler-0" Mar 09 03:02:50 crc kubenswrapper[4901]: I0309 03:02:50.702905 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5276569a-5e4d-4bbb-ab39-f9a420e4a3e9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5276569a-5e4d-4bbb-ab39-f9a420e4a3e9\") " pod="openstack/cinder-scheduler-0" Mar 09 03:02:50 crc kubenswrapper[4901]: I0309 03:02:50.703826 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5276569a-5e4d-4bbb-ab39-f9a420e4a3e9-config-data\") pod \"cinder-scheduler-0\" (UID: \"5276569a-5e4d-4bbb-ab39-f9a420e4a3e9\") " pod="openstack/cinder-scheduler-0" Mar 09 03:02:50 crc kubenswrapper[4901]: I0309 03:02:50.704924 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9w864\" (UniqueName: \"kubernetes.io/projected/5276569a-5e4d-4bbb-ab39-f9a420e4a3e9-kube-api-access-9w864\") pod \"cinder-scheduler-0\" (UID: \"5276569a-5e4d-4bbb-ab39-f9a420e4a3e9\") " pod="openstack/cinder-scheduler-0" Mar 09 03:02:50 crc kubenswrapper[4901]: I0309 03:02:50.727506 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 09 03:02:50 crc kubenswrapper[4901]: I0309 03:02:50.898195 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5ddb85f7bb-phpwc"] Mar 09 03:02:50 crc kubenswrapper[4901]: I0309 03:02:50.900185 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5ddb85f7bb-phpwc" Mar 09 03:02:50 crc kubenswrapper[4901]: I0309 03:02:50.916366 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5ddb85f7bb-phpwc"] Mar 09 03:02:51 crc kubenswrapper[4901]: I0309 03:02:50.996331 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/966d96ae-fba9-4ecd-85e5-a81cecfb2ed3-public-tls-certs\") pod \"placement-5ddb85f7bb-phpwc\" (UID: \"966d96ae-fba9-4ecd-85e5-a81cecfb2ed3\") " pod="openstack/placement-5ddb85f7bb-phpwc" Mar 09 03:02:51 crc kubenswrapper[4901]: I0309 03:02:50.996382 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/966d96ae-fba9-4ecd-85e5-a81cecfb2ed3-logs\") pod \"placement-5ddb85f7bb-phpwc\" (UID: \"966d96ae-fba9-4ecd-85e5-a81cecfb2ed3\") " pod="openstack/placement-5ddb85f7bb-phpwc" Mar 09 03:02:51 crc kubenswrapper[4901]: I0309 03:02:50.996406 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/966d96ae-fba9-4ecd-85e5-a81cecfb2ed3-combined-ca-bundle\") pod \"placement-5ddb85f7bb-phpwc\" (UID: \"966d96ae-fba9-4ecd-85e5-a81cecfb2ed3\") " pod="openstack/placement-5ddb85f7bb-phpwc" Mar 09 03:02:51 crc kubenswrapper[4901]: I0309 03:02:50.996450 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/966d96ae-fba9-4ecd-85e5-a81cecfb2ed3-config-data\") pod \"placement-5ddb85f7bb-phpwc\" (UID: \"966d96ae-fba9-4ecd-85e5-a81cecfb2ed3\") " pod="openstack/placement-5ddb85f7bb-phpwc" Mar 09 03:02:51 crc kubenswrapper[4901]: I0309 03:02:50.996478 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/966d96ae-fba9-4ecd-85e5-a81cecfb2ed3-scripts\") pod \"placement-5ddb85f7bb-phpwc\" (UID: \"966d96ae-fba9-4ecd-85e5-a81cecfb2ed3\") " pod="openstack/placement-5ddb85f7bb-phpwc" Mar 09 03:02:51 crc kubenswrapper[4901]: I0309 03:02:50.996537 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/966d96ae-fba9-4ecd-85e5-a81cecfb2ed3-internal-tls-certs\") pod \"placement-5ddb85f7bb-phpwc\" (UID: \"966d96ae-fba9-4ecd-85e5-a81cecfb2ed3\") " pod="openstack/placement-5ddb85f7bb-phpwc" Mar 09 03:02:51 crc kubenswrapper[4901]: I0309 03:02:50.996564 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nm4l\" (UniqueName: \"kubernetes.io/projected/966d96ae-fba9-4ecd-85e5-a81cecfb2ed3-kube-api-access-9nm4l\") pod \"placement-5ddb85f7bb-phpwc\" (UID: \"966d96ae-fba9-4ecd-85e5-a81cecfb2ed3\") " pod="openstack/placement-5ddb85f7bb-phpwc" Mar 09 03:02:51 crc kubenswrapper[4901]: I0309 03:02:51.098198 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/966d96ae-fba9-4ecd-85e5-a81cecfb2ed3-scripts\") pod \"placement-5ddb85f7bb-phpwc\" (UID: \"966d96ae-fba9-4ecd-85e5-a81cecfb2ed3\") " pod="openstack/placement-5ddb85f7bb-phpwc" Mar 09 03:02:51 crc kubenswrapper[4901]: I0309 03:02:51.098342 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/966d96ae-fba9-4ecd-85e5-a81cecfb2ed3-internal-tls-certs\") pod \"placement-5ddb85f7bb-phpwc\" (UID: \"966d96ae-fba9-4ecd-85e5-a81cecfb2ed3\") " pod="openstack/placement-5ddb85f7bb-phpwc" Mar 09 03:02:51 crc kubenswrapper[4901]: I0309 03:02:51.098388 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nm4l\" (UniqueName: \"kubernetes.io/projected/966d96ae-fba9-4ecd-85e5-a81cecfb2ed3-kube-api-access-9nm4l\") pod \"placement-5ddb85f7bb-phpwc\" (UID: \"966d96ae-fba9-4ecd-85e5-a81cecfb2ed3\") " pod="openstack/placement-5ddb85f7bb-phpwc" Mar 09 03:02:51 crc kubenswrapper[4901]: I0309 03:02:51.098434 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/966d96ae-fba9-4ecd-85e5-a81cecfb2ed3-public-tls-certs\") pod \"placement-5ddb85f7bb-phpwc\" (UID: \"966d96ae-fba9-4ecd-85e5-a81cecfb2ed3\") " pod="openstack/placement-5ddb85f7bb-phpwc" Mar 09 03:02:51 crc kubenswrapper[4901]: I0309 03:02:51.098456 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/966d96ae-fba9-4ecd-85e5-a81cecfb2ed3-logs\") pod \"placement-5ddb85f7bb-phpwc\" (UID: \"966d96ae-fba9-4ecd-85e5-a81cecfb2ed3\") " pod="openstack/placement-5ddb85f7bb-phpwc" Mar 09 03:02:51 crc kubenswrapper[4901]: I0309 03:02:51.098479 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/966d96ae-fba9-4ecd-85e5-a81cecfb2ed3-combined-ca-bundle\") pod \"placement-5ddb85f7bb-phpwc\" (UID: \"966d96ae-fba9-4ecd-85e5-a81cecfb2ed3\") " pod="openstack/placement-5ddb85f7bb-phpwc" Mar 09 03:02:51 crc kubenswrapper[4901]: I0309 03:02:51.098536 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/966d96ae-fba9-4ecd-85e5-a81cecfb2ed3-config-data\") pod \"placement-5ddb85f7bb-phpwc\" (UID: \"966d96ae-fba9-4ecd-85e5-a81cecfb2ed3\") " pod="openstack/placement-5ddb85f7bb-phpwc" Mar 09 03:02:51 crc kubenswrapper[4901]: I0309 03:02:51.099730 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/966d96ae-fba9-4ecd-85e5-a81cecfb2ed3-logs\") pod \"placement-5ddb85f7bb-phpwc\" (UID: \"966d96ae-fba9-4ecd-85e5-a81cecfb2ed3\") " pod="openstack/placement-5ddb85f7bb-phpwc" Mar 09 03:02:51 crc kubenswrapper[4901]: I0309 03:02:51.103422 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/966d96ae-fba9-4ecd-85e5-a81cecfb2ed3-combined-ca-bundle\") pod \"placement-5ddb85f7bb-phpwc\" (UID: \"966d96ae-fba9-4ecd-85e5-a81cecfb2ed3\") " pod="openstack/placement-5ddb85f7bb-phpwc" Mar 09 03:02:51 crc kubenswrapper[4901]: I0309 03:02:51.103944 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/966d96ae-fba9-4ecd-85e5-a81cecfb2ed3-public-tls-certs\") pod \"placement-5ddb85f7bb-phpwc\" (UID: \"966d96ae-fba9-4ecd-85e5-a81cecfb2ed3\") " pod="openstack/placement-5ddb85f7bb-phpwc" Mar 09 03:02:51 crc kubenswrapper[4901]: I0309 03:02:51.104755 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/966d96ae-fba9-4ecd-85e5-a81cecfb2ed3-scripts\") pod \"placement-5ddb85f7bb-phpwc\" (UID: \"966d96ae-fba9-4ecd-85e5-a81cecfb2ed3\") " pod="openstack/placement-5ddb85f7bb-phpwc" Mar 09 03:02:51 crc kubenswrapper[4901]: I0309 03:02:51.105746 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/966d96ae-fba9-4ecd-85e5-a81cecfb2ed3-internal-tls-certs\") pod \"placement-5ddb85f7bb-phpwc\" (UID: \"966d96ae-fba9-4ecd-85e5-a81cecfb2ed3\") " pod="openstack/placement-5ddb85f7bb-phpwc" Mar 09 03:02:51 crc kubenswrapper[4901]: I0309 03:02:51.108890 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/966d96ae-fba9-4ecd-85e5-a81cecfb2ed3-config-data\") pod \"placement-5ddb85f7bb-phpwc\" (UID: \"966d96ae-fba9-4ecd-85e5-a81cecfb2ed3\") " pod="openstack/placement-5ddb85f7bb-phpwc" Mar 09 03:02:51 crc kubenswrapper[4901]: I0309 03:02:51.120791 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nm4l\" (UniqueName: \"kubernetes.io/projected/966d96ae-fba9-4ecd-85e5-a81cecfb2ed3-kube-api-access-9nm4l\") pod \"placement-5ddb85f7bb-phpwc\" (UID: \"966d96ae-fba9-4ecd-85e5-a81cecfb2ed3\") " pod="openstack/placement-5ddb85f7bb-phpwc" Mar 09 03:02:51 crc kubenswrapper[4901]: I0309 03:02:51.228728 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5ddb85f7bb-phpwc" Mar 09 03:02:51 crc kubenswrapper[4901]: I0309 03:02:51.313456 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 03:02:51 crc kubenswrapper[4901]: I0309 03:02:51.377335 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5276569a-5e4d-4bbb-ab39-f9a420e4a3e9","Type":"ContainerStarted","Data":"f38b4b55666610f8dbc59a27371dfe35ab766a60ffe8fff7f77a0ae5cd048172"} Mar 09 03:02:51 crc kubenswrapper[4901]: I0309 03:02:51.722340 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5ddb85f7bb-phpwc"] Mar 09 03:02:52 crc kubenswrapper[4901]: I0309 03:02:52.092704 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-79578f965f-zpp5p" Mar 09 03:02:52 crc kubenswrapper[4901]: I0309 03:02:52.147554 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="114dc2da-7bec-47de-a6f4-a37e2c56e1f8" path="/var/lib/kubelet/pods/114dc2da-7bec-47de-a6f4-a37e2c56e1f8/volumes" Mar 09 03:02:52 crc kubenswrapper[4901]: I0309 03:02:52.395317 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5ddb85f7bb-phpwc" event={"ID":"966d96ae-fba9-4ecd-85e5-a81cecfb2ed3","Type":"ContainerStarted","Data":"ff023d589de2e36a5396ff975d9b99b7c27af8bbe416330e6c016841896c5a55"} Mar 09 03:02:52 crc kubenswrapper[4901]: I0309 03:02:52.395399 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5ddb85f7bb-phpwc" Mar 09 03:02:52 crc kubenswrapper[4901]: I0309 03:02:52.395595 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5ddb85f7bb-phpwc" Mar 09 03:02:52 crc kubenswrapper[4901]: I0309 03:02:52.395607 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5ddb85f7bb-phpwc" event={"ID":"966d96ae-fba9-4ecd-85e5-a81cecfb2ed3","Type":"ContainerStarted","Data":"4da5234fa05d69c236f53cbbc103505a093111b7b3b09f7a401eaded8dc333cb"} Mar 09 03:02:52 crc kubenswrapper[4901]: I0309 03:02:52.395623 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5ddb85f7bb-phpwc" event={"ID":"966d96ae-fba9-4ecd-85e5-a81cecfb2ed3","Type":"ContainerStarted","Data":"0c6cb5a8d63f828a66ef60ea27462787d5ebbbb27ddd435f77fedbaa4692fc7f"} Mar 09 03:02:52 crc kubenswrapper[4901]: I0309 03:02:52.396562 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5276569a-5e4d-4bbb-ab39-f9a420e4a3e9","Type":"ContainerStarted","Data":"facb483ac1a2684cbdb2157f973980b22dbea7114e3234f3ba9ca3d2bc68a22b"} Mar 09 03:02:52 crc kubenswrapper[4901]: I0309 03:02:52.411158 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5ddb85f7bb-phpwc" podStartSLOduration=2.411143966 podStartE2EDuration="2.411143966s" podCreationTimestamp="2026-03-09 03:02:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 03:02:52.409454673 +0000 UTC m=+1296.999118405" watchObservedRunningTime="2026-03-09 03:02:52.411143966 +0000 UTC m=+1297.000807698" Mar 09 03:02:52 crc kubenswrapper[4901]: I0309 03:02:52.846515 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6f87bb5fc6-qwprj" Mar 09 03:02:52 crc kubenswrapper[4901]: I0309 03:02:52.942496 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/907b295e-2a2e-4bb0-ba41-4aa445eb8a29-config-data-custom\") pod \"907b295e-2a2e-4bb0-ba41-4aa445eb8a29\" (UID: \"907b295e-2a2e-4bb0-ba41-4aa445eb8a29\") " Mar 09 03:02:52 crc kubenswrapper[4901]: I0309 03:02:52.942904 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/907b295e-2a2e-4bb0-ba41-4aa445eb8a29-logs\") pod \"907b295e-2a2e-4bb0-ba41-4aa445eb8a29\" (UID: \"907b295e-2a2e-4bb0-ba41-4aa445eb8a29\") " Mar 09 03:02:52 crc kubenswrapper[4901]: I0309 03:02:52.943003 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzpqc\" (UniqueName: \"kubernetes.io/projected/907b295e-2a2e-4bb0-ba41-4aa445eb8a29-kube-api-access-bzpqc\") pod \"907b295e-2a2e-4bb0-ba41-4aa445eb8a29\" (UID: \"907b295e-2a2e-4bb0-ba41-4aa445eb8a29\") " Mar 09 03:02:52 crc kubenswrapper[4901]: I0309 03:02:52.943170 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/907b295e-2a2e-4bb0-ba41-4aa445eb8a29-combined-ca-bundle\") pod \"907b295e-2a2e-4bb0-ba41-4aa445eb8a29\" (UID: \"907b295e-2a2e-4bb0-ba41-4aa445eb8a29\") " Mar 09 03:02:52 crc kubenswrapper[4901]: I0309 03:02:52.943283 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/907b295e-2a2e-4bb0-ba41-4aa445eb8a29-config-data\") pod \"907b295e-2a2e-4bb0-ba41-4aa445eb8a29\" (UID: \"907b295e-2a2e-4bb0-ba41-4aa445eb8a29\") " Mar 09 03:02:52 crc kubenswrapper[4901]: I0309 03:02:52.943441 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/907b295e-2a2e-4bb0-ba41-4aa445eb8a29-logs" (OuterVolumeSpecName: "logs") pod "907b295e-2a2e-4bb0-ba41-4aa445eb8a29" (UID: "907b295e-2a2e-4bb0-ba41-4aa445eb8a29"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:02:52 crc kubenswrapper[4901]: I0309 03:02:52.943911 4901 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/907b295e-2a2e-4bb0-ba41-4aa445eb8a29-logs\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:52 crc kubenswrapper[4901]: I0309 03:02:52.950861 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/907b295e-2a2e-4bb0-ba41-4aa445eb8a29-kube-api-access-bzpqc" (OuterVolumeSpecName: "kube-api-access-bzpqc") pod "907b295e-2a2e-4bb0-ba41-4aa445eb8a29" (UID: "907b295e-2a2e-4bb0-ba41-4aa445eb8a29"). InnerVolumeSpecName "kube-api-access-bzpqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:02:52 crc kubenswrapper[4901]: I0309 03:02:52.984518 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/907b295e-2a2e-4bb0-ba41-4aa445eb8a29-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "907b295e-2a2e-4bb0-ba41-4aa445eb8a29" (UID: "907b295e-2a2e-4bb0-ba41-4aa445eb8a29"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:02:52 crc kubenswrapper[4901]: I0309 03:02:52.994969 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/907b295e-2a2e-4bb0-ba41-4aa445eb8a29-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "907b295e-2a2e-4bb0-ba41-4aa445eb8a29" (UID: "907b295e-2a2e-4bb0-ba41-4aa445eb8a29"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:02:53 crc kubenswrapper[4901]: I0309 03:02:53.021909 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/907b295e-2a2e-4bb0-ba41-4aa445eb8a29-config-data" (OuterVolumeSpecName: "config-data") pod "907b295e-2a2e-4bb0-ba41-4aa445eb8a29" (UID: "907b295e-2a2e-4bb0-ba41-4aa445eb8a29"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:02:53 crc kubenswrapper[4901]: I0309 03:02:53.047077 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/907b295e-2a2e-4bb0-ba41-4aa445eb8a29-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:53 crc kubenswrapper[4901]: I0309 03:02:53.047122 4901 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/907b295e-2a2e-4bb0-ba41-4aa445eb8a29-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:53 crc kubenswrapper[4901]: I0309 03:02:53.047132 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzpqc\" (UniqueName: \"kubernetes.io/projected/907b295e-2a2e-4bb0-ba41-4aa445eb8a29-kube-api-access-bzpqc\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:53 crc kubenswrapper[4901]: I0309 03:02:53.047143 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/907b295e-2a2e-4bb0-ba41-4aa445eb8a29-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:53 crc kubenswrapper[4901]: I0309 03:02:53.409110 4901 generic.go:334] "Generic (PLEG): container finished" podID="907b295e-2a2e-4bb0-ba41-4aa445eb8a29" containerID="b0b1e1836b4a7e67bc4916199d7607a4524f8d7550ca48b13dd84f6ef5e20171" exitCode=0 Mar 09 03:02:53 crc kubenswrapper[4901]: I0309 03:02:53.409179 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f87bb5fc6-qwprj" event={"ID":"907b295e-2a2e-4bb0-ba41-4aa445eb8a29","Type":"ContainerDied","Data":"b0b1e1836b4a7e67bc4916199d7607a4524f8d7550ca48b13dd84f6ef5e20171"} Mar 09 03:02:53 crc kubenswrapper[4901]: I0309 03:02:53.409192 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6f87bb5fc6-qwprj" Mar 09 03:02:53 crc kubenswrapper[4901]: I0309 03:02:53.410370 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f87bb5fc6-qwprj" event={"ID":"907b295e-2a2e-4bb0-ba41-4aa445eb8a29","Type":"ContainerDied","Data":"30824ba14fe53d011c60a0a56da541656b9ed79b46d94f5fc0a7b0560e8bad46"} Mar 09 03:02:53 crc kubenswrapper[4901]: I0309 03:02:53.410395 4901 scope.go:117] "RemoveContainer" containerID="b0b1e1836b4a7e67bc4916199d7607a4524f8d7550ca48b13dd84f6ef5e20171" Mar 09 03:02:53 crc kubenswrapper[4901]: I0309 03:02:53.412822 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5276569a-5e4d-4bbb-ab39-f9a420e4a3e9","Type":"ContainerStarted","Data":"91cbb6deb4ec6b7d9a4c55257a984d7c3866e71cfa748163111d9fb04ddec075"} Mar 09 03:02:53 crc kubenswrapper[4901]: I0309 03:02:53.438210 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.43818818 podStartE2EDuration="3.43818818s" podCreationTimestamp="2026-03-09 03:02:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 03:02:53.431343347 +0000 UTC m=+1298.021007089" watchObservedRunningTime="2026-03-09 03:02:53.43818818 +0000 UTC m=+1298.027851912" Mar 09 03:02:53 crc kubenswrapper[4901]: I0309 03:02:53.460419 4901 scope.go:117] "RemoveContainer" containerID="5db3a19e40c37a82437ff38abde51a4a6e1c83227d3710d42e45676f33438ee5" Mar 09 03:02:53 crc kubenswrapper[4901]: I0309 03:02:53.463698 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6f87bb5fc6-qwprj"] Mar 09 03:02:53 crc kubenswrapper[4901]: I0309 03:02:53.471853 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6f87bb5fc6-qwprj"] Mar 09 03:02:53 crc kubenswrapper[4901]: I0309 03:02:53.487836 4901 scope.go:117] "RemoveContainer" containerID="b0b1e1836b4a7e67bc4916199d7607a4524f8d7550ca48b13dd84f6ef5e20171" Mar 09 03:02:53 crc kubenswrapper[4901]: E0309 03:02:53.490786 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0b1e1836b4a7e67bc4916199d7607a4524f8d7550ca48b13dd84f6ef5e20171\": container with ID starting with b0b1e1836b4a7e67bc4916199d7607a4524f8d7550ca48b13dd84f6ef5e20171 not found: ID does not exist" containerID="b0b1e1836b4a7e67bc4916199d7607a4524f8d7550ca48b13dd84f6ef5e20171" Mar 09 03:02:53 crc kubenswrapper[4901]: I0309 03:02:53.490820 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0b1e1836b4a7e67bc4916199d7607a4524f8d7550ca48b13dd84f6ef5e20171"} err="failed to get container status \"b0b1e1836b4a7e67bc4916199d7607a4524f8d7550ca48b13dd84f6ef5e20171\": rpc error: code = NotFound desc = could not find container \"b0b1e1836b4a7e67bc4916199d7607a4524f8d7550ca48b13dd84f6ef5e20171\": container with ID starting with b0b1e1836b4a7e67bc4916199d7607a4524f8d7550ca48b13dd84f6ef5e20171 not found: ID does not exist" Mar 09 03:02:53 crc kubenswrapper[4901]: I0309 03:02:53.490843 4901 scope.go:117] "RemoveContainer" containerID="5db3a19e40c37a82437ff38abde51a4a6e1c83227d3710d42e45676f33438ee5" Mar 09 03:02:53 crc kubenswrapper[4901]: E0309 03:02:53.494498 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5db3a19e40c37a82437ff38abde51a4a6e1c83227d3710d42e45676f33438ee5\": container with ID starting with 5db3a19e40c37a82437ff38abde51a4a6e1c83227d3710d42e45676f33438ee5 not found: ID does not exist" containerID="5db3a19e40c37a82437ff38abde51a4a6e1c83227d3710d42e45676f33438ee5" Mar 09 03:02:53 crc kubenswrapper[4901]: I0309 03:02:53.494540 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5db3a19e40c37a82437ff38abde51a4a6e1c83227d3710d42e45676f33438ee5"} err="failed to get container status \"5db3a19e40c37a82437ff38abde51a4a6e1c83227d3710d42e45676f33438ee5\": rpc error: code = NotFound desc = could not find container \"5db3a19e40c37a82437ff38abde51a4a6e1c83227d3710d42e45676f33438ee5\": container with ID starting with 5db3a19e40c37a82437ff38abde51a4a6e1c83227d3710d42e45676f33438ee5 not found: ID does not exist" Mar 09 03:02:53 crc kubenswrapper[4901]: I0309 03:02:53.703342 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 09 03:02:54 crc kubenswrapper[4901]: I0309 03:02:54.114839 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="907b295e-2a2e-4bb0-ba41-4aa445eb8a29" path="/var/lib/kubelet/pods/907b295e-2a2e-4bb0-ba41-4aa445eb8a29/volumes" Mar 09 03:02:54 crc kubenswrapper[4901]: I0309 03:02:54.424296 4901 generic.go:334] "Generic (PLEG): container finished" podID="7f7aec5c-9887-4331-8806-3164120e927e" containerID="d470f4319d83b0f2465857ddb8bb7578426b84cca0130f611b83b80a7b4431a5" exitCode=0 Mar 09 03:02:54 crc kubenswrapper[4901]: I0309 03:02:54.424497 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57d6545db5-vnkq4" event={"ID":"7f7aec5c-9887-4331-8806-3164120e927e","Type":"ContainerDied","Data":"d470f4319d83b0f2465857ddb8bb7578426b84cca0130f611b83b80a7b4431a5"} Mar 09 03:02:54 crc kubenswrapper[4901]: I0309 03:02:54.959878 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-57d6545db5-vnkq4" Mar 09 03:02:55 crc kubenswrapper[4901]: I0309 03:02:55.084322 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7f7aec5c-9887-4331-8806-3164120e927e-httpd-config\") pod \"7f7aec5c-9887-4331-8806-3164120e927e\" (UID: \"7f7aec5c-9887-4331-8806-3164120e927e\") " Mar 09 03:02:55 crc kubenswrapper[4901]: I0309 03:02:55.084407 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f7aec5c-9887-4331-8806-3164120e927e-combined-ca-bundle\") pod \"7f7aec5c-9887-4331-8806-3164120e927e\" (UID: \"7f7aec5c-9887-4331-8806-3164120e927e\") " Mar 09 03:02:55 crc kubenswrapper[4901]: I0309 03:02:55.084466 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f7aec5c-9887-4331-8806-3164120e927e-internal-tls-certs\") pod \"7f7aec5c-9887-4331-8806-3164120e927e\" (UID: \"7f7aec5c-9887-4331-8806-3164120e927e\") " Mar 09 03:02:55 crc kubenswrapper[4901]: I0309 03:02:55.084546 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f7aec5c-9887-4331-8806-3164120e927e-public-tls-certs\") pod \"7f7aec5c-9887-4331-8806-3164120e927e\" (UID: \"7f7aec5c-9887-4331-8806-3164120e927e\") " Mar 09 03:02:55 crc kubenswrapper[4901]: I0309 03:02:55.084606 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f7aec5c-9887-4331-8806-3164120e927e-ovndb-tls-certs\") pod \"7f7aec5c-9887-4331-8806-3164120e927e\" (UID: \"7f7aec5c-9887-4331-8806-3164120e927e\") " Mar 09 03:02:55 crc kubenswrapper[4901]: I0309 03:02:55.084648 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7f7aec5c-9887-4331-8806-3164120e927e-config\") pod \"7f7aec5c-9887-4331-8806-3164120e927e\" (UID: \"7f7aec5c-9887-4331-8806-3164120e927e\") " Mar 09 03:02:55 crc kubenswrapper[4901]: I0309 03:02:55.084843 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kd6hh\" (UniqueName: \"kubernetes.io/projected/7f7aec5c-9887-4331-8806-3164120e927e-kube-api-access-kd6hh\") pod \"7f7aec5c-9887-4331-8806-3164120e927e\" (UID: \"7f7aec5c-9887-4331-8806-3164120e927e\") " Mar 09 03:02:55 crc kubenswrapper[4901]: I0309 03:02:55.090976 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f7aec5c-9887-4331-8806-3164120e927e-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "7f7aec5c-9887-4331-8806-3164120e927e" (UID: "7f7aec5c-9887-4331-8806-3164120e927e"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:02:55 crc kubenswrapper[4901]: I0309 03:02:55.093160 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f7aec5c-9887-4331-8806-3164120e927e-kube-api-access-kd6hh" (OuterVolumeSpecName: "kube-api-access-kd6hh") pod "7f7aec5c-9887-4331-8806-3164120e927e" (UID: "7f7aec5c-9887-4331-8806-3164120e927e"). InnerVolumeSpecName "kube-api-access-kd6hh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:02:55 crc kubenswrapper[4901]: I0309 03:02:55.146749 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f7aec5c-9887-4331-8806-3164120e927e-config" (OuterVolumeSpecName: "config") pod "7f7aec5c-9887-4331-8806-3164120e927e" (UID: "7f7aec5c-9887-4331-8806-3164120e927e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:02:55 crc kubenswrapper[4901]: I0309 03:02:55.168527 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f7aec5c-9887-4331-8806-3164120e927e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7f7aec5c-9887-4331-8806-3164120e927e" (UID: "7f7aec5c-9887-4331-8806-3164120e927e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:02:55 crc kubenswrapper[4901]: I0309 03:02:55.171861 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f7aec5c-9887-4331-8806-3164120e927e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7f7aec5c-9887-4331-8806-3164120e927e" (UID: "7f7aec5c-9887-4331-8806-3164120e927e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:02:55 crc kubenswrapper[4901]: I0309 03:02:55.173922 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f7aec5c-9887-4331-8806-3164120e927e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7f7aec5c-9887-4331-8806-3164120e927e" (UID: "7f7aec5c-9887-4331-8806-3164120e927e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:02:55 crc kubenswrapper[4901]: I0309 03:02:55.181658 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f7aec5c-9887-4331-8806-3164120e927e-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "7f7aec5c-9887-4331-8806-3164120e927e" (UID: "7f7aec5c-9887-4331-8806-3164120e927e"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:02:55 crc kubenswrapper[4901]: I0309 03:02:55.189920 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kd6hh\" (UniqueName: \"kubernetes.io/projected/7f7aec5c-9887-4331-8806-3164120e927e-kube-api-access-kd6hh\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:55 crc kubenswrapper[4901]: I0309 03:02:55.190145 4901 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7f7aec5c-9887-4331-8806-3164120e927e-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:55 crc kubenswrapper[4901]: I0309 03:02:55.190431 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f7aec5c-9887-4331-8806-3164120e927e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:55 crc kubenswrapper[4901]: I0309 03:02:55.190595 4901 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f7aec5c-9887-4331-8806-3164120e927e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:55 crc kubenswrapper[4901]: I0309 03:02:55.190759 4901 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f7aec5c-9887-4331-8806-3164120e927e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:55 crc kubenswrapper[4901]: I0309 03:02:55.190871 4901 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f7aec5c-9887-4331-8806-3164120e927e-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:55 crc kubenswrapper[4901]: I0309 03:02:55.190989 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/7f7aec5c-9887-4331-8806-3164120e927e-config\") on node \"crc\" DevicePath \"\"" Mar 09 03:02:55 crc kubenswrapper[4901]: I0309 03:02:55.437167 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57d6545db5-vnkq4" event={"ID":"7f7aec5c-9887-4331-8806-3164120e927e","Type":"ContainerDied","Data":"3702362ca50b6d59759c487de59179513d8ee76d883cfb504dcec113ac0330e4"} Mar 09 03:02:55 crc kubenswrapper[4901]: I0309 03:02:55.437589 4901 scope.go:117] "RemoveContainer" containerID="8c06a059ceda4f7e216a798de721d42fa01fd9f026bcf1c558dc22f3544edb04" Mar 09 03:02:55 crc kubenswrapper[4901]: I0309 03:02:55.437267 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-57d6545db5-vnkq4" Mar 09 03:02:55 crc kubenswrapper[4901]: I0309 03:02:55.474783 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-57d6545db5-vnkq4"] Mar 09 03:02:55 crc kubenswrapper[4901]: I0309 03:02:55.478952 4901 scope.go:117] "RemoveContainer" containerID="d470f4319d83b0f2465857ddb8bb7578426b84cca0130f611b83b80a7b4431a5" Mar 09 03:02:55 crc kubenswrapper[4901]: I0309 03:02:55.483681 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-57d6545db5-vnkq4"] Mar 09 03:02:55 crc kubenswrapper[4901]: I0309 03:02:55.727842 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 09 03:02:56 crc kubenswrapper[4901]: I0309 03:02:56.128737 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f7aec5c-9887-4331-8806-3164120e927e" path="/var/lib/kubelet/pods/7f7aec5c-9887-4331-8806-3164120e927e/volumes" Mar 09 03:02:56 crc kubenswrapper[4901]: I0309 03:02:56.202708 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 09 03:02:56 crc kubenswrapper[4901]: E0309 03:02:56.203616 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f7aec5c-9887-4331-8806-3164120e927e" containerName="neutron-httpd" Mar 09 03:02:56 crc kubenswrapper[4901]: I0309 03:02:56.203756 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f7aec5c-9887-4331-8806-3164120e927e" containerName="neutron-httpd" Mar 09 03:02:56 crc kubenswrapper[4901]: E0309 03:02:56.203933 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f7aec5c-9887-4331-8806-3164120e927e" containerName="neutron-api" Mar 09 03:02:56 crc kubenswrapper[4901]: I0309 03:02:56.204045 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f7aec5c-9887-4331-8806-3164120e927e" containerName="neutron-api" Mar 09 03:02:56 crc kubenswrapper[4901]: E0309 03:02:56.204160 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="907b295e-2a2e-4bb0-ba41-4aa445eb8a29" containerName="barbican-api-log" Mar 09 03:02:56 crc kubenswrapper[4901]: I0309 03:02:56.204297 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="907b295e-2a2e-4bb0-ba41-4aa445eb8a29" containerName="barbican-api-log" Mar 09 03:02:56 crc kubenswrapper[4901]: E0309 03:02:56.204446 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="907b295e-2a2e-4bb0-ba41-4aa445eb8a29" containerName="barbican-api" Mar 09 03:02:56 crc kubenswrapper[4901]: I0309 03:02:56.204607 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="907b295e-2a2e-4bb0-ba41-4aa445eb8a29" containerName="barbican-api" Mar 09 03:02:56 crc kubenswrapper[4901]: I0309 03:02:56.205045 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="907b295e-2a2e-4bb0-ba41-4aa445eb8a29" containerName="barbican-api-log" Mar 09 03:02:56 crc kubenswrapper[4901]: I0309 03:02:56.205181 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f7aec5c-9887-4331-8806-3164120e927e" containerName="neutron-httpd" Mar 09 03:02:56 crc kubenswrapper[4901]: I0309 03:02:56.205364 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="907b295e-2a2e-4bb0-ba41-4aa445eb8a29" containerName="barbican-api" Mar 09 03:02:56 crc kubenswrapper[4901]: I0309 03:02:56.205510 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f7aec5c-9887-4331-8806-3164120e927e" containerName="neutron-api" Mar 09 03:02:56 crc kubenswrapper[4901]: I0309 03:02:56.206563 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 09 03:02:56 crc kubenswrapper[4901]: I0309 03:02:56.209698 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 09 03:02:56 crc kubenswrapper[4901]: I0309 03:02:56.209810 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 09 03:02:56 crc kubenswrapper[4901]: I0309 03:02:56.212758 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-zlnz7" Mar 09 03:02:56 crc kubenswrapper[4901]: I0309 03:02:56.213532 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 09 03:02:56 crc kubenswrapper[4901]: I0309 03:02:56.314172 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/34bc86a8-8821-462a-b15b-c2f847f44be2-openstack-config\") pod \"openstackclient\" (UID: \"34bc86a8-8821-462a-b15b-c2f847f44be2\") " pod="openstack/openstackclient" Mar 09 03:02:56 crc kubenswrapper[4901]: I0309 03:02:56.314514 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34bc86a8-8821-462a-b15b-c2f847f44be2-combined-ca-bundle\") pod \"openstackclient\" (UID: \"34bc86a8-8821-462a-b15b-c2f847f44be2\") " pod="openstack/openstackclient" Mar 09 03:02:56 crc kubenswrapper[4901]: I0309 03:02:56.314683 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/34bc86a8-8821-462a-b15b-c2f847f44be2-openstack-config-secret\") pod \"openstackclient\" (UID: \"34bc86a8-8821-462a-b15b-c2f847f44be2\") " pod="openstack/openstackclient" Mar 09 03:02:56 crc kubenswrapper[4901]: I0309 03:02:56.314853 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbbk9\" (UniqueName: \"kubernetes.io/projected/34bc86a8-8821-462a-b15b-c2f847f44be2-kube-api-access-jbbk9\") pod \"openstackclient\" (UID: \"34bc86a8-8821-462a-b15b-c2f847f44be2\") " pod="openstack/openstackclient" Mar 09 03:02:56 crc kubenswrapper[4901]: I0309 03:02:56.416583 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/34bc86a8-8821-462a-b15b-c2f847f44be2-openstack-config\") pod \"openstackclient\" (UID: \"34bc86a8-8821-462a-b15b-c2f847f44be2\") " pod="openstack/openstackclient" Mar 09 03:02:56 crc kubenswrapper[4901]: I0309 03:02:56.416863 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34bc86a8-8821-462a-b15b-c2f847f44be2-combined-ca-bundle\") pod \"openstackclient\" (UID: \"34bc86a8-8821-462a-b15b-c2f847f44be2\") " pod="openstack/openstackclient" Mar 09 03:02:56 crc kubenswrapper[4901]: I0309 03:02:56.416896 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/34bc86a8-8821-462a-b15b-c2f847f44be2-openstack-config-secret\") pod \"openstackclient\" (UID: \"34bc86a8-8821-462a-b15b-c2f847f44be2\") " pod="openstack/openstackclient" Mar 09 03:02:56 crc kubenswrapper[4901]: I0309 03:02:56.416961 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbbk9\" (UniqueName: \"kubernetes.io/projected/34bc86a8-8821-462a-b15b-c2f847f44be2-kube-api-access-jbbk9\") pod \"openstackclient\" (UID: \"34bc86a8-8821-462a-b15b-c2f847f44be2\") " pod="openstack/openstackclient" Mar 09 03:02:56 crc kubenswrapper[4901]: I0309 03:02:56.418053 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/34bc86a8-8821-462a-b15b-c2f847f44be2-openstack-config\") pod \"openstackclient\" (UID: \"34bc86a8-8821-462a-b15b-c2f847f44be2\") " pod="openstack/openstackclient" Mar 09 03:02:56 crc kubenswrapper[4901]: I0309 03:02:56.422834 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34bc86a8-8821-462a-b15b-c2f847f44be2-combined-ca-bundle\") pod \"openstackclient\" (UID: \"34bc86a8-8821-462a-b15b-c2f847f44be2\") " pod="openstack/openstackclient" Mar 09 03:02:56 crc kubenswrapper[4901]: I0309 03:02:56.427007 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/34bc86a8-8821-462a-b15b-c2f847f44be2-openstack-config-secret\") pod \"openstackclient\" (UID: \"34bc86a8-8821-462a-b15b-c2f847f44be2\") " pod="openstack/openstackclient" Mar 09 03:02:56 crc kubenswrapper[4901]: I0309 03:02:56.437882 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbbk9\" (UniqueName: \"kubernetes.io/projected/34bc86a8-8821-462a-b15b-c2f847f44be2-kube-api-access-jbbk9\") pod \"openstackclient\" (UID: \"34bc86a8-8821-462a-b15b-c2f847f44be2\") " pod="openstack/openstackclient" Mar 09 03:02:56 crc kubenswrapper[4901]: I0309 03:02:56.544532 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 09 03:02:57 crc kubenswrapper[4901]: W0309 03:02:57.029041 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34bc86a8_8821_462a_b15b_c2f847f44be2.slice/crio-2f4ba5b63b2f6a5bf3e40e0173d05d44009443a7a823397d3e39ecd9c0a2d872 WatchSource:0}: Error finding container 2f4ba5b63b2f6a5bf3e40e0173d05d44009443a7a823397d3e39ecd9c0a2d872: Status 404 returned error can't find the container with id 2f4ba5b63b2f6a5bf3e40e0173d05d44009443a7a823397d3e39ecd9c0a2d872 Mar 09 03:02:57 crc kubenswrapper[4901]: I0309 03:02:57.029509 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 09 03:02:57 crc kubenswrapper[4901]: I0309 03:02:57.457262 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"34bc86a8-8821-462a-b15b-c2f847f44be2","Type":"ContainerStarted","Data":"2f4ba5b63b2f6a5bf3e40e0173d05d44009443a7a823397d3e39ecd9c0a2d872"} Mar 09 03:03:00 crc kubenswrapper[4901]: I0309 03:03:00.784246 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-579bf976f9-ds45q"] Mar 09 03:03:00 crc kubenswrapper[4901]: I0309 03:03:00.793207 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-579bf976f9-ds45q" Mar 09 03:03:00 crc kubenswrapper[4901]: I0309 03:03:00.795952 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 09 03:03:00 crc kubenswrapper[4901]: I0309 03:03:00.796175 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 09 03:03:00 crc kubenswrapper[4901]: I0309 03:03:00.802286 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 09 03:03:00 crc kubenswrapper[4901]: I0309 03:03:00.812765 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-579bf976f9-ds45q"] Mar 09 03:03:00 crc kubenswrapper[4901]: I0309 03:03:00.863310 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 03:03:00 crc kubenswrapper[4901]: I0309 03:03:00.863635 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 03:03:00 crc kubenswrapper[4901]: I0309 03:03:00.893813 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2bgt\" (UniqueName: \"kubernetes.io/projected/cfd87218-f7dc-424a-acda-dd7b57792738-kube-api-access-z2bgt\") pod \"swift-proxy-579bf976f9-ds45q\" (UID: \"cfd87218-f7dc-424a-acda-dd7b57792738\") " pod="openstack/swift-proxy-579bf976f9-ds45q" Mar 09 03:03:00 crc kubenswrapper[4901]: I0309 03:03:00.893889 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfd87218-f7dc-424a-acda-dd7b57792738-combined-ca-bundle\") pod \"swift-proxy-579bf976f9-ds45q\" (UID: \"cfd87218-f7dc-424a-acda-dd7b57792738\") " pod="openstack/swift-proxy-579bf976f9-ds45q" Mar 09 03:03:00 crc kubenswrapper[4901]: I0309 03:03:00.893977 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfd87218-f7dc-424a-acda-dd7b57792738-config-data\") pod \"swift-proxy-579bf976f9-ds45q\" (UID: \"cfd87218-f7dc-424a-acda-dd7b57792738\") " pod="openstack/swift-proxy-579bf976f9-ds45q" Mar 09 03:03:00 crc kubenswrapper[4901]: I0309 03:03:00.894049 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfd87218-f7dc-424a-acda-dd7b57792738-log-httpd\") pod \"swift-proxy-579bf976f9-ds45q\" (UID: \"cfd87218-f7dc-424a-acda-dd7b57792738\") " pod="openstack/swift-proxy-579bf976f9-ds45q" Mar 09 03:03:00 crc kubenswrapper[4901]: I0309 03:03:00.894121 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfd87218-f7dc-424a-acda-dd7b57792738-public-tls-certs\") pod \"swift-proxy-579bf976f9-ds45q\" (UID: \"cfd87218-f7dc-424a-acda-dd7b57792738\") " pod="openstack/swift-proxy-579bf976f9-ds45q" Mar 09 03:03:00 crc kubenswrapper[4901]: I0309 03:03:00.894162 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfd87218-f7dc-424a-acda-dd7b57792738-internal-tls-certs\") pod \"swift-proxy-579bf976f9-ds45q\" (UID: \"cfd87218-f7dc-424a-acda-dd7b57792738\") " pod="openstack/swift-proxy-579bf976f9-ds45q" Mar 09 03:03:00 crc kubenswrapper[4901]: I0309 03:03:00.894264 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfd87218-f7dc-424a-acda-dd7b57792738-run-httpd\") pod \"swift-proxy-579bf976f9-ds45q\" (UID: \"cfd87218-f7dc-424a-acda-dd7b57792738\") " pod="openstack/swift-proxy-579bf976f9-ds45q" Mar 09 03:03:00 crc kubenswrapper[4901]: I0309 03:03:00.894322 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cfd87218-f7dc-424a-acda-dd7b57792738-etc-swift\") pod \"swift-proxy-579bf976f9-ds45q\" (UID: \"cfd87218-f7dc-424a-acda-dd7b57792738\") " pod="openstack/swift-proxy-579bf976f9-ds45q" Mar 09 03:03:00 crc kubenswrapper[4901]: I0309 03:03:00.995817 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfd87218-f7dc-424a-acda-dd7b57792738-log-httpd\") pod \"swift-proxy-579bf976f9-ds45q\" (UID: \"cfd87218-f7dc-424a-acda-dd7b57792738\") " pod="openstack/swift-proxy-579bf976f9-ds45q" Mar 09 03:03:00 crc kubenswrapper[4901]: I0309 03:03:00.995882 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfd87218-f7dc-424a-acda-dd7b57792738-public-tls-certs\") pod \"swift-proxy-579bf976f9-ds45q\" (UID: \"cfd87218-f7dc-424a-acda-dd7b57792738\") " pod="openstack/swift-proxy-579bf976f9-ds45q" Mar 09 03:03:00 crc kubenswrapper[4901]: I0309 03:03:00.995909 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfd87218-f7dc-424a-acda-dd7b57792738-internal-tls-certs\") pod \"swift-proxy-579bf976f9-ds45q\" (UID: \"cfd87218-f7dc-424a-acda-dd7b57792738\") " pod="openstack/swift-proxy-579bf976f9-ds45q" Mar 09 03:03:00 crc kubenswrapper[4901]: I0309 03:03:00.995951 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfd87218-f7dc-424a-acda-dd7b57792738-run-httpd\") pod \"swift-proxy-579bf976f9-ds45q\" (UID: \"cfd87218-f7dc-424a-acda-dd7b57792738\") " pod="openstack/swift-proxy-579bf976f9-ds45q" Mar 09 03:03:00 crc kubenswrapper[4901]: I0309 03:03:00.995983 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cfd87218-f7dc-424a-acda-dd7b57792738-etc-swift\") pod \"swift-proxy-579bf976f9-ds45q\" (UID: \"cfd87218-f7dc-424a-acda-dd7b57792738\") " pod="openstack/swift-proxy-579bf976f9-ds45q" Mar 09 03:03:00 crc kubenswrapper[4901]: I0309 03:03:00.996012 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2bgt\" (UniqueName: \"kubernetes.io/projected/cfd87218-f7dc-424a-acda-dd7b57792738-kube-api-access-z2bgt\") pod \"swift-proxy-579bf976f9-ds45q\" (UID: \"cfd87218-f7dc-424a-acda-dd7b57792738\") " pod="openstack/swift-proxy-579bf976f9-ds45q" Mar 09 03:03:00 crc kubenswrapper[4901]: I0309 03:03:00.996058 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfd87218-f7dc-424a-acda-dd7b57792738-combined-ca-bundle\") pod \"swift-proxy-579bf976f9-ds45q\" (UID: \"cfd87218-f7dc-424a-acda-dd7b57792738\") " pod="openstack/swift-proxy-579bf976f9-ds45q" Mar 09 03:03:00 crc kubenswrapper[4901]: I0309 03:03:00.996088 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfd87218-f7dc-424a-acda-dd7b57792738-config-data\") pod \"swift-proxy-579bf976f9-ds45q\" (UID: \"cfd87218-f7dc-424a-acda-dd7b57792738\") " pod="openstack/swift-proxy-579bf976f9-ds45q" Mar 09 03:03:00 crc kubenswrapper[4901]: I0309 03:03:00.996318 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfd87218-f7dc-424a-acda-dd7b57792738-log-httpd\") pod \"swift-proxy-579bf976f9-ds45q\" (UID: \"cfd87218-f7dc-424a-acda-dd7b57792738\") " pod="openstack/swift-proxy-579bf976f9-ds45q" Mar 09 03:03:00 crc kubenswrapper[4901]: I0309 03:03:00.996970 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfd87218-f7dc-424a-acda-dd7b57792738-run-httpd\") pod \"swift-proxy-579bf976f9-ds45q\" (UID: \"cfd87218-f7dc-424a-acda-dd7b57792738\") " pod="openstack/swift-proxy-579bf976f9-ds45q" Mar 09 03:03:01 crc kubenswrapper[4901]: I0309 03:03:01.000995 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfd87218-f7dc-424a-acda-dd7b57792738-internal-tls-certs\") pod \"swift-proxy-579bf976f9-ds45q\" (UID: \"cfd87218-f7dc-424a-acda-dd7b57792738\") " pod="openstack/swift-proxy-579bf976f9-ds45q" Mar 09 03:03:01 crc kubenswrapper[4901]: I0309 03:03:01.001245 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfd87218-f7dc-424a-acda-dd7b57792738-public-tls-certs\") pod \"swift-proxy-579bf976f9-ds45q\" (UID: \"cfd87218-f7dc-424a-acda-dd7b57792738\") " pod="openstack/swift-proxy-579bf976f9-ds45q" Mar 09 03:03:01 crc kubenswrapper[4901]: I0309 03:03:01.002237 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cfd87218-f7dc-424a-acda-dd7b57792738-etc-swift\") pod \"swift-proxy-579bf976f9-ds45q\" (UID: \"cfd87218-f7dc-424a-acda-dd7b57792738\") " pod="openstack/swift-proxy-579bf976f9-ds45q" Mar 09 03:03:01 crc kubenswrapper[4901]: I0309 03:03:01.009327 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 09 03:03:01 crc kubenswrapper[4901]: I0309 03:03:01.015533 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfd87218-f7dc-424a-acda-dd7b57792738-combined-ca-bundle\") pod \"swift-proxy-579bf976f9-ds45q\" (UID: \"cfd87218-f7dc-424a-acda-dd7b57792738\") " pod="openstack/swift-proxy-579bf976f9-ds45q" Mar 09 03:03:01 crc kubenswrapper[4901]: I0309 03:03:01.015780 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2bgt\" (UniqueName: \"kubernetes.io/projected/cfd87218-f7dc-424a-acda-dd7b57792738-kube-api-access-z2bgt\") pod \"swift-proxy-579bf976f9-ds45q\" (UID: \"cfd87218-f7dc-424a-acda-dd7b57792738\") " pod="openstack/swift-proxy-579bf976f9-ds45q" Mar 09 03:03:01 crc kubenswrapper[4901]: I0309 03:03:01.031622 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfd87218-f7dc-424a-acda-dd7b57792738-config-data\") pod \"swift-proxy-579bf976f9-ds45q\" (UID: \"cfd87218-f7dc-424a-acda-dd7b57792738\") " pod="openstack/swift-proxy-579bf976f9-ds45q" Mar 09 03:03:01 crc kubenswrapper[4901]: I0309 03:03:01.152730 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-579bf976f9-ds45q" Mar 09 03:03:02 crc kubenswrapper[4901]: I0309 03:03:02.600094 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 03:03:02 crc kubenswrapper[4901]: I0309 03:03:02.600894 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5157e341-8a57-4f03-8061-e6c8853dddb4" containerName="ceilometer-central-agent" containerID="cri-o://a0c62ebde44dc827946759905296f74d7febc121c09c142bec80c26419afec3b" gracePeriod=30 Mar 09 03:03:02 crc kubenswrapper[4901]: I0309 03:03:02.601388 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5157e341-8a57-4f03-8061-e6c8853dddb4" containerName="proxy-httpd" containerID="cri-o://b8495428bc91f79334a918c6d694f02ee4ef2e45cc7507747ca003edd66fe46d" gracePeriod=30 Mar 09 03:03:02 crc kubenswrapper[4901]: I0309 03:03:02.601376 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5157e341-8a57-4f03-8061-e6c8853dddb4" containerName="sg-core" containerID="cri-o://575fa649b573943652ea995943aae329e269a75ec468388a3fb8f2d76c031b60" gracePeriod=30 Mar 09 03:03:02 crc kubenswrapper[4901]: I0309 03:03:02.601504 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5157e341-8a57-4f03-8061-e6c8853dddb4" containerName="ceilometer-notification-agent" containerID="cri-o://699fbdc0c622a452d327135f74b14302cd75bb5453d66ac8e9692e83dbabf682" gracePeriod=30 Mar 09 03:03:02 crc kubenswrapper[4901]: I0309 03:03:02.609902 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="5157e341-8a57-4f03-8061-e6c8853dddb4" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.167:3000/\": EOF" Mar 09 03:03:03 crc kubenswrapper[4901]: I0309 03:03:03.524201 4901 generic.go:334] "Generic (PLEG): container finished" podID="5157e341-8a57-4f03-8061-e6c8853dddb4" containerID="b8495428bc91f79334a918c6d694f02ee4ef2e45cc7507747ca003edd66fe46d" exitCode=0 Mar 09 03:03:03 crc kubenswrapper[4901]: I0309 03:03:03.524475 4901 generic.go:334] "Generic (PLEG): container finished" podID="5157e341-8a57-4f03-8061-e6c8853dddb4" containerID="575fa649b573943652ea995943aae329e269a75ec468388a3fb8f2d76c031b60" exitCode=2 Mar 09 03:03:03 crc kubenswrapper[4901]: I0309 03:03:03.524488 4901 generic.go:334] "Generic (PLEG): container finished" podID="5157e341-8a57-4f03-8061-e6c8853dddb4" containerID="699fbdc0c622a452d327135f74b14302cd75bb5453d66ac8e9692e83dbabf682" exitCode=0 Mar 09 03:03:03 crc kubenswrapper[4901]: I0309 03:03:03.524497 4901 generic.go:334] "Generic (PLEG): container finished" podID="5157e341-8a57-4f03-8061-e6c8853dddb4" containerID="a0c62ebde44dc827946759905296f74d7febc121c09c142bec80c26419afec3b" exitCode=0 Mar 09 03:03:03 crc kubenswrapper[4901]: I0309 03:03:03.524263 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5157e341-8a57-4f03-8061-e6c8853dddb4","Type":"ContainerDied","Data":"b8495428bc91f79334a918c6d694f02ee4ef2e45cc7507747ca003edd66fe46d"} Mar 09 03:03:03 crc kubenswrapper[4901]: I0309 03:03:03.524531 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5157e341-8a57-4f03-8061-e6c8853dddb4","Type":"ContainerDied","Data":"575fa649b573943652ea995943aae329e269a75ec468388a3fb8f2d76c031b60"} Mar 09 03:03:03 crc kubenswrapper[4901]: I0309 03:03:03.524545 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5157e341-8a57-4f03-8061-e6c8853dddb4","Type":"ContainerDied","Data":"699fbdc0c622a452d327135f74b14302cd75bb5453d66ac8e9692e83dbabf682"} Mar 09 03:03:03 crc kubenswrapper[4901]: I0309 03:03:03.524555 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5157e341-8a57-4f03-8061-e6c8853dddb4","Type":"ContainerDied","Data":"a0c62ebde44dc827946759905296f74d7febc121c09c142bec80c26419afec3b"} Mar 09 03:03:05 crc kubenswrapper[4901]: I0309 03:03:05.610144 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 03:03:05 crc kubenswrapper[4901]: I0309 03:03:05.680538 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5157e341-8a57-4f03-8061-e6c8853dddb4-scripts\") pod \"5157e341-8a57-4f03-8061-e6c8853dddb4\" (UID: \"5157e341-8a57-4f03-8061-e6c8853dddb4\") " Mar 09 03:03:05 crc kubenswrapper[4901]: I0309 03:03:05.680688 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5157e341-8a57-4f03-8061-e6c8853dddb4-combined-ca-bundle\") pod \"5157e341-8a57-4f03-8061-e6c8853dddb4\" (UID: \"5157e341-8a57-4f03-8061-e6c8853dddb4\") " Mar 09 03:03:05 crc kubenswrapper[4901]: I0309 03:03:05.680737 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5157e341-8a57-4f03-8061-e6c8853dddb4-run-httpd\") pod \"5157e341-8a57-4f03-8061-e6c8853dddb4\" (UID: \"5157e341-8a57-4f03-8061-e6c8853dddb4\") " Mar 09 03:03:05 crc kubenswrapper[4901]: I0309 03:03:05.680777 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5157e341-8a57-4f03-8061-e6c8853dddb4-log-httpd\") pod \"5157e341-8a57-4f03-8061-e6c8853dddb4\" (UID: \"5157e341-8a57-4f03-8061-e6c8853dddb4\") " Mar 09 03:03:05 crc kubenswrapper[4901]: I0309 03:03:05.680806 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdzxk\" (UniqueName: \"kubernetes.io/projected/5157e341-8a57-4f03-8061-e6c8853dddb4-kube-api-access-kdzxk\") pod \"5157e341-8a57-4f03-8061-e6c8853dddb4\" (UID: \"5157e341-8a57-4f03-8061-e6c8853dddb4\") " Mar 09 03:03:05 crc kubenswrapper[4901]: I0309 03:03:05.680857 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5157e341-8a57-4f03-8061-e6c8853dddb4-sg-core-conf-yaml\") pod \"5157e341-8a57-4f03-8061-e6c8853dddb4\" (UID: \"5157e341-8a57-4f03-8061-e6c8853dddb4\") " Mar 09 03:03:05 crc kubenswrapper[4901]: I0309 03:03:05.680881 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5157e341-8a57-4f03-8061-e6c8853dddb4-config-data\") pod \"5157e341-8a57-4f03-8061-e6c8853dddb4\" (UID: \"5157e341-8a57-4f03-8061-e6c8853dddb4\") " Mar 09 03:03:05 crc kubenswrapper[4901]: I0309 03:03:05.681728 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5157e341-8a57-4f03-8061-e6c8853dddb4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5157e341-8a57-4f03-8061-e6c8853dddb4" (UID: "5157e341-8a57-4f03-8061-e6c8853dddb4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:03:05 crc kubenswrapper[4901]: I0309 03:03:05.681990 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5157e341-8a57-4f03-8061-e6c8853dddb4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5157e341-8a57-4f03-8061-e6c8853dddb4" (UID: "5157e341-8a57-4f03-8061-e6c8853dddb4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:03:05 crc kubenswrapper[4901]: I0309 03:03:05.686206 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5157e341-8a57-4f03-8061-e6c8853dddb4-scripts" (OuterVolumeSpecName: "scripts") pod "5157e341-8a57-4f03-8061-e6c8853dddb4" (UID: "5157e341-8a57-4f03-8061-e6c8853dddb4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:03:05 crc kubenswrapper[4901]: I0309 03:03:05.686568 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5157e341-8a57-4f03-8061-e6c8853dddb4-kube-api-access-kdzxk" (OuterVolumeSpecName: "kube-api-access-kdzxk") pod "5157e341-8a57-4f03-8061-e6c8853dddb4" (UID: "5157e341-8a57-4f03-8061-e6c8853dddb4"). InnerVolumeSpecName "kube-api-access-kdzxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:03:05 crc kubenswrapper[4901]: I0309 03:03:05.705742 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5157e341-8a57-4f03-8061-e6c8853dddb4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5157e341-8a57-4f03-8061-e6c8853dddb4" (UID: "5157e341-8a57-4f03-8061-e6c8853dddb4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:03:05 crc kubenswrapper[4901]: I0309 03:03:05.750999 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5157e341-8a57-4f03-8061-e6c8853dddb4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5157e341-8a57-4f03-8061-e6c8853dddb4" (UID: "5157e341-8a57-4f03-8061-e6c8853dddb4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:03:05 crc kubenswrapper[4901]: I0309 03:03:05.783193 4901 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5157e341-8a57-4f03-8061-e6c8853dddb4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 03:03:05 crc kubenswrapper[4901]: I0309 03:03:05.783281 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5157e341-8a57-4f03-8061-e6c8853dddb4-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:03:05 crc kubenswrapper[4901]: I0309 03:03:05.783311 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5157e341-8a57-4f03-8061-e6c8853dddb4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:03:05 crc kubenswrapper[4901]: I0309 03:03:05.783334 4901 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5157e341-8a57-4f03-8061-e6c8853dddb4-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 03:03:05 crc kubenswrapper[4901]: I0309 03:03:05.783355 4901 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5157e341-8a57-4f03-8061-e6c8853dddb4-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 03:03:05 crc kubenswrapper[4901]: I0309 03:03:05.783374 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdzxk\" (UniqueName: \"kubernetes.io/projected/5157e341-8a57-4f03-8061-e6c8853dddb4-kube-api-access-kdzxk\") on node \"crc\" DevicePath \"\"" Mar 09 03:03:05 crc kubenswrapper[4901]: I0309 03:03:05.813153 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5157e341-8a57-4f03-8061-e6c8853dddb4-config-data" (OuterVolumeSpecName: "config-data") pod "5157e341-8a57-4f03-8061-e6c8853dddb4" (UID: "5157e341-8a57-4f03-8061-e6c8853dddb4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:03:05 crc kubenswrapper[4901]: I0309 03:03:05.885533 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5157e341-8a57-4f03-8061-e6c8853dddb4-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 03:03:05 crc kubenswrapper[4901]: W0309 03:03:05.920956 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcfd87218_f7dc_424a_acda_dd7b57792738.slice/crio-9d64fb7ca5417e814d9976f81b03a48cde8b3f4460dfc64117105a4c35428c75 WatchSource:0}: Error finding container 9d64fb7ca5417e814d9976f81b03a48cde8b3f4460dfc64117105a4c35428c75: Status 404 returned error can't find the container with id 9d64fb7ca5417e814d9976f81b03a48cde8b3f4460dfc64117105a4c35428c75 Mar 09 03:03:05 crc kubenswrapper[4901]: I0309 03:03:05.922022 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-579bf976f9-ds45q"] Mar 09 03:03:06 crc kubenswrapper[4901]: I0309 03:03:06.557667 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"34bc86a8-8821-462a-b15b-c2f847f44be2","Type":"ContainerStarted","Data":"1c5af01525dee027d91335bfc70f1c4c730cdc63e3d8be7b219fbbab05c905fa"} Mar 09 03:03:06 crc kubenswrapper[4901]: I0309 03:03:06.562530 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5157e341-8a57-4f03-8061-e6c8853dddb4","Type":"ContainerDied","Data":"025c1f8e927ed124ac565ae6e468bf87410439ab05e8e0908ea26de249ac72bf"} Mar 09 03:03:06 crc kubenswrapper[4901]: I0309 03:03:06.562585 4901 scope.go:117] "RemoveContainer" containerID="b8495428bc91f79334a918c6d694f02ee4ef2e45cc7507747ca003edd66fe46d" Mar 09 03:03:06 crc kubenswrapper[4901]: I0309 03:03:06.562715 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 03:03:06 crc kubenswrapper[4901]: I0309 03:03:06.565329 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-579bf976f9-ds45q" event={"ID":"cfd87218-f7dc-424a-acda-dd7b57792738","Type":"ContainerStarted","Data":"f2e8fa826008c2c9f01478d90a32d0b9381a5e77f7d22cee3432c76f471846e2"} Mar 09 03:03:06 crc kubenswrapper[4901]: I0309 03:03:06.565370 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-579bf976f9-ds45q" event={"ID":"cfd87218-f7dc-424a-acda-dd7b57792738","Type":"ContainerStarted","Data":"5aadda7ec4443b77cc72cf257c26eb2f3cb4e579053dc6d7e5abdc079cb7dcfd"} Mar 09 03:03:06 crc kubenswrapper[4901]: I0309 03:03:06.565384 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-579bf976f9-ds45q" event={"ID":"cfd87218-f7dc-424a-acda-dd7b57792738","Type":"ContainerStarted","Data":"9d64fb7ca5417e814d9976f81b03a48cde8b3f4460dfc64117105a4c35428c75"} Mar 09 03:03:06 crc kubenswrapper[4901]: I0309 03:03:06.565558 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-579bf976f9-ds45q" Mar 09 03:03:06 crc kubenswrapper[4901]: I0309 03:03:06.580091 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.243311585 podStartE2EDuration="10.580073986s" podCreationTimestamp="2026-03-09 03:02:56 +0000 UTC" firstStartedPulling="2026-03-09 03:02:57.031521411 +0000 UTC m=+1301.621185143" lastFinishedPulling="2026-03-09 03:03:05.368283822 +0000 UTC m=+1309.957947544" observedRunningTime="2026-03-09 03:03:06.576030654 +0000 UTC m=+1311.165694426" watchObservedRunningTime="2026-03-09 03:03:06.580073986 +0000 UTC m=+1311.169737718" Mar 09 03:03:06 crc kubenswrapper[4901]: I0309 03:03:06.590637 4901 scope.go:117] "RemoveContainer" containerID="575fa649b573943652ea995943aae329e269a75ec468388a3fb8f2d76c031b60" Mar 09 03:03:06 crc kubenswrapper[4901]: I0309 03:03:06.601310 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 03:03:06 crc kubenswrapper[4901]: I0309 03:03:06.628668 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 03:03:06 crc kubenswrapper[4901]: I0309 03:03:06.629729 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-579bf976f9-ds45q" podStartSLOduration=6.629714668 podStartE2EDuration="6.629714668s" podCreationTimestamp="2026-03-09 03:03:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 03:03:06.616923086 +0000 UTC m=+1311.206586818" watchObservedRunningTime="2026-03-09 03:03:06.629714668 +0000 UTC m=+1311.219378400" Mar 09 03:03:06 crc kubenswrapper[4901]: I0309 03:03:06.633044 4901 scope.go:117] "RemoveContainer" containerID="699fbdc0c622a452d327135f74b14302cd75bb5453d66ac8e9692e83dbabf682" Mar 09 03:03:06 crc kubenswrapper[4901]: I0309 03:03:06.646504 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 03:03:06 crc kubenswrapper[4901]: E0309 03:03:06.646896 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5157e341-8a57-4f03-8061-e6c8853dddb4" containerName="ceilometer-notification-agent" Mar 09 03:03:06 crc kubenswrapper[4901]: I0309 03:03:06.646911 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="5157e341-8a57-4f03-8061-e6c8853dddb4" containerName="ceilometer-notification-agent" Mar 09 03:03:06 crc kubenswrapper[4901]: E0309 03:03:06.646922 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5157e341-8a57-4f03-8061-e6c8853dddb4" containerName="proxy-httpd" Mar 09 03:03:06 crc kubenswrapper[4901]: I0309 03:03:06.646930 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="5157e341-8a57-4f03-8061-e6c8853dddb4" containerName="proxy-httpd" Mar 09 03:03:06 crc kubenswrapper[4901]: E0309 03:03:06.646940 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5157e341-8a57-4f03-8061-e6c8853dddb4" containerName="ceilometer-central-agent" Mar 09 03:03:06 crc kubenswrapper[4901]: I0309 03:03:06.646946 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="5157e341-8a57-4f03-8061-e6c8853dddb4" containerName="ceilometer-central-agent" Mar 09 03:03:06 crc kubenswrapper[4901]: E0309 03:03:06.646963 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5157e341-8a57-4f03-8061-e6c8853dddb4" containerName="sg-core" Mar 09 03:03:06 crc kubenswrapper[4901]: I0309 03:03:06.646968 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="5157e341-8a57-4f03-8061-e6c8853dddb4" containerName="sg-core" Mar 09 03:03:06 crc kubenswrapper[4901]: I0309 03:03:06.647143 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="5157e341-8a57-4f03-8061-e6c8853dddb4" containerName="sg-core" Mar 09 03:03:06 crc kubenswrapper[4901]: I0309 03:03:06.647162 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="5157e341-8a57-4f03-8061-e6c8853dddb4" containerName="proxy-httpd" Mar 09 03:03:06 crc kubenswrapper[4901]: I0309 03:03:06.647172 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="5157e341-8a57-4f03-8061-e6c8853dddb4" containerName="ceilometer-central-agent" Mar 09 03:03:06 crc kubenswrapper[4901]: I0309 03:03:06.647184 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="5157e341-8a57-4f03-8061-e6c8853dddb4" containerName="ceilometer-notification-agent" Mar 09 03:03:06 crc kubenswrapper[4901]: I0309 03:03:06.649133 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 03:03:06 crc kubenswrapper[4901]: I0309 03:03:06.652791 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 03:03:06 crc kubenswrapper[4901]: I0309 03:03:06.653514 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 03:03:06 crc kubenswrapper[4901]: I0309 03:03:06.654390 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 03:03:06 crc kubenswrapper[4901]: I0309 03:03:06.661501 4901 scope.go:117] "RemoveContainer" containerID="a0c62ebde44dc827946759905296f74d7febc121c09c142bec80c26419afec3b" Mar 09 03:03:06 crc kubenswrapper[4901]: I0309 03:03:06.706460 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/788c3db9-e44c-4c56-a5d4-392dffa5e21d-log-httpd\") pod \"ceilometer-0\" (UID: \"788c3db9-e44c-4c56-a5d4-392dffa5e21d\") " pod="openstack/ceilometer-0" Mar 09 03:03:06 crc kubenswrapper[4901]: I0309 03:03:06.706523 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/788c3db9-e44c-4c56-a5d4-392dffa5e21d-run-httpd\") pod \"ceilometer-0\" (UID: \"788c3db9-e44c-4c56-a5d4-392dffa5e21d\") " pod="openstack/ceilometer-0" Mar 09 03:03:06 crc kubenswrapper[4901]: I0309 03:03:06.706599 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/788c3db9-e44c-4c56-a5d4-392dffa5e21d-config-data\") pod \"ceilometer-0\" (UID: \"788c3db9-e44c-4c56-a5d4-392dffa5e21d\") " pod="openstack/ceilometer-0" Mar 09 03:03:06 crc kubenswrapper[4901]: I0309 03:03:06.706732 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/788c3db9-e44c-4c56-a5d4-392dffa5e21d-scripts\") pod \"ceilometer-0\" (UID: \"788c3db9-e44c-4c56-a5d4-392dffa5e21d\") " pod="openstack/ceilometer-0" Mar 09 03:03:06 crc kubenswrapper[4901]: I0309 03:03:06.706797 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/788c3db9-e44c-4c56-a5d4-392dffa5e21d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"788c3db9-e44c-4c56-a5d4-392dffa5e21d\") " pod="openstack/ceilometer-0" Mar 09 03:03:06 crc kubenswrapper[4901]: I0309 03:03:06.706960 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl2jd\" (UniqueName: \"kubernetes.io/projected/788c3db9-e44c-4c56-a5d4-392dffa5e21d-kube-api-access-xl2jd\") pod \"ceilometer-0\" (UID: \"788c3db9-e44c-4c56-a5d4-392dffa5e21d\") " pod="openstack/ceilometer-0" Mar 09 03:03:06 crc kubenswrapper[4901]: I0309 03:03:06.706987 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/788c3db9-e44c-4c56-a5d4-392dffa5e21d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"788c3db9-e44c-4c56-a5d4-392dffa5e21d\") " pod="openstack/ceilometer-0" Mar 09 03:03:06 crc kubenswrapper[4901]: I0309 03:03:06.808806 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/788c3db9-e44c-4c56-a5d4-392dffa5e21d-config-data\") pod \"ceilometer-0\" (UID: \"788c3db9-e44c-4c56-a5d4-392dffa5e21d\") " pod="openstack/ceilometer-0" Mar 09 03:03:06 crc kubenswrapper[4901]: I0309 03:03:06.809277 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/788c3db9-e44c-4c56-a5d4-392dffa5e21d-scripts\") pod \"ceilometer-0\" (UID: \"788c3db9-e44c-4c56-a5d4-392dffa5e21d\") " pod="openstack/ceilometer-0" Mar 09 03:03:06 crc kubenswrapper[4901]: I0309 03:03:06.809340 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/788c3db9-e44c-4c56-a5d4-392dffa5e21d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"788c3db9-e44c-4c56-a5d4-392dffa5e21d\") " pod="openstack/ceilometer-0" Mar 09 03:03:06 crc kubenswrapper[4901]: I0309 03:03:06.809594 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xl2jd\" (UniqueName: \"kubernetes.io/projected/788c3db9-e44c-4c56-a5d4-392dffa5e21d-kube-api-access-xl2jd\") pod \"ceilometer-0\" (UID: \"788c3db9-e44c-4c56-a5d4-392dffa5e21d\") " pod="openstack/ceilometer-0" Mar 09 03:03:06 crc kubenswrapper[4901]: I0309 03:03:06.809649 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/788c3db9-e44c-4c56-a5d4-392dffa5e21d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"788c3db9-e44c-4c56-a5d4-392dffa5e21d\") " pod="openstack/ceilometer-0" Mar 09 03:03:06 crc kubenswrapper[4901]: I0309 03:03:06.809709 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/788c3db9-e44c-4c56-a5d4-392dffa5e21d-log-httpd\") pod \"ceilometer-0\" (UID: \"788c3db9-e44c-4c56-a5d4-392dffa5e21d\") " pod="openstack/ceilometer-0" Mar 09 03:03:06 crc kubenswrapper[4901]: I0309 03:03:06.809776 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/788c3db9-e44c-4c56-a5d4-392dffa5e21d-run-httpd\") pod \"ceilometer-0\" (UID: \"788c3db9-e44c-4c56-a5d4-392dffa5e21d\") " pod="openstack/ceilometer-0" Mar 09 03:03:06 crc kubenswrapper[4901]: I0309 03:03:06.810339 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/788c3db9-e44c-4c56-a5d4-392dffa5e21d-run-httpd\") pod \"ceilometer-0\" (UID: \"788c3db9-e44c-4c56-a5d4-392dffa5e21d\") " pod="openstack/ceilometer-0" Mar 09 03:03:06 crc kubenswrapper[4901]: I0309 03:03:06.811842 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/788c3db9-e44c-4c56-a5d4-392dffa5e21d-log-httpd\") pod \"ceilometer-0\" (UID: \"788c3db9-e44c-4c56-a5d4-392dffa5e21d\") " pod="openstack/ceilometer-0" Mar 09 03:03:06 crc kubenswrapper[4901]: I0309 03:03:06.813877 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/788c3db9-e44c-4c56-a5d4-392dffa5e21d-scripts\") pod \"ceilometer-0\" (UID: \"788c3db9-e44c-4c56-a5d4-392dffa5e21d\") " pod="openstack/ceilometer-0" Mar 09 03:03:06 crc kubenswrapper[4901]: I0309 03:03:06.814212 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/788c3db9-e44c-4c56-a5d4-392dffa5e21d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"788c3db9-e44c-4c56-a5d4-392dffa5e21d\") " pod="openstack/ceilometer-0" Mar 09 03:03:06 crc kubenswrapper[4901]: I0309 03:03:06.830199 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/788c3db9-e44c-4c56-a5d4-392dffa5e21d-config-data\") pod \"ceilometer-0\" (UID: \"788c3db9-e44c-4c56-a5d4-392dffa5e21d\") " pod="openstack/ceilometer-0" Mar 09 03:03:06 crc kubenswrapper[4901]: I0309 03:03:06.836555 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/788c3db9-e44c-4c56-a5d4-392dffa5e21d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"788c3db9-e44c-4c56-a5d4-392dffa5e21d\") " pod="openstack/ceilometer-0" Mar 09 03:03:06 crc kubenswrapper[4901]: I0309 03:03:06.842919 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl2jd\" (UniqueName: \"kubernetes.io/projected/788c3db9-e44c-4c56-a5d4-392dffa5e21d-kube-api-access-xl2jd\") pod \"ceilometer-0\" (UID: \"788c3db9-e44c-4c56-a5d4-392dffa5e21d\") " pod="openstack/ceilometer-0" Mar 09 03:03:06 crc kubenswrapper[4901]: I0309 03:03:06.997573 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 03:03:07 crc kubenswrapper[4901]: I0309 03:03:07.342439 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 03:03:07 crc kubenswrapper[4901]: I0309 03:03:07.342864 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="2e7f8f6c-6ee2-4c69-a626-59821baff365" containerName="glance-log" containerID="cri-o://4ecd8a9a0c326c2f31d5e17c784e9ff098a2c8264061ea6c9e191bc6bcedc8f9" gracePeriod=30 Mar 09 03:03:07 crc kubenswrapper[4901]: I0309 03:03:07.346616 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="2e7f8f6c-6ee2-4c69-a626-59821baff365" containerName="glance-httpd" containerID="cri-o://48a396afec967efb548fbf555fa8d2eba2851c4ce077fb176764c6f76b80f5f6" gracePeriod=30 Mar 09 03:03:07 crc kubenswrapper[4901]: I0309 03:03:07.468926 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 03:03:07 crc kubenswrapper[4901]: I0309 03:03:07.573619 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"788c3db9-e44c-4c56-a5d4-392dffa5e21d","Type":"ContainerStarted","Data":"ee4c113f4e407543b0df7cf2543b7314cc1d6a562112679ee564306000b1b900"} Mar 09 03:03:07 crc kubenswrapper[4901]: I0309 03:03:07.576044 4901 generic.go:334] "Generic (PLEG): container finished" podID="2e7f8f6c-6ee2-4c69-a626-59821baff365" containerID="4ecd8a9a0c326c2f31d5e17c784e9ff098a2c8264061ea6c9e191bc6bcedc8f9" exitCode=143 Mar 09 03:03:07 crc kubenswrapper[4901]: I0309 03:03:07.576106 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2e7f8f6c-6ee2-4c69-a626-59821baff365","Type":"ContainerDied","Data":"4ecd8a9a0c326c2f31d5e17c784e9ff098a2c8264061ea6c9e191bc6bcedc8f9"} Mar 09 03:03:07 crc kubenswrapper[4901]: I0309 03:03:07.576377 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-579bf976f9-ds45q" Mar 09 03:03:08 crc kubenswrapper[4901]: I0309 03:03:08.117448 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5157e341-8a57-4f03-8061-e6c8853dddb4" path="/var/lib/kubelet/pods/5157e341-8a57-4f03-8061-e6c8853dddb4/volumes" Mar 09 03:03:08 crc kubenswrapper[4901]: I0309 03:03:08.169896 4901 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e] : Timed out while waiting for systemd to remove kubepods-besteffort-pod8c59723e_e60d_41fe_8ddc_7bf0bbdafe4e.slice" Mar 09 03:03:08 crc kubenswrapper[4901]: E0309 03:03:08.170184 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e] : Timed out while waiting for systemd to remove kubepods-besteffort-pod8c59723e_e60d_41fe_8ddc_7bf0bbdafe4e.slice" pod="openstack/dnsmasq-dns-55c649d8d5-bs2zk" podUID="8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e" Mar 09 03:03:08 crc kubenswrapper[4901]: I0309 03:03:08.220817 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 03:03:08 crc kubenswrapper[4901]: I0309 03:03:08.221102 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="98c082b4-bd96-4b48-80ec-6c0dc672ec58" containerName="glance-log" containerID="cri-o://41acdeef4d56deb89483a044161d31214685253b8c9d8783b8df0b16a66ac281" gracePeriod=30 Mar 09 03:03:08 crc kubenswrapper[4901]: I0309 03:03:08.221282 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="98c082b4-bd96-4b48-80ec-6c0dc672ec58" containerName="glance-httpd" containerID="cri-o://1138330afb28e0cc6c7476a832e0a83a44deeaf627f09fd8766d81d2294dff1e" gracePeriod=30 Mar 09 03:03:08 crc kubenswrapper[4901]: I0309 03:03:08.330785 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 03:03:08 crc kubenswrapper[4901]: I0309 03:03:08.584938 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"788c3db9-e44c-4c56-a5d4-392dffa5e21d","Type":"ContainerStarted","Data":"f5fdd4cdfa035f6889b051c26813eba310fe6f41180e65115f37f28e4142c27d"} Mar 09 03:03:08 crc kubenswrapper[4901]: I0309 03:03:08.586569 4901 generic.go:334] "Generic (PLEG): container finished" podID="98c082b4-bd96-4b48-80ec-6c0dc672ec58" containerID="41acdeef4d56deb89483a044161d31214685253b8c9d8783b8df0b16a66ac281" exitCode=143 Mar 09 03:03:08 crc kubenswrapper[4901]: I0309 03:03:08.586622 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"98c082b4-bd96-4b48-80ec-6c0dc672ec58","Type":"ContainerDied","Data":"41acdeef4d56deb89483a044161d31214685253b8c9d8783b8df0b16a66ac281"} Mar 09 03:03:08 crc kubenswrapper[4901]: I0309 03:03:08.586640 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55c649d8d5-bs2zk" Mar 09 03:03:08 crc kubenswrapper[4901]: I0309 03:03:08.607920 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55c649d8d5-bs2zk"] Mar 09 03:03:08 crc kubenswrapper[4901]: I0309 03:03:08.614251 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55c649d8d5-bs2zk"] Mar 09 03:03:09 crc kubenswrapper[4901]: I0309 03:03:09.596182 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"788c3db9-e44c-4c56-a5d4-392dffa5e21d","Type":"ContainerStarted","Data":"288629f2369cf44e8570ccd2b5cb49c13c6b8d3111cf8ba15ad5becdb12a5291"} Mar 09 03:03:09 crc kubenswrapper[4901]: I0309 03:03:09.596699 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"788c3db9-e44c-4c56-a5d4-392dffa5e21d","Type":"ContainerStarted","Data":"207cdf29b2f56804ac4dc9e8ce38ea02d6c959a6d9a759e151211f7512a0f936"} Mar 09 03:03:10 crc kubenswrapper[4901]: I0309 03:03:10.117447 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e" path="/var/lib/kubelet/pods/8c59723e-e60d-41fe-8ddc-7bf0bbdafe4e/volumes" Mar 09 03:03:10 crc kubenswrapper[4901]: I0309 03:03:10.608812 4901 generic.go:334] "Generic (PLEG): container finished" podID="2e7f8f6c-6ee2-4c69-a626-59821baff365" containerID="48a396afec967efb548fbf555fa8d2eba2851c4ce077fb176764c6f76b80f5f6" exitCode=0 Mar 09 03:03:10 crc kubenswrapper[4901]: I0309 03:03:10.608901 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2e7f8f6c-6ee2-4c69-a626-59821baff365","Type":"ContainerDied","Data":"48a396afec967efb548fbf555fa8d2eba2851c4ce077fb176764c6f76b80f5f6"} Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.006443 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.091196 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2e7f8f6c-6ee2-4c69-a626-59821baff365-httpd-run\") pod \"2e7f8f6c-6ee2-4c69-a626-59821baff365\" (UID: \"2e7f8f6c-6ee2-4c69-a626-59821baff365\") " Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.091631 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e7f8f6c-6ee2-4c69-a626-59821baff365-scripts\") pod \"2e7f8f6c-6ee2-4c69-a626-59821baff365\" (UID: \"2e7f8f6c-6ee2-4c69-a626-59821baff365\") " Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.091793 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e7f8f6c-6ee2-4c69-a626-59821baff365-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2e7f8f6c-6ee2-4c69-a626-59821baff365" (UID: "2e7f8f6c-6ee2-4c69-a626-59821baff365"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.091837 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gngb5\" (UniqueName: \"kubernetes.io/projected/2e7f8f6c-6ee2-4c69-a626-59821baff365-kube-api-access-gngb5\") pod \"2e7f8f6c-6ee2-4c69-a626-59821baff365\" (UID: \"2e7f8f6c-6ee2-4c69-a626-59821baff365\") " Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.092301 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e7f8f6c-6ee2-4c69-a626-59821baff365-public-tls-certs\") pod \"2e7f8f6c-6ee2-4c69-a626-59821baff365\" (UID: \"2e7f8f6c-6ee2-4c69-a626-59821baff365\") " Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.092330 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e7f8f6c-6ee2-4c69-a626-59821baff365-logs\") pod \"2e7f8f6c-6ee2-4c69-a626-59821baff365\" (UID: \"2e7f8f6c-6ee2-4c69-a626-59821baff365\") " Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.092380 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"2e7f8f6c-6ee2-4c69-a626-59821baff365\" (UID: \"2e7f8f6c-6ee2-4c69-a626-59821baff365\") " Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.092467 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e7f8f6c-6ee2-4c69-a626-59821baff365-config-data\") pod \"2e7f8f6c-6ee2-4c69-a626-59821baff365\" (UID: \"2e7f8f6c-6ee2-4c69-a626-59821baff365\") " Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.092505 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e7f8f6c-6ee2-4c69-a626-59821baff365-combined-ca-bundle\") pod \"2e7f8f6c-6ee2-4c69-a626-59821baff365\" (UID: \"2e7f8f6c-6ee2-4c69-a626-59821baff365\") " Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.092682 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e7f8f6c-6ee2-4c69-a626-59821baff365-logs" (OuterVolumeSpecName: "logs") pod "2e7f8f6c-6ee2-4c69-a626-59821baff365" (UID: "2e7f8f6c-6ee2-4c69-a626-59821baff365"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.093422 4901 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2e7f8f6c-6ee2-4c69-a626-59821baff365-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.093463 4901 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e7f8f6c-6ee2-4c69-a626-59821baff365-logs\") on node \"crc\" DevicePath \"\"" Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.099525 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "2e7f8f6c-6ee2-4c69-a626-59821baff365" (UID: "2e7f8f6c-6ee2-4c69-a626-59821baff365"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.099906 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e7f8f6c-6ee2-4c69-a626-59821baff365-kube-api-access-gngb5" (OuterVolumeSpecName: "kube-api-access-gngb5") pod "2e7f8f6c-6ee2-4c69-a626-59821baff365" (UID: "2e7f8f6c-6ee2-4c69-a626-59821baff365"). InnerVolumeSpecName "kube-api-access-gngb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.116425 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e7f8f6c-6ee2-4c69-a626-59821baff365-scripts" (OuterVolumeSpecName: "scripts") pod "2e7f8f6c-6ee2-4c69-a626-59821baff365" (UID: "2e7f8f6c-6ee2-4c69-a626-59821baff365"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.136427 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e7f8f6c-6ee2-4c69-a626-59821baff365-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2e7f8f6c-6ee2-4c69-a626-59821baff365" (UID: "2e7f8f6c-6ee2-4c69-a626-59821baff365"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.167346 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e7f8f6c-6ee2-4c69-a626-59821baff365-config-data" (OuterVolumeSpecName: "config-data") pod "2e7f8f6c-6ee2-4c69-a626-59821baff365" (UID: "2e7f8f6c-6ee2-4c69-a626-59821baff365"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.184604 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-579bf976f9-ds45q" Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.188243 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-579bf976f9-ds45q" Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.188775 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e7f8f6c-6ee2-4c69-a626-59821baff365-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2e7f8f6c-6ee2-4c69-a626-59821baff365" (UID: "2e7f8f6c-6ee2-4c69-a626-59821baff365"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.195517 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e7f8f6c-6ee2-4c69-a626-59821baff365-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.195537 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gngb5\" (UniqueName: \"kubernetes.io/projected/2e7f8f6c-6ee2-4c69-a626-59821baff365-kube-api-access-gngb5\") on node \"crc\" DevicePath \"\"" Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.195546 4901 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e7f8f6c-6ee2-4c69-a626-59821baff365-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.195565 4901 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.195574 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e7f8f6c-6ee2-4c69-a626-59821baff365-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.195583 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e7f8f6c-6ee2-4c69-a626-59821baff365-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.212718 4901 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.297478 4901 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.398063 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="98c082b4-bd96-4b48-80ec-6c0dc672ec58" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.153:9292/healthcheck\": read tcp 10.217.0.2:44998->10.217.0.153:9292: read: connection reset by peer" Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.398111 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="98c082b4-bd96-4b48-80ec-6c0dc672ec58" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.153:9292/healthcheck\": read tcp 10.217.0.2:45000->10.217.0.153:9292: read: connection reset by peer" Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.629120 4901 generic.go:334] "Generic (PLEG): container finished" podID="98c082b4-bd96-4b48-80ec-6c0dc672ec58" containerID="1138330afb28e0cc6c7476a832e0a83a44deeaf627f09fd8766d81d2294dff1e" exitCode=0 Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.629198 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"98c082b4-bd96-4b48-80ec-6c0dc672ec58","Type":"ContainerDied","Data":"1138330afb28e0cc6c7476a832e0a83a44deeaf627f09fd8766d81d2294dff1e"} Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.634413 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2e7f8f6c-6ee2-4c69-a626-59821baff365","Type":"ContainerDied","Data":"ff4516cb1ea306f9399f867a4f7428ac49315b2eec48f097247e19cc02ad50a1"} Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.634463 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.634481 4901 scope.go:117] "RemoveContainer" containerID="48a396afec967efb548fbf555fa8d2eba2851c4ce077fb176764c6f76b80f5f6" Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.658122 4901 scope.go:117] "RemoveContainer" containerID="4ecd8a9a0c326c2f31d5e17c784e9ff098a2c8264061ea6c9e191bc6bcedc8f9" Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.670986 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.683291 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.693872 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 03:03:11 crc kubenswrapper[4901]: E0309 03:03:11.694252 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e7f8f6c-6ee2-4c69-a626-59821baff365" containerName="glance-httpd" Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.694266 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e7f8f6c-6ee2-4c69-a626-59821baff365" containerName="glance-httpd" Mar 09 03:03:11 crc kubenswrapper[4901]: E0309 03:03:11.694278 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e7f8f6c-6ee2-4c69-a626-59821baff365" containerName="glance-log" Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.694285 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e7f8f6c-6ee2-4c69-a626-59821baff365" containerName="glance-log" Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.694445 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e7f8f6c-6ee2-4c69-a626-59821baff365" containerName="glance-httpd" Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.694470 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e7f8f6c-6ee2-4c69-a626-59821baff365" containerName="glance-log" Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.697134 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.702253 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.703077 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.709071 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.807279 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac19cc68-f23c-4622-b265-6e94db65a43f-config-data\") pod \"glance-default-external-api-0\" (UID: \"ac19cc68-f23c-4622-b265-6e94db65a43f\") " pod="openstack/glance-default-external-api-0" Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.807381 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac19cc68-f23c-4622-b265-6e94db65a43f-scripts\") pod \"glance-default-external-api-0\" (UID: \"ac19cc68-f23c-4622-b265-6e94db65a43f\") " pod="openstack/glance-default-external-api-0" Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.807435 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ac19cc68-f23c-4622-b265-6e94db65a43f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ac19cc68-f23c-4622-b265-6e94db65a43f\") " pod="openstack/glance-default-external-api-0" Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.807467 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"ac19cc68-f23c-4622-b265-6e94db65a43f\") " pod="openstack/glance-default-external-api-0" Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.807522 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac19cc68-f23c-4622-b265-6e94db65a43f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ac19cc68-f23c-4622-b265-6e94db65a43f\") " pod="openstack/glance-default-external-api-0" Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.807562 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6rw8\" (UniqueName: \"kubernetes.io/projected/ac19cc68-f23c-4622-b265-6e94db65a43f-kube-api-access-c6rw8\") pod \"glance-default-external-api-0\" (UID: \"ac19cc68-f23c-4622-b265-6e94db65a43f\") " pod="openstack/glance-default-external-api-0" Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.807585 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac19cc68-f23c-4622-b265-6e94db65a43f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ac19cc68-f23c-4622-b265-6e94db65a43f\") " pod="openstack/glance-default-external-api-0" Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.807635 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac19cc68-f23c-4622-b265-6e94db65a43f-logs\") pod \"glance-default-external-api-0\" (UID: \"ac19cc68-f23c-4622-b265-6e94db65a43f\") " pod="openstack/glance-default-external-api-0" Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.909575 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac19cc68-f23c-4622-b265-6e94db65a43f-logs\") pod \"glance-default-external-api-0\" (UID: \"ac19cc68-f23c-4622-b265-6e94db65a43f\") " pod="openstack/glance-default-external-api-0" Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.909632 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac19cc68-f23c-4622-b265-6e94db65a43f-config-data\") pod \"glance-default-external-api-0\" (UID: \"ac19cc68-f23c-4622-b265-6e94db65a43f\") " pod="openstack/glance-default-external-api-0" Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.909688 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac19cc68-f23c-4622-b265-6e94db65a43f-scripts\") pod \"glance-default-external-api-0\" (UID: \"ac19cc68-f23c-4622-b265-6e94db65a43f\") " pod="openstack/glance-default-external-api-0" Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.909727 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ac19cc68-f23c-4622-b265-6e94db65a43f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ac19cc68-f23c-4622-b265-6e94db65a43f\") " pod="openstack/glance-default-external-api-0" Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.909750 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"ac19cc68-f23c-4622-b265-6e94db65a43f\") " pod="openstack/glance-default-external-api-0" Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.909788 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac19cc68-f23c-4622-b265-6e94db65a43f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ac19cc68-f23c-4622-b265-6e94db65a43f\") " pod="openstack/glance-default-external-api-0" Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.909814 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac19cc68-f23c-4622-b265-6e94db65a43f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ac19cc68-f23c-4622-b265-6e94db65a43f\") " pod="openstack/glance-default-external-api-0" Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.909831 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6rw8\" (UniqueName: \"kubernetes.io/projected/ac19cc68-f23c-4622-b265-6e94db65a43f-kube-api-access-c6rw8\") pod \"glance-default-external-api-0\" (UID: \"ac19cc68-f23c-4622-b265-6e94db65a43f\") " pod="openstack/glance-default-external-api-0" Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.910174 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ac19cc68-f23c-4622-b265-6e94db65a43f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ac19cc68-f23c-4622-b265-6e94db65a43f\") " pod="openstack/glance-default-external-api-0" Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.910281 4901 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"ac19cc68-f23c-4622-b265-6e94db65a43f\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.910920 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac19cc68-f23c-4622-b265-6e94db65a43f-logs\") pod \"glance-default-external-api-0\" (UID: \"ac19cc68-f23c-4622-b265-6e94db65a43f\") " pod="openstack/glance-default-external-api-0" Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.937999 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac19cc68-f23c-4622-b265-6e94db65a43f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ac19cc68-f23c-4622-b265-6e94db65a43f\") " pod="openstack/glance-default-external-api-0" Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.938297 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6rw8\" (UniqueName: \"kubernetes.io/projected/ac19cc68-f23c-4622-b265-6e94db65a43f-kube-api-access-c6rw8\") pod \"glance-default-external-api-0\" (UID: \"ac19cc68-f23c-4622-b265-6e94db65a43f\") " pod="openstack/glance-default-external-api-0" Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.938927 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac19cc68-f23c-4622-b265-6e94db65a43f-config-data\") pod \"glance-default-external-api-0\" (UID: \"ac19cc68-f23c-4622-b265-6e94db65a43f\") " pod="openstack/glance-default-external-api-0" Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.953931 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac19cc68-f23c-4622-b265-6e94db65a43f-scripts\") pod \"glance-default-external-api-0\" (UID: \"ac19cc68-f23c-4622-b265-6e94db65a43f\") " pod="openstack/glance-default-external-api-0" Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.964558 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac19cc68-f23c-4622-b265-6e94db65a43f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ac19cc68-f23c-4622-b265-6e94db65a43f\") " pod="openstack/glance-default-external-api-0" Mar 09 03:03:11 crc kubenswrapper[4901]: I0309 03:03:11.973710 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"ac19cc68-f23c-4622-b265-6e94db65a43f\") " pod="openstack/glance-default-external-api-0" Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.028321 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.050684 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.133699 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"98c082b4-bd96-4b48-80ec-6c0dc672ec58\" (UID: \"98c082b4-bd96-4b48-80ec-6c0dc672ec58\") " Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.133958 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/98c082b4-bd96-4b48-80ec-6c0dc672ec58-httpd-run\") pod \"98c082b4-bd96-4b48-80ec-6c0dc672ec58\" (UID: \"98c082b4-bd96-4b48-80ec-6c0dc672ec58\") " Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.134003 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/98c082b4-bd96-4b48-80ec-6c0dc672ec58-internal-tls-certs\") pod \"98c082b4-bd96-4b48-80ec-6c0dc672ec58\" (UID: \"98c082b4-bd96-4b48-80ec-6c0dc672ec58\") " Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.134040 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98c082b4-bd96-4b48-80ec-6c0dc672ec58-config-data\") pod \"98c082b4-bd96-4b48-80ec-6c0dc672ec58\" (UID: \"98c082b4-bd96-4b48-80ec-6c0dc672ec58\") " Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.134055 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98c082b4-bd96-4b48-80ec-6c0dc672ec58-scripts\") pod \"98c082b4-bd96-4b48-80ec-6c0dc672ec58\" (UID: \"98c082b4-bd96-4b48-80ec-6c0dc672ec58\") " Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.134071 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55q5b\" (UniqueName: \"kubernetes.io/projected/98c082b4-bd96-4b48-80ec-6c0dc672ec58-kube-api-access-55q5b\") pod \"98c082b4-bd96-4b48-80ec-6c0dc672ec58\" (UID: \"98c082b4-bd96-4b48-80ec-6c0dc672ec58\") " Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.134093 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98c082b4-bd96-4b48-80ec-6c0dc672ec58-logs\") pod \"98c082b4-bd96-4b48-80ec-6c0dc672ec58\" (UID: \"98c082b4-bd96-4b48-80ec-6c0dc672ec58\") " Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.134182 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98c082b4-bd96-4b48-80ec-6c0dc672ec58-combined-ca-bundle\") pod \"98c082b4-bd96-4b48-80ec-6c0dc672ec58\" (UID: \"98c082b4-bd96-4b48-80ec-6c0dc672ec58\") " Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.144386 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98c082b4-bd96-4b48-80ec-6c0dc672ec58-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "98c082b4-bd96-4b48-80ec-6c0dc672ec58" (UID: "98c082b4-bd96-4b48-80ec-6c0dc672ec58"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.148438 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98c082b4-bd96-4b48-80ec-6c0dc672ec58-logs" (OuterVolumeSpecName: "logs") pod "98c082b4-bd96-4b48-80ec-6c0dc672ec58" (UID: "98c082b4-bd96-4b48-80ec-6c0dc672ec58"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.166775 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98c082b4-bd96-4b48-80ec-6c0dc672ec58-kube-api-access-55q5b" (OuterVolumeSpecName: "kube-api-access-55q5b") pod "98c082b4-bd96-4b48-80ec-6c0dc672ec58" (UID: "98c082b4-bd96-4b48-80ec-6c0dc672ec58"). InnerVolumeSpecName "kube-api-access-55q5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.173797 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e7f8f6c-6ee2-4c69-a626-59821baff365" path="/var/lib/kubelet/pods/2e7f8f6c-6ee2-4c69-a626-59821baff365/volumes" Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.183607 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98c082b4-bd96-4b48-80ec-6c0dc672ec58-scripts" (OuterVolumeSpecName: "scripts") pod "98c082b4-bd96-4b48-80ec-6c0dc672ec58" (UID: "98c082b4-bd96-4b48-80ec-6c0dc672ec58"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.196369 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "98c082b4-bd96-4b48-80ec-6c0dc672ec58" (UID: "98c082b4-bd96-4b48-80ec-6c0dc672ec58"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.240072 4901 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.240104 4901 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/98c082b4-bd96-4b48-80ec-6c0dc672ec58-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.240113 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98c082b4-bd96-4b48-80ec-6c0dc672ec58-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.240121 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55q5b\" (UniqueName: \"kubernetes.io/projected/98c082b4-bd96-4b48-80ec-6c0dc672ec58-kube-api-access-55q5b\") on node \"crc\" DevicePath \"\"" Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.240132 4901 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98c082b4-bd96-4b48-80ec-6c0dc672ec58-logs\") on node \"crc\" DevicePath \"\"" Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.252877 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98c082b4-bd96-4b48-80ec-6c0dc672ec58-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "98c082b4-bd96-4b48-80ec-6c0dc672ec58" (UID: "98c082b4-bd96-4b48-80ec-6c0dc672ec58"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.316709 4901 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.338032 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98c082b4-bd96-4b48-80ec-6c0dc672ec58-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "98c082b4-bd96-4b48-80ec-6c0dc672ec58" (UID: "98c082b4-bd96-4b48-80ec-6c0dc672ec58"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.346296 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98c082b4-bd96-4b48-80ec-6c0dc672ec58-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.346319 4901 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.346328 4901 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/98c082b4-bd96-4b48-80ec-6c0dc672ec58-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.365381 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98c082b4-bd96-4b48-80ec-6c0dc672ec58-config-data" (OuterVolumeSpecName: "config-data") pod "98c082b4-bd96-4b48-80ec-6c0dc672ec58" (UID: "98c082b4-bd96-4b48-80ec-6c0dc672ec58"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.449932 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98c082b4-bd96-4b48-80ec-6c0dc672ec58-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.661078 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"98c082b4-bd96-4b48-80ec-6c0dc672ec58","Type":"ContainerDied","Data":"20da0afed589aa15b9d0021134c60a488b9306e15f3779f5aba6f286522d4555"} Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.661143 4901 scope.go:117] "RemoveContainer" containerID="1138330afb28e0cc6c7476a832e0a83a44deeaf627f09fd8766d81d2294dff1e" Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.661376 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.681240 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="788c3db9-e44c-4c56-a5d4-392dffa5e21d" containerName="ceilometer-central-agent" containerID="cri-o://f5fdd4cdfa035f6889b051c26813eba310fe6f41180e65115f37f28e4142c27d" gracePeriod=30 Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.681474 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.681693 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="788c3db9-e44c-4c56-a5d4-392dffa5e21d" containerName="proxy-httpd" containerID="cri-o://ff8c4b5cb415a2613e06b88df9a201e75343e9fd9014c0c3d025e9db8a0c2eb2" gracePeriod=30 Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.681738 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="788c3db9-e44c-4c56-a5d4-392dffa5e21d" containerName="sg-core" containerID="cri-o://288629f2369cf44e8570ccd2b5cb49c13c6b8d3111cf8ba15ad5becdb12a5291" gracePeriod=30 Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.681770 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="788c3db9-e44c-4c56-a5d4-392dffa5e21d" containerName="ceilometer-notification-agent" containerID="cri-o://207cdf29b2f56804ac4dc9e8ce38ea02d6c959a6d9a759e151211f7512a0f936" gracePeriod=30 Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.711724 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.77004312 podStartE2EDuration="6.711707429s" podCreationTimestamp="2026-03-09 03:03:06 +0000 UTC" firstStartedPulling="2026-03-09 03:03:07.473340626 +0000 UTC m=+1312.063004358" lastFinishedPulling="2026-03-09 03:03:12.415004935 +0000 UTC m=+1317.004668667" observedRunningTime="2026-03-09 03:03:12.704048806 +0000 UTC m=+1317.293712538" watchObservedRunningTime="2026-03-09 03:03:12.711707429 +0000 UTC m=+1317.301371161" Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.723118 4901 scope.go:117] "RemoveContainer" containerID="41acdeef4d56deb89483a044161d31214685253b8c9d8783b8df0b16a66ac281" Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.741233 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.750591 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.762299 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.767213 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 03:03:12 crc kubenswrapper[4901]: E0309 03:03:12.771248 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98c082b4-bd96-4b48-80ec-6c0dc672ec58" containerName="glance-httpd" Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.771261 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="98c082b4-bd96-4b48-80ec-6c0dc672ec58" containerName="glance-httpd" Mar 09 03:03:12 crc kubenswrapper[4901]: E0309 03:03:12.771292 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98c082b4-bd96-4b48-80ec-6c0dc672ec58" containerName="glance-log" Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.771298 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="98c082b4-bd96-4b48-80ec-6c0dc672ec58" containerName="glance-log" Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.771451 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="98c082b4-bd96-4b48-80ec-6c0dc672ec58" containerName="glance-httpd" Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.771473 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="98c082b4-bd96-4b48-80ec-6c0dc672ec58" containerName="glance-log" Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.772338 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.775122 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.775320 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.796517 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 03:03:12 crc kubenswrapper[4901]: W0309 03:03:12.807667 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac19cc68_f23c_4622_b265_6e94db65a43f.slice/crio-9152285f054a950d74baffc0f701b5abed3213de1f526aa39fa265f7214f9ed1 WatchSource:0}: Error finding container 9152285f054a950d74baffc0f701b5abed3213de1f526aa39fa265f7214f9ed1: Status 404 returned error can't find the container with id 9152285f054a950d74baffc0f701b5abed3213de1f526aa39fa265f7214f9ed1 Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.857687 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"12ec135f-33b3-4be3-bb27-5bb0ea25ddce\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.858115 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12ec135f-33b3-4be3-bb27-5bb0ea25ddce-config-data\") pod \"glance-default-internal-api-0\" (UID: \"12ec135f-33b3-4be3-bb27-5bb0ea25ddce\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.858163 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12ec135f-33b3-4be3-bb27-5bb0ea25ddce-scripts\") pod \"glance-default-internal-api-0\" (UID: \"12ec135f-33b3-4be3-bb27-5bb0ea25ddce\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.858207 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/12ec135f-33b3-4be3-bb27-5bb0ea25ddce-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"12ec135f-33b3-4be3-bb27-5bb0ea25ddce\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.858257 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/12ec135f-33b3-4be3-bb27-5bb0ea25ddce-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"12ec135f-33b3-4be3-bb27-5bb0ea25ddce\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.858288 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ec135f-33b3-4be3-bb27-5bb0ea25ddce-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"12ec135f-33b3-4be3-bb27-5bb0ea25ddce\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.858311 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwnv6\" (UniqueName: \"kubernetes.io/projected/12ec135f-33b3-4be3-bb27-5bb0ea25ddce-kube-api-access-cwnv6\") pod \"glance-default-internal-api-0\" (UID: \"12ec135f-33b3-4be3-bb27-5bb0ea25ddce\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.858332 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12ec135f-33b3-4be3-bb27-5bb0ea25ddce-logs\") pod \"glance-default-internal-api-0\" (UID: \"12ec135f-33b3-4be3-bb27-5bb0ea25ddce\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.959708 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/12ec135f-33b3-4be3-bb27-5bb0ea25ddce-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"12ec135f-33b3-4be3-bb27-5bb0ea25ddce\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.959764 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/12ec135f-33b3-4be3-bb27-5bb0ea25ddce-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"12ec135f-33b3-4be3-bb27-5bb0ea25ddce\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.959795 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ec135f-33b3-4be3-bb27-5bb0ea25ddce-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"12ec135f-33b3-4be3-bb27-5bb0ea25ddce\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.959819 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwnv6\" (UniqueName: \"kubernetes.io/projected/12ec135f-33b3-4be3-bb27-5bb0ea25ddce-kube-api-access-cwnv6\") pod \"glance-default-internal-api-0\" (UID: \"12ec135f-33b3-4be3-bb27-5bb0ea25ddce\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.959841 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12ec135f-33b3-4be3-bb27-5bb0ea25ddce-logs\") pod \"glance-default-internal-api-0\" (UID: \"12ec135f-33b3-4be3-bb27-5bb0ea25ddce\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.959867 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"12ec135f-33b3-4be3-bb27-5bb0ea25ddce\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.959912 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12ec135f-33b3-4be3-bb27-5bb0ea25ddce-config-data\") pod \"glance-default-internal-api-0\" (UID: \"12ec135f-33b3-4be3-bb27-5bb0ea25ddce\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.959952 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12ec135f-33b3-4be3-bb27-5bb0ea25ddce-scripts\") pod \"glance-default-internal-api-0\" (UID: \"12ec135f-33b3-4be3-bb27-5bb0ea25ddce\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.960912 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/12ec135f-33b3-4be3-bb27-5bb0ea25ddce-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"12ec135f-33b3-4be3-bb27-5bb0ea25ddce\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.960935 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12ec135f-33b3-4be3-bb27-5bb0ea25ddce-logs\") pod \"glance-default-internal-api-0\" (UID: \"12ec135f-33b3-4be3-bb27-5bb0ea25ddce\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.961336 4901 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"12ec135f-33b3-4be3-bb27-5bb0ea25ddce\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.977362 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/12ec135f-33b3-4be3-bb27-5bb0ea25ddce-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"12ec135f-33b3-4be3-bb27-5bb0ea25ddce\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.979499 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ec135f-33b3-4be3-bb27-5bb0ea25ddce-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"12ec135f-33b3-4be3-bb27-5bb0ea25ddce\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.979682 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12ec135f-33b3-4be3-bb27-5bb0ea25ddce-config-data\") pod \"glance-default-internal-api-0\" (UID: \"12ec135f-33b3-4be3-bb27-5bb0ea25ddce\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.981345 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwnv6\" (UniqueName: \"kubernetes.io/projected/12ec135f-33b3-4be3-bb27-5bb0ea25ddce-kube-api-access-cwnv6\") pod \"glance-default-internal-api-0\" (UID: \"12ec135f-33b3-4be3-bb27-5bb0ea25ddce\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:03:12 crc kubenswrapper[4901]: I0309 03:03:12.981371 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12ec135f-33b3-4be3-bb27-5bb0ea25ddce-scripts\") pod \"glance-default-internal-api-0\" (UID: \"12ec135f-33b3-4be3-bb27-5bb0ea25ddce\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:03:13 crc kubenswrapper[4901]: I0309 03:03:13.003136 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"12ec135f-33b3-4be3-bb27-5bb0ea25ddce\") " pod="openstack/glance-default-internal-api-0" Mar 09 03:03:13 crc kubenswrapper[4901]: I0309 03:03:13.159854 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 03:03:13 crc kubenswrapper[4901]: I0309 03:03:13.705682 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ac19cc68-f23c-4622-b265-6e94db65a43f","Type":"ContainerStarted","Data":"d69c564e49621dd1b26ec80f330b2a7ebc14dc4c83034905dc611e74754ca966"} Mar 09 03:03:13 crc kubenswrapper[4901]: I0309 03:03:13.706163 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ac19cc68-f23c-4622-b265-6e94db65a43f","Type":"ContainerStarted","Data":"9152285f054a950d74baffc0f701b5abed3213de1f526aa39fa265f7214f9ed1"} Mar 09 03:03:13 crc kubenswrapper[4901]: I0309 03:03:13.715926 4901 generic.go:334] "Generic (PLEG): container finished" podID="788c3db9-e44c-4c56-a5d4-392dffa5e21d" containerID="288629f2369cf44e8570ccd2b5cb49c13c6b8d3111cf8ba15ad5becdb12a5291" exitCode=2 Mar 09 03:03:13 crc kubenswrapper[4901]: I0309 03:03:13.715961 4901 generic.go:334] "Generic (PLEG): container finished" podID="788c3db9-e44c-4c56-a5d4-392dffa5e21d" containerID="207cdf29b2f56804ac4dc9e8ce38ea02d6c959a6d9a759e151211f7512a0f936" exitCode=0 Mar 09 03:03:13 crc kubenswrapper[4901]: I0309 03:03:13.715984 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"788c3db9-e44c-4c56-a5d4-392dffa5e21d","Type":"ContainerStarted","Data":"ff8c4b5cb415a2613e06b88df9a201e75343e9fd9014c0c3d025e9db8a0c2eb2"} Mar 09 03:03:13 crc kubenswrapper[4901]: I0309 03:03:13.716012 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"788c3db9-e44c-4c56-a5d4-392dffa5e21d","Type":"ContainerDied","Data":"288629f2369cf44e8570ccd2b5cb49c13c6b8d3111cf8ba15ad5becdb12a5291"} Mar 09 03:03:13 crc kubenswrapper[4901]: I0309 03:03:13.716027 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"788c3db9-e44c-4c56-a5d4-392dffa5e21d","Type":"ContainerDied","Data":"207cdf29b2f56804ac4dc9e8ce38ea02d6c959a6d9a759e151211f7512a0f936"} Mar 09 03:03:13 crc kubenswrapper[4901]: I0309 03:03:13.735470 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 03:03:13 crc kubenswrapper[4901]: W0309 03:03:13.744782 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12ec135f_33b3_4be3_bb27_5bb0ea25ddce.slice/crio-f93312899e026f6aa80e17f08b7fea820f795932388bbc17762e07c45cc308c2 WatchSource:0}: Error finding container f93312899e026f6aa80e17f08b7fea820f795932388bbc17762e07c45cc308c2: Status 404 returned error can't find the container with id f93312899e026f6aa80e17f08b7fea820f795932388bbc17762e07c45cc308c2 Mar 09 03:03:14 crc kubenswrapper[4901]: I0309 03:03:14.158914 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98c082b4-bd96-4b48-80ec-6c0dc672ec58" path="/var/lib/kubelet/pods/98c082b4-bd96-4b48-80ec-6c0dc672ec58/volumes" Mar 09 03:03:14 crc kubenswrapper[4901]: I0309 03:03:14.746979 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"12ec135f-33b3-4be3-bb27-5bb0ea25ddce","Type":"ContainerStarted","Data":"b6d35aeb5dab9771d9b67accacc82110a3fcd1a1a64f3b6be5cc15e368bd1336"} Mar 09 03:03:14 crc kubenswrapper[4901]: I0309 03:03:14.747349 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"12ec135f-33b3-4be3-bb27-5bb0ea25ddce","Type":"ContainerStarted","Data":"f93312899e026f6aa80e17f08b7fea820f795932388bbc17762e07c45cc308c2"} Mar 09 03:03:14 crc kubenswrapper[4901]: I0309 03:03:14.750568 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ac19cc68-f23c-4622-b265-6e94db65a43f","Type":"ContainerStarted","Data":"fffb85172792a1f2d2029246715912b5942a62d42487ca9ffed83a800ad7a8d7"} Mar 09 03:03:15 crc kubenswrapper[4901]: I0309 03:03:15.368550 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5b7f6df545-whtgc" Mar 09 03:03:15 crc kubenswrapper[4901]: I0309 03:03:15.388448 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.388431291 podStartE2EDuration="4.388431291s" podCreationTimestamp="2026-03-09 03:03:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 03:03:14.772392913 +0000 UTC m=+1319.362056645" watchObservedRunningTime="2026-03-09 03:03:15.388431291 +0000 UTC m=+1319.978095023" Mar 09 03:03:15 crc kubenswrapper[4901]: I0309 03:03:15.436706 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-766b7d5cd8-9xjl5"] Mar 09 03:03:15 crc kubenswrapper[4901]: I0309 03:03:15.436919 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-766b7d5cd8-9xjl5" podUID="189df7f6-2b92-4f36-a6e6-1462bc471159" containerName="neutron-api" containerID="cri-o://bb6a418de9f8462e709a1779b927ca42664b3205b311697c756748bd01a3bb1e" gracePeriod=30 Mar 09 03:03:15 crc kubenswrapper[4901]: I0309 03:03:15.437048 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-766b7d5cd8-9xjl5" podUID="189df7f6-2b92-4f36-a6e6-1462bc471159" containerName="neutron-httpd" containerID="cri-o://b63c90a7bd193594cf46a664293cb499ada4dd2e2e4629d238df3be2d6a82bd4" gracePeriod=30 Mar 09 03:03:15 crc kubenswrapper[4901]: I0309 03:03:15.761348 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"12ec135f-33b3-4be3-bb27-5bb0ea25ddce","Type":"ContainerStarted","Data":"923da5f384dd45a881e4088f214dd21db8342270329da027650b26fef0f69378"} Mar 09 03:03:15 crc kubenswrapper[4901]: I0309 03:03:15.763116 4901 generic.go:334] "Generic (PLEG): container finished" podID="189df7f6-2b92-4f36-a6e6-1462bc471159" containerID="b63c90a7bd193594cf46a664293cb499ada4dd2e2e4629d238df3be2d6a82bd4" exitCode=0 Mar 09 03:03:15 crc kubenswrapper[4901]: I0309 03:03:15.763160 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-766b7d5cd8-9xjl5" event={"ID":"189df7f6-2b92-4f36-a6e6-1462bc471159","Type":"ContainerDied","Data":"b63c90a7bd193594cf46a664293cb499ada4dd2e2e4629d238df3be2d6a82bd4"} Mar 09 03:03:16 crc kubenswrapper[4901]: I0309 03:03:16.776848 4901 generic.go:334] "Generic (PLEG): container finished" podID="788c3db9-e44c-4c56-a5d4-392dffa5e21d" containerID="f5fdd4cdfa035f6889b051c26813eba310fe6f41180e65115f37f28e4142c27d" exitCode=0 Mar 09 03:03:16 crc kubenswrapper[4901]: I0309 03:03:16.776938 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"788c3db9-e44c-4c56-a5d4-392dffa5e21d","Type":"ContainerDied","Data":"f5fdd4cdfa035f6889b051c26813eba310fe6f41180e65115f37f28e4142c27d"} Mar 09 03:03:19 crc kubenswrapper[4901]: I0309 03:03:19.791005 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=7.790985923 podStartE2EDuration="7.790985923s" podCreationTimestamp="2026-03-09 03:03:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 03:03:15.789839486 +0000 UTC m=+1320.379503218" watchObservedRunningTime="2026-03-09 03:03:19.790985923 +0000 UTC m=+1324.380649655" Mar 09 03:03:19 crc kubenswrapper[4901]: I0309 03:03:19.795320 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-kks6g"] Mar 09 03:03:19 crc kubenswrapper[4901]: I0309 03:03:19.796600 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-kks6g" Mar 09 03:03:19 crc kubenswrapper[4901]: I0309 03:03:19.807301 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-kks6g"] Mar 09 03:03:19 crc kubenswrapper[4901]: I0309 03:03:19.832990 4901 generic.go:334] "Generic (PLEG): container finished" podID="189df7f6-2b92-4f36-a6e6-1462bc471159" containerID="bb6a418de9f8462e709a1779b927ca42664b3205b311697c756748bd01a3bb1e" exitCode=0 Mar 09 03:03:19 crc kubenswrapper[4901]: I0309 03:03:19.833186 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-766b7d5cd8-9xjl5" event={"ID":"189df7f6-2b92-4f36-a6e6-1462bc471159","Type":"ContainerDied","Data":"bb6a418de9f8462e709a1779b927ca42664b3205b311697c756748bd01a3bb1e"} Mar 09 03:03:19 crc kubenswrapper[4901]: I0309 03:03:19.873916 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-9kg2m"] Mar 09 03:03:19 crc kubenswrapper[4901]: I0309 03:03:19.875297 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9kg2m" Mar 09 03:03:19 crc kubenswrapper[4901]: I0309 03:03:19.884257 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-9kg2m"] Mar 09 03:03:19 crc kubenswrapper[4901]: I0309 03:03:19.910144 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smwz5\" (UniqueName: \"kubernetes.io/projected/09d03d05-f63f-4a84-be7e-fcfaaae0505d-kube-api-access-smwz5\") pod \"nova-api-db-create-kks6g\" (UID: \"09d03d05-f63f-4a84-be7e-fcfaaae0505d\") " pod="openstack/nova-api-db-create-kks6g" Mar 09 03:03:19 crc kubenswrapper[4901]: I0309 03:03:19.910199 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a093673-8ed6-457e-8981-83864827e781-operator-scripts\") pod \"nova-cell0-db-create-9kg2m\" (UID: \"1a093673-8ed6-457e-8981-83864827e781\") " pod="openstack/nova-cell0-db-create-9kg2m" Mar 09 03:03:19 crc kubenswrapper[4901]: I0309 03:03:19.910284 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fz9dh\" (UniqueName: \"kubernetes.io/projected/1a093673-8ed6-457e-8981-83864827e781-kube-api-access-fz9dh\") pod \"nova-cell0-db-create-9kg2m\" (UID: \"1a093673-8ed6-457e-8981-83864827e781\") " pod="openstack/nova-cell0-db-create-9kg2m" Mar 09 03:03:19 crc kubenswrapper[4901]: I0309 03:03:19.910530 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09d03d05-f63f-4a84-be7e-fcfaaae0505d-operator-scripts\") pod \"nova-api-db-create-kks6g\" (UID: \"09d03d05-f63f-4a84-be7e-fcfaaae0505d\") " pod="openstack/nova-api-db-create-kks6g" Mar 09 03:03:19 crc kubenswrapper[4901]: I0309 03:03:19.923873 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-766b7d5cd8-9xjl5" Mar 09 03:03:19 crc kubenswrapper[4901]: I0309 03:03:19.980937 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-04a2-account-create-update-lz6dq"] Mar 09 03:03:19 crc kubenswrapper[4901]: E0309 03:03:19.981510 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="189df7f6-2b92-4f36-a6e6-1462bc471159" containerName="neutron-httpd" Mar 09 03:03:19 crc kubenswrapper[4901]: I0309 03:03:19.981593 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="189df7f6-2b92-4f36-a6e6-1462bc471159" containerName="neutron-httpd" Mar 09 03:03:19 crc kubenswrapper[4901]: E0309 03:03:19.981668 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="189df7f6-2b92-4f36-a6e6-1462bc471159" containerName="neutron-api" Mar 09 03:03:19 crc kubenswrapper[4901]: I0309 03:03:19.981718 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="189df7f6-2b92-4f36-a6e6-1462bc471159" containerName="neutron-api" Mar 09 03:03:19 crc kubenswrapper[4901]: I0309 03:03:19.981932 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="189df7f6-2b92-4f36-a6e6-1462bc471159" containerName="neutron-httpd" Mar 09 03:03:19 crc kubenswrapper[4901]: I0309 03:03:19.981991 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="189df7f6-2b92-4f36-a6e6-1462bc471159" containerName="neutron-api" Mar 09 03:03:19 crc kubenswrapper[4901]: I0309 03:03:19.982633 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-04a2-account-create-update-lz6dq" Mar 09 03:03:19 crc kubenswrapper[4901]: I0309 03:03:19.984506 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 09 03:03:19 crc kubenswrapper[4901]: I0309 03:03:19.990915 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-04a2-account-create-update-lz6dq"] Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.011763 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/189df7f6-2b92-4f36-a6e6-1462bc471159-ovndb-tls-certs\") pod \"189df7f6-2b92-4f36-a6e6-1462bc471159\" (UID: \"189df7f6-2b92-4f36-a6e6-1462bc471159\") " Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.011864 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/189df7f6-2b92-4f36-a6e6-1462bc471159-config\") pod \"189df7f6-2b92-4f36-a6e6-1462bc471159\" (UID: \"189df7f6-2b92-4f36-a6e6-1462bc471159\") " Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.011917 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/189df7f6-2b92-4f36-a6e6-1462bc471159-combined-ca-bundle\") pod \"189df7f6-2b92-4f36-a6e6-1462bc471159\" (UID: \"189df7f6-2b92-4f36-a6e6-1462bc471159\") " Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.012021 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ck6bj\" (UniqueName: \"kubernetes.io/projected/189df7f6-2b92-4f36-a6e6-1462bc471159-kube-api-access-ck6bj\") pod \"189df7f6-2b92-4f36-a6e6-1462bc471159\" (UID: \"189df7f6-2b92-4f36-a6e6-1462bc471159\") " Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.012069 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/189df7f6-2b92-4f36-a6e6-1462bc471159-httpd-config\") pod \"189df7f6-2b92-4f36-a6e6-1462bc471159\" (UID: \"189df7f6-2b92-4f36-a6e6-1462bc471159\") " Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.012329 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fz9dh\" (UniqueName: \"kubernetes.io/projected/1a093673-8ed6-457e-8981-83864827e781-kube-api-access-fz9dh\") pod \"nova-cell0-db-create-9kg2m\" (UID: \"1a093673-8ed6-457e-8981-83864827e781\") " pod="openstack/nova-cell0-db-create-9kg2m" Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.012680 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09d03d05-f63f-4a84-be7e-fcfaaae0505d-operator-scripts\") pod \"nova-api-db-create-kks6g\" (UID: \"09d03d05-f63f-4a84-be7e-fcfaaae0505d\") " pod="openstack/nova-api-db-create-kks6g" Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.013354 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09d03d05-f63f-4a84-be7e-fcfaaae0505d-operator-scripts\") pod \"nova-api-db-create-kks6g\" (UID: \"09d03d05-f63f-4a84-be7e-fcfaaae0505d\") " pod="openstack/nova-api-db-create-kks6g" Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.013430 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smwz5\" (UniqueName: \"kubernetes.io/projected/09d03d05-f63f-4a84-be7e-fcfaaae0505d-kube-api-access-smwz5\") pod \"nova-api-db-create-kks6g\" (UID: \"09d03d05-f63f-4a84-be7e-fcfaaae0505d\") " pod="openstack/nova-api-db-create-kks6g" Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.013463 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4clr\" (UniqueName: \"kubernetes.io/projected/6be4ea2e-b742-478c-a6e3-56f43a856e40-kube-api-access-x4clr\") pod \"nova-api-04a2-account-create-update-lz6dq\" (UID: \"6be4ea2e-b742-478c-a6e3-56f43a856e40\") " pod="openstack/nova-api-04a2-account-create-update-lz6dq" Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.013484 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a093673-8ed6-457e-8981-83864827e781-operator-scripts\") pod \"nova-cell0-db-create-9kg2m\" (UID: \"1a093673-8ed6-457e-8981-83864827e781\") " pod="openstack/nova-cell0-db-create-9kg2m" Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.013508 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6be4ea2e-b742-478c-a6e3-56f43a856e40-operator-scripts\") pod \"nova-api-04a2-account-create-update-lz6dq\" (UID: \"6be4ea2e-b742-478c-a6e3-56f43a856e40\") " pod="openstack/nova-api-04a2-account-create-update-lz6dq" Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.014174 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a093673-8ed6-457e-8981-83864827e781-operator-scripts\") pod \"nova-cell0-db-create-9kg2m\" (UID: \"1a093673-8ed6-457e-8981-83864827e781\") " pod="openstack/nova-cell0-db-create-9kg2m" Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.024577 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/189df7f6-2b92-4f36-a6e6-1462bc471159-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "189df7f6-2b92-4f36-a6e6-1462bc471159" (UID: "189df7f6-2b92-4f36-a6e6-1462bc471159"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.025392 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/189df7f6-2b92-4f36-a6e6-1462bc471159-kube-api-access-ck6bj" (OuterVolumeSpecName: "kube-api-access-ck6bj") pod "189df7f6-2b92-4f36-a6e6-1462bc471159" (UID: "189df7f6-2b92-4f36-a6e6-1462bc471159"). InnerVolumeSpecName "kube-api-access-ck6bj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.028238 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smwz5\" (UniqueName: \"kubernetes.io/projected/09d03d05-f63f-4a84-be7e-fcfaaae0505d-kube-api-access-smwz5\") pod \"nova-api-db-create-kks6g\" (UID: \"09d03d05-f63f-4a84-be7e-fcfaaae0505d\") " pod="openstack/nova-api-db-create-kks6g" Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.031490 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fz9dh\" (UniqueName: \"kubernetes.io/projected/1a093673-8ed6-457e-8981-83864827e781-kube-api-access-fz9dh\") pod \"nova-cell0-db-create-9kg2m\" (UID: \"1a093673-8ed6-457e-8981-83864827e781\") " pod="openstack/nova-cell0-db-create-9kg2m" Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.092077 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-sfgd9"] Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.093241 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-sfgd9" Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.098491 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/189df7f6-2b92-4f36-a6e6-1462bc471159-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "189df7f6-2b92-4f36-a6e6-1462bc471159" (UID: "189df7f6-2b92-4f36-a6e6-1462bc471159"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.102367 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-sfgd9"] Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.102564 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/189df7f6-2b92-4f36-a6e6-1462bc471159-config" (OuterVolumeSpecName: "config") pod "189df7f6-2b92-4f36-a6e6-1462bc471159" (UID: "189df7f6-2b92-4f36-a6e6-1462bc471159"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.114360 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhd8m\" (UniqueName: \"kubernetes.io/projected/0c8555bf-e516-4e10-be72-afbbb53fb31e-kube-api-access-bhd8m\") pod \"nova-cell1-db-create-sfgd9\" (UID: \"0c8555bf-e516-4e10-be72-afbbb53fb31e\") " pod="openstack/nova-cell1-db-create-sfgd9" Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.114411 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c8555bf-e516-4e10-be72-afbbb53fb31e-operator-scripts\") pod \"nova-cell1-db-create-sfgd9\" (UID: \"0c8555bf-e516-4e10-be72-afbbb53fb31e\") " pod="openstack/nova-cell1-db-create-sfgd9" Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.114472 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4clr\" (UniqueName: \"kubernetes.io/projected/6be4ea2e-b742-478c-a6e3-56f43a856e40-kube-api-access-x4clr\") pod \"nova-api-04a2-account-create-update-lz6dq\" (UID: \"6be4ea2e-b742-478c-a6e3-56f43a856e40\") " pod="openstack/nova-api-04a2-account-create-update-lz6dq" Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.114560 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6be4ea2e-b742-478c-a6e3-56f43a856e40-operator-scripts\") pod \"nova-api-04a2-account-create-update-lz6dq\" (UID: \"6be4ea2e-b742-478c-a6e3-56f43a856e40\") " pod="openstack/nova-api-04a2-account-create-update-lz6dq" Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.115067 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/189df7f6-2b92-4f36-a6e6-1462bc471159-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.115093 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ck6bj\" (UniqueName: \"kubernetes.io/projected/189df7f6-2b92-4f36-a6e6-1462bc471159-kube-api-access-ck6bj\") on node \"crc\" DevicePath \"\"" Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.115107 4901 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/189df7f6-2b92-4f36-a6e6-1462bc471159-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.115121 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/189df7f6-2b92-4f36-a6e6-1462bc471159-config\") on node \"crc\" DevicePath \"\"" Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.115524 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6be4ea2e-b742-478c-a6e3-56f43a856e40-operator-scripts\") pod \"nova-api-04a2-account-create-update-lz6dq\" (UID: \"6be4ea2e-b742-478c-a6e3-56f43a856e40\") " pod="openstack/nova-api-04a2-account-create-update-lz6dq" Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.131615 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4clr\" (UniqueName: \"kubernetes.io/projected/6be4ea2e-b742-478c-a6e3-56f43a856e40-kube-api-access-x4clr\") pod \"nova-api-04a2-account-create-update-lz6dq\" (UID: \"6be4ea2e-b742-478c-a6e3-56f43a856e40\") " pod="openstack/nova-api-04a2-account-create-update-lz6dq" Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.152405 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/189df7f6-2b92-4f36-a6e6-1462bc471159-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "189df7f6-2b92-4f36-a6e6-1462bc471159" (UID: "189df7f6-2b92-4f36-a6e6-1462bc471159"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.176690 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-3362-account-create-update-tdpdc"] Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.177713 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-3362-account-create-update-tdpdc" Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.182727 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.193164 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-3362-account-create-update-tdpdc"] Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.220097 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-kks6g" Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.221432 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhd8m\" (UniqueName: \"kubernetes.io/projected/0c8555bf-e516-4e10-be72-afbbb53fb31e-kube-api-access-bhd8m\") pod \"nova-cell1-db-create-sfgd9\" (UID: \"0c8555bf-e516-4e10-be72-afbbb53fb31e\") " pod="openstack/nova-cell1-db-create-sfgd9" Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.221486 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4xjn\" (UniqueName: \"kubernetes.io/projected/698029af-0c10-4446-81f0-fd59859b8722-kube-api-access-v4xjn\") pod \"nova-cell0-3362-account-create-update-tdpdc\" (UID: \"698029af-0c10-4446-81f0-fd59859b8722\") " pod="openstack/nova-cell0-3362-account-create-update-tdpdc" Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.221526 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c8555bf-e516-4e10-be72-afbbb53fb31e-operator-scripts\") pod \"nova-cell1-db-create-sfgd9\" (UID: \"0c8555bf-e516-4e10-be72-afbbb53fb31e\") " pod="openstack/nova-cell1-db-create-sfgd9" Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.221544 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/698029af-0c10-4446-81f0-fd59859b8722-operator-scripts\") pod \"nova-cell0-3362-account-create-update-tdpdc\" (UID: \"698029af-0c10-4446-81f0-fd59859b8722\") " pod="openstack/nova-cell0-3362-account-create-update-tdpdc" Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.221621 4901 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/189df7f6-2b92-4f36-a6e6-1462bc471159-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.224650 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c8555bf-e516-4e10-be72-afbbb53fb31e-operator-scripts\") pod \"nova-cell1-db-create-sfgd9\" (UID: \"0c8555bf-e516-4e10-be72-afbbb53fb31e\") " pod="openstack/nova-cell1-db-create-sfgd9" Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.234095 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9kg2m" Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.243677 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhd8m\" (UniqueName: \"kubernetes.io/projected/0c8555bf-e516-4e10-be72-afbbb53fb31e-kube-api-access-bhd8m\") pod \"nova-cell1-db-create-sfgd9\" (UID: \"0c8555bf-e516-4e10-be72-afbbb53fb31e\") " pod="openstack/nova-cell1-db-create-sfgd9" Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.300616 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-04a2-account-create-update-lz6dq" Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.323465 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4xjn\" (UniqueName: \"kubernetes.io/projected/698029af-0c10-4446-81f0-fd59859b8722-kube-api-access-v4xjn\") pod \"nova-cell0-3362-account-create-update-tdpdc\" (UID: \"698029af-0c10-4446-81f0-fd59859b8722\") " pod="openstack/nova-cell0-3362-account-create-update-tdpdc" Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.323522 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/698029af-0c10-4446-81f0-fd59859b8722-operator-scripts\") pod \"nova-cell0-3362-account-create-update-tdpdc\" (UID: \"698029af-0c10-4446-81f0-fd59859b8722\") " pod="openstack/nova-cell0-3362-account-create-update-tdpdc" Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.324256 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/698029af-0c10-4446-81f0-fd59859b8722-operator-scripts\") pod \"nova-cell0-3362-account-create-update-tdpdc\" (UID: \"698029af-0c10-4446-81f0-fd59859b8722\") " pod="openstack/nova-cell0-3362-account-create-update-tdpdc" Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.346998 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4xjn\" (UniqueName: \"kubernetes.io/projected/698029af-0c10-4446-81f0-fd59859b8722-kube-api-access-v4xjn\") pod \"nova-cell0-3362-account-create-update-tdpdc\" (UID: \"698029af-0c10-4446-81f0-fd59859b8722\") " pod="openstack/nova-cell0-3362-account-create-update-tdpdc" Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.383521 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cddb-account-create-update-457m6"] Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.387153 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cddb-account-create-update-457m6" Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.389201 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.399172 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cddb-account-create-update-457m6"] Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.417906 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-sfgd9" Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.424956 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lt76\" (UniqueName: \"kubernetes.io/projected/6494c542-3d82-43a7-b938-77820e0d3adb-kube-api-access-7lt76\") pod \"nova-cell1-cddb-account-create-update-457m6\" (UID: \"6494c542-3d82-43a7-b938-77820e0d3adb\") " pod="openstack/nova-cell1-cddb-account-create-update-457m6" Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.425005 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6494c542-3d82-43a7-b938-77820e0d3adb-operator-scripts\") pod \"nova-cell1-cddb-account-create-update-457m6\" (UID: \"6494c542-3d82-43a7-b938-77820e0d3adb\") " pod="openstack/nova-cell1-cddb-account-create-update-457m6" Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.505622 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-3362-account-create-update-tdpdc" Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.526587 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lt76\" (UniqueName: \"kubernetes.io/projected/6494c542-3d82-43a7-b938-77820e0d3adb-kube-api-access-7lt76\") pod \"nova-cell1-cddb-account-create-update-457m6\" (UID: \"6494c542-3d82-43a7-b938-77820e0d3adb\") " pod="openstack/nova-cell1-cddb-account-create-update-457m6" Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.526629 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6494c542-3d82-43a7-b938-77820e0d3adb-operator-scripts\") pod \"nova-cell1-cddb-account-create-update-457m6\" (UID: \"6494c542-3d82-43a7-b938-77820e0d3adb\") " pod="openstack/nova-cell1-cddb-account-create-update-457m6" Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.527415 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6494c542-3d82-43a7-b938-77820e0d3adb-operator-scripts\") pod \"nova-cell1-cddb-account-create-update-457m6\" (UID: \"6494c542-3d82-43a7-b938-77820e0d3adb\") " pod="openstack/nova-cell1-cddb-account-create-update-457m6" Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.545846 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lt76\" (UniqueName: \"kubernetes.io/projected/6494c542-3d82-43a7-b938-77820e0d3adb-kube-api-access-7lt76\") pod \"nova-cell1-cddb-account-create-update-457m6\" (UID: \"6494c542-3d82-43a7-b938-77820e0d3adb\") " pod="openstack/nova-cell1-cddb-account-create-update-457m6" Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.712254 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cddb-account-create-update-457m6" Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.720429 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-kks6g"] Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.760388 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-9kg2m"] Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.849032 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-kks6g" event={"ID":"09d03d05-f63f-4a84-be7e-fcfaaae0505d","Type":"ContainerStarted","Data":"2f68fdba62b1106ac238c08aaa24a025243e14c502740753157ab24d625d425f"} Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.862032 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-766b7d5cd8-9xjl5" event={"ID":"189df7f6-2b92-4f36-a6e6-1462bc471159","Type":"ContainerDied","Data":"7ab7fb0f26a5094af208c916b52ca40577d07593fc91974ee793bf1a6b39da45"} Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.862077 4901 scope.go:117] "RemoveContainer" containerID="b63c90a7bd193594cf46a664293cb499ada4dd2e2e4629d238df3be2d6a82bd4" Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.862205 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-766b7d5cd8-9xjl5" Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.868372 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-9kg2m" event={"ID":"1a093673-8ed6-457e-8981-83864827e781","Type":"ContainerStarted","Data":"40ff592dd12b17efac8ca0f8c1019a6c829e6bbbee7aae92fde827e5a3712e20"} Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.890664 4901 scope.go:117] "RemoveContainer" containerID="bb6a418de9f8462e709a1779b927ca42664b3205b311697c756748bd01a3bb1e" Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.935524 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-04a2-account-create-update-lz6dq"] Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.949688 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-766b7d5cd8-9xjl5"] Mar 09 03:03:20 crc kubenswrapper[4901]: W0309 03:03:20.951045 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6be4ea2e_b742_478c_a6e3_56f43a856e40.slice/crio-49d33c27c183d4c874ee92985ba5530944c35196dd92b5457e32ad18b82e8432 WatchSource:0}: Error finding container 49d33c27c183d4c874ee92985ba5530944c35196dd92b5457e32ad18b82e8432: Status 404 returned error can't find the container with id 49d33c27c183d4c874ee92985ba5530944c35196dd92b5457e32ad18b82e8432 Mar 09 03:03:20 crc kubenswrapper[4901]: I0309 03:03:20.960925 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-766b7d5cd8-9xjl5"] Mar 09 03:03:21 crc kubenswrapper[4901]: I0309 03:03:21.010755 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-sfgd9"] Mar 09 03:03:21 crc kubenswrapper[4901]: I0309 03:03:21.088516 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-3362-account-create-update-tdpdc"] Mar 09 03:03:21 crc kubenswrapper[4901]: I0309 03:03:21.263622 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cddb-account-create-update-457m6"] Mar 09 03:03:21 crc kubenswrapper[4901]: W0309 03:03:21.306902 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6494c542_3d82_43a7_b938_77820e0d3adb.slice/crio-48d73beadb5929df01b27e62c78eb757d2acbe029c407b1311ab3c6271a73cad WatchSource:0}: Error finding container 48d73beadb5929df01b27e62c78eb757d2acbe029c407b1311ab3c6271a73cad: Status 404 returned error can't find the container with id 48d73beadb5929df01b27e62c78eb757d2acbe029c407b1311ab3c6271a73cad Mar 09 03:03:21 crc kubenswrapper[4901]: I0309 03:03:21.877975 4901 generic.go:334] "Generic (PLEG): container finished" podID="09d03d05-f63f-4a84-be7e-fcfaaae0505d" containerID="d79cab62bb0112d2d186528ed06fc210871e18ac819e1d659adeeda8aaae0c07" exitCode=0 Mar 09 03:03:21 crc kubenswrapper[4901]: I0309 03:03:21.878033 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-kks6g" event={"ID":"09d03d05-f63f-4a84-be7e-fcfaaae0505d","Type":"ContainerDied","Data":"d79cab62bb0112d2d186528ed06fc210871e18ac819e1d659adeeda8aaae0c07"} Mar 09 03:03:21 crc kubenswrapper[4901]: I0309 03:03:21.880347 4901 generic.go:334] "Generic (PLEG): container finished" podID="0c8555bf-e516-4e10-be72-afbbb53fb31e" containerID="e1228ba42fc90c0cde6de4a3ef427e708441ad24f194f71b8f084075c75abf93" exitCode=0 Mar 09 03:03:21 crc kubenswrapper[4901]: I0309 03:03:21.880398 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-sfgd9" event={"ID":"0c8555bf-e516-4e10-be72-afbbb53fb31e","Type":"ContainerDied","Data":"e1228ba42fc90c0cde6de4a3ef427e708441ad24f194f71b8f084075c75abf93"} Mar 09 03:03:21 crc kubenswrapper[4901]: I0309 03:03:21.880447 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-sfgd9" event={"ID":"0c8555bf-e516-4e10-be72-afbbb53fb31e","Type":"ContainerStarted","Data":"4cee02a4b4c650b8643e7983ddfa80cd841b9e25a9939728f64400dc0b42e7a6"} Mar 09 03:03:21 crc kubenswrapper[4901]: I0309 03:03:21.886670 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cddb-account-create-update-457m6" event={"ID":"6494c542-3d82-43a7-b938-77820e0d3adb","Type":"ContainerStarted","Data":"cfc47f8beb8ecefcfac0787d59cd1dc8d3416566c1269b0f491dd5ea14608964"} Mar 09 03:03:21 crc kubenswrapper[4901]: I0309 03:03:21.886724 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cddb-account-create-update-457m6" event={"ID":"6494c542-3d82-43a7-b938-77820e0d3adb","Type":"ContainerStarted","Data":"48d73beadb5929df01b27e62c78eb757d2acbe029c407b1311ab3c6271a73cad"} Mar 09 03:03:21 crc kubenswrapper[4901]: I0309 03:03:21.890907 4901 generic.go:334] "Generic (PLEG): container finished" podID="6be4ea2e-b742-478c-a6e3-56f43a856e40" containerID="bc2e0443e15be5cd6de9ba14dfcdf841d6b447a932b62328a61c9eff56f7aba6" exitCode=0 Mar 09 03:03:21 crc kubenswrapper[4901]: I0309 03:03:21.891003 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-04a2-account-create-update-lz6dq" event={"ID":"6be4ea2e-b742-478c-a6e3-56f43a856e40","Type":"ContainerDied","Data":"bc2e0443e15be5cd6de9ba14dfcdf841d6b447a932b62328a61c9eff56f7aba6"} Mar 09 03:03:21 crc kubenswrapper[4901]: I0309 03:03:21.891055 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-04a2-account-create-update-lz6dq" event={"ID":"6be4ea2e-b742-478c-a6e3-56f43a856e40","Type":"ContainerStarted","Data":"49d33c27c183d4c874ee92985ba5530944c35196dd92b5457e32ad18b82e8432"} Mar 09 03:03:21 crc kubenswrapper[4901]: I0309 03:03:21.892462 4901 generic.go:334] "Generic (PLEG): container finished" podID="698029af-0c10-4446-81f0-fd59859b8722" containerID="ff6b196fb187bea817fb3de6278431b1813fe38037e7088e05efb7c067276b2c" exitCode=0 Mar 09 03:03:21 crc kubenswrapper[4901]: I0309 03:03:21.892509 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-3362-account-create-update-tdpdc" event={"ID":"698029af-0c10-4446-81f0-fd59859b8722","Type":"ContainerDied","Data":"ff6b196fb187bea817fb3de6278431b1813fe38037e7088e05efb7c067276b2c"} Mar 09 03:03:21 crc kubenswrapper[4901]: I0309 03:03:21.892539 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-3362-account-create-update-tdpdc" event={"ID":"698029af-0c10-4446-81f0-fd59859b8722","Type":"ContainerStarted","Data":"6351e36263357c5ca71c1314284c05402a6e619a75b249f49f7c8e6f8eea1034"} Mar 09 03:03:21 crc kubenswrapper[4901]: I0309 03:03:21.893654 4901 generic.go:334] "Generic (PLEG): container finished" podID="1a093673-8ed6-457e-8981-83864827e781" containerID="d15cb14ee1f318a86895da38e06143089829e2c3813d3c07043aacb028b2aada" exitCode=0 Mar 09 03:03:21 crc kubenswrapper[4901]: I0309 03:03:21.893698 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-9kg2m" event={"ID":"1a093673-8ed6-457e-8981-83864827e781","Type":"ContainerDied","Data":"d15cb14ee1f318a86895da38e06143089829e2c3813d3c07043aacb028b2aada"} Mar 09 03:03:21 crc kubenswrapper[4901]: I0309 03:03:21.955603 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cddb-account-create-update-457m6" podStartSLOduration=1.955587898 podStartE2EDuration="1.955587898s" podCreationTimestamp="2026-03-09 03:03:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 03:03:21.950938211 +0000 UTC m=+1326.540601943" watchObservedRunningTime="2026-03-09 03:03:21.955587898 +0000 UTC m=+1326.545251620" Mar 09 03:03:22 crc kubenswrapper[4901]: I0309 03:03:22.029111 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 09 03:03:22 crc kubenswrapper[4901]: I0309 03:03:22.029174 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 09 03:03:22 crc kubenswrapper[4901]: I0309 03:03:22.061114 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 09 03:03:22 crc kubenswrapper[4901]: I0309 03:03:22.069059 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 09 03:03:22 crc kubenswrapper[4901]: I0309 03:03:22.116854 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="189df7f6-2b92-4f36-a6e6-1462bc471159" path="/var/lib/kubelet/pods/189df7f6-2b92-4f36-a6e6-1462bc471159/volumes" Mar 09 03:03:22 crc kubenswrapper[4901]: I0309 03:03:22.458859 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5ddb85f7bb-phpwc" Mar 09 03:03:22 crc kubenswrapper[4901]: I0309 03:03:22.480095 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5ddb85f7bb-phpwc" Mar 09 03:03:22 crc kubenswrapper[4901]: I0309 03:03:22.562568 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-588d7b64fd-wbsl2"] Mar 09 03:03:22 crc kubenswrapper[4901]: I0309 03:03:22.562869 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-588d7b64fd-wbsl2" podUID="50fd5778-3018-4a41-8db7-285ca63540a5" containerName="placement-log" containerID="cri-o://92f0b018fb501ac1d31f864c3c26312799db4eaeb5131c441e1d09cb7f469811" gracePeriod=30 Mar 09 03:03:22 crc kubenswrapper[4901]: I0309 03:03:22.562990 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-588d7b64fd-wbsl2" podUID="50fd5778-3018-4a41-8db7-285ca63540a5" containerName="placement-api" containerID="cri-o://0b687bccedee7ceb14535726d105414601f7fa2b74a29a4654dc4138b0a31d91" gracePeriod=30 Mar 09 03:03:22 crc kubenswrapper[4901]: I0309 03:03:22.902659 4901 generic.go:334] "Generic (PLEG): container finished" podID="50fd5778-3018-4a41-8db7-285ca63540a5" containerID="92f0b018fb501ac1d31f864c3c26312799db4eaeb5131c441e1d09cb7f469811" exitCode=143 Mar 09 03:03:22 crc kubenswrapper[4901]: I0309 03:03:22.902720 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-588d7b64fd-wbsl2" event={"ID":"50fd5778-3018-4a41-8db7-285ca63540a5","Type":"ContainerDied","Data":"92f0b018fb501ac1d31f864c3c26312799db4eaeb5131c441e1d09cb7f469811"} Mar 09 03:03:22 crc kubenswrapper[4901]: I0309 03:03:22.905938 4901 generic.go:334] "Generic (PLEG): container finished" podID="6494c542-3d82-43a7-b938-77820e0d3adb" containerID="cfc47f8beb8ecefcfac0787d59cd1dc8d3416566c1269b0f491dd5ea14608964" exitCode=0 Mar 09 03:03:22 crc kubenswrapper[4901]: I0309 03:03:22.906037 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cddb-account-create-update-457m6" event={"ID":"6494c542-3d82-43a7-b938-77820e0d3adb","Type":"ContainerDied","Data":"cfc47f8beb8ecefcfac0787d59cd1dc8d3416566c1269b0f491dd5ea14608964"} Mar 09 03:03:22 crc kubenswrapper[4901]: I0309 03:03:22.907487 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 09 03:03:22 crc kubenswrapper[4901]: I0309 03:03:22.907595 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 09 03:03:23 crc kubenswrapper[4901]: I0309 03:03:23.160034 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 09 03:03:23 crc kubenswrapper[4901]: I0309 03:03:23.169489 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 09 03:03:23 crc kubenswrapper[4901]: I0309 03:03:23.198206 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 09 03:03:23 crc kubenswrapper[4901]: I0309 03:03:23.250573 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 09 03:03:23 crc kubenswrapper[4901]: I0309 03:03:23.342880 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-3362-account-create-update-tdpdc" Mar 09 03:03:23 crc kubenswrapper[4901]: I0309 03:03:23.492545 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4xjn\" (UniqueName: \"kubernetes.io/projected/698029af-0c10-4446-81f0-fd59859b8722-kube-api-access-v4xjn\") pod \"698029af-0c10-4446-81f0-fd59859b8722\" (UID: \"698029af-0c10-4446-81f0-fd59859b8722\") " Mar 09 03:03:23 crc kubenswrapper[4901]: I0309 03:03:23.492630 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/698029af-0c10-4446-81f0-fd59859b8722-operator-scripts\") pod \"698029af-0c10-4446-81f0-fd59859b8722\" (UID: \"698029af-0c10-4446-81f0-fd59859b8722\") " Mar 09 03:03:23 crc kubenswrapper[4901]: I0309 03:03:23.497006 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/698029af-0c10-4446-81f0-fd59859b8722-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "698029af-0c10-4446-81f0-fd59859b8722" (UID: "698029af-0c10-4446-81f0-fd59859b8722"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:03:23 crc kubenswrapper[4901]: I0309 03:03:23.510598 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/698029af-0c10-4446-81f0-fd59859b8722-kube-api-access-v4xjn" (OuterVolumeSpecName: "kube-api-access-v4xjn") pod "698029af-0c10-4446-81f0-fd59859b8722" (UID: "698029af-0c10-4446-81f0-fd59859b8722"). InnerVolumeSpecName "kube-api-access-v4xjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:03:23 crc kubenswrapper[4901]: I0309 03:03:23.512882 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9kg2m" Mar 09 03:03:23 crc kubenswrapper[4901]: I0309 03:03:23.533610 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-04a2-account-create-update-lz6dq" Mar 09 03:03:23 crc kubenswrapper[4901]: I0309 03:03:23.537925 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-sfgd9" Mar 09 03:03:23 crc kubenswrapper[4901]: I0309 03:03:23.541826 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-kks6g" Mar 09 03:03:23 crc kubenswrapper[4901]: I0309 03:03:23.598647 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4xjn\" (UniqueName: \"kubernetes.io/projected/698029af-0c10-4446-81f0-fd59859b8722-kube-api-access-v4xjn\") on node \"crc\" DevicePath \"\"" Mar 09 03:03:23 crc kubenswrapper[4901]: I0309 03:03:23.598677 4901 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/698029af-0c10-4446-81f0-fd59859b8722-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:03:23 crc kubenswrapper[4901]: I0309 03:03:23.699670 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a093673-8ed6-457e-8981-83864827e781-operator-scripts\") pod \"1a093673-8ed6-457e-8981-83864827e781\" (UID: \"1a093673-8ed6-457e-8981-83864827e781\") " Mar 09 03:03:23 crc kubenswrapper[4901]: I0309 03:03:23.699750 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4clr\" (UniqueName: \"kubernetes.io/projected/6be4ea2e-b742-478c-a6e3-56f43a856e40-kube-api-access-x4clr\") pod \"6be4ea2e-b742-478c-a6e3-56f43a856e40\" (UID: \"6be4ea2e-b742-478c-a6e3-56f43a856e40\") " Mar 09 03:03:23 crc kubenswrapper[4901]: I0309 03:03:23.699778 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhd8m\" (UniqueName: \"kubernetes.io/projected/0c8555bf-e516-4e10-be72-afbbb53fb31e-kube-api-access-bhd8m\") pod \"0c8555bf-e516-4e10-be72-afbbb53fb31e\" (UID: \"0c8555bf-e516-4e10-be72-afbbb53fb31e\") " Mar 09 03:03:23 crc kubenswrapper[4901]: I0309 03:03:23.699804 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09d03d05-f63f-4a84-be7e-fcfaaae0505d-operator-scripts\") pod \"09d03d05-f63f-4a84-be7e-fcfaaae0505d\" (UID: \"09d03d05-f63f-4a84-be7e-fcfaaae0505d\") " Mar 09 03:03:23 crc kubenswrapper[4901]: I0309 03:03:23.699866 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fz9dh\" (UniqueName: \"kubernetes.io/projected/1a093673-8ed6-457e-8981-83864827e781-kube-api-access-fz9dh\") pod \"1a093673-8ed6-457e-8981-83864827e781\" (UID: \"1a093673-8ed6-457e-8981-83864827e781\") " Mar 09 03:03:23 crc kubenswrapper[4901]: I0309 03:03:23.699889 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6be4ea2e-b742-478c-a6e3-56f43a856e40-operator-scripts\") pod \"6be4ea2e-b742-478c-a6e3-56f43a856e40\" (UID: \"6be4ea2e-b742-478c-a6e3-56f43a856e40\") " Mar 09 03:03:23 crc kubenswrapper[4901]: I0309 03:03:23.699957 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c8555bf-e516-4e10-be72-afbbb53fb31e-operator-scripts\") pod \"0c8555bf-e516-4e10-be72-afbbb53fb31e\" (UID: \"0c8555bf-e516-4e10-be72-afbbb53fb31e\") " Mar 09 03:03:23 crc kubenswrapper[4901]: I0309 03:03:23.700005 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smwz5\" (UniqueName: \"kubernetes.io/projected/09d03d05-f63f-4a84-be7e-fcfaaae0505d-kube-api-access-smwz5\") pod \"09d03d05-f63f-4a84-be7e-fcfaaae0505d\" (UID: \"09d03d05-f63f-4a84-be7e-fcfaaae0505d\") " Mar 09 03:03:23 crc kubenswrapper[4901]: I0309 03:03:23.707665 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09d03d05-f63f-4a84-be7e-fcfaaae0505d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "09d03d05-f63f-4a84-be7e-fcfaaae0505d" (UID: "09d03d05-f63f-4a84-be7e-fcfaaae0505d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:03:23 crc kubenswrapper[4901]: I0309 03:03:23.708251 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a093673-8ed6-457e-8981-83864827e781-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1a093673-8ed6-457e-8981-83864827e781" (UID: "1a093673-8ed6-457e-8981-83864827e781"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:03:23 crc kubenswrapper[4901]: I0309 03:03:23.708265 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c8555bf-e516-4e10-be72-afbbb53fb31e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0c8555bf-e516-4e10-be72-afbbb53fb31e" (UID: "0c8555bf-e516-4e10-be72-afbbb53fb31e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:03:23 crc kubenswrapper[4901]: I0309 03:03:23.708585 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6be4ea2e-b742-478c-a6e3-56f43a856e40-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6be4ea2e-b742-478c-a6e3-56f43a856e40" (UID: "6be4ea2e-b742-478c-a6e3-56f43a856e40"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:03:23 crc kubenswrapper[4901]: I0309 03:03:23.711472 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c8555bf-e516-4e10-be72-afbbb53fb31e-kube-api-access-bhd8m" (OuterVolumeSpecName: "kube-api-access-bhd8m") pod "0c8555bf-e516-4e10-be72-afbbb53fb31e" (UID: "0c8555bf-e516-4e10-be72-afbbb53fb31e"). InnerVolumeSpecName "kube-api-access-bhd8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:03:23 crc kubenswrapper[4901]: I0309 03:03:23.711506 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6be4ea2e-b742-478c-a6e3-56f43a856e40-kube-api-access-x4clr" (OuterVolumeSpecName: "kube-api-access-x4clr") pod "6be4ea2e-b742-478c-a6e3-56f43a856e40" (UID: "6be4ea2e-b742-478c-a6e3-56f43a856e40"). InnerVolumeSpecName "kube-api-access-x4clr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:03:23 crc kubenswrapper[4901]: I0309 03:03:23.711877 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a093673-8ed6-457e-8981-83864827e781-kube-api-access-fz9dh" (OuterVolumeSpecName: "kube-api-access-fz9dh") pod "1a093673-8ed6-457e-8981-83864827e781" (UID: "1a093673-8ed6-457e-8981-83864827e781"). InnerVolumeSpecName "kube-api-access-fz9dh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:03:23 crc kubenswrapper[4901]: I0309 03:03:23.712190 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09d03d05-f63f-4a84-be7e-fcfaaae0505d-kube-api-access-smwz5" (OuterVolumeSpecName: "kube-api-access-smwz5") pod "09d03d05-f63f-4a84-be7e-fcfaaae0505d" (UID: "09d03d05-f63f-4a84-be7e-fcfaaae0505d"). InnerVolumeSpecName "kube-api-access-smwz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:03:23 crc kubenswrapper[4901]: I0309 03:03:23.808770 4901 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a093673-8ed6-457e-8981-83864827e781-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:03:23 crc kubenswrapper[4901]: I0309 03:03:23.809008 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4clr\" (UniqueName: \"kubernetes.io/projected/6be4ea2e-b742-478c-a6e3-56f43a856e40-kube-api-access-x4clr\") on node \"crc\" DevicePath \"\"" Mar 09 03:03:23 crc kubenswrapper[4901]: I0309 03:03:23.809020 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhd8m\" (UniqueName: \"kubernetes.io/projected/0c8555bf-e516-4e10-be72-afbbb53fb31e-kube-api-access-bhd8m\") on node \"crc\" DevicePath \"\"" Mar 09 03:03:23 crc kubenswrapper[4901]: I0309 03:03:23.809031 4901 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09d03d05-f63f-4a84-be7e-fcfaaae0505d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:03:23 crc kubenswrapper[4901]: I0309 03:03:23.809040 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fz9dh\" (UniqueName: \"kubernetes.io/projected/1a093673-8ed6-457e-8981-83864827e781-kube-api-access-fz9dh\") on node \"crc\" DevicePath \"\"" Mar 09 03:03:23 crc kubenswrapper[4901]: I0309 03:03:23.809048 4901 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6be4ea2e-b742-478c-a6e3-56f43a856e40-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:03:23 crc kubenswrapper[4901]: I0309 03:03:23.809056 4901 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c8555bf-e516-4e10-be72-afbbb53fb31e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:03:23 crc kubenswrapper[4901]: I0309 03:03:23.809066 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smwz5\" (UniqueName: \"kubernetes.io/projected/09d03d05-f63f-4a84-be7e-fcfaaae0505d-kube-api-access-smwz5\") on node \"crc\" DevicePath \"\"" Mar 09 03:03:23 crc kubenswrapper[4901]: I0309 03:03:23.918495 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9kg2m" Mar 09 03:03:23 crc kubenswrapper[4901]: I0309 03:03:23.918508 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-9kg2m" event={"ID":"1a093673-8ed6-457e-8981-83864827e781","Type":"ContainerDied","Data":"40ff592dd12b17efac8ca0f8c1019a6c829e6bbbee7aae92fde827e5a3712e20"} Mar 09 03:03:23 crc kubenswrapper[4901]: I0309 03:03:23.918544 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40ff592dd12b17efac8ca0f8c1019a6c829e6bbbee7aae92fde827e5a3712e20" Mar 09 03:03:23 crc kubenswrapper[4901]: I0309 03:03:23.921159 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-kks6g" event={"ID":"09d03d05-f63f-4a84-be7e-fcfaaae0505d","Type":"ContainerDied","Data":"2f68fdba62b1106ac238c08aaa24a025243e14c502740753157ab24d625d425f"} Mar 09 03:03:23 crc kubenswrapper[4901]: I0309 03:03:23.921192 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f68fdba62b1106ac238c08aaa24a025243e14c502740753157ab24d625d425f" Mar 09 03:03:23 crc kubenswrapper[4901]: I0309 03:03:23.921285 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-kks6g" Mar 09 03:03:23 crc kubenswrapper[4901]: I0309 03:03:23.925174 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-sfgd9" Mar 09 03:03:23 crc kubenswrapper[4901]: I0309 03:03:23.925310 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-sfgd9" event={"ID":"0c8555bf-e516-4e10-be72-afbbb53fb31e","Type":"ContainerDied","Data":"4cee02a4b4c650b8643e7983ddfa80cd841b9e25a9939728f64400dc0b42e7a6"} Mar 09 03:03:23 crc kubenswrapper[4901]: I0309 03:03:23.925370 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4cee02a4b4c650b8643e7983ddfa80cd841b9e25a9939728f64400dc0b42e7a6" Mar 09 03:03:23 crc kubenswrapper[4901]: I0309 03:03:23.926501 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-04a2-account-create-update-lz6dq" event={"ID":"6be4ea2e-b742-478c-a6e3-56f43a856e40","Type":"ContainerDied","Data":"49d33c27c183d4c874ee92985ba5530944c35196dd92b5457e32ad18b82e8432"} Mar 09 03:03:23 crc kubenswrapper[4901]: I0309 03:03:23.926524 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49d33c27c183d4c874ee92985ba5530944c35196dd92b5457e32ad18b82e8432" Mar 09 03:03:23 crc kubenswrapper[4901]: I0309 03:03:23.926556 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-04a2-account-create-update-lz6dq" Mar 09 03:03:23 crc kubenswrapper[4901]: I0309 03:03:23.931469 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-3362-account-create-update-tdpdc" Mar 09 03:03:23 crc kubenswrapper[4901]: I0309 03:03:23.931631 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-3362-account-create-update-tdpdc" event={"ID":"698029af-0c10-4446-81f0-fd59859b8722","Type":"ContainerDied","Data":"6351e36263357c5ca71c1314284c05402a6e619a75b249f49f7c8e6f8eea1034"} Mar 09 03:03:23 crc kubenswrapper[4901]: I0309 03:03:23.931712 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6351e36263357c5ca71c1314284c05402a6e619a75b249f49f7c8e6f8eea1034" Mar 09 03:03:23 crc kubenswrapper[4901]: I0309 03:03:23.933035 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 09 03:03:23 crc kubenswrapper[4901]: I0309 03:03:23.934125 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 09 03:03:24 crc kubenswrapper[4901]: I0309 03:03:24.307488 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cddb-account-create-update-457m6" Mar 09 03:03:24 crc kubenswrapper[4901]: I0309 03:03:24.418997 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6494c542-3d82-43a7-b938-77820e0d3adb-operator-scripts\") pod \"6494c542-3d82-43a7-b938-77820e0d3adb\" (UID: \"6494c542-3d82-43a7-b938-77820e0d3adb\") " Mar 09 03:03:24 crc kubenswrapper[4901]: I0309 03:03:24.419065 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lt76\" (UniqueName: \"kubernetes.io/projected/6494c542-3d82-43a7-b938-77820e0d3adb-kube-api-access-7lt76\") pod \"6494c542-3d82-43a7-b938-77820e0d3adb\" (UID: \"6494c542-3d82-43a7-b938-77820e0d3adb\") " Mar 09 03:03:24 crc kubenswrapper[4901]: I0309 03:03:24.419500 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6494c542-3d82-43a7-b938-77820e0d3adb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6494c542-3d82-43a7-b938-77820e0d3adb" (UID: "6494c542-3d82-43a7-b938-77820e0d3adb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:03:24 crc kubenswrapper[4901]: I0309 03:03:24.423295 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6494c542-3d82-43a7-b938-77820e0d3adb-kube-api-access-7lt76" (OuterVolumeSpecName: "kube-api-access-7lt76") pod "6494c542-3d82-43a7-b938-77820e0d3adb" (UID: "6494c542-3d82-43a7-b938-77820e0d3adb"). InnerVolumeSpecName "kube-api-access-7lt76". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:03:24 crc kubenswrapper[4901]: I0309 03:03:24.520991 4901 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6494c542-3d82-43a7-b938-77820e0d3adb-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:03:24 crc kubenswrapper[4901]: I0309 03:03:24.521035 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7lt76\" (UniqueName: \"kubernetes.io/projected/6494c542-3d82-43a7-b938-77820e0d3adb-kube-api-access-7lt76\") on node \"crc\" DevicePath \"\"" Mar 09 03:03:24 crc kubenswrapper[4901]: I0309 03:03:24.869082 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 09 03:03:24 crc kubenswrapper[4901]: I0309 03:03:24.879181 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 09 03:03:24 crc kubenswrapper[4901]: I0309 03:03:24.939253 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cddb-account-create-update-457m6" event={"ID":"6494c542-3d82-43a7-b938-77820e0d3adb","Type":"ContainerDied","Data":"48d73beadb5929df01b27e62c78eb757d2acbe029c407b1311ab3c6271a73cad"} Mar 09 03:03:24 crc kubenswrapper[4901]: I0309 03:03:24.939296 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48d73beadb5929df01b27e62c78eb757d2acbe029c407b1311ab3c6271a73cad" Mar 09 03:03:24 crc kubenswrapper[4901]: I0309 03:03:24.939336 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cddb-account-create-update-457m6" Mar 09 03:03:25 crc kubenswrapper[4901]: I0309 03:03:25.668339 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-c2nzk"] Mar 09 03:03:25 crc kubenswrapper[4901]: E0309 03:03:25.668952 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09d03d05-f63f-4a84-be7e-fcfaaae0505d" containerName="mariadb-database-create" Mar 09 03:03:25 crc kubenswrapper[4901]: I0309 03:03:25.669037 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="09d03d05-f63f-4a84-be7e-fcfaaae0505d" containerName="mariadb-database-create" Mar 09 03:03:25 crc kubenswrapper[4901]: E0309 03:03:25.669095 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a093673-8ed6-457e-8981-83864827e781" containerName="mariadb-database-create" Mar 09 03:03:25 crc kubenswrapper[4901]: I0309 03:03:25.669143 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a093673-8ed6-457e-8981-83864827e781" containerName="mariadb-database-create" Mar 09 03:03:25 crc kubenswrapper[4901]: E0309 03:03:25.669203 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6be4ea2e-b742-478c-a6e3-56f43a856e40" containerName="mariadb-account-create-update" Mar 09 03:03:25 crc kubenswrapper[4901]: I0309 03:03:25.669281 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="6be4ea2e-b742-478c-a6e3-56f43a856e40" containerName="mariadb-account-create-update" Mar 09 03:03:25 crc kubenswrapper[4901]: E0309 03:03:25.669341 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="698029af-0c10-4446-81f0-fd59859b8722" containerName="mariadb-account-create-update" Mar 09 03:03:25 crc kubenswrapper[4901]: I0309 03:03:25.669407 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="698029af-0c10-4446-81f0-fd59859b8722" containerName="mariadb-account-create-update" Mar 09 03:03:25 crc kubenswrapper[4901]: E0309 03:03:25.669469 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6494c542-3d82-43a7-b938-77820e0d3adb" containerName="mariadb-account-create-update" Mar 09 03:03:25 crc kubenswrapper[4901]: I0309 03:03:25.669518 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="6494c542-3d82-43a7-b938-77820e0d3adb" containerName="mariadb-account-create-update" Mar 09 03:03:25 crc kubenswrapper[4901]: E0309 03:03:25.669580 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c8555bf-e516-4e10-be72-afbbb53fb31e" containerName="mariadb-database-create" Mar 09 03:03:25 crc kubenswrapper[4901]: I0309 03:03:25.669634 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c8555bf-e516-4e10-be72-afbbb53fb31e" containerName="mariadb-database-create" Mar 09 03:03:25 crc kubenswrapper[4901]: I0309 03:03:25.669846 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="6494c542-3d82-43a7-b938-77820e0d3adb" containerName="mariadb-account-create-update" Mar 09 03:03:25 crc kubenswrapper[4901]: I0309 03:03:25.669913 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="6be4ea2e-b742-478c-a6e3-56f43a856e40" containerName="mariadb-account-create-update" Mar 09 03:03:25 crc kubenswrapper[4901]: I0309 03:03:25.670166 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="698029af-0c10-4446-81f0-fd59859b8722" containerName="mariadb-account-create-update" Mar 09 03:03:25 crc kubenswrapper[4901]: I0309 03:03:25.670235 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a093673-8ed6-457e-8981-83864827e781" containerName="mariadb-database-create" Mar 09 03:03:25 crc kubenswrapper[4901]: I0309 03:03:25.670300 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="09d03d05-f63f-4a84-be7e-fcfaaae0505d" containerName="mariadb-database-create" Mar 09 03:03:25 crc kubenswrapper[4901]: I0309 03:03:25.670362 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c8555bf-e516-4e10-be72-afbbb53fb31e" containerName="mariadb-database-create" Mar 09 03:03:25 crc kubenswrapper[4901]: I0309 03:03:25.670935 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-c2nzk" Mar 09 03:03:25 crc kubenswrapper[4901]: I0309 03:03:25.673052 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 09 03:03:25 crc kubenswrapper[4901]: I0309 03:03:25.673380 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 09 03:03:25 crc kubenswrapper[4901]: I0309 03:03:25.673421 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-sm9q2" Mar 09 03:03:25 crc kubenswrapper[4901]: I0309 03:03:25.684733 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-c2nzk"] Mar 09 03:03:25 crc kubenswrapper[4901]: I0309 03:03:25.794498 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 09 03:03:25 crc kubenswrapper[4901]: I0309 03:03:25.840588 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfeef473-c12a-4af0-9a27-f1fe52a0b144-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-c2nzk\" (UID: \"bfeef473-c12a-4af0-9a27-f1fe52a0b144\") " pod="openstack/nova-cell0-conductor-db-sync-c2nzk" Mar 09 03:03:25 crc kubenswrapper[4901]: I0309 03:03:25.840673 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dl8vz\" (UniqueName: \"kubernetes.io/projected/bfeef473-c12a-4af0-9a27-f1fe52a0b144-kube-api-access-dl8vz\") pod \"nova-cell0-conductor-db-sync-c2nzk\" (UID: \"bfeef473-c12a-4af0-9a27-f1fe52a0b144\") " pod="openstack/nova-cell0-conductor-db-sync-c2nzk" Mar 09 03:03:25 crc kubenswrapper[4901]: I0309 03:03:25.840871 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfeef473-c12a-4af0-9a27-f1fe52a0b144-config-data\") pod \"nova-cell0-conductor-db-sync-c2nzk\" (UID: \"bfeef473-c12a-4af0-9a27-f1fe52a0b144\") " pod="openstack/nova-cell0-conductor-db-sync-c2nzk" Mar 09 03:03:25 crc kubenswrapper[4901]: I0309 03:03:25.840957 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfeef473-c12a-4af0-9a27-f1fe52a0b144-scripts\") pod \"nova-cell0-conductor-db-sync-c2nzk\" (UID: \"bfeef473-c12a-4af0-9a27-f1fe52a0b144\") " pod="openstack/nova-cell0-conductor-db-sync-c2nzk" Mar 09 03:03:25 crc kubenswrapper[4901]: I0309 03:03:25.865719 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 09 03:03:25 crc kubenswrapper[4901]: I0309 03:03:25.942245 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfeef473-c12a-4af0-9a27-f1fe52a0b144-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-c2nzk\" (UID: \"bfeef473-c12a-4af0-9a27-f1fe52a0b144\") " pod="openstack/nova-cell0-conductor-db-sync-c2nzk" Mar 09 03:03:25 crc kubenswrapper[4901]: I0309 03:03:25.942334 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dl8vz\" (UniqueName: \"kubernetes.io/projected/bfeef473-c12a-4af0-9a27-f1fe52a0b144-kube-api-access-dl8vz\") pod \"nova-cell0-conductor-db-sync-c2nzk\" (UID: \"bfeef473-c12a-4af0-9a27-f1fe52a0b144\") " pod="openstack/nova-cell0-conductor-db-sync-c2nzk" Mar 09 03:03:25 crc kubenswrapper[4901]: I0309 03:03:25.942382 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfeef473-c12a-4af0-9a27-f1fe52a0b144-config-data\") pod \"nova-cell0-conductor-db-sync-c2nzk\" (UID: \"bfeef473-c12a-4af0-9a27-f1fe52a0b144\") " pod="openstack/nova-cell0-conductor-db-sync-c2nzk" Mar 09 03:03:25 crc kubenswrapper[4901]: I0309 03:03:25.942412 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfeef473-c12a-4af0-9a27-f1fe52a0b144-scripts\") pod \"nova-cell0-conductor-db-sync-c2nzk\" (UID: \"bfeef473-c12a-4af0-9a27-f1fe52a0b144\") " pod="openstack/nova-cell0-conductor-db-sync-c2nzk" Mar 09 03:03:25 crc kubenswrapper[4901]: I0309 03:03:25.947777 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfeef473-c12a-4af0-9a27-f1fe52a0b144-scripts\") pod \"nova-cell0-conductor-db-sync-c2nzk\" (UID: \"bfeef473-c12a-4af0-9a27-f1fe52a0b144\") " pod="openstack/nova-cell0-conductor-db-sync-c2nzk" Mar 09 03:03:25 crc kubenswrapper[4901]: I0309 03:03:25.948408 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfeef473-c12a-4af0-9a27-f1fe52a0b144-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-c2nzk\" (UID: \"bfeef473-c12a-4af0-9a27-f1fe52a0b144\") " pod="openstack/nova-cell0-conductor-db-sync-c2nzk" Mar 09 03:03:25 crc kubenswrapper[4901]: I0309 03:03:25.948424 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfeef473-c12a-4af0-9a27-f1fe52a0b144-config-data\") pod \"nova-cell0-conductor-db-sync-c2nzk\" (UID: \"bfeef473-c12a-4af0-9a27-f1fe52a0b144\") " pod="openstack/nova-cell0-conductor-db-sync-c2nzk" Mar 09 03:03:25 crc kubenswrapper[4901]: I0309 03:03:25.952107 4901 generic.go:334] "Generic (PLEG): container finished" podID="50fd5778-3018-4a41-8db7-285ca63540a5" containerID="0b687bccedee7ceb14535726d105414601f7fa2b74a29a4654dc4138b0a31d91" exitCode=0 Mar 09 03:03:25 crc kubenswrapper[4901]: I0309 03:03:25.952915 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-588d7b64fd-wbsl2" event={"ID":"50fd5778-3018-4a41-8db7-285ca63540a5","Type":"ContainerDied","Data":"0b687bccedee7ceb14535726d105414601f7fa2b74a29a4654dc4138b0a31d91"} Mar 09 03:03:25 crc kubenswrapper[4901]: I0309 03:03:25.969834 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dl8vz\" (UniqueName: \"kubernetes.io/projected/bfeef473-c12a-4af0-9a27-f1fe52a0b144-kube-api-access-dl8vz\") pod \"nova-cell0-conductor-db-sync-c2nzk\" (UID: \"bfeef473-c12a-4af0-9a27-f1fe52a0b144\") " pod="openstack/nova-cell0-conductor-db-sync-c2nzk" Mar 09 03:03:25 crc kubenswrapper[4901]: I0309 03:03:25.984726 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-c2nzk" Mar 09 03:03:26 crc kubenswrapper[4901]: I0309 03:03:26.108962 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-588d7b64fd-wbsl2" Mar 09 03:03:26 crc kubenswrapper[4901]: I0309 03:03:26.246983 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/50fd5778-3018-4a41-8db7-285ca63540a5-internal-tls-certs\") pod \"50fd5778-3018-4a41-8db7-285ca63540a5\" (UID: \"50fd5778-3018-4a41-8db7-285ca63540a5\") " Mar 09 03:03:26 crc kubenswrapper[4901]: I0309 03:03:26.247089 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50fd5778-3018-4a41-8db7-285ca63540a5-scripts\") pod \"50fd5778-3018-4a41-8db7-285ca63540a5\" (UID: \"50fd5778-3018-4a41-8db7-285ca63540a5\") " Mar 09 03:03:26 crc kubenswrapper[4901]: I0309 03:03:26.247113 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50fd5778-3018-4a41-8db7-285ca63540a5-combined-ca-bundle\") pod \"50fd5778-3018-4a41-8db7-285ca63540a5\" (UID: \"50fd5778-3018-4a41-8db7-285ca63540a5\") " Mar 09 03:03:26 crc kubenswrapper[4901]: I0309 03:03:26.247145 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/50fd5778-3018-4a41-8db7-285ca63540a5-public-tls-certs\") pod \"50fd5778-3018-4a41-8db7-285ca63540a5\" (UID: \"50fd5778-3018-4a41-8db7-285ca63540a5\") " Mar 09 03:03:26 crc kubenswrapper[4901]: I0309 03:03:26.247188 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50fd5778-3018-4a41-8db7-285ca63540a5-logs\") pod \"50fd5778-3018-4a41-8db7-285ca63540a5\" (UID: \"50fd5778-3018-4a41-8db7-285ca63540a5\") " Mar 09 03:03:26 crc kubenswrapper[4901]: I0309 03:03:26.247247 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6n7x9\" (UniqueName: \"kubernetes.io/projected/50fd5778-3018-4a41-8db7-285ca63540a5-kube-api-access-6n7x9\") pod \"50fd5778-3018-4a41-8db7-285ca63540a5\" (UID: \"50fd5778-3018-4a41-8db7-285ca63540a5\") " Mar 09 03:03:26 crc kubenswrapper[4901]: I0309 03:03:26.247322 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50fd5778-3018-4a41-8db7-285ca63540a5-config-data\") pod \"50fd5778-3018-4a41-8db7-285ca63540a5\" (UID: \"50fd5778-3018-4a41-8db7-285ca63540a5\") " Mar 09 03:03:26 crc kubenswrapper[4901]: I0309 03:03:26.250583 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50fd5778-3018-4a41-8db7-285ca63540a5-logs" (OuterVolumeSpecName: "logs") pod "50fd5778-3018-4a41-8db7-285ca63540a5" (UID: "50fd5778-3018-4a41-8db7-285ca63540a5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:03:26 crc kubenswrapper[4901]: I0309 03:03:26.252983 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50fd5778-3018-4a41-8db7-285ca63540a5-scripts" (OuterVolumeSpecName: "scripts") pod "50fd5778-3018-4a41-8db7-285ca63540a5" (UID: "50fd5778-3018-4a41-8db7-285ca63540a5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:03:26 crc kubenswrapper[4901]: I0309 03:03:26.259570 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50fd5778-3018-4a41-8db7-285ca63540a5-kube-api-access-6n7x9" (OuterVolumeSpecName: "kube-api-access-6n7x9") pod "50fd5778-3018-4a41-8db7-285ca63540a5" (UID: "50fd5778-3018-4a41-8db7-285ca63540a5"). InnerVolumeSpecName "kube-api-access-6n7x9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:03:26 crc kubenswrapper[4901]: I0309 03:03:26.303577 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50fd5778-3018-4a41-8db7-285ca63540a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "50fd5778-3018-4a41-8db7-285ca63540a5" (UID: "50fd5778-3018-4a41-8db7-285ca63540a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:03:26 crc kubenswrapper[4901]: I0309 03:03:26.337025 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50fd5778-3018-4a41-8db7-285ca63540a5-config-data" (OuterVolumeSpecName: "config-data") pod "50fd5778-3018-4a41-8db7-285ca63540a5" (UID: "50fd5778-3018-4a41-8db7-285ca63540a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:03:26 crc kubenswrapper[4901]: I0309 03:03:26.348868 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50fd5778-3018-4a41-8db7-285ca63540a5-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:03:26 crc kubenswrapper[4901]: I0309 03:03:26.348898 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50fd5778-3018-4a41-8db7-285ca63540a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:03:26 crc kubenswrapper[4901]: I0309 03:03:26.348914 4901 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50fd5778-3018-4a41-8db7-285ca63540a5-logs\") on node \"crc\" DevicePath \"\"" Mar 09 03:03:26 crc kubenswrapper[4901]: I0309 03:03:26.348941 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6n7x9\" (UniqueName: \"kubernetes.io/projected/50fd5778-3018-4a41-8db7-285ca63540a5-kube-api-access-6n7x9\") on node \"crc\" DevicePath \"\"" Mar 09 03:03:26 crc kubenswrapper[4901]: I0309 03:03:26.348950 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50fd5778-3018-4a41-8db7-285ca63540a5-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 03:03:26 crc kubenswrapper[4901]: I0309 03:03:26.364343 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50fd5778-3018-4a41-8db7-285ca63540a5-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "50fd5778-3018-4a41-8db7-285ca63540a5" (UID: "50fd5778-3018-4a41-8db7-285ca63540a5"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:03:26 crc kubenswrapper[4901]: I0309 03:03:26.367635 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50fd5778-3018-4a41-8db7-285ca63540a5-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "50fd5778-3018-4a41-8db7-285ca63540a5" (UID: "50fd5778-3018-4a41-8db7-285ca63540a5"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:03:26 crc kubenswrapper[4901]: I0309 03:03:26.449977 4901 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/50fd5778-3018-4a41-8db7-285ca63540a5-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 03:03:26 crc kubenswrapper[4901]: I0309 03:03:26.450012 4901 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/50fd5778-3018-4a41-8db7-285ca63540a5-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 03:03:26 crc kubenswrapper[4901]: W0309 03:03:26.473528 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbfeef473_c12a_4af0_9a27_f1fe52a0b144.slice/crio-fb2766afb4515aeba13db4f1e390af23926fd0767969f9d8a0a8a96e91fa1421 WatchSource:0}: Error finding container fb2766afb4515aeba13db4f1e390af23926fd0767969f9d8a0a8a96e91fa1421: Status 404 returned error can't find the container with id fb2766afb4515aeba13db4f1e390af23926fd0767969f9d8a0a8a96e91fa1421 Mar 09 03:03:26 crc kubenswrapper[4901]: I0309 03:03:26.482165 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-c2nzk"] Mar 09 03:03:26 crc kubenswrapper[4901]: I0309 03:03:26.962088 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-c2nzk" event={"ID":"bfeef473-c12a-4af0-9a27-f1fe52a0b144","Type":"ContainerStarted","Data":"fb2766afb4515aeba13db4f1e390af23926fd0767969f9d8a0a8a96e91fa1421"} Mar 09 03:03:26 crc kubenswrapper[4901]: I0309 03:03:26.964070 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-588d7b64fd-wbsl2" Mar 09 03:03:26 crc kubenswrapper[4901]: I0309 03:03:26.964066 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-588d7b64fd-wbsl2" event={"ID":"50fd5778-3018-4a41-8db7-285ca63540a5","Type":"ContainerDied","Data":"eb6bf3e01b9775960887acf875cff95b811097b67344ba8c4818e22727cd4a25"} Mar 09 03:03:26 crc kubenswrapper[4901]: I0309 03:03:26.964134 4901 scope.go:117] "RemoveContainer" containerID="0b687bccedee7ceb14535726d105414601f7fa2b74a29a4654dc4138b0a31d91" Mar 09 03:03:27 crc kubenswrapper[4901]: I0309 03:03:27.001190 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-588d7b64fd-wbsl2"] Mar 09 03:03:27 crc kubenswrapper[4901]: I0309 03:03:27.006851 4901 scope.go:117] "RemoveContainer" containerID="92f0b018fb501ac1d31f864c3c26312799db4eaeb5131c441e1d09cb7f469811" Mar 09 03:03:27 crc kubenswrapper[4901]: I0309 03:03:27.011444 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-588d7b64fd-wbsl2"] Mar 09 03:03:28 crc kubenswrapper[4901]: I0309 03:03:28.123169 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50fd5778-3018-4a41-8db7-285ca63540a5" path="/var/lib/kubelet/pods/50fd5778-3018-4a41-8db7-285ca63540a5/volumes" Mar 09 03:03:30 crc kubenswrapper[4901]: I0309 03:03:30.863032 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 03:03:30 crc kubenswrapper[4901]: I0309 03:03:30.863542 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 03:03:33 crc kubenswrapper[4901]: I0309 03:03:33.312957 4901 scope.go:117] "RemoveContainer" containerID="5d3dd1d530d4253c5d537f4e04d650d84a9ab3eaad7b29ee545fc4ab6206bf33" Mar 09 03:03:33 crc kubenswrapper[4901]: I0309 03:03:33.360581 4901 scope.go:117] "RemoveContainer" containerID="53355a9b10878fa8dc367f9ed248fb2df866e2c47805aafb50d8e19ea55cb5f1" Mar 09 03:03:33 crc kubenswrapper[4901]: I0309 03:03:33.542057 4901 scope.go:117] "RemoveContainer" containerID="98e3856142f4088ee2a834c7a3ca1c4a96be53ef6e077ab66e8f17f522b99fd7" Mar 09 03:03:34 crc kubenswrapper[4901]: I0309 03:03:34.083310 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-c2nzk" event={"ID":"bfeef473-c12a-4af0-9a27-f1fe52a0b144","Type":"ContainerStarted","Data":"f20698a7666281d2b7b17ad3948cf16cb7044c99d7b7d9bd908fc1bc3eea7a32"} Mar 09 03:03:34 crc kubenswrapper[4901]: I0309 03:03:34.101968 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-c2nzk" podStartSLOduration=2.215254079 podStartE2EDuration="9.101901965s" podCreationTimestamp="2026-03-09 03:03:25 +0000 UTC" firstStartedPulling="2026-03-09 03:03:26.475598223 +0000 UTC m=+1331.065261955" lastFinishedPulling="2026-03-09 03:03:33.362246099 +0000 UTC m=+1337.951909841" observedRunningTime="2026-03-09 03:03:34.094805696 +0000 UTC m=+1338.684469428" watchObservedRunningTime="2026-03-09 03:03:34.101901965 +0000 UTC m=+1338.691565697" Mar 09 03:03:37 crc kubenswrapper[4901]: I0309 03:03:37.008152 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="788c3db9-e44c-4c56-a5d4-392dffa5e21d" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.134366 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.200729 4901 generic.go:334] "Generic (PLEG): container finished" podID="788c3db9-e44c-4c56-a5d4-392dffa5e21d" containerID="ff8c4b5cb415a2613e06b88df9a201e75343e9fd9014c0c3d025e9db8a0c2eb2" exitCode=137 Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.200764 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"788c3db9-e44c-4c56-a5d4-392dffa5e21d","Type":"ContainerDied","Data":"ff8c4b5cb415a2613e06b88df9a201e75343e9fd9014c0c3d025e9db8a0c2eb2"} Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.200796 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.200814 4901 scope.go:117] "RemoveContainer" containerID="ff8c4b5cb415a2613e06b88df9a201e75343e9fd9014c0c3d025e9db8a0c2eb2" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.200803 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"788c3db9-e44c-4c56-a5d4-392dffa5e21d","Type":"ContainerDied","Data":"ee4c113f4e407543b0df7cf2543b7314cc1d6a562112679ee564306000b1b900"} Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.202787 4901 generic.go:334] "Generic (PLEG): container finished" podID="bfeef473-c12a-4af0-9a27-f1fe52a0b144" containerID="f20698a7666281d2b7b17ad3948cf16cb7044c99d7b7d9bd908fc1bc3eea7a32" exitCode=0 Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.202812 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-c2nzk" event={"ID":"bfeef473-c12a-4af0-9a27-f1fe52a0b144","Type":"ContainerDied","Data":"f20698a7666281d2b7b17ad3948cf16cb7044c99d7b7d9bd908fc1bc3eea7a32"} Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.237507 4901 scope.go:117] "RemoveContainer" containerID="288629f2369cf44e8570ccd2b5cb49c13c6b8d3111cf8ba15ad5becdb12a5291" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.255312 4901 scope.go:117] "RemoveContainer" containerID="207cdf29b2f56804ac4dc9e8ce38ea02d6c959a6d9a759e151211f7512a0f936" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.272614 4901 scope.go:117] "RemoveContainer" containerID="f5fdd4cdfa035f6889b051c26813eba310fe6f41180e65115f37f28e4142c27d" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.301238 4901 scope.go:117] "RemoveContainer" containerID="ff8c4b5cb415a2613e06b88df9a201e75343e9fd9014c0c3d025e9db8a0c2eb2" Mar 09 03:03:43 crc kubenswrapper[4901]: E0309 03:03:43.301822 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff8c4b5cb415a2613e06b88df9a201e75343e9fd9014c0c3d025e9db8a0c2eb2\": container with ID starting with ff8c4b5cb415a2613e06b88df9a201e75343e9fd9014c0c3d025e9db8a0c2eb2 not found: ID does not exist" containerID="ff8c4b5cb415a2613e06b88df9a201e75343e9fd9014c0c3d025e9db8a0c2eb2" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.301873 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff8c4b5cb415a2613e06b88df9a201e75343e9fd9014c0c3d025e9db8a0c2eb2"} err="failed to get container status \"ff8c4b5cb415a2613e06b88df9a201e75343e9fd9014c0c3d025e9db8a0c2eb2\": rpc error: code = NotFound desc = could not find container \"ff8c4b5cb415a2613e06b88df9a201e75343e9fd9014c0c3d025e9db8a0c2eb2\": container with ID starting with ff8c4b5cb415a2613e06b88df9a201e75343e9fd9014c0c3d025e9db8a0c2eb2 not found: ID does not exist" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.301899 4901 scope.go:117] "RemoveContainer" containerID="288629f2369cf44e8570ccd2b5cb49c13c6b8d3111cf8ba15ad5becdb12a5291" Mar 09 03:03:43 crc kubenswrapper[4901]: E0309 03:03:43.302170 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"288629f2369cf44e8570ccd2b5cb49c13c6b8d3111cf8ba15ad5becdb12a5291\": container with ID starting with 288629f2369cf44e8570ccd2b5cb49c13c6b8d3111cf8ba15ad5becdb12a5291 not found: ID does not exist" containerID="288629f2369cf44e8570ccd2b5cb49c13c6b8d3111cf8ba15ad5becdb12a5291" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.302203 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"288629f2369cf44e8570ccd2b5cb49c13c6b8d3111cf8ba15ad5becdb12a5291"} err="failed to get container status \"288629f2369cf44e8570ccd2b5cb49c13c6b8d3111cf8ba15ad5becdb12a5291\": rpc error: code = NotFound desc = could not find container \"288629f2369cf44e8570ccd2b5cb49c13c6b8d3111cf8ba15ad5becdb12a5291\": container with ID starting with 288629f2369cf44e8570ccd2b5cb49c13c6b8d3111cf8ba15ad5becdb12a5291 not found: ID does not exist" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.302294 4901 scope.go:117] "RemoveContainer" containerID="207cdf29b2f56804ac4dc9e8ce38ea02d6c959a6d9a759e151211f7512a0f936" Mar 09 03:03:43 crc kubenswrapper[4901]: E0309 03:03:43.302530 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"207cdf29b2f56804ac4dc9e8ce38ea02d6c959a6d9a759e151211f7512a0f936\": container with ID starting with 207cdf29b2f56804ac4dc9e8ce38ea02d6c959a6d9a759e151211f7512a0f936 not found: ID does not exist" containerID="207cdf29b2f56804ac4dc9e8ce38ea02d6c959a6d9a759e151211f7512a0f936" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.302559 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"207cdf29b2f56804ac4dc9e8ce38ea02d6c959a6d9a759e151211f7512a0f936"} err="failed to get container status \"207cdf29b2f56804ac4dc9e8ce38ea02d6c959a6d9a759e151211f7512a0f936\": rpc error: code = NotFound desc = could not find container \"207cdf29b2f56804ac4dc9e8ce38ea02d6c959a6d9a759e151211f7512a0f936\": container with ID starting with 207cdf29b2f56804ac4dc9e8ce38ea02d6c959a6d9a759e151211f7512a0f936 not found: ID does not exist" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.302591 4901 scope.go:117] "RemoveContainer" containerID="f5fdd4cdfa035f6889b051c26813eba310fe6f41180e65115f37f28e4142c27d" Mar 09 03:03:43 crc kubenswrapper[4901]: E0309 03:03:43.302811 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5fdd4cdfa035f6889b051c26813eba310fe6f41180e65115f37f28e4142c27d\": container with ID starting with f5fdd4cdfa035f6889b051c26813eba310fe6f41180e65115f37f28e4142c27d not found: ID does not exist" containerID="f5fdd4cdfa035f6889b051c26813eba310fe6f41180e65115f37f28e4142c27d" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.302844 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5fdd4cdfa035f6889b051c26813eba310fe6f41180e65115f37f28e4142c27d"} err="failed to get container status \"f5fdd4cdfa035f6889b051c26813eba310fe6f41180e65115f37f28e4142c27d\": rpc error: code = NotFound desc = could not find container \"f5fdd4cdfa035f6889b051c26813eba310fe6f41180e65115f37f28e4142c27d\": container with ID starting with f5fdd4cdfa035f6889b051c26813eba310fe6f41180e65115f37f28e4142c27d not found: ID does not exist" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.306654 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/788c3db9-e44c-4c56-a5d4-392dffa5e21d-combined-ca-bundle\") pod \"788c3db9-e44c-4c56-a5d4-392dffa5e21d\" (UID: \"788c3db9-e44c-4c56-a5d4-392dffa5e21d\") " Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.306773 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/788c3db9-e44c-4c56-a5d4-392dffa5e21d-sg-core-conf-yaml\") pod \"788c3db9-e44c-4c56-a5d4-392dffa5e21d\" (UID: \"788c3db9-e44c-4c56-a5d4-392dffa5e21d\") " Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.306800 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xl2jd\" (UniqueName: \"kubernetes.io/projected/788c3db9-e44c-4c56-a5d4-392dffa5e21d-kube-api-access-xl2jd\") pod \"788c3db9-e44c-4c56-a5d4-392dffa5e21d\" (UID: \"788c3db9-e44c-4c56-a5d4-392dffa5e21d\") " Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.306833 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/788c3db9-e44c-4c56-a5d4-392dffa5e21d-config-data\") pod \"788c3db9-e44c-4c56-a5d4-392dffa5e21d\" (UID: \"788c3db9-e44c-4c56-a5d4-392dffa5e21d\") " Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.306931 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/788c3db9-e44c-4c56-a5d4-392dffa5e21d-scripts\") pod \"788c3db9-e44c-4c56-a5d4-392dffa5e21d\" (UID: \"788c3db9-e44c-4c56-a5d4-392dffa5e21d\") " Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.306964 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/788c3db9-e44c-4c56-a5d4-392dffa5e21d-log-httpd\") pod \"788c3db9-e44c-4c56-a5d4-392dffa5e21d\" (UID: \"788c3db9-e44c-4c56-a5d4-392dffa5e21d\") " Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.306982 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/788c3db9-e44c-4c56-a5d4-392dffa5e21d-run-httpd\") pod \"788c3db9-e44c-4c56-a5d4-392dffa5e21d\" (UID: \"788c3db9-e44c-4c56-a5d4-392dffa5e21d\") " Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.307552 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/788c3db9-e44c-4c56-a5d4-392dffa5e21d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "788c3db9-e44c-4c56-a5d4-392dffa5e21d" (UID: "788c3db9-e44c-4c56-a5d4-392dffa5e21d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.308096 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/788c3db9-e44c-4c56-a5d4-392dffa5e21d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "788c3db9-e44c-4c56-a5d4-392dffa5e21d" (UID: "788c3db9-e44c-4c56-a5d4-392dffa5e21d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.312338 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/788c3db9-e44c-4c56-a5d4-392dffa5e21d-kube-api-access-xl2jd" (OuterVolumeSpecName: "kube-api-access-xl2jd") pod "788c3db9-e44c-4c56-a5d4-392dffa5e21d" (UID: "788c3db9-e44c-4c56-a5d4-392dffa5e21d"). InnerVolumeSpecName "kube-api-access-xl2jd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.313552 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/788c3db9-e44c-4c56-a5d4-392dffa5e21d-scripts" (OuterVolumeSpecName: "scripts") pod "788c3db9-e44c-4c56-a5d4-392dffa5e21d" (UID: "788c3db9-e44c-4c56-a5d4-392dffa5e21d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.368130 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/788c3db9-e44c-4c56-a5d4-392dffa5e21d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "788c3db9-e44c-4c56-a5d4-392dffa5e21d" (UID: "788c3db9-e44c-4c56-a5d4-392dffa5e21d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.408987 4901 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/788c3db9-e44c-4c56-a5d4-392dffa5e21d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.409389 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xl2jd\" (UniqueName: \"kubernetes.io/projected/788c3db9-e44c-4c56-a5d4-392dffa5e21d-kube-api-access-xl2jd\") on node \"crc\" DevicePath \"\"" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.409412 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/788c3db9-e44c-4c56-a5d4-392dffa5e21d-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.409430 4901 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/788c3db9-e44c-4c56-a5d4-392dffa5e21d-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.409445 4901 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/788c3db9-e44c-4c56-a5d4-392dffa5e21d-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.418335 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/788c3db9-e44c-4c56-a5d4-392dffa5e21d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "788c3db9-e44c-4c56-a5d4-392dffa5e21d" (UID: "788c3db9-e44c-4c56-a5d4-392dffa5e21d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.437978 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/788c3db9-e44c-4c56-a5d4-392dffa5e21d-config-data" (OuterVolumeSpecName: "config-data") pod "788c3db9-e44c-4c56-a5d4-392dffa5e21d" (UID: "788c3db9-e44c-4c56-a5d4-392dffa5e21d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.511266 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/788c3db9-e44c-4c56-a5d4-392dffa5e21d-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.511302 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/788c3db9-e44c-4c56-a5d4-392dffa5e21d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.570238 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.588566 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.601510 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 03:03:43 crc kubenswrapper[4901]: E0309 03:03:43.601920 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="788c3db9-e44c-4c56-a5d4-392dffa5e21d" containerName="proxy-httpd" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.601938 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="788c3db9-e44c-4c56-a5d4-392dffa5e21d" containerName="proxy-httpd" Mar 09 03:03:43 crc kubenswrapper[4901]: E0309 03:03:43.601950 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="788c3db9-e44c-4c56-a5d4-392dffa5e21d" containerName="ceilometer-notification-agent" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.601956 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="788c3db9-e44c-4c56-a5d4-392dffa5e21d" containerName="ceilometer-notification-agent" Mar 09 03:03:43 crc kubenswrapper[4901]: E0309 03:03:43.601968 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50fd5778-3018-4a41-8db7-285ca63540a5" containerName="placement-api" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.601974 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="50fd5778-3018-4a41-8db7-285ca63540a5" containerName="placement-api" Mar 09 03:03:43 crc kubenswrapper[4901]: E0309 03:03:43.601982 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="788c3db9-e44c-4c56-a5d4-392dffa5e21d" containerName="sg-core" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.601989 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="788c3db9-e44c-4c56-a5d4-392dffa5e21d" containerName="sg-core" Mar 09 03:03:43 crc kubenswrapper[4901]: E0309 03:03:43.602016 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50fd5778-3018-4a41-8db7-285ca63540a5" containerName="placement-log" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.602021 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="50fd5778-3018-4a41-8db7-285ca63540a5" containerName="placement-log" Mar 09 03:03:43 crc kubenswrapper[4901]: E0309 03:03:43.602028 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="788c3db9-e44c-4c56-a5d4-392dffa5e21d" containerName="ceilometer-central-agent" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.602033 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="788c3db9-e44c-4c56-a5d4-392dffa5e21d" containerName="ceilometer-central-agent" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.602178 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="788c3db9-e44c-4c56-a5d4-392dffa5e21d" containerName="proxy-httpd" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.602189 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="788c3db9-e44c-4c56-a5d4-392dffa5e21d" containerName="ceilometer-notification-agent" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.602200 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="50fd5778-3018-4a41-8db7-285ca63540a5" containerName="placement-api" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.602206 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="788c3db9-e44c-4c56-a5d4-392dffa5e21d" containerName="ceilometer-central-agent" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.602218 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="50fd5778-3018-4a41-8db7-285ca63540a5" containerName="placement-log" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.602242 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="788c3db9-e44c-4c56-a5d4-392dffa5e21d" containerName="sg-core" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.603747 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.607922 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.608644 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.611394 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.714197 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82b374be-f2cd-4656-87de-434995c335b8-run-httpd\") pod \"ceilometer-0\" (UID: \"82b374be-f2cd-4656-87de-434995c335b8\") " pod="openstack/ceilometer-0" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.714270 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2rhj\" (UniqueName: \"kubernetes.io/projected/82b374be-f2cd-4656-87de-434995c335b8-kube-api-access-n2rhj\") pod \"ceilometer-0\" (UID: \"82b374be-f2cd-4656-87de-434995c335b8\") " pod="openstack/ceilometer-0" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.714295 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82b374be-f2cd-4656-87de-434995c335b8-log-httpd\") pod \"ceilometer-0\" (UID: \"82b374be-f2cd-4656-87de-434995c335b8\") " pod="openstack/ceilometer-0" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.714333 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82b374be-f2cd-4656-87de-434995c335b8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"82b374be-f2cd-4656-87de-434995c335b8\") " pod="openstack/ceilometer-0" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.714607 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82b374be-f2cd-4656-87de-434995c335b8-config-data\") pod \"ceilometer-0\" (UID: \"82b374be-f2cd-4656-87de-434995c335b8\") " pod="openstack/ceilometer-0" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.714943 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82b374be-f2cd-4656-87de-434995c335b8-scripts\") pod \"ceilometer-0\" (UID: \"82b374be-f2cd-4656-87de-434995c335b8\") " pod="openstack/ceilometer-0" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.715135 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/82b374be-f2cd-4656-87de-434995c335b8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"82b374be-f2cd-4656-87de-434995c335b8\") " pod="openstack/ceilometer-0" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.817022 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/82b374be-f2cd-4656-87de-434995c335b8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"82b374be-f2cd-4656-87de-434995c335b8\") " pod="openstack/ceilometer-0" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.819158 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82b374be-f2cd-4656-87de-434995c335b8-run-httpd\") pod \"ceilometer-0\" (UID: \"82b374be-f2cd-4656-87de-434995c335b8\") " pod="openstack/ceilometer-0" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.819973 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2rhj\" (UniqueName: \"kubernetes.io/projected/82b374be-f2cd-4656-87de-434995c335b8-kube-api-access-n2rhj\") pod \"ceilometer-0\" (UID: \"82b374be-f2cd-4656-87de-434995c335b8\") " pod="openstack/ceilometer-0" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.820037 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82b374be-f2cd-4656-87de-434995c335b8-log-httpd\") pod \"ceilometer-0\" (UID: \"82b374be-f2cd-4656-87de-434995c335b8\") " pod="openstack/ceilometer-0" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.820105 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82b374be-f2cd-4656-87de-434995c335b8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"82b374be-f2cd-4656-87de-434995c335b8\") " pod="openstack/ceilometer-0" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.820180 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82b374be-f2cd-4656-87de-434995c335b8-config-data\") pod \"ceilometer-0\" (UID: \"82b374be-f2cd-4656-87de-434995c335b8\") " pod="openstack/ceilometer-0" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.819894 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82b374be-f2cd-4656-87de-434995c335b8-run-httpd\") pod \"ceilometer-0\" (UID: \"82b374be-f2cd-4656-87de-434995c335b8\") " pod="openstack/ceilometer-0" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.820990 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82b374be-f2cd-4656-87de-434995c335b8-log-httpd\") pod \"ceilometer-0\" (UID: \"82b374be-f2cd-4656-87de-434995c335b8\") " pod="openstack/ceilometer-0" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.821768 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82b374be-f2cd-4656-87de-434995c335b8-scripts\") pod \"ceilometer-0\" (UID: \"82b374be-f2cd-4656-87de-434995c335b8\") " pod="openstack/ceilometer-0" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.825499 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/82b374be-f2cd-4656-87de-434995c335b8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"82b374be-f2cd-4656-87de-434995c335b8\") " pod="openstack/ceilometer-0" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.825836 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82b374be-f2cd-4656-87de-434995c335b8-config-data\") pod \"ceilometer-0\" (UID: \"82b374be-f2cd-4656-87de-434995c335b8\") " pod="openstack/ceilometer-0" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.828514 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82b374be-f2cd-4656-87de-434995c335b8-scripts\") pod \"ceilometer-0\" (UID: \"82b374be-f2cd-4656-87de-434995c335b8\") " pod="openstack/ceilometer-0" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.832247 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82b374be-f2cd-4656-87de-434995c335b8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"82b374be-f2cd-4656-87de-434995c335b8\") " pod="openstack/ceilometer-0" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.839779 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2rhj\" (UniqueName: \"kubernetes.io/projected/82b374be-f2cd-4656-87de-434995c335b8-kube-api-access-n2rhj\") pod \"ceilometer-0\" (UID: \"82b374be-f2cd-4656-87de-434995c335b8\") " pod="openstack/ceilometer-0" Mar 09 03:03:43 crc kubenswrapper[4901]: I0309 03:03:43.931330 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 03:03:44 crc kubenswrapper[4901]: I0309 03:03:44.128214 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="788c3db9-e44c-4c56-a5d4-392dffa5e21d" path="/var/lib/kubelet/pods/788c3db9-e44c-4c56-a5d4-392dffa5e21d/volumes" Mar 09 03:03:44 crc kubenswrapper[4901]: I0309 03:03:44.396441 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 03:03:44 crc kubenswrapper[4901]: I0309 03:03:44.443002 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-c2nzk" Mar 09 03:03:44 crc kubenswrapper[4901]: I0309 03:03:44.539026 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfeef473-c12a-4af0-9a27-f1fe52a0b144-config-data\") pod \"bfeef473-c12a-4af0-9a27-f1fe52a0b144\" (UID: \"bfeef473-c12a-4af0-9a27-f1fe52a0b144\") " Mar 09 03:03:44 crc kubenswrapper[4901]: I0309 03:03:44.539141 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dl8vz\" (UniqueName: \"kubernetes.io/projected/bfeef473-c12a-4af0-9a27-f1fe52a0b144-kube-api-access-dl8vz\") pod \"bfeef473-c12a-4af0-9a27-f1fe52a0b144\" (UID: \"bfeef473-c12a-4af0-9a27-f1fe52a0b144\") " Mar 09 03:03:44 crc kubenswrapper[4901]: I0309 03:03:44.539203 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfeef473-c12a-4af0-9a27-f1fe52a0b144-scripts\") pod \"bfeef473-c12a-4af0-9a27-f1fe52a0b144\" (UID: \"bfeef473-c12a-4af0-9a27-f1fe52a0b144\") " Mar 09 03:03:44 crc kubenswrapper[4901]: I0309 03:03:44.539281 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfeef473-c12a-4af0-9a27-f1fe52a0b144-combined-ca-bundle\") pod \"bfeef473-c12a-4af0-9a27-f1fe52a0b144\" (UID: \"bfeef473-c12a-4af0-9a27-f1fe52a0b144\") " Mar 09 03:03:44 crc kubenswrapper[4901]: I0309 03:03:44.544841 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfeef473-c12a-4af0-9a27-f1fe52a0b144-kube-api-access-dl8vz" (OuterVolumeSpecName: "kube-api-access-dl8vz") pod "bfeef473-c12a-4af0-9a27-f1fe52a0b144" (UID: "bfeef473-c12a-4af0-9a27-f1fe52a0b144"). InnerVolumeSpecName "kube-api-access-dl8vz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:03:44 crc kubenswrapper[4901]: I0309 03:03:44.545404 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfeef473-c12a-4af0-9a27-f1fe52a0b144-scripts" (OuterVolumeSpecName: "scripts") pod "bfeef473-c12a-4af0-9a27-f1fe52a0b144" (UID: "bfeef473-c12a-4af0-9a27-f1fe52a0b144"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:03:44 crc kubenswrapper[4901]: I0309 03:03:44.565344 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfeef473-c12a-4af0-9a27-f1fe52a0b144-config-data" (OuterVolumeSpecName: "config-data") pod "bfeef473-c12a-4af0-9a27-f1fe52a0b144" (UID: "bfeef473-c12a-4af0-9a27-f1fe52a0b144"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:03:44 crc kubenswrapper[4901]: I0309 03:03:44.567311 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfeef473-c12a-4af0-9a27-f1fe52a0b144-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bfeef473-c12a-4af0-9a27-f1fe52a0b144" (UID: "bfeef473-c12a-4af0-9a27-f1fe52a0b144"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:03:44 crc kubenswrapper[4901]: I0309 03:03:44.641686 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfeef473-c12a-4af0-9a27-f1fe52a0b144-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:03:44 crc kubenswrapper[4901]: I0309 03:03:44.641715 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfeef473-c12a-4af0-9a27-f1fe52a0b144-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:03:44 crc kubenswrapper[4901]: I0309 03:03:44.641724 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfeef473-c12a-4af0-9a27-f1fe52a0b144-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 03:03:44 crc kubenswrapper[4901]: I0309 03:03:44.641743 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dl8vz\" (UniqueName: \"kubernetes.io/projected/bfeef473-c12a-4af0-9a27-f1fe52a0b144-kube-api-access-dl8vz\") on node \"crc\" DevicePath \"\"" Mar 09 03:03:45 crc kubenswrapper[4901]: I0309 03:03:45.221312 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82b374be-f2cd-4656-87de-434995c335b8","Type":"ContainerStarted","Data":"d52cb09ac30319a2c4deb33c5e55021fc2ff481395bdb4392a711410f679a1a7"} Mar 09 03:03:45 crc kubenswrapper[4901]: I0309 03:03:45.224020 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-c2nzk" event={"ID":"bfeef473-c12a-4af0-9a27-f1fe52a0b144","Type":"ContainerDied","Data":"fb2766afb4515aeba13db4f1e390af23926fd0767969f9d8a0a8a96e91fa1421"} Mar 09 03:03:45 crc kubenswrapper[4901]: I0309 03:03:45.224047 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb2766afb4515aeba13db4f1e390af23926fd0767969f9d8a0a8a96e91fa1421" Mar 09 03:03:45 crc kubenswrapper[4901]: I0309 03:03:45.224102 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-c2nzk" Mar 09 03:03:45 crc kubenswrapper[4901]: I0309 03:03:45.316084 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 09 03:03:45 crc kubenswrapper[4901]: E0309 03:03:45.316535 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfeef473-c12a-4af0-9a27-f1fe52a0b144" containerName="nova-cell0-conductor-db-sync" Mar 09 03:03:45 crc kubenswrapper[4901]: I0309 03:03:45.316556 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfeef473-c12a-4af0-9a27-f1fe52a0b144" containerName="nova-cell0-conductor-db-sync" Mar 09 03:03:45 crc kubenswrapper[4901]: I0309 03:03:45.316799 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfeef473-c12a-4af0-9a27-f1fe52a0b144" containerName="nova-cell0-conductor-db-sync" Mar 09 03:03:45 crc kubenswrapper[4901]: I0309 03:03:45.317482 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 09 03:03:45 crc kubenswrapper[4901]: I0309 03:03:45.320015 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-sm9q2" Mar 09 03:03:45 crc kubenswrapper[4901]: I0309 03:03:45.320211 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 09 03:03:45 crc kubenswrapper[4901]: I0309 03:03:45.337809 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 09 03:03:45 crc kubenswrapper[4901]: I0309 03:03:45.464759 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62mrb\" (UniqueName: \"kubernetes.io/projected/412e01f6-e4bb-4bbd-ba88-5726f3e2f87f-kube-api-access-62mrb\") pod \"nova-cell0-conductor-0\" (UID: \"412e01f6-e4bb-4bbd-ba88-5726f3e2f87f\") " pod="openstack/nova-cell0-conductor-0" Mar 09 03:03:45 crc kubenswrapper[4901]: I0309 03:03:45.464913 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/412e01f6-e4bb-4bbd-ba88-5726f3e2f87f-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"412e01f6-e4bb-4bbd-ba88-5726f3e2f87f\") " pod="openstack/nova-cell0-conductor-0" Mar 09 03:03:45 crc kubenswrapper[4901]: I0309 03:03:45.465145 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/412e01f6-e4bb-4bbd-ba88-5726f3e2f87f-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"412e01f6-e4bb-4bbd-ba88-5726f3e2f87f\") " pod="openstack/nova-cell0-conductor-0" Mar 09 03:03:45 crc kubenswrapper[4901]: I0309 03:03:45.567245 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62mrb\" (UniqueName: \"kubernetes.io/projected/412e01f6-e4bb-4bbd-ba88-5726f3e2f87f-kube-api-access-62mrb\") pod \"nova-cell0-conductor-0\" (UID: \"412e01f6-e4bb-4bbd-ba88-5726f3e2f87f\") " pod="openstack/nova-cell0-conductor-0" Mar 09 03:03:45 crc kubenswrapper[4901]: I0309 03:03:45.567309 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/412e01f6-e4bb-4bbd-ba88-5726f3e2f87f-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"412e01f6-e4bb-4bbd-ba88-5726f3e2f87f\") " pod="openstack/nova-cell0-conductor-0" Mar 09 03:03:45 crc kubenswrapper[4901]: I0309 03:03:45.567399 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/412e01f6-e4bb-4bbd-ba88-5726f3e2f87f-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"412e01f6-e4bb-4bbd-ba88-5726f3e2f87f\") " pod="openstack/nova-cell0-conductor-0" Mar 09 03:03:45 crc kubenswrapper[4901]: I0309 03:03:45.571730 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/412e01f6-e4bb-4bbd-ba88-5726f3e2f87f-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"412e01f6-e4bb-4bbd-ba88-5726f3e2f87f\") " pod="openstack/nova-cell0-conductor-0" Mar 09 03:03:45 crc kubenswrapper[4901]: I0309 03:03:45.574538 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/412e01f6-e4bb-4bbd-ba88-5726f3e2f87f-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"412e01f6-e4bb-4bbd-ba88-5726f3e2f87f\") " pod="openstack/nova-cell0-conductor-0" Mar 09 03:03:45 crc kubenswrapper[4901]: I0309 03:03:45.584877 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62mrb\" (UniqueName: \"kubernetes.io/projected/412e01f6-e4bb-4bbd-ba88-5726f3e2f87f-kube-api-access-62mrb\") pod \"nova-cell0-conductor-0\" (UID: \"412e01f6-e4bb-4bbd-ba88-5726f3e2f87f\") " pod="openstack/nova-cell0-conductor-0" Mar 09 03:03:45 crc kubenswrapper[4901]: I0309 03:03:45.638040 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 09 03:03:46 crc kubenswrapper[4901]: I0309 03:03:46.129865 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 09 03:03:46 crc kubenswrapper[4901]: I0309 03:03:46.237663 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82b374be-f2cd-4656-87de-434995c335b8","Type":"ContainerStarted","Data":"194f85138ab8744ab8a124c0966ef227053b765ea797045a5a24001b90c9697f"} Mar 09 03:03:46 crc kubenswrapper[4901]: I0309 03:03:46.237727 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82b374be-f2cd-4656-87de-434995c335b8","Type":"ContainerStarted","Data":"be64a084be3568bf9e8eacb212506ae565f9cb7c53a2ab9b4191564a854b1707"} Mar 09 03:03:46 crc kubenswrapper[4901]: I0309 03:03:46.238757 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"412e01f6-e4bb-4bbd-ba88-5726f3e2f87f","Type":"ContainerStarted","Data":"6b9e33445c19373191e94b111d4c2b86f6d395ccfd04820e7d1d5abe3743aa0e"} Mar 09 03:03:47 crc kubenswrapper[4901]: I0309 03:03:47.249991 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"412e01f6-e4bb-4bbd-ba88-5726f3e2f87f","Type":"ContainerStarted","Data":"2a8500133ddbae16882734d17dcfeac24a437220c873e5c49b9335461b23a2a0"} Mar 09 03:03:47 crc kubenswrapper[4901]: I0309 03:03:47.250437 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 09 03:03:47 crc kubenswrapper[4901]: I0309 03:03:47.272280 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.272260848 podStartE2EDuration="2.272260848s" podCreationTimestamp="2026-03-09 03:03:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 03:03:47.267824556 +0000 UTC m=+1351.857488298" watchObservedRunningTime="2026-03-09 03:03:47.272260848 +0000 UTC m=+1351.861924590" Mar 09 03:03:48 crc kubenswrapper[4901]: I0309 03:03:48.262969 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82b374be-f2cd-4656-87de-434995c335b8","Type":"ContainerStarted","Data":"327e1760b929a425e6deaf746b7c294662305acfc7f358f0737461a3d46dc213"} Mar 09 03:03:50 crc kubenswrapper[4901]: I0309 03:03:50.294862 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82b374be-f2cd-4656-87de-434995c335b8","Type":"ContainerStarted","Data":"490fc515cecfddb8c4df164cdf67c0f9aa81056209eb1f052da2c95b40961783"} Mar 09 03:03:50 crc kubenswrapper[4901]: I0309 03:03:50.295795 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 03:03:50 crc kubenswrapper[4901]: I0309 03:03:50.327124 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.91213613 podStartE2EDuration="7.327106288s" podCreationTimestamp="2026-03-09 03:03:43 +0000 UTC" firstStartedPulling="2026-03-09 03:03:44.416722496 +0000 UTC m=+1349.006386238" lastFinishedPulling="2026-03-09 03:03:49.831692654 +0000 UTC m=+1354.421356396" observedRunningTime="2026-03-09 03:03:50.320962443 +0000 UTC m=+1354.910626185" watchObservedRunningTime="2026-03-09 03:03:50.327106288 +0000 UTC m=+1354.916770020" Mar 09 03:03:55 crc kubenswrapper[4901]: I0309 03:03:55.688069 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.266371 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-lb2xt"] Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.267507 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-lb2xt" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.269954 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.270133 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.279039 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-lb2xt"] Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.435915 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-lb2xt\" (UID: \"aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5\") " pod="openstack/nova-cell0-cell-mapping-lb2xt" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.436042 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czk72\" (UniqueName: \"kubernetes.io/projected/aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5-kube-api-access-czk72\") pod \"nova-cell0-cell-mapping-lb2xt\" (UID: \"aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5\") " pod="openstack/nova-cell0-cell-mapping-lb2xt" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.436069 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5-config-data\") pod \"nova-cell0-cell-mapping-lb2xt\" (UID: \"aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5\") " pod="openstack/nova-cell0-cell-mapping-lb2xt" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.436132 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5-scripts\") pod \"nova-cell0-cell-mapping-lb2xt\" (UID: \"aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5\") " pod="openstack/nova-cell0-cell-mapping-lb2xt" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.472486 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.473905 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.476452 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.486409 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.487667 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.501591 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.504814 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.535408 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.538649 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-lb2xt\" (UID: \"aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5\") " pod="openstack/nova-cell0-cell-mapping-lb2xt" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.538730 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c442232-3859-43a4-9f6a-bc330c647b14-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3c442232-3859-43a4-9f6a-bc330c647b14\") " pod="openstack/nova-api-0" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.538771 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jz7c\" (UniqueName: \"kubernetes.io/projected/b7764b86-48f0-4058-a60d-9275796fe58c-kube-api-access-6jz7c\") pod \"nova-scheduler-0\" (UID: \"b7764b86-48f0-4058-a60d-9275796fe58c\") " pod="openstack/nova-scheduler-0" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.538815 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5-config-data\") pod \"nova-cell0-cell-mapping-lb2xt\" (UID: \"aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5\") " pod="openstack/nova-cell0-cell-mapping-lb2xt" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.538833 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czk72\" (UniqueName: \"kubernetes.io/projected/aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5-kube-api-access-czk72\") pod \"nova-cell0-cell-mapping-lb2xt\" (UID: \"aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5\") " pod="openstack/nova-cell0-cell-mapping-lb2xt" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.538890 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c442232-3859-43a4-9f6a-bc330c647b14-config-data\") pod \"nova-api-0\" (UID: \"3c442232-3859-43a4-9f6a-bc330c647b14\") " pod="openstack/nova-api-0" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.538910 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c442232-3859-43a4-9f6a-bc330c647b14-logs\") pod \"nova-api-0\" (UID: \"3c442232-3859-43a4-9f6a-bc330c647b14\") " pod="openstack/nova-api-0" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.538925 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7764b86-48f0-4058-a60d-9275796fe58c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b7764b86-48f0-4058-a60d-9275796fe58c\") " pod="openstack/nova-scheduler-0" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.538949 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5-scripts\") pod \"nova-cell0-cell-mapping-lb2xt\" (UID: \"aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5\") " pod="openstack/nova-cell0-cell-mapping-lb2xt" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.538987 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96tc6\" (UniqueName: \"kubernetes.io/projected/3c442232-3859-43a4-9f6a-bc330c647b14-kube-api-access-96tc6\") pod \"nova-api-0\" (UID: \"3c442232-3859-43a4-9f6a-bc330c647b14\") " pod="openstack/nova-api-0" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.539002 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7764b86-48f0-4058-a60d-9275796fe58c-config-data\") pod \"nova-scheduler-0\" (UID: \"b7764b86-48f0-4058-a60d-9275796fe58c\") " pod="openstack/nova-scheduler-0" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.544845 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5-scripts\") pod \"nova-cell0-cell-mapping-lb2xt\" (UID: \"aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5\") " pod="openstack/nova-cell0-cell-mapping-lb2xt" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.557359 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5-config-data\") pod \"nova-cell0-cell-mapping-lb2xt\" (UID: \"aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5\") " pod="openstack/nova-cell0-cell-mapping-lb2xt" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.557808 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-lb2xt\" (UID: \"aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5\") " pod="openstack/nova-cell0-cell-mapping-lb2xt" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.588211 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czk72\" (UniqueName: \"kubernetes.io/projected/aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5-kube-api-access-czk72\") pod \"nova-cell0-cell-mapping-lb2xt\" (UID: \"aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5\") " pod="openstack/nova-cell0-cell-mapping-lb2xt" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.613280 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.615080 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.621998 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.629612 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.637379 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.638965 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.640420 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jz7c\" (UniqueName: \"kubernetes.io/projected/b7764b86-48f0-4058-a60d-9275796fe58c-kube-api-access-6jz7c\") pod \"nova-scheduler-0\" (UID: \"b7764b86-48f0-4058-a60d-9275796fe58c\") " pod="openstack/nova-scheduler-0" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.640462 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff45df6-72fc-4db9-abe9-c5fa3bdc483b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2ff45df6-72fc-4db9-abe9-c5fa3bdc483b\") " pod="openstack/nova-metadata-0" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.640489 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ff45df6-72fc-4db9-abe9-c5fa3bdc483b-logs\") pod \"nova-metadata-0\" (UID: \"2ff45df6-72fc-4db9-abe9-c5fa3bdc483b\") " pod="openstack/nova-metadata-0" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.640509 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnz4p\" (UniqueName: \"kubernetes.io/projected/2ff45df6-72fc-4db9-abe9-c5fa3bdc483b-kube-api-access-bnz4p\") pod \"nova-metadata-0\" (UID: \"2ff45df6-72fc-4db9-abe9-c5fa3bdc483b\") " pod="openstack/nova-metadata-0" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.640527 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8155ec57-49a6-4d30-930e-9f10e3f28d17-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8155ec57-49a6-4d30-930e-9f10e3f28d17\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.640552 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c442232-3859-43a4-9f6a-bc330c647b14-config-data\") pod \"nova-api-0\" (UID: \"3c442232-3859-43a4-9f6a-bc330c647b14\") " pod="openstack/nova-api-0" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.640571 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c442232-3859-43a4-9f6a-bc330c647b14-logs\") pod \"nova-api-0\" (UID: \"3c442232-3859-43a4-9f6a-bc330c647b14\") " pod="openstack/nova-api-0" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.640586 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7764b86-48f0-4058-a60d-9275796fe58c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b7764b86-48f0-4058-a60d-9275796fe58c\") " pod="openstack/nova-scheduler-0" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.640606 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ff45df6-72fc-4db9-abe9-c5fa3bdc483b-config-data\") pod \"nova-metadata-0\" (UID: \"2ff45df6-72fc-4db9-abe9-c5fa3bdc483b\") " pod="openstack/nova-metadata-0" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.640622 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnpwr\" (UniqueName: \"kubernetes.io/projected/8155ec57-49a6-4d30-930e-9f10e3f28d17-kube-api-access-nnpwr\") pod \"nova-cell1-novncproxy-0\" (UID: \"8155ec57-49a6-4d30-930e-9f10e3f28d17\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.640660 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96tc6\" (UniqueName: \"kubernetes.io/projected/3c442232-3859-43a4-9f6a-bc330c647b14-kube-api-access-96tc6\") pod \"nova-api-0\" (UID: \"3c442232-3859-43a4-9f6a-bc330c647b14\") " pod="openstack/nova-api-0" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.640676 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7764b86-48f0-4058-a60d-9275796fe58c-config-data\") pod \"nova-scheduler-0\" (UID: \"b7764b86-48f0-4058-a60d-9275796fe58c\") " pod="openstack/nova-scheduler-0" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.640693 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8155ec57-49a6-4d30-930e-9f10e3f28d17-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8155ec57-49a6-4d30-930e-9f10e3f28d17\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.640745 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c442232-3859-43a4-9f6a-bc330c647b14-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3c442232-3859-43a4-9f6a-bc330c647b14\") " pod="openstack/nova-api-0" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.641762 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c442232-3859-43a4-9f6a-bc330c647b14-logs\") pod \"nova-api-0\" (UID: \"3c442232-3859-43a4-9f6a-bc330c647b14\") " pod="openstack/nova-api-0" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.659899 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c442232-3859-43a4-9f6a-bc330c647b14-config-data\") pod \"nova-api-0\" (UID: \"3c442232-3859-43a4-9f6a-bc330c647b14\") " pod="openstack/nova-api-0" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.660630 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7764b86-48f0-4058-a60d-9275796fe58c-config-data\") pod \"nova-scheduler-0\" (UID: \"b7764b86-48f0-4058-a60d-9275796fe58c\") " pod="openstack/nova-scheduler-0" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.661327 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.664919 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7764b86-48f0-4058-a60d-9275796fe58c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b7764b86-48f0-4058-a60d-9275796fe58c\") " pod="openstack/nova-scheduler-0" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.673533 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96tc6\" (UniqueName: \"kubernetes.io/projected/3c442232-3859-43a4-9f6a-bc330c647b14-kube-api-access-96tc6\") pod \"nova-api-0\" (UID: \"3c442232-3859-43a4-9f6a-bc330c647b14\") " pod="openstack/nova-api-0" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.678866 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jz7c\" (UniqueName: \"kubernetes.io/projected/b7764b86-48f0-4058-a60d-9275796fe58c-kube-api-access-6jz7c\") pod \"nova-scheduler-0\" (UID: \"b7764b86-48f0-4058-a60d-9275796fe58c\") " pod="openstack/nova-scheduler-0" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.685420 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c442232-3859-43a4-9f6a-bc330c647b14-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3c442232-3859-43a4-9f6a-bc330c647b14\") " pod="openstack/nova-api-0" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.690791 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.742128 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8155ec57-49a6-4d30-930e-9f10e3f28d17-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8155ec57-49a6-4d30-930e-9f10e3f28d17\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.744997 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ff45df6-72fc-4db9-abe9-c5fa3bdc483b-config-data\") pod \"nova-metadata-0\" (UID: \"2ff45df6-72fc-4db9-abe9-c5fa3bdc483b\") " pod="openstack/nova-metadata-0" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.745037 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnpwr\" (UniqueName: \"kubernetes.io/projected/8155ec57-49a6-4d30-930e-9f10e3f28d17-kube-api-access-nnpwr\") pod \"nova-cell1-novncproxy-0\" (UID: \"8155ec57-49a6-4d30-930e-9f10e3f28d17\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.745140 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8155ec57-49a6-4d30-930e-9f10e3f28d17-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8155ec57-49a6-4d30-930e-9f10e3f28d17\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.745448 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff45df6-72fc-4db9-abe9-c5fa3bdc483b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2ff45df6-72fc-4db9-abe9-c5fa3bdc483b\") " pod="openstack/nova-metadata-0" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.745499 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ff45df6-72fc-4db9-abe9-c5fa3bdc483b-logs\") pod \"nova-metadata-0\" (UID: \"2ff45df6-72fc-4db9-abe9-c5fa3bdc483b\") " pod="openstack/nova-metadata-0" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.745527 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnz4p\" (UniqueName: \"kubernetes.io/projected/2ff45df6-72fc-4db9-abe9-c5fa3bdc483b-kube-api-access-bnz4p\") pod \"nova-metadata-0\" (UID: \"2ff45df6-72fc-4db9-abe9-c5fa3bdc483b\") " pod="openstack/nova-metadata-0" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.746814 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ff45df6-72fc-4db9-abe9-c5fa3bdc483b-logs\") pod \"nova-metadata-0\" (UID: \"2ff45df6-72fc-4db9-abe9-c5fa3bdc483b\") " pod="openstack/nova-metadata-0" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.750427 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8155ec57-49a6-4d30-930e-9f10e3f28d17-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8155ec57-49a6-4d30-930e-9f10e3f28d17\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.750497 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ff45df6-72fc-4db9-abe9-c5fa3bdc483b-config-data\") pod \"nova-metadata-0\" (UID: \"2ff45df6-72fc-4db9-abe9-c5fa3bdc483b\") " pod="openstack/nova-metadata-0" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.750585 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8155ec57-49a6-4d30-930e-9f10e3f28d17-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8155ec57-49a6-4d30-930e-9f10e3f28d17\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.754720 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff45df6-72fc-4db9-abe9-c5fa3bdc483b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2ff45df6-72fc-4db9-abe9-c5fa3bdc483b\") " pod="openstack/nova-metadata-0" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.762625 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7559df67df-7q7jp"] Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.764148 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7559df67df-7q7jp" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.770314 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnpwr\" (UniqueName: \"kubernetes.io/projected/8155ec57-49a6-4d30-930e-9f10e3f28d17-kube-api-access-nnpwr\") pod \"nova-cell1-novncproxy-0\" (UID: \"8155ec57-49a6-4d30-930e-9f10e3f28d17\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.775306 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7559df67df-7q7jp"] Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.778937 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnz4p\" (UniqueName: \"kubernetes.io/projected/2ff45df6-72fc-4db9-abe9-c5fa3bdc483b-kube-api-access-bnz4p\") pod \"nova-metadata-0\" (UID: \"2ff45df6-72fc-4db9-abe9-c5fa3bdc483b\") " pod="openstack/nova-metadata-0" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.794973 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.806866 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.846354 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f0c1248f-7953-4a19-a4ff-f7b717bd699b-ovsdbserver-sb\") pod \"dnsmasq-dns-7559df67df-7q7jp\" (UID: \"f0c1248f-7953-4a19-a4ff-f7b717bd699b\") " pod="openstack/dnsmasq-dns-7559df67df-7q7jp" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.846391 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0c1248f-7953-4a19-a4ff-f7b717bd699b-config\") pod \"dnsmasq-dns-7559df67df-7q7jp\" (UID: \"f0c1248f-7953-4a19-a4ff-f7b717bd699b\") " pod="openstack/dnsmasq-dns-7559df67df-7q7jp" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.846415 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0c1248f-7953-4a19-a4ff-f7b717bd699b-dns-svc\") pod \"dnsmasq-dns-7559df67df-7q7jp\" (UID: \"f0c1248f-7953-4a19-a4ff-f7b717bd699b\") " pod="openstack/dnsmasq-dns-7559df67df-7q7jp" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.846437 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6l2g\" (UniqueName: \"kubernetes.io/projected/f0c1248f-7953-4a19-a4ff-f7b717bd699b-kube-api-access-h6l2g\") pod \"dnsmasq-dns-7559df67df-7q7jp\" (UID: \"f0c1248f-7953-4a19-a4ff-f7b717bd699b\") " pod="openstack/dnsmasq-dns-7559df67df-7q7jp" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.846621 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f0c1248f-7953-4a19-a4ff-f7b717bd699b-ovsdbserver-nb\") pod \"dnsmasq-dns-7559df67df-7q7jp\" (UID: \"f0c1248f-7953-4a19-a4ff-f7b717bd699b\") " pod="openstack/dnsmasq-dns-7559df67df-7q7jp" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.846646 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f0c1248f-7953-4a19-a4ff-f7b717bd699b-dns-swift-storage-0\") pod \"dnsmasq-dns-7559df67df-7q7jp\" (UID: \"f0c1248f-7953-4a19-a4ff-f7b717bd699b\") " pod="openstack/dnsmasq-dns-7559df67df-7q7jp" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.884920 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-lb2xt" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.947824 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6l2g\" (UniqueName: \"kubernetes.io/projected/f0c1248f-7953-4a19-a4ff-f7b717bd699b-kube-api-access-h6l2g\") pod \"dnsmasq-dns-7559df67df-7q7jp\" (UID: \"f0c1248f-7953-4a19-a4ff-f7b717bd699b\") " pod="openstack/dnsmasq-dns-7559df67df-7q7jp" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.947974 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f0c1248f-7953-4a19-a4ff-f7b717bd699b-ovsdbserver-nb\") pod \"dnsmasq-dns-7559df67df-7q7jp\" (UID: \"f0c1248f-7953-4a19-a4ff-f7b717bd699b\") " pod="openstack/dnsmasq-dns-7559df67df-7q7jp" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.948005 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f0c1248f-7953-4a19-a4ff-f7b717bd699b-dns-swift-storage-0\") pod \"dnsmasq-dns-7559df67df-7q7jp\" (UID: \"f0c1248f-7953-4a19-a4ff-f7b717bd699b\") " pod="openstack/dnsmasq-dns-7559df67df-7q7jp" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.948036 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f0c1248f-7953-4a19-a4ff-f7b717bd699b-ovsdbserver-sb\") pod \"dnsmasq-dns-7559df67df-7q7jp\" (UID: \"f0c1248f-7953-4a19-a4ff-f7b717bd699b\") " pod="openstack/dnsmasq-dns-7559df67df-7q7jp" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.948057 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0c1248f-7953-4a19-a4ff-f7b717bd699b-config\") pod \"dnsmasq-dns-7559df67df-7q7jp\" (UID: \"f0c1248f-7953-4a19-a4ff-f7b717bd699b\") " pod="openstack/dnsmasq-dns-7559df67df-7q7jp" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.948075 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0c1248f-7953-4a19-a4ff-f7b717bd699b-dns-svc\") pod \"dnsmasq-dns-7559df67df-7q7jp\" (UID: \"f0c1248f-7953-4a19-a4ff-f7b717bd699b\") " pod="openstack/dnsmasq-dns-7559df67df-7q7jp" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.949046 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0c1248f-7953-4a19-a4ff-f7b717bd699b-dns-svc\") pod \"dnsmasq-dns-7559df67df-7q7jp\" (UID: \"f0c1248f-7953-4a19-a4ff-f7b717bd699b\") " pod="openstack/dnsmasq-dns-7559df67df-7q7jp" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.949355 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0c1248f-7953-4a19-a4ff-f7b717bd699b-config\") pod \"dnsmasq-dns-7559df67df-7q7jp\" (UID: \"f0c1248f-7953-4a19-a4ff-f7b717bd699b\") " pod="openstack/dnsmasq-dns-7559df67df-7q7jp" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.949484 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f0c1248f-7953-4a19-a4ff-f7b717bd699b-dns-swift-storage-0\") pod \"dnsmasq-dns-7559df67df-7q7jp\" (UID: \"f0c1248f-7953-4a19-a4ff-f7b717bd699b\") " pod="openstack/dnsmasq-dns-7559df67df-7q7jp" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.949663 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f0c1248f-7953-4a19-a4ff-f7b717bd699b-ovsdbserver-nb\") pod \"dnsmasq-dns-7559df67df-7q7jp\" (UID: \"f0c1248f-7953-4a19-a4ff-f7b717bd699b\") " pod="openstack/dnsmasq-dns-7559df67df-7q7jp" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.951740 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f0c1248f-7953-4a19-a4ff-f7b717bd699b-ovsdbserver-sb\") pod \"dnsmasq-dns-7559df67df-7q7jp\" (UID: \"f0c1248f-7953-4a19-a4ff-f7b717bd699b\") " pod="openstack/dnsmasq-dns-7559df67df-7q7jp" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.964962 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 09 03:03:56 crc kubenswrapper[4901]: I0309 03:03:56.989444 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6l2g\" (UniqueName: \"kubernetes.io/projected/f0c1248f-7953-4a19-a4ff-f7b717bd699b-kube-api-access-h6l2g\") pod \"dnsmasq-dns-7559df67df-7q7jp\" (UID: \"f0c1248f-7953-4a19-a4ff-f7b717bd699b\") " pod="openstack/dnsmasq-dns-7559df67df-7q7jp" Mar 09 03:03:57 crc kubenswrapper[4901]: I0309 03:03:57.040328 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 03:03:57 crc kubenswrapper[4901]: I0309 03:03:57.107775 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7559df67df-7q7jp" Mar 09 03:03:57 crc kubenswrapper[4901]: I0309 03:03:57.297527 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-jv9hz"] Mar 09 03:03:57 crc kubenswrapper[4901]: I0309 03:03:57.299075 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-jv9hz" Mar 09 03:03:57 crc kubenswrapper[4901]: I0309 03:03:57.301979 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 09 03:03:57 crc kubenswrapper[4901]: I0309 03:03:57.302145 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 09 03:03:57 crc kubenswrapper[4901]: I0309 03:03:57.307316 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-jv9hz"] Mar 09 03:03:57 crc kubenswrapper[4901]: I0309 03:03:57.355381 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03fb66e1-3428-40d7-a0b4-b8a7938ff800-config-data\") pod \"nova-cell1-conductor-db-sync-jv9hz\" (UID: \"03fb66e1-3428-40d7-a0b4-b8a7938ff800\") " pod="openstack/nova-cell1-conductor-db-sync-jv9hz" Mar 09 03:03:57 crc kubenswrapper[4901]: I0309 03:03:57.355442 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03fb66e1-3428-40d7-a0b4-b8a7938ff800-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-jv9hz\" (UID: \"03fb66e1-3428-40d7-a0b4-b8a7938ff800\") " pod="openstack/nova-cell1-conductor-db-sync-jv9hz" Mar 09 03:03:57 crc kubenswrapper[4901]: I0309 03:03:57.355498 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03fb66e1-3428-40d7-a0b4-b8a7938ff800-scripts\") pod \"nova-cell1-conductor-db-sync-jv9hz\" (UID: \"03fb66e1-3428-40d7-a0b4-b8a7938ff800\") " pod="openstack/nova-cell1-conductor-db-sync-jv9hz" Mar 09 03:03:57 crc kubenswrapper[4901]: I0309 03:03:57.355594 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwjxk\" (UniqueName: \"kubernetes.io/projected/03fb66e1-3428-40d7-a0b4-b8a7938ff800-kube-api-access-cwjxk\") pod \"nova-cell1-conductor-db-sync-jv9hz\" (UID: \"03fb66e1-3428-40d7-a0b4-b8a7938ff800\") " pod="openstack/nova-cell1-conductor-db-sync-jv9hz" Mar 09 03:03:57 crc kubenswrapper[4901]: W0309 03:03:57.421423 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c442232_3859_43a4_9f6a_bc330c647b14.slice/crio-071d4808f8beab3ebc139bcb783c4e85a52870c24a0059bc855d8d99700b21bc WatchSource:0}: Error finding container 071d4808f8beab3ebc139bcb783c4e85a52870c24a0059bc855d8d99700b21bc: Status 404 returned error can't find the container with id 071d4808f8beab3ebc139bcb783c4e85a52870c24a0059bc855d8d99700b21bc Mar 09 03:03:57 crc kubenswrapper[4901]: I0309 03:03:57.425930 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 09 03:03:57 crc kubenswrapper[4901]: I0309 03:03:57.440756 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-lb2xt"] Mar 09 03:03:57 crc kubenswrapper[4901]: I0309 03:03:57.457982 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwjxk\" (UniqueName: \"kubernetes.io/projected/03fb66e1-3428-40d7-a0b4-b8a7938ff800-kube-api-access-cwjxk\") pod \"nova-cell1-conductor-db-sync-jv9hz\" (UID: \"03fb66e1-3428-40d7-a0b4-b8a7938ff800\") " pod="openstack/nova-cell1-conductor-db-sync-jv9hz" Mar 09 03:03:57 crc kubenswrapper[4901]: I0309 03:03:57.458071 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03fb66e1-3428-40d7-a0b4-b8a7938ff800-config-data\") pod \"nova-cell1-conductor-db-sync-jv9hz\" (UID: \"03fb66e1-3428-40d7-a0b4-b8a7938ff800\") " pod="openstack/nova-cell1-conductor-db-sync-jv9hz" Mar 09 03:03:57 crc kubenswrapper[4901]: I0309 03:03:57.458116 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03fb66e1-3428-40d7-a0b4-b8a7938ff800-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-jv9hz\" (UID: \"03fb66e1-3428-40d7-a0b4-b8a7938ff800\") " pod="openstack/nova-cell1-conductor-db-sync-jv9hz" Mar 09 03:03:57 crc kubenswrapper[4901]: I0309 03:03:57.458170 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03fb66e1-3428-40d7-a0b4-b8a7938ff800-scripts\") pod \"nova-cell1-conductor-db-sync-jv9hz\" (UID: \"03fb66e1-3428-40d7-a0b4-b8a7938ff800\") " pod="openstack/nova-cell1-conductor-db-sync-jv9hz" Mar 09 03:03:57 crc kubenswrapper[4901]: I0309 03:03:57.461124 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 03:03:57 crc kubenswrapper[4901]: I0309 03:03:57.466102 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03fb66e1-3428-40d7-a0b4-b8a7938ff800-scripts\") pod \"nova-cell1-conductor-db-sync-jv9hz\" (UID: \"03fb66e1-3428-40d7-a0b4-b8a7938ff800\") " pod="openstack/nova-cell1-conductor-db-sync-jv9hz" Mar 09 03:03:57 crc kubenswrapper[4901]: I0309 03:03:57.466332 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03fb66e1-3428-40d7-a0b4-b8a7938ff800-config-data\") pod \"nova-cell1-conductor-db-sync-jv9hz\" (UID: \"03fb66e1-3428-40d7-a0b4-b8a7938ff800\") " pod="openstack/nova-cell1-conductor-db-sync-jv9hz" Mar 09 03:03:57 crc kubenswrapper[4901]: I0309 03:03:57.466658 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03fb66e1-3428-40d7-a0b4-b8a7938ff800-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-jv9hz\" (UID: \"03fb66e1-3428-40d7-a0b4-b8a7938ff800\") " pod="openstack/nova-cell1-conductor-db-sync-jv9hz" Mar 09 03:03:57 crc kubenswrapper[4901]: I0309 03:03:57.473679 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwjxk\" (UniqueName: \"kubernetes.io/projected/03fb66e1-3428-40d7-a0b4-b8a7938ff800-kube-api-access-cwjxk\") pod \"nova-cell1-conductor-db-sync-jv9hz\" (UID: \"03fb66e1-3428-40d7-a0b4-b8a7938ff800\") " pod="openstack/nova-cell1-conductor-db-sync-jv9hz" Mar 09 03:03:57 crc kubenswrapper[4901]: I0309 03:03:57.618630 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-jv9hz" Mar 09 03:03:57 crc kubenswrapper[4901]: I0309 03:03:57.710839 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 03:03:57 crc kubenswrapper[4901]: I0309 03:03:57.730266 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 03:03:57 crc kubenswrapper[4901]: I0309 03:03:57.737286 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7559df67df-7q7jp"] Mar 09 03:03:58 crc kubenswrapper[4901]: I0309 03:03:58.122940 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-jv9hz"] Mar 09 03:03:58 crc kubenswrapper[4901]: I0309 03:03:58.379141 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2ff45df6-72fc-4db9-abe9-c5fa3bdc483b","Type":"ContainerStarted","Data":"25fc2da38d0a39e81857e99db8997a6bf9e5564c26a20611cc29a403b4e3b0bb"} Mar 09 03:03:58 crc kubenswrapper[4901]: I0309 03:03:58.380755 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8155ec57-49a6-4d30-930e-9f10e3f28d17","Type":"ContainerStarted","Data":"6353c34d78ff3563cd0844b477ac08d81a1bc39a1bf3831ad829e47004f81e4c"} Mar 09 03:03:58 crc kubenswrapper[4901]: I0309 03:03:58.382671 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-jv9hz" event={"ID":"03fb66e1-3428-40d7-a0b4-b8a7938ff800","Type":"ContainerStarted","Data":"5d9e5ccec85a35a04d0a76fa028d5ad72a893c23a40e4f14f4d92ce0a8b5962c"} Mar 09 03:03:58 crc kubenswrapper[4901]: I0309 03:03:58.382698 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-jv9hz" event={"ID":"03fb66e1-3428-40d7-a0b4-b8a7938ff800","Type":"ContainerStarted","Data":"6daa5b108c01deec5d19f2ca4647b81cf00879c7dcb1fd8a91f3c95c06e3c69c"} Mar 09 03:03:58 crc kubenswrapper[4901]: I0309 03:03:58.390234 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-lb2xt" event={"ID":"aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5","Type":"ContainerStarted","Data":"7eb050b8062d25a6a16cf0cb4f42df92e684eebb712dff6d0628185b0a853346"} Mar 09 03:03:58 crc kubenswrapper[4901]: I0309 03:03:58.390275 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-lb2xt" event={"ID":"aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5","Type":"ContainerStarted","Data":"4a44912171118290a3155a3ed5211cf1797c82623d255dc66cc3f1bcb28ef1c6"} Mar 09 03:03:58 crc kubenswrapper[4901]: I0309 03:03:58.392160 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3c442232-3859-43a4-9f6a-bc330c647b14","Type":"ContainerStarted","Data":"071d4808f8beab3ebc139bcb783c4e85a52870c24a0059bc855d8d99700b21bc"} Mar 09 03:03:58 crc kubenswrapper[4901]: I0309 03:03:58.393806 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b7764b86-48f0-4058-a60d-9275796fe58c","Type":"ContainerStarted","Data":"88a9ee72d697005f633fa182795e673a83604d8f8f3ff10163899fc0259d33ad"} Mar 09 03:03:58 crc kubenswrapper[4901]: I0309 03:03:58.396489 4901 generic.go:334] "Generic (PLEG): container finished" podID="f0c1248f-7953-4a19-a4ff-f7b717bd699b" containerID="07bf31610f8975ae92a0e27b9fcd014493e2f2ff22c12620f19d45ee99431eff" exitCode=0 Mar 09 03:03:58 crc kubenswrapper[4901]: I0309 03:03:58.396532 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7559df67df-7q7jp" event={"ID":"f0c1248f-7953-4a19-a4ff-f7b717bd699b","Type":"ContainerDied","Data":"07bf31610f8975ae92a0e27b9fcd014493e2f2ff22c12620f19d45ee99431eff"} Mar 09 03:03:58 crc kubenswrapper[4901]: I0309 03:03:58.396555 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7559df67df-7q7jp" event={"ID":"f0c1248f-7953-4a19-a4ff-f7b717bd699b","Type":"ContainerStarted","Data":"8bc7e7bcfa9ea0b7f46346767b5526b50417a3194a64db5883b1b935e93b9a82"} Mar 09 03:03:58 crc kubenswrapper[4901]: I0309 03:03:58.405890 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-jv9hz" podStartSLOduration=1.405871562 podStartE2EDuration="1.405871562s" podCreationTimestamp="2026-03-09 03:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 03:03:58.398142677 +0000 UTC m=+1362.987806409" watchObservedRunningTime="2026-03-09 03:03:58.405871562 +0000 UTC m=+1362.995535294" Mar 09 03:03:58 crc kubenswrapper[4901]: I0309 03:03:58.422693 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-lb2xt" podStartSLOduration=2.422675006 podStartE2EDuration="2.422675006s" podCreationTimestamp="2026-03-09 03:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 03:03:58.411701209 +0000 UTC m=+1363.001364951" watchObservedRunningTime="2026-03-09 03:03:58.422675006 +0000 UTC m=+1363.012338738" Mar 09 03:03:59 crc kubenswrapper[4901]: I0309 03:03:59.416127 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7559df67df-7q7jp" event={"ID":"f0c1248f-7953-4a19-a4ff-f7b717bd699b","Type":"ContainerStarted","Data":"da3dd0236d7e00ec5f51e67a138e5431887dd87e41d62b3f23daf194a910085a"} Mar 09 03:04:00 crc kubenswrapper[4901]: I0309 03:04:00.132481 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7559df67df-7q7jp" podStartSLOduration=4.132457031 podStartE2EDuration="4.132457031s" podCreationTimestamp="2026-03-09 03:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 03:03:59.440689183 +0000 UTC m=+1364.030352935" watchObservedRunningTime="2026-03-09 03:04:00.132457031 +0000 UTC m=+1364.722120753" Mar 09 03:04:00 crc kubenswrapper[4901]: I0309 03:04:00.158588 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550424-rt4jl"] Mar 09 03:04:00 crc kubenswrapper[4901]: I0309 03:04:00.160624 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550424-rt4jl" Mar 09 03:04:00 crc kubenswrapper[4901]: I0309 03:04:00.169054 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 03:04:00 crc kubenswrapper[4901]: I0309 03:04:00.170709 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 03:04:00 crc kubenswrapper[4901]: I0309 03:04:00.170660 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lq6vs" Mar 09 03:04:00 crc kubenswrapper[4901]: I0309 03:04:00.174983 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 03:04:00 crc kubenswrapper[4901]: I0309 03:04:00.209733 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550424-rt4jl"] Mar 09 03:04:00 crc kubenswrapper[4901]: I0309 03:04:00.235902 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 03:04:00 crc kubenswrapper[4901]: I0309 03:04:00.334978 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjp9j\" (UniqueName: \"kubernetes.io/projected/725ee9e1-68e0-4756-bfc7-e4d209aeeae8-kube-api-access-kjp9j\") pod \"auto-csr-approver-29550424-rt4jl\" (UID: \"725ee9e1-68e0-4756-bfc7-e4d209aeeae8\") " pod="openshift-infra/auto-csr-approver-29550424-rt4jl" Mar 09 03:04:00 crc kubenswrapper[4901]: I0309 03:04:00.428905 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7559df67df-7q7jp" Mar 09 03:04:00 crc kubenswrapper[4901]: I0309 03:04:00.436035 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjp9j\" (UniqueName: \"kubernetes.io/projected/725ee9e1-68e0-4756-bfc7-e4d209aeeae8-kube-api-access-kjp9j\") pod \"auto-csr-approver-29550424-rt4jl\" (UID: \"725ee9e1-68e0-4756-bfc7-e4d209aeeae8\") " pod="openshift-infra/auto-csr-approver-29550424-rt4jl" Mar 09 03:04:00 crc kubenswrapper[4901]: I0309 03:04:00.455991 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjp9j\" (UniqueName: \"kubernetes.io/projected/725ee9e1-68e0-4756-bfc7-e4d209aeeae8-kube-api-access-kjp9j\") pod \"auto-csr-approver-29550424-rt4jl\" (UID: \"725ee9e1-68e0-4756-bfc7-e4d209aeeae8\") " pod="openshift-infra/auto-csr-approver-29550424-rt4jl" Mar 09 03:04:00 crc kubenswrapper[4901]: I0309 03:04:00.491618 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550424-rt4jl" Mar 09 03:04:00 crc kubenswrapper[4901]: I0309 03:04:00.863003 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 03:04:00 crc kubenswrapper[4901]: I0309 03:04:00.863470 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 03:04:00 crc kubenswrapper[4901]: I0309 03:04:00.863512 4901 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5c998" Mar 09 03:04:00 crc kubenswrapper[4901]: I0309 03:04:00.864156 4901 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9277963b36f2cb3e2457299d51b88e2cddec56d32cd2a3c3337a07a6a046785b"} pod="openshift-machine-config-operator/machine-config-daemon-5c998" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 03:04:00 crc kubenswrapper[4901]: I0309 03:04:00.864203 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" containerID="cri-o://9277963b36f2cb3e2457299d51b88e2cddec56d32cd2a3c3337a07a6a046785b" gracePeriod=600 Mar 09 03:04:01 crc kubenswrapper[4901]: I0309 03:04:01.142662 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550424-rt4jl"] Mar 09 03:04:01 crc kubenswrapper[4901]: W0309 03:04:01.170470 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod725ee9e1_68e0_4756_bfc7_e4d209aeeae8.slice/crio-c3e53d574da66bdd6744a0c9c4feb41e6bd92ab1bab367b5597219630e6455c6 WatchSource:0}: Error finding container c3e53d574da66bdd6744a0c9c4feb41e6bd92ab1bab367b5597219630e6455c6: Status 404 returned error can't find the container with id c3e53d574da66bdd6744a0c9c4feb41e6bd92ab1bab367b5597219630e6455c6 Mar 09 03:04:01 crc kubenswrapper[4901]: I0309 03:04:01.440293 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8155ec57-49a6-4d30-930e-9f10e3f28d17","Type":"ContainerStarted","Data":"db19a4f3b32cda9cbdf9d87cfcc422227be8d976ac4f57d9207a8d89e12f330c"} Mar 09 03:04:01 crc kubenswrapper[4901]: I0309 03:04:01.440338 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="8155ec57-49a6-4d30-930e-9f10e3f28d17" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://db19a4f3b32cda9cbdf9d87cfcc422227be8d976ac4f57d9207a8d89e12f330c" gracePeriod=30 Mar 09 03:04:01 crc kubenswrapper[4901]: I0309 03:04:01.441896 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550424-rt4jl" event={"ID":"725ee9e1-68e0-4756-bfc7-e4d209aeeae8","Type":"ContainerStarted","Data":"c3e53d574da66bdd6744a0c9c4feb41e6bd92ab1bab367b5597219630e6455c6"} Mar 09 03:04:01 crc kubenswrapper[4901]: I0309 03:04:01.443284 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b7764b86-48f0-4058-a60d-9275796fe58c","Type":"ContainerStarted","Data":"5ce1250df11ac77db18e280bfbff43f51b13a9f3620235afe3114b645f1a1693"} Mar 09 03:04:01 crc kubenswrapper[4901]: I0309 03:04:01.446610 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3c442232-3859-43a4-9f6a-bc330c647b14","Type":"ContainerStarted","Data":"2262a0acce159c7c77430c43bd8af63d5d0bbe8965d780c78a1c50ba93771971"} Mar 09 03:04:01 crc kubenswrapper[4901]: I0309 03:04:01.446642 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3c442232-3859-43a4-9f6a-bc330c647b14","Type":"ContainerStarted","Data":"134a1b6b3562289ab6b2e8c4a9b2c0888bd313dc89a6d399f2696607e0296849"} Mar 09 03:04:01 crc kubenswrapper[4901]: I0309 03:04:01.456838 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2ff45df6-72fc-4db9-abe9-c5fa3bdc483b","Type":"ContainerStarted","Data":"4026a0fbb7c5d2e1fa115a5eda3236812c99eb3cb536c8c7666bc02bc9e1b763"} Mar 09 03:04:01 crc kubenswrapper[4901]: I0309 03:04:01.456881 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2ff45df6-72fc-4db9-abe9-c5fa3bdc483b","Type":"ContainerStarted","Data":"4ebad9aa07970bd6cbd4b9801c88cd0a8767fee7261bccac3eda8513e2234835"} Mar 09 03:04:01 crc kubenswrapper[4901]: I0309 03:04:01.456905 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2ff45df6-72fc-4db9-abe9-c5fa3bdc483b" containerName="nova-metadata-log" containerID="cri-o://4ebad9aa07970bd6cbd4b9801c88cd0a8767fee7261bccac3eda8513e2234835" gracePeriod=30 Mar 09 03:04:01 crc kubenswrapper[4901]: I0309 03:04:01.456948 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2ff45df6-72fc-4db9-abe9-c5fa3bdc483b" containerName="nova-metadata-metadata" containerID="cri-o://4026a0fbb7c5d2e1fa115a5eda3236812c99eb3cb536c8c7666bc02bc9e1b763" gracePeriod=30 Mar 09 03:04:01 crc kubenswrapper[4901]: I0309 03:04:01.466131 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.507213819 podStartE2EDuration="5.466114598s" podCreationTimestamp="2026-03-09 03:03:56 +0000 UTC" firstStartedPulling="2026-03-09 03:03:57.749025715 +0000 UTC m=+1362.338689447" lastFinishedPulling="2026-03-09 03:04:00.707926494 +0000 UTC m=+1365.297590226" observedRunningTime="2026-03-09 03:04:01.457563372 +0000 UTC m=+1366.047227114" watchObservedRunningTime="2026-03-09 03:04:01.466114598 +0000 UTC m=+1366.055778320" Mar 09 03:04:01 crc kubenswrapper[4901]: I0309 03:04:01.470913 4901 generic.go:334] "Generic (PLEG): container finished" podID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerID="9277963b36f2cb3e2457299d51b88e2cddec56d32cd2a3c3337a07a6a046785b" exitCode=0 Mar 09 03:04:01 crc kubenswrapper[4901]: I0309 03:04:01.471161 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" event={"ID":"65e722e8-52c4-4bb6-9927-f378b2f7296a","Type":"ContainerDied","Data":"9277963b36f2cb3e2457299d51b88e2cddec56d32cd2a3c3337a07a6a046785b"} Mar 09 03:04:01 crc kubenswrapper[4901]: I0309 03:04:01.471196 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" event={"ID":"65e722e8-52c4-4bb6-9927-f378b2f7296a","Type":"ContainerStarted","Data":"e3004d260bc17a7df9a3f09f9c3fb88b56d94af0a91dbe7f057c714451b1f515"} Mar 09 03:04:01 crc kubenswrapper[4901]: I0309 03:04:01.471231 4901 scope.go:117] "RemoveContainer" containerID="de1ddcd67e5e6d7dcbea8ef2824b5106d2b931526ee8f8e98968dbc1152811b6" Mar 09 03:04:01 crc kubenswrapper[4901]: I0309 03:04:01.514474 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.227267787 podStartE2EDuration="5.514453107s" podCreationTimestamp="2026-03-09 03:03:56 +0000 UTC" firstStartedPulling="2026-03-09 03:03:57.432734787 +0000 UTC m=+1362.022398519" lastFinishedPulling="2026-03-09 03:04:00.719920107 +0000 UTC m=+1365.309583839" observedRunningTime="2026-03-09 03:04:01.504591678 +0000 UTC m=+1366.094255420" watchObservedRunningTime="2026-03-09 03:04:01.514453107 +0000 UTC m=+1366.104116839" Mar 09 03:04:01 crc kubenswrapper[4901]: I0309 03:04:01.517288 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.572099945 podStartE2EDuration="5.517275838s" podCreationTimestamp="2026-03-09 03:03:56 +0000 UTC" firstStartedPulling="2026-03-09 03:03:57.765001778 +0000 UTC m=+1362.354665510" lastFinishedPulling="2026-03-09 03:04:00.710177671 +0000 UTC m=+1365.299841403" observedRunningTime="2026-03-09 03:04:01.489658942 +0000 UTC m=+1366.079322674" watchObservedRunningTime="2026-03-09 03:04:01.517275838 +0000 UTC m=+1366.106939570" Mar 09 03:04:01 crc kubenswrapper[4901]: I0309 03:04:01.523615 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.270689233 podStartE2EDuration="5.523600158s" podCreationTimestamp="2026-03-09 03:03:56 +0000 UTC" firstStartedPulling="2026-03-09 03:03:57.45504434 +0000 UTC m=+1362.044708072" lastFinishedPulling="2026-03-09 03:04:00.707955265 +0000 UTC m=+1365.297618997" observedRunningTime="2026-03-09 03:04:01.52328246 +0000 UTC m=+1366.112946192" watchObservedRunningTime="2026-03-09 03:04:01.523600158 +0000 UTC m=+1366.113263890" Mar 09 03:04:01 crc kubenswrapper[4901]: I0309 03:04:01.807768 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 09 03:04:01 crc kubenswrapper[4901]: I0309 03:04:01.965347 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 09 03:04:02 crc kubenswrapper[4901]: I0309 03:04:02.041405 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 09 03:04:02 crc kubenswrapper[4901]: I0309 03:04:02.041493 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 09 03:04:02 crc kubenswrapper[4901]: I0309 03:04:02.188615 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 03:04:02 crc kubenswrapper[4901]: I0309 03:04:02.376018 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnz4p\" (UniqueName: \"kubernetes.io/projected/2ff45df6-72fc-4db9-abe9-c5fa3bdc483b-kube-api-access-bnz4p\") pod \"2ff45df6-72fc-4db9-abe9-c5fa3bdc483b\" (UID: \"2ff45df6-72fc-4db9-abe9-c5fa3bdc483b\") " Mar 09 03:04:02 crc kubenswrapper[4901]: I0309 03:04:02.377311 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ff45df6-72fc-4db9-abe9-c5fa3bdc483b-config-data\") pod \"2ff45df6-72fc-4db9-abe9-c5fa3bdc483b\" (UID: \"2ff45df6-72fc-4db9-abe9-c5fa3bdc483b\") " Mar 09 03:04:02 crc kubenswrapper[4901]: I0309 03:04:02.377441 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ff45df6-72fc-4db9-abe9-c5fa3bdc483b-logs\") pod \"2ff45df6-72fc-4db9-abe9-c5fa3bdc483b\" (UID: \"2ff45df6-72fc-4db9-abe9-c5fa3bdc483b\") " Mar 09 03:04:02 crc kubenswrapper[4901]: I0309 03:04:02.377525 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff45df6-72fc-4db9-abe9-c5fa3bdc483b-combined-ca-bundle\") pod \"2ff45df6-72fc-4db9-abe9-c5fa3bdc483b\" (UID: \"2ff45df6-72fc-4db9-abe9-c5fa3bdc483b\") " Mar 09 03:04:02 crc kubenswrapper[4901]: I0309 03:04:02.378132 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ff45df6-72fc-4db9-abe9-c5fa3bdc483b-logs" (OuterVolumeSpecName: "logs") pod "2ff45df6-72fc-4db9-abe9-c5fa3bdc483b" (UID: "2ff45df6-72fc-4db9-abe9-c5fa3bdc483b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:04:02 crc kubenswrapper[4901]: I0309 03:04:02.378628 4901 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ff45df6-72fc-4db9-abe9-c5fa3bdc483b-logs\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:02 crc kubenswrapper[4901]: I0309 03:04:02.384905 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ff45df6-72fc-4db9-abe9-c5fa3bdc483b-kube-api-access-bnz4p" (OuterVolumeSpecName: "kube-api-access-bnz4p") pod "2ff45df6-72fc-4db9-abe9-c5fa3bdc483b" (UID: "2ff45df6-72fc-4db9-abe9-c5fa3bdc483b"). InnerVolumeSpecName "kube-api-access-bnz4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:04:02 crc kubenswrapper[4901]: I0309 03:04:02.415706 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ff45df6-72fc-4db9-abe9-c5fa3bdc483b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2ff45df6-72fc-4db9-abe9-c5fa3bdc483b" (UID: "2ff45df6-72fc-4db9-abe9-c5fa3bdc483b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:04:02 crc kubenswrapper[4901]: I0309 03:04:02.424558 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ff45df6-72fc-4db9-abe9-c5fa3bdc483b-config-data" (OuterVolumeSpecName: "config-data") pod "2ff45df6-72fc-4db9-abe9-c5fa3bdc483b" (UID: "2ff45df6-72fc-4db9-abe9-c5fa3bdc483b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:04:02 crc kubenswrapper[4901]: I0309 03:04:02.480335 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ff45df6-72fc-4db9-abe9-c5fa3bdc483b-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:02 crc kubenswrapper[4901]: I0309 03:04:02.480365 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff45df6-72fc-4db9-abe9-c5fa3bdc483b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:02 crc kubenswrapper[4901]: I0309 03:04:02.480376 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnz4p\" (UniqueName: \"kubernetes.io/projected/2ff45df6-72fc-4db9-abe9-c5fa3bdc483b-kube-api-access-bnz4p\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:02 crc kubenswrapper[4901]: I0309 03:04:02.505327 4901 generic.go:334] "Generic (PLEG): container finished" podID="2ff45df6-72fc-4db9-abe9-c5fa3bdc483b" containerID="4026a0fbb7c5d2e1fa115a5eda3236812c99eb3cb536c8c7666bc02bc9e1b763" exitCode=0 Mar 09 03:04:02 crc kubenswrapper[4901]: I0309 03:04:02.505407 4901 generic.go:334] "Generic (PLEG): container finished" podID="2ff45df6-72fc-4db9-abe9-c5fa3bdc483b" containerID="4ebad9aa07970bd6cbd4b9801c88cd0a8767fee7261bccac3eda8513e2234835" exitCode=143 Mar 09 03:04:02 crc kubenswrapper[4901]: I0309 03:04:02.505484 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2ff45df6-72fc-4db9-abe9-c5fa3bdc483b","Type":"ContainerDied","Data":"4026a0fbb7c5d2e1fa115a5eda3236812c99eb3cb536c8c7666bc02bc9e1b763"} Mar 09 03:04:02 crc kubenswrapper[4901]: I0309 03:04:02.505510 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2ff45df6-72fc-4db9-abe9-c5fa3bdc483b","Type":"ContainerDied","Data":"4ebad9aa07970bd6cbd4b9801c88cd0a8767fee7261bccac3eda8513e2234835"} Mar 09 03:04:02 crc kubenswrapper[4901]: I0309 03:04:02.505529 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2ff45df6-72fc-4db9-abe9-c5fa3bdc483b","Type":"ContainerDied","Data":"25fc2da38d0a39e81857e99db8997a6bf9e5564c26a20611cc29a403b4e3b0bb"} Mar 09 03:04:02 crc kubenswrapper[4901]: I0309 03:04:02.505547 4901 scope.go:117] "RemoveContainer" containerID="4026a0fbb7c5d2e1fa115a5eda3236812c99eb3cb536c8c7666bc02bc9e1b763" Mar 09 03:04:02 crc kubenswrapper[4901]: I0309 03:04:02.505728 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 03:04:02 crc kubenswrapper[4901]: I0309 03:04:02.551763 4901 scope.go:117] "RemoveContainer" containerID="4ebad9aa07970bd6cbd4b9801c88cd0a8767fee7261bccac3eda8513e2234835" Mar 09 03:04:02 crc kubenswrapper[4901]: I0309 03:04:02.573752 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 03:04:02 crc kubenswrapper[4901]: I0309 03:04:02.581756 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 03:04:02 crc kubenswrapper[4901]: I0309 03:04:02.597113 4901 scope.go:117] "RemoveContainer" containerID="4026a0fbb7c5d2e1fa115a5eda3236812c99eb3cb536c8c7666bc02bc9e1b763" Mar 09 03:04:02 crc kubenswrapper[4901]: E0309 03:04:02.597602 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4026a0fbb7c5d2e1fa115a5eda3236812c99eb3cb536c8c7666bc02bc9e1b763\": container with ID starting with 4026a0fbb7c5d2e1fa115a5eda3236812c99eb3cb536c8c7666bc02bc9e1b763 not found: ID does not exist" containerID="4026a0fbb7c5d2e1fa115a5eda3236812c99eb3cb536c8c7666bc02bc9e1b763" Mar 09 03:04:02 crc kubenswrapper[4901]: I0309 03:04:02.597635 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4026a0fbb7c5d2e1fa115a5eda3236812c99eb3cb536c8c7666bc02bc9e1b763"} err="failed to get container status \"4026a0fbb7c5d2e1fa115a5eda3236812c99eb3cb536c8c7666bc02bc9e1b763\": rpc error: code = NotFound desc = could not find container \"4026a0fbb7c5d2e1fa115a5eda3236812c99eb3cb536c8c7666bc02bc9e1b763\": container with ID starting with 4026a0fbb7c5d2e1fa115a5eda3236812c99eb3cb536c8c7666bc02bc9e1b763 not found: ID does not exist" Mar 09 03:04:02 crc kubenswrapper[4901]: I0309 03:04:02.597661 4901 scope.go:117] "RemoveContainer" containerID="4ebad9aa07970bd6cbd4b9801c88cd0a8767fee7261bccac3eda8513e2234835" Mar 09 03:04:02 crc kubenswrapper[4901]: E0309 03:04:02.598029 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ebad9aa07970bd6cbd4b9801c88cd0a8767fee7261bccac3eda8513e2234835\": container with ID starting with 4ebad9aa07970bd6cbd4b9801c88cd0a8767fee7261bccac3eda8513e2234835 not found: ID does not exist" containerID="4ebad9aa07970bd6cbd4b9801c88cd0a8767fee7261bccac3eda8513e2234835" Mar 09 03:04:02 crc kubenswrapper[4901]: I0309 03:04:02.598051 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ebad9aa07970bd6cbd4b9801c88cd0a8767fee7261bccac3eda8513e2234835"} err="failed to get container status \"4ebad9aa07970bd6cbd4b9801c88cd0a8767fee7261bccac3eda8513e2234835\": rpc error: code = NotFound desc = could not find container \"4ebad9aa07970bd6cbd4b9801c88cd0a8767fee7261bccac3eda8513e2234835\": container with ID starting with 4ebad9aa07970bd6cbd4b9801c88cd0a8767fee7261bccac3eda8513e2234835 not found: ID does not exist" Mar 09 03:04:02 crc kubenswrapper[4901]: I0309 03:04:02.598070 4901 scope.go:117] "RemoveContainer" containerID="4026a0fbb7c5d2e1fa115a5eda3236812c99eb3cb536c8c7666bc02bc9e1b763" Mar 09 03:04:02 crc kubenswrapper[4901]: I0309 03:04:02.598574 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4026a0fbb7c5d2e1fa115a5eda3236812c99eb3cb536c8c7666bc02bc9e1b763"} err="failed to get container status \"4026a0fbb7c5d2e1fa115a5eda3236812c99eb3cb536c8c7666bc02bc9e1b763\": rpc error: code = NotFound desc = could not find container \"4026a0fbb7c5d2e1fa115a5eda3236812c99eb3cb536c8c7666bc02bc9e1b763\": container with ID starting with 4026a0fbb7c5d2e1fa115a5eda3236812c99eb3cb536c8c7666bc02bc9e1b763 not found: ID does not exist" Mar 09 03:04:02 crc kubenswrapper[4901]: I0309 03:04:02.598599 4901 scope.go:117] "RemoveContainer" containerID="4ebad9aa07970bd6cbd4b9801c88cd0a8767fee7261bccac3eda8513e2234835" Mar 09 03:04:02 crc kubenswrapper[4901]: I0309 03:04:02.598951 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ebad9aa07970bd6cbd4b9801c88cd0a8767fee7261bccac3eda8513e2234835"} err="failed to get container status \"4ebad9aa07970bd6cbd4b9801c88cd0a8767fee7261bccac3eda8513e2234835\": rpc error: code = NotFound desc = could not find container \"4ebad9aa07970bd6cbd4b9801c88cd0a8767fee7261bccac3eda8513e2234835\": container with ID starting with 4ebad9aa07970bd6cbd4b9801c88cd0a8767fee7261bccac3eda8513e2234835 not found: ID does not exist" Mar 09 03:04:02 crc kubenswrapper[4901]: I0309 03:04:02.601003 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 09 03:04:02 crc kubenswrapper[4901]: E0309 03:04:02.601584 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ff45df6-72fc-4db9-abe9-c5fa3bdc483b" containerName="nova-metadata-log" Mar 09 03:04:02 crc kubenswrapper[4901]: I0309 03:04:02.601600 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ff45df6-72fc-4db9-abe9-c5fa3bdc483b" containerName="nova-metadata-log" Mar 09 03:04:02 crc kubenswrapper[4901]: E0309 03:04:02.601618 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ff45df6-72fc-4db9-abe9-c5fa3bdc483b" containerName="nova-metadata-metadata" Mar 09 03:04:02 crc kubenswrapper[4901]: I0309 03:04:02.601624 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ff45df6-72fc-4db9-abe9-c5fa3bdc483b" containerName="nova-metadata-metadata" Mar 09 03:04:02 crc kubenswrapper[4901]: I0309 03:04:02.601800 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ff45df6-72fc-4db9-abe9-c5fa3bdc483b" containerName="nova-metadata-metadata" Mar 09 03:04:02 crc kubenswrapper[4901]: I0309 03:04:02.601821 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ff45df6-72fc-4db9-abe9-c5fa3bdc483b" containerName="nova-metadata-log" Mar 09 03:04:02 crc kubenswrapper[4901]: I0309 03:04:02.602737 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 03:04:02 crc kubenswrapper[4901]: I0309 03:04:02.604825 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 09 03:04:02 crc kubenswrapper[4901]: I0309 03:04:02.605034 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 09 03:04:02 crc kubenswrapper[4901]: I0309 03:04:02.621266 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 03:04:02 crc kubenswrapper[4901]: I0309 03:04:02.683996 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtsn4\" (UniqueName: \"kubernetes.io/projected/3b885da9-9534-4b38-af0f-6e88cd7c068b-kube-api-access-qtsn4\") pod \"nova-metadata-0\" (UID: \"3b885da9-9534-4b38-af0f-6e88cd7c068b\") " pod="openstack/nova-metadata-0" Mar 09 03:04:02 crc kubenswrapper[4901]: I0309 03:04:02.685256 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b885da9-9534-4b38-af0f-6e88cd7c068b-config-data\") pod \"nova-metadata-0\" (UID: \"3b885da9-9534-4b38-af0f-6e88cd7c068b\") " pod="openstack/nova-metadata-0" Mar 09 03:04:02 crc kubenswrapper[4901]: I0309 03:04:02.685601 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b885da9-9534-4b38-af0f-6e88cd7c068b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3b885da9-9534-4b38-af0f-6e88cd7c068b\") " pod="openstack/nova-metadata-0" Mar 09 03:04:02 crc kubenswrapper[4901]: I0309 03:04:02.685641 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b885da9-9534-4b38-af0f-6e88cd7c068b-logs\") pod \"nova-metadata-0\" (UID: \"3b885da9-9534-4b38-af0f-6e88cd7c068b\") " pod="openstack/nova-metadata-0" Mar 09 03:04:02 crc kubenswrapper[4901]: I0309 03:04:02.685666 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b885da9-9534-4b38-af0f-6e88cd7c068b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3b885da9-9534-4b38-af0f-6e88cd7c068b\") " pod="openstack/nova-metadata-0" Mar 09 03:04:02 crc kubenswrapper[4901]: I0309 03:04:02.786863 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtsn4\" (UniqueName: \"kubernetes.io/projected/3b885da9-9534-4b38-af0f-6e88cd7c068b-kube-api-access-qtsn4\") pod \"nova-metadata-0\" (UID: \"3b885da9-9534-4b38-af0f-6e88cd7c068b\") " pod="openstack/nova-metadata-0" Mar 09 03:04:02 crc kubenswrapper[4901]: I0309 03:04:02.787191 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b885da9-9534-4b38-af0f-6e88cd7c068b-config-data\") pod \"nova-metadata-0\" (UID: \"3b885da9-9534-4b38-af0f-6e88cd7c068b\") " pod="openstack/nova-metadata-0" Mar 09 03:04:02 crc kubenswrapper[4901]: I0309 03:04:02.787284 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b885da9-9534-4b38-af0f-6e88cd7c068b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3b885da9-9534-4b38-af0f-6e88cd7c068b\") " pod="openstack/nova-metadata-0" Mar 09 03:04:02 crc kubenswrapper[4901]: I0309 03:04:02.787308 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b885da9-9534-4b38-af0f-6e88cd7c068b-logs\") pod \"nova-metadata-0\" (UID: \"3b885da9-9534-4b38-af0f-6e88cd7c068b\") " pod="openstack/nova-metadata-0" Mar 09 03:04:02 crc kubenswrapper[4901]: I0309 03:04:02.787330 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b885da9-9534-4b38-af0f-6e88cd7c068b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3b885da9-9534-4b38-af0f-6e88cd7c068b\") " pod="openstack/nova-metadata-0" Mar 09 03:04:02 crc kubenswrapper[4901]: I0309 03:04:02.792028 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b885da9-9534-4b38-af0f-6e88cd7c068b-logs\") pod \"nova-metadata-0\" (UID: \"3b885da9-9534-4b38-af0f-6e88cd7c068b\") " pod="openstack/nova-metadata-0" Mar 09 03:04:02 crc kubenswrapper[4901]: I0309 03:04:02.792131 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b885da9-9534-4b38-af0f-6e88cd7c068b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3b885da9-9534-4b38-af0f-6e88cd7c068b\") " pod="openstack/nova-metadata-0" Mar 09 03:04:02 crc kubenswrapper[4901]: I0309 03:04:02.793069 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b885da9-9534-4b38-af0f-6e88cd7c068b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3b885da9-9534-4b38-af0f-6e88cd7c068b\") " pod="openstack/nova-metadata-0" Mar 09 03:04:02 crc kubenswrapper[4901]: I0309 03:04:02.793820 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b885da9-9534-4b38-af0f-6e88cd7c068b-config-data\") pod \"nova-metadata-0\" (UID: \"3b885da9-9534-4b38-af0f-6e88cd7c068b\") " pod="openstack/nova-metadata-0" Mar 09 03:04:02 crc kubenswrapper[4901]: I0309 03:04:02.808058 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtsn4\" (UniqueName: \"kubernetes.io/projected/3b885da9-9534-4b38-af0f-6e88cd7c068b-kube-api-access-qtsn4\") pod \"nova-metadata-0\" (UID: \"3b885da9-9534-4b38-af0f-6e88cd7c068b\") " pod="openstack/nova-metadata-0" Mar 09 03:04:02 crc kubenswrapper[4901]: I0309 03:04:02.922723 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 03:04:03 crc kubenswrapper[4901]: W0309 03:04:03.418749 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b885da9_9534_4b38_af0f_6e88cd7c068b.slice/crio-58beab63d9f2c8e0c8541c0db19609f5e49be4c98b2cb81c5a934e63edd4e4b7 WatchSource:0}: Error finding container 58beab63d9f2c8e0c8541c0db19609f5e49be4c98b2cb81c5a934e63edd4e4b7: Status 404 returned error can't find the container with id 58beab63d9f2c8e0c8541c0db19609f5e49be4c98b2cb81c5a934e63edd4e4b7 Mar 09 03:04:03 crc kubenswrapper[4901]: I0309 03:04:03.420772 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 03:04:03 crc kubenswrapper[4901]: I0309 03:04:03.537193 4901 generic.go:334] "Generic (PLEG): container finished" podID="725ee9e1-68e0-4756-bfc7-e4d209aeeae8" containerID="7323f3ec790c22a5c15492d0b747ce5e981ed6529f8b38178d47eb18686cb80a" exitCode=0 Mar 09 03:04:03 crc kubenswrapper[4901]: I0309 03:04:03.537566 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550424-rt4jl" event={"ID":"725ee9e1-68e0-4756-bfc7-e4d209aeeae8","Type":"ContainerDied","Data":"7323f3ec790c22a5c15492d0b747ce5e981ed6529f8b38178d47eb18686cb80a"} Mar 09 03:04:03 crc kubenswrapper[4901]: I0309 03:04:03.548867 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3b885da9-9534-4b38-af0f-6e88cd7c068b","Type":"ContainerStarted","Data":"58beab63d9f2c8e0c8541c0db19609f5e49be4c98b2cb81c5a934e63edd4e4b7"} Mar 09 03:04:04 crc kubenswrapper[4901]: I0309 03:04:04.119403 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ff45df6-72fc-4db9-abe9-c5fa3bdc483b" path="/var/lib/kubelet/pods/2ff45df6-72fc-4db9-abe9-c5fa3bdc483b/volumes" Mar 09 03:04:04 crc kubenswrapper[4901]: I0309 03:04:04.565611 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3b885da9-9534-4b38-af0f-6e88cd7c068b","Type":"ContainerStarted","Data":"a8671cc2258e76bc71ba7f64fa80bf243d2e6e61388f89c3acd170d0767e7177"} Mar 09 03:04:04 crc kubenswrapper[4901]: I0309 03:04:04.565712 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3b885da9-9534-4b38-af0f-6e88cd7c068b","Type":"ContainerStarted","Data":"69e737b92715d93b1083f5edd16b85a658ab1e0f2c1839a2672f92f05e1c9a84"} Mar 09 03:04:04 crc kubenswrapper[4901]: I0309 03:04:04.601433 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.601404836 podStartE2EDuration="2.601404836s" podCreationTimestamp="2026-03-09 03:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 03:04:04.585760282 +0000 UTC m=+1369.175424054" watchObservedRunningTime="2026-03-09 03:04:04.601404836 +0000 UTC m=+1369.191068608" Mar 09 03:04:04 crc kubenswrapper[4901]: I0309 03:04:04.976487 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550424-rt4jl" Mar 09 03:04:05 crc kubenswrapper[4901]: I0309 03:04:05.143641 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjp9j\" (UniqueName: \"kubernetes.io/projected/725ee9e1-68e0-4756-bfc7-e4d209aeeae8-kube-api-access-kjp9j\") pod \"725ee9e1-68e0-4756-bfc7-e4d209aeeae8\" (UID: \"725ee9e1-68e0-4756-bfc7-e4d209aeeae8\") " Mar 09 03:04:05 crc kubenswrapper[4901]: I0309 03:04:05.150311 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/725ee9e1-68e0-4756-bfc7-e4d209aeeae8-kube-api-access-kjp9j" (OuterVolumeSpecName: "kube-api-access-kjp9j") pod "725ee9e1-68e0-4756-bfc7-e4d209aeeae8" (UID: "725ee9e1-68e0-4756-bfc7-e4d209aeeae8"). InnerVolumeSpecName "kube-api-access-kjp9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:04:05 crc kubenswrapper[4901]: I0309 03:04:05.246606 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjp9j\" (UniqueName: \"kubernetes.io/projected/725ee9e1-68e0-4756-bfc7-e4d209aeeae8-kube-api-access-kjp9j\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:05 crc kubenswrapper[4901]: I0309 03:04:05.580734 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550424-rt4jl" Mar 09 03:04:05 crc kubenswrapper[4901]: I0309 03:04:05.580808 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550424-rt4jl" event={"ID":"725ee9e1-68e0-4756-bfc7-e4d209aeeae8","Type":"ContainerDied","Data":"c3e53d574da66bdd6744a0c9c4feb41e6bd92ab1bab367b5597219630e6455c6"} Mar 09 03:04:05 crc kubenswrapper[4901]: I0309 03:04:05.581506 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3e53d574da66bdd6744a0c9c4feb41e6bd92ab1bab367b5597219630e6455c6" Mar 09 03:04:05 crc kubenswrapper[4901]: I0309 03:04:05.583838 4901 generic.go:334] "Generic (PLEG): container finished" podID="03fb66e1-3428-40d7-a0b4-b8a7938ff800" containerID="5d9e5ccec85a35a04d0a76fa028d5ad72a893c23a40e4f14f4d92ce0a8b5962c" exitCode=0 Mar 09 03:04:05 crc kubenswrapper[4901]: I0309 03:04:05.583930 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-jv9hz" event={"ID":"03fb66e1-3428-40d7-a0b4-b8a7938ff800","Type":"ContainerDied","Data":"5d9e5ccec85a35a04d0a76fa028d5ad72a893c23a40e4f14f4d92ce0a8b5962c"} Mar 09 03:04:05 crc kubenswrapper[4901]: I0309 03:04:05.587091 4901 generic.go:334] "Generic (PLEG): container finished" podID="aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5" containerID="7eb050b8062d25a6a16cf0cb4f42df92e684eebb712dff6d0628185b0a853346" exitCode=0 Mar 09 03:04:05 crc kubenswrapper[4901]: I0309 03:04:05.588236 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-lb2xt" event={"ID":"aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5","Type":"ContainerDied","Data":"7eb050b8062d25a6a16cf0cb4f42df92e684eebb712dff6d0628185b0a853346"} Mar 09 03:04:06 crc kubenswrapper[4901]: I0309 03:04:06.069663 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550418-rmqtb"] Mar 09 03:04:06 crc kubenswrapper[4901]: I0309 03:04:06.082042 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550418-rmqtb"] Mar 09 03:04:06 crc kubenswrapper[4901]: I0309 03:04:06.126064 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9eb148ab-7da6-4a35-9fab-16e7a98612a6" path="/var/lib/kubelet/pods/9eb148ab-7da6-4a35-9fab-16e7a98612a6/volumes" Mar 09 03:04:06 crc kubenswrapper[4901]: I0309 03:04:06.795139 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 09 03:04:06 crc kubenswrapper[4901]: I0309 03:04:06.795505 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 09 03:04:06 crc kubenswrapper[4901]: I0309 03:04:06.809472 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 09 03:04:06 crc kubenswrapper[4901]: I0309 03:04:06.903770 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.109364 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7559df67df-7q7jp" Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.113920 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-jv9hz" Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.118633 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-lb2xt" Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.186849 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03fb66e1-3428-40d7-a0b4-b8a7938ff800-config-data\") pod \"03fb66e1-3428-40d7-a0b4-b8a7938ff800\" (UID: \"03fb66e1-3428-40d7-a0b4-b8a7938ff800\") " Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.186939 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwjxk\" (UniqueName: \"kubernetes.io/projected/03fb66e1-3428-40d7-a0b4-b8a7938ff800-kube-api-access-cwjxk\") pod \"03fb66e1-3428-40d7-a0b4-b8a7938ff800\" (UID: \"03fb66e1-3428-40d7-a0b4-b8a7938ff800\") " Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.187027 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5-scripts\") pod \"aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5\" (UID: \"aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5\") " Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.187073 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5-config-data\") pod \"aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5\" (UID: \"aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5\") " Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.187099 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5-combined-ca-bundle\") pod \"aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5\" (UID: \"aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5\") " Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.187148 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czk72\" (UniqueName: \"kubernetes.io/projected/aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5-kube-api-access-czk72\") pod \"aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5\" (UID: \"aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5\") " Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.187205 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03fb66e1-3428-40d7-a0b4-b8a7938ff800-combined-ca-bundle\") pod \"03fb66e1-3428-40d7-a0b4-b8a7938ff800\" (UID: \"03fb66e1-3428-40d7-a0b4-b8a7938ff800\") " Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.187283 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03fb66e1-3428-40d7-a0b4-b8a7938ff800-scripts\") pod \"03fb66e1-3428-40d7-a0b4-b8a7938ff800\" (UID: \"03fb66e1-3428-40d7-a0b4-b8a7938ff800\") " Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.189284 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-765c5b6b49-64hwd"] Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.189550 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-765c5b6b49-64hwd" podUID="2224855e-40d6-45cd-b001-18e3cc94610d" containerName="dnsmasq-dns" containerID="cri-o://2cabf147c324c9b0d8e1ce65fce8ffa92f5c6ad3220ab420158e1d3de32dea84" gracePeriod=10 Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.207520 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03fb66e1-3428-40d7-a0b4-b8a7938ff800-kube-api-access-cwjxk" (OuterVolumeSpecName: "kube-api-access-cwjxk") pod "03fb66e1-3428-40d7-a0b4-b8a7938ff800" (UID: "03fb66e1-3428-40d7-a0b4-b8a7938ff800"). InnerVolumeSpecName "kube-api-access-cwjxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.207832 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03fb66e1-3428-40d7-a0b4-b8a7938ff800-scripts" (OuterVolumeSpecName: "scripts") pod "03fb66e1-3428-40d7-a0b4-b8a7938ff800" (UID: "03fb66e1-3428-40d7-a0b4-b8a7938ff800"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.209444 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5-scripts" (OuterVolumeSpecName: "scripts") pod "aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5" (UID: "aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.220866 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5-kube-api-access-czk72" (OuterVolumeSpecName: "kube-api-access-czk72") pod "aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5" (UID: "aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5"). InnerVolumeSpecName "kube-api-access-czk72". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.231527 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03fb66e1-3428-40d7-a0b4-b8a7938ff800-config-data" (OuterVolumeSpecName: "config-data") pod "03fb66e1-3428-40d7-a0b4-b8a7938ff800" (UID: "03fb66e1-3428-40d7-a0b4-b8a7938ff800"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.232940 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5-config-data" (OuterVolumeSpecName: "config-data") pod "aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5" (UID: "aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.256598 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03fb66e1-3428-40d7-a0b4-b8a7938ff800-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "03fb66e1-3428-40d7-a0b4-b8a7938ff800" (UID: "03fb66e1-3428-40d7-a0b4-b8a7938ff800"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.272423 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5" (UID: "aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.289826 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.289880 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.289893 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.289905 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czk72\" (UniqueName: \"kubernetes.io/projected/aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5-kube-api-access-czk72\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.289914 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03fb66e1-3428-40d7-a0b4-b8a7938ff800-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.289930 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03fb66e1-3428-40d7-a0b4-b8a7938ff800-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.289938 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03fb66e1-3428-40d7-a0b4-b8a7938ff800-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.289948 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwjxk\" (UniqueName: \"kubernetes.io/projected/03fb66e1-3428-40d7-a0b4-b8a7938ff800-kube-api-access-cwjxk\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.609925 4901 generic.go:334] "Generic (PLEG): container finished" podID="2224855e-40d6-45cd-b001-18e3cc94610d" containerID="2cabf147c324c9b0d8e1ce65fce8ffa92f5c6ad3220ab420158e1d3de32dea84" exitCode=0 Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.610012 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-765c5b6b49-64hwd" event={"ID":"2224855e-40d6-45cd-b001-18e3cc94610d","Type":"ContainerDied","Data":"2cabf147c324c9b0d8e1ce65fce8ffa92f5c6ad3220ab420158e1d3de32dea84"} Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.630624 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-jv9hz" event={"ID":"03fb66e1-3428-40d7-a0b4-b8a7938ff800","Type":"ContainerDied","Data":"6daa5b108c01deec5d19f2ca4647b81cf00879c7dcb1fd8a91f3c95c06e3c69c"} Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.630661 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6daa5b108c01deec5d19f2ca4647b81cf00879c7dcb1fd8a91f3c95c06e3c69c" Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.630728 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-jv9hz" Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.633245 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-lb2xt" Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.633290 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-lb2xt" event={"ID":"aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5","Type":"ContainerDied","Data":"4a44912171118290a3155a3ed5211cf1797c82623d255dc66cc3f1bcb28ef1c6"} Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.633312 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a44912171118290a3155a3ed5211cf1797c82623d255dc66cc3f1bcb28ef1c6" Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.702255 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.723072 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-765c5b6b49-64hwd" Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.758668 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 09 03:04:07 crc kubenswrapper[4901]: E0309 03:04:07.759585 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2224855e-40d6-45cd-b001-18e3cc94610d" containerName="init" Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.759702 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="2224855e-40d6-45cd-b001-18e3cc94610d" containerName="init" Mar 09 03:04:07 crc kubenswrapper[4901]: E0309 03:04:07.759784 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2224855e-40d6-45cd-b001-18e3cc94610d" containerName="dnsmasq-dns" Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.759846 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="2224855e-40d6-45cd-b001-18e3cc94610d" containerName="dnsmasq-dns" Mar 09 03:04:07 crc kubenswrapper[4901]: E0309 03:04:07.759910 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="725ee9e1-68e0-4756-bfc7-e4d209aeeae8" containerName="oc" Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.759966 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="725ee9e1-68e0-4756-bfc7-e4d209aeeae8" containerName="oc" Mar 09 03:04:07 crc kubenswrapper[4901]: E0309 03:04:07.760043 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5" containerName="nova-manage" Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.760094 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5" containerName="nova-manage" Mar 09 03:04:07 crc kubenswrapper[4901]: E0309 03:04:07.760162 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03fb66e1-3428-40d7-a0b4-b8a7938ff800" containerName="nova-cell1-conductor-db-sync" Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.760219 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="03fb66e1-3428-40d7-a0b4-b8a7938ff800" containerName="nova-cell1-conductor-db-sync" Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.760601 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="2224855e-40d6-45cd-b001-18e3cc94610d" containerName="dnsmasq-dns" Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.760691 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="725ee9e1-68e0-4756-bfc7-e4d209aeeae8" containerName="oc" Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.760753 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="03fb66e1-3428-40d7-a0b4-b8a7938ff800" containerName="nova-cell1-conductor-db-sync" Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.760840 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5" containerName="nova-manage" Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.761669 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.764459 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.782203 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.851637 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2224855e-40d6-45cd-b001-18e3cc94610d-ovsdbserver-sb\") pod \"2224855e-40d6-45cd-b001-18e3cc94610d\" (UID: \"2224855e-40d6-45cd-b001-18e3cc94610d\") " Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.852614 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2224855e-40d6-45cd-b001-18e3cc94610d-dns-svc\") pod \"2224855e-40d6-45cd-b001-18e3cc94610d\" (UID: \"2224855e-40d6-45cd-b001-18e3cc94610d\") " Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.852700 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2224855e-40d6-45cd-b001-18e3cc94610d-dns-swift-storage-0\") pod \"2224855e-40d6-45cd-b001-18e3cc94610d\" (UID: \"2224855e-40d6-45cd-b001-18e3cc94610d\") " Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.852877 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hw2br\" (UniqueName: \"kubernetes.io/projected/2224855e-40d6-45cd-b001-18e3cc94610d-kube-api-access-hw2br\") pod \"2224855e-40d6-45cd-b001-18e3cc94610d\" (UID: \"2224855e-40d6-45cd-b001-18e3cc94610d\") " Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.852964 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2224855e-40d6-45cd-b001-18e3cc94610d-ovsdbserver-nb\") pod \"2224855e-40d6-45cd-b001-18e3cc94610d\" (UID: \"2224855e-40d6-45cd-b001-18e3cc94610d\") " Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.853080 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2224855e-40d6-45cd-b001-18e3cc94610d-config\") pod \"2224855e-40d6-45cd-b001-18e3cc94610d\" (UID: \"2224855e-40d6-45cd-b001-18e3cc94610d\") " Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.863552 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.864414 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3c442232-3859-43a4-9f6a-bc330c647b14" containerName="nova-api-log" containerID="cri-o://134a1b6b3562289ab6b2e8c4a9b2c0888bd313dc89a6d399f2696607e0296849" gracePeriod=30 Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.864517 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2224855e-40d6-45cd-b001-18e3cc94610d-kube-api-access-hw2br" (OuterVolumeSpecName: "kube-api-access-hw2br") pod "2224855e-40d6-45cd-b001-18e3cc94610d" (UID: "2224855e-40d6-45cd-b001-18e3cc94610d"). InnerVolumeSpecName "kube-api-access-hw2br". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.864567 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3c442232-3859-43a4-9f6a-bc330c647b14" containerName="nova-api-api" containerID="cri-o://2262a0acce159c7c77430c43bd8af63d5d0bbe8965d780c78a1c50ba93771971" gracePeriod=30 Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.877824 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3c442232-3859-43a4-9f6a-bc330c647b14" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.188:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.877857 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3c442232-3859-43a4-9f6a-bc330c647b14" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.188:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.910347 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.910566 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3b885da9-9534-4b38-af0f-6e88cd7c068b" containerName="nova-metadata-log" containerID="cri-o://69e737b92715d93b1083f5edd16b85a658ab1e0f2c1839a2672f92f05e1c9a84" gracePeriod=30 Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.911052 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3b885da9-9534-4b38-af0f-6e88cd7c068b" containerName="nova-metadata-metadata" containerID="cri-o://a8671cc2258e76bc71ba7f64fa80bf243d2e6e61388f89c3acd170d0767e7177" gracePeriod=30 Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.923170 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.923524 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.952775 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2224855e-40d6-45cd-b001-18e3cc94610d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2224855e-40d6-45cd-b001-18e3cc94610d" (UID: "2224855e-40d6-45cd-b001-18e3cc94610d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.952782 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2224855e-40d6-45cd-b001-18e3cc94610d-config" (OuterVolumeSpecName: "config") pod "2224855e-40d6-45cd-b001-18e3cc94610d" (UID: "2224855e-40d6-45cd-b001-18e3cc94610d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.955013 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19b624e5-b3de-4724-b995-829d3fcd48ae-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"19b624e5-b3de-4724-b995-829d3fcd48ae\") " pod="openstack/nova-cell1-conductor-0" Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.955158 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w69nf\" (UniqueName: \"kubernetes.io/projected/19b624e5-b3de-4724-b995-829d3fcd48ae-kube-api-access-w69nf\") pod \"nova-cell1-conductor-0\" (UID: \"19b624e5-b3de-4724-b995-829d3fcd48ae\") " pod="openstack/nova-cell1-conductor-0" Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.955403 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19b624e5-b3de-4724-b995-829d3fcd48ae-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"19b624e5-b3de-4724-b995-829d3fcd48ae\") " pod="openstack/nova-cell1-conductor-0" Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.955763 4901 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2224855e-40d6-45cd-b001-18e3cc94610d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.955782 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hw2br\" (UniqueName: \"kubernetes.io/projected/2224855e-40d6-45cd-b001-18e3cc94610d-kube-api-access-hw2br\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.955792 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2224855e-40d6-45cd-b001-18e3cc94610d-config\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.974539 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2224855e-40d6-45cd-b001-18e3cc94610d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2224855e-40d6-45cd-b001-18e3cc94610d" (UID: "2224855e-40d6-45cd-b001-18e3cc94610d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:04:07 crc kubenswrapper[4901]: I0309 03:04:07.977707 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2224855e-40d6-45cd-b001-18e3cc94610d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2224855e-40d6-45cd-b001-18e3cc94610d" (UID: "2224855e-40d6-45cd-b001-18e3cc94610d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:04:08 crc kubenswrapper[4901]: I0309 03:04:08.002162 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2224855e-40d6-45cd-b001-18e3cc94610d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2224855e-40d6-45cd-b001-18e3cc94610d" (UID: "2224855e-40d6-45cd-b001-18e3cc94610d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:04:08 crc kubenswrapper[4901]: I0309 03:04:08.057166 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19b624e5-b3de-4724-b995-829d3fcd48ae-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"19b624e5-b3de-4724-b995-829d3fcd48ae\") " pod="openstack/nova-cell1-conductor-0" Mar 09 03:04:08 crc kubenswrapper[4901]: I0309 03:04:08.057235 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w69nf\" (UniqueName: \"kubernetes.io/projected/19b624e5-b3de-4724-b995-829d3fcd48ae-kube-api-access-w69nf\") pod \"nova-cell1-conductor-0\" (UID: \"19b624e5-b3de-4724-b995-829d3fcd48ae\") " pod="openstack/nova-cell1-conductor-0" Mar 09 03:04:08 crc kubenswrapper[4901]: I0309 03:04:08.057273 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19b624e5-b3de-4724-b995-829d3fcd48ae-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"19b624e5-b3de-4724-b995-829d3fcd48ae\") " pod="openstack/nova-cell1-conductor-0" Mar 09 03:04:08 crc kubenswrapper[4901]: I0309 03:04:08.057315 4901 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2224855e-40d6-45cd-b001-18e3cc94610d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:08 crc kubenswrapper[4901]: I0309 03:04:08.057325 4901 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2224855e-40d6-45cd-b001-18e3cc94610d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:08 crc kubenswrapper[4901]: I0309 03:04:08.057333 4901 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2224855e-40d6-45cd-b001-18e3cc94610d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:08 crc kubenswrapper[4901]: I0309 03:04:08.060299 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19b624e5-b3de-4724-b995-829d3fcd48ae-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"19b624e5-b3de-4724-b995-829d3fcd48ae\") " pod="openstack/nova-cell1-conductor-0" Mar 09 03:04:08 crc kubenswrapper[4901]: I0309 03:04:08.060743 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19b624e5-b3de-4724-b995-829d3fcd48ae-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"19b624e5-b3de-4724-b995-829d3fcd48ae\") " pod="openstack/nova-cell1-conductor-0" Mar 09 03:04:08 crc kubenswrapper[4901]: I0309 03:04:08.072329 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w69nf\" (UniqueName: \"kubernetes.io/projected/19b624e5-b3de-4724-b995-829d3fcd48ae-kube-api-access-w69nf\") pod \"nova-cell1-conductor-0\" (UID: \"19b624e5-b3de-4724-b995-829d3fcd48ae\") " pod="openstack/nova-cell1-conductor-0" Mar 09 03:04:08 crc kubenswrapper[4901]: I0309 03:04:08.082123 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 09 03:04:08 crc kubenswrapper[4901]: I0309 03:04:08.251157 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 03:04:08 crc kubenswrapper[4901]: I0309 03:04:08.497017 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 03:04:08 crc kubenswrapper[4901]: I0309 03:04:08.597568 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 09 03:04:08 crc kubenswrapper[4901]: I0309 03:04:08.654484 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"19b624e5-b3de-4724-b995-829d3fcd48ae","Type":"ContainerStarted","Data":"e97742193c6f9c9ef8d7ea89c5569345682293bcd2199bd28944b14f3399b6a6"} Mar 09 03:04:08 crc kubenswrapper[4901]: I0309 03:04:08.656404 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-765c5b6b49-64hwd" event={"ID":"2224855e-40d6-45cd-b001-18e3cc94610d","Type":"ContainerDied","Data":"cbf0c3a00aaaa134deaab2830ae6a58dc74f3e70f0f557dab5d46534bdc85cb2"} Mar 09 03:04:08 crc kubenswrapper[4901]: I0309 03:04:08.656413 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-765c5b6b49-64hwd" Mar 09 03:04:08 crc kubenswrapper[4901]: I0309 03:04:08.656455 4901 scope.go:117] "RemoveContainer" containerID="2cabf147c324c9b0d8e1ce65fce8ffa92f5c6ad3220ab420158e1d3de32dea84" Mar 09 03:04:08 crc kubenswrapper[4901]: I0309 03:04:08.663193 4901 generic.go:334] "Generic (PLEG): container finished" podID="3b885da9-9534-4b38-af0f-6e88cd7c068b" containerID="a8671cc2258e76bc71ba7f64fa80bf243d2e6e61388f89c3acd170d0767e7177" exitCode=0 Mar 09 03:04:08 crc kubenswrapper[4901]: I0309 03:04:08.663251 4901 generic.go:334] "Generic (PLEG): container finished" podID="3b885da9-9534-4b38-af0f-6e88cd7c068b" containerID="69e737b92715d93b1083f5edd16b85a658ab1e0f2c1839a2672f92f05e1c9a84" exitCode=143 Mar 09 03:04:08 crc kubenswrapper[4901]: I0309 03:04:08.663266 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 03:04:08 crc kubenswrapper[4901]: I0309 03:04:08.663319 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3b885da9-9534-4b38-af0f-6e88cd7c068b","Type":"ContainerDied","Data":"a8671cc2258e76bc71ba7f64fa80bf243d2e6e61388f89c3acd170d0767e7177"} Mar 09 03:04:08 crc kubenswrapper[4901]: I0309 03:04:08.663346 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3b885da9-9534-4b38-af0f-6e88cd7c068b","Type":"ContainerDied","Data":"69e737b92715d93b1083f5edd16b85a658ab1e0f2c1839a2672f92f05e1c9a84"} Mar 09 03:04:08 crc kubenswrapper[4901]: I0309 03:04:08.663356 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3b885da9-9534-4b38-af0f-6e88cd7c068b","Type":"ContainerDied","Data":"58beab63d9f2c8e0c8541c0db19609f5e49be4c98b2cb81c5a934e63edd4e4b7"} Mar 09 03:04:08 crc kubenswrapper[4901]: I0309 03:04:08.676574 4901 generic.go:334] "Generic (PLEG): container finished" podID="3c442232-3859-43a4-9f6a-bc330c647b14" containerID="134a1b6b3562289ab6b2e8c4a9b2c0888bd313dc89a6d399f2696607e0296849" exitCode=143 Mar 09 03:04:08 crc kubenswrapper[4901]: I0309 03:04:08.677101 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3c442232-3859-43a4-9f6a-bc330c647b14","Type":"ContainerDied","Data":"134a1b6b3562289ab6b2e8c4a9b2c0888bd313dc89a6d399f2696607e0296849"} Mar 09 03:04:08 crc kubenswrapper[4901]: I0309 03:04:08.679622 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtsn4\" (UniqueName: \"kubernetes.io/projected/3b885da9-9534-4b38-af0f-6e88cd7c068b-kube-api-access-qtsn4\") pod \"3b885da9-9534-4b38-af0f-6e88cd7c068b\" (UID: \"3b885da9-9534-4b38-af0f-6e88cd7c068b\") " Mar 09 03:04:08 crc kubenswrapper[4901]: I0309 03:04:08.679773 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b885da9-9534-4b38-af0f-6e88cd7c068b-logs\") pod \"3b885da9-9534-4b38-af0f-6e88cd7c068b\" (UID: \"3b885da9-9534-4b38-af0f-6e88cd7c068b\") " Mar 09 03:04:08 crc kubenswrapper[4901]: I0309 03:04:08.679876 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b885da9-9534-4b38-af0f-6e88cd7c068b-combined-ca-bundle\") pod \"3b885da9-9534-4b38-af0f-6e88cd7c068b\" (UID: \"3b885da9-9534-4b38-af0f-6e88cd7c068b\") " Mar 09 03:04:08 crc kubenswrapper[4901]: I0309 03:04:08.679916 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b885da9-9534-4b38-af0f-6e88cd7c068b-config-data\") pod \"3b885da9-9534-4b38-af0f-6e88cd7c068b\" (UID: \"3b885da9-9534-4b38-af0f-6e88cd7c068b\") " Mar 09 03:04:08 crc kubenswrapper[4901]: I0309 03:04:08.679985 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b885da9-9534-4b38-af0f-6e88cd7c068b-nova-metadata-tls-certs\") pod \"3b885da9-9534-4b38-af0f-6e88cd7c068b\" (UID: \"3b885da9-9534-4b38-af0f-6e88cd7c068b\") " Mar 09 03:04:08 crc kubenswrapper[4901]: I0309 03:04:08.680287 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b885da9-9534-4b38-af0f-6e88cd7c068b-logs" (OuterVolumeSpecName: "logs") pod "3b885da9-9534-4b38-af0f-6e88cd7c068b" (UID: "3b885da9-9534-4b38-af0f-6e88cd7c068b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:04:08 crc kubenswrapper[4901]: I0309 03:04:08.680596 4901 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b885da9-9534-4b38-af0f-6e88cd7c068b-logs\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:08 crc kubenswrapper[4901]: I0309 03:04:08.684176 4901 scope.go:117] "RemoveContainer" containerID="51dd6706760fdfd6e90bc55edfe74e1ead6282dd2f77159077d8a1101b087c17" Mar 09 03:04:08 crc kubenswrapper[4901]: I0309 03:04:08.684328 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-765c5b6b49-64hwd"] Mar 09 03:04:08 crc kubenswrapper[4901]: I0309 03:04:08.688335 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b885da9-9534-4b38-af0f-6e88cd7c068b-kube-api-access-qtsn4" (OuterVolumeSpecName: "kube-api-access-qtsn4") pod "3b885da9-9534-4b38-af0f-6e88cd7c068b" (UID: "3b885da9-9534-4b38-af0f-6e88cd7c068b"). InnerVolumeSpecName "kube-api-access-qtsn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:04:08 crc kubenswrapper[4901]: I0309 03:04:08.689197 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-765c5b6b49-64hwd"] Mar 09 03:04:08 crc kubenswrapper[4901]: I0309 03:04:08.709050 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b885da9-9534-4b38-af0f-6e88cd7c068b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3b885da9-9534-4b38-af0f-6e88cd7c068b" (UID: "3b885da9-9534-4b38-af0f-6e88cd7c068b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:04:08 crc kubenswrapper[4901]: I0309 03:04:08.712509 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b885da9-9534-4b38-af0f-6e88cd7c068b-config-data" (OuterVolumeSpecName: "config-data") pod "3b885da9-9534-4b38-af0f-6e88cd7c068b" (UID: "3b885da9-9534-4b38-af0f-6e88cd7c068b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:04:08 crc kubenswrapper[4901]: I0309 03:04:08.713028 4901 scope.go:117] "RemoveContainer" containerID="a8671cc2258e76bc71ba7f64fa80bf243d2e6e61388f89c3acd170d0767e7177" Mar 09 03:04:08 crc kubenswrapper[4901]: I0309 03:04:08.732127 4901 scope.go:117] "RemoveContainer" containerID="69e737b92715d93b1083f5edd16b85a658ab1e0f2c1839a2672f92f05e1c9a84" Mar 09 03:04:08 crc kubenswrapper[4901]: I0309 03:04:08.756540 4901 scope.go:117] "RemoveContainer" containerID="a8671cc2258e76bc71ba7f64fa80bf243d2e6e61388f89c3acd170d0767e7177" Mar 09 03:04:08 crc kubenswrapper[4901]: E0309 03:04:08.757436 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8671cc2258e76bc71ba7f64fa80bf243d2e6e61388f89c3acd170d0767e7177\": container with ID starting with a8671cc2258e76bc71ba7f64fa80bf243d2e6e61388f89c3acd170d0767e7177 not found: ID does not exist" containerID="a8671cc2258e76bc71ba7f64fa80bf243d2e6e61388f89c3acd170d0767e7177" Mar 09 03:04:08 crc kubenswrapper[4901]: I0309 03:04:08.757485 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8671cc2258e76bc71ba7f64fa80bf243d2e6e61388f89c3acd170d0767e7177"} err="failed to get container status \"a8671cc2258e76bc71ba7f64fa80bf243d2e6e61388f89c3acd170d0767e7177\": rpc error: code = NotFound desc = could not find container \"a8671cc2258e76bc71ba7f64fa80bf243d2e6e61388f89c3acd170d0767e7177\": container with ID starting with a8671cc2258e76bc71ba7f64fa80bf243d2e6e61388f89c3acd170d0767e7177 not found: ID does not exist" Mar 09 03:04:08 crc kubenswrapper[4901]: I0309 03:04:08.757512 4901 scope.go:117] "RemoveContainer" containerID="69e737b92715d93b1083f5edd16b85a658ab1e0f2c1839a2672f92f05e1c9a84" Mar 09 03:04:08 crc kubenswrapper[4901]: E0309 03:04:08.758299 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69e737b92715d93b1083f5edd16b85a658ab1e0f2c1839a2672f92f05e1c9a84\": container with ID starting with 69e737b92715d93b1083f5edd16b85a658ab1e0f2c1839a2672f92f05e1c9a84 not found: ID does not exist" containerID="69e737b92715d93b1083f5edd16b85a658ab1e0f2c1839a2672f92f05e1c9a84" Mar 09 03:04:08 crc kubenswrapper[4901]: I0309 03:04:08.758372 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69e737b92715d93b1083f5edd16b85a658ab1e0f2c1839a2672f92f05e1c9a84"} err="failed to get container status \"69e737b92715d93b1083f5edd16b85a658ab1e0f2c1839a2672f92f05e1c9a84\": rpc error: code = NotFound desc = could not find container \"69e737b92715d93b1083f5edd16b85a658ab1e0f2c1839a2672f92f05e1c9a84\": container with ID starting with 69e737b92715d93b1083f5edd16b85a658ab1e0f2c1839a2672f92f05e1c9a84 not found: ID does not exist" Mar 09 03:04:08 crc kubenswrapper[4901]: I0309 03:04:08.758427 4901 scope.go:117] "RemoveContainer" containerID="a8671cc2258e76bc71ba7f64fa80bf243d2e6e61388f89c3acd170d0767e7177" Mar 09 03:04:08 crc kubenswrapper[4901]: I0309 03:04:08.760728 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8671cc2258e76bc71ba7f64fa80bf243d2e6e61388f89c3acd170d0767e7177"} err="failed to get container status \"a8671cc2258e76bc71ba7f64fa80bf243d2e6e61388f89c3acd170d0767e7177\": rpc error: code = NotFound desc = could not find container \"a8671cc2258e76bc71ba7f64fa80bf243d2e6e61388f89c3acd170d0767e7177\": container with ID starting with a8671cc2258e76bc71ba7f64fa80bf243d2e6e61388f89c3acd170d0767e7177 not found: ID does not exist" Mar 09 03:04:08 crc kubenswrapper[4901]: I0309 03:04:08.760936 4901 scope.go:117] "RemoveContainer" containerID="69e737b92715d93b1083f5edd16b85a658ab1e0f2c1839a2672f92f05e1c9a84" Mar 09 03:04:08 crc kubenswrapper[4901]: I0309 03:04:08.761505 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b885da9-9534-4b38-af0f-6e88cd7c068b-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "3b885da9-9534-4b38-af0f-6e88cd7c068b" (UID: "3b885da9-9534-4b38-af0f-6e88cd7c068b"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:04:08 crc kubenswrapper[4901]: I0309 03:04:08.767322 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69e737b92715d93b1083f5edd16b85a658ab1e0f2c1839a2672f92f05e1c9a84"} err="failed to get container status \"69e737b92715d93b1083f5edd16b85a658ab1e0f2c1839a2672f92f05e1c9a84\": rpc error: code = NotFound desc = could not find container \"69e737b92715d93b1083f5edd16b85a658ab1e0f2c1839a2672f92f05e1c9a84\": container with ID starting with 69e737b92715d93b1083f5edd16b85a658ab1e0f2c1839a2672f92f05e1c9a84 not found: ID does not exist" Mar 09 03:04:08 crc kubenswrapper[4901]: I0309 03:04:08.782799 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b885da9-9534-4b38-af0f-6e88cd7c068b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:08 crc kubenswrapper[4901]: I0309 03:04:08.782827 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b885da9-9534-4b38-af0f-6e88cd7c068b-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:08 crc kubenswrapper[4901]: I0309 03:04:08.782837 4901 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b885da9-9534-4b38-af0f-6e88cd7c068b-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:08 crc kubenswrapper[4901]: I0309 03:04:08.782861 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtsn4\" (UniqueName: \"kubernetes.io/projected/3b885da9-9534-4b38-af0f-6e88cd7c068b-kube-api-access-qtsn4\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:09 crc kubenswrapper[4901]: I0309 03:04:09.048887 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 03:04:09 crc kubenswrapper[4901]: I0309 03:04:09.082738 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 03:04:09 crc kubenswrapper[4901]: I0309 03:04:09.094999 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 09 03:04:09 crc kubenswrapper[4901]: E0309 03:04:09.095346 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b885da9-9534-4b38-af0f-6e88cd7c068b" containerName="nova-metadata-log" Mar 09 03:04:09 crc kubenswrapper[4901]: I0309 03:04:09.095360 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b885da9-9534-4b38-af0f-6e88cd7c068b" containerName="nova-metadata-log" Mar 09 03:04:09 crc kubenswrapper[4901]: E0309 03:04:09.095382 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b885da9-9534-4b38-af0f-6e88cd7c068b" containerName="nova-metadata-metadata" Mar 09 03:04:09 crc kubenswrapper[4901]: I0309 03:04:09.095388 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b885da9-9534-4b38-af0f-6e88cd7c068b" containerName="nova-metadata-metadata" Mar 09 03:04:09 crc kubenswrapper[4901]: I0309 03:04:09.095630 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b885da9-9534-4b38-af0f-6e88cd7c068b" containerName="nova-metadata-log" Mar 09 03:04:09 crc kubenswrapper[4901]: I0309 03:04:09.095654 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b885da9-9534-4b38-af0f-6e88cd7c068b" containerName="nova-metadata-metadata" Mar 09 03:04:09 crc kubenswrapper[4901]: I0309 03:04:09.097340 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 03:04:09 crc kubenswrapper[4901]: I0309 03:04:09.099845 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 09 03:04:09 crc kubenswrapper[4901]: I0309 03:04:09.100807 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 09 03:04:09 crc kubenswrapper[4901]: I0309 03:04:09.124312 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 03:04:09 crc kubenswrapper[4901]: I0309 03:04:09.291682 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4plp\" (UniqueName: \"kubernetes.io/projected/0c7fbb65-658d-416a-85da-243a966b9bc9-kube-api-access-f4plp\") pod \"nova-metadata-0\" (UID: \"0c7fbb65-658d-416a-85da-243a966b9bc9\") " pod="openstack/nova-metadata-0" Mar 09 03:04:09 crc kubenswrapper[4901]: I0309 03:04:09.291767 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c7fbb65-658d-416a-85da-243a966b9bc9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0c7fbb65-658d-416a-85da-243a966b9bc9\") " pod="openstack/nova-metadata-0" Mar 09 03:04:09 crc kubenswrapper[4901]: I0309 03:04:09.291786 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c7fbb65-658d-416a-85da-243a966b9bc9-config-data\") pod \"nova-metadata-0\" (UID: \"0c7fbb65-658d-416a-85da-243a966b9bc9\") " pod="openstack/nova-metadata-0" Mar 09 03:04:09 crc kubenswrapper[4901]: I0309 03:04:09.291882 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c7fbb65-658d-416a-85da-243a966b9bc9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0c7fbb65-658d-416a-85da-243a966b9bc9\") " pod="openstack/nova-metadata-0" Mar 09 03:04:09 crc kubenswrapper[4901]: I0309 03:04:09.292047 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c7fbb65-658d-416a-85da-243a966b9bc9-logs\") pod \"nova-metadata-0\" (UID: \"0c7fbb65-658d-416a-85da-243a966b9bc9\") " pod="openstack/nova-metadata-0" Mar 09 03:04:09 crc kubenswrapper[4901]: I0309 03:04:09.393421 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c7fbb65-658d-416a-85da-243a966b9bc9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0c7fbb65-658d-416a-85da-243a966b9bc9\") " pod="openstack/nova-metadata-0" Mar 09 03:04:09 crc kubenswrapper[4901]: I0309 03:04:09.393477 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c7fbb65-658d-416a-85da-243a966b9bc9-config-data\") pod \"nova-metadata-0\" (UID: \"0c7fbb65-658d-416a-85da-243a966b9bc9\") " pod="openstack/nova-metadata-0" Mar 09 03:04:09 crc kubenswrapper[4901]: I0309 03:04:09.393502 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c7fbb65-658d-416a-85da-243a966b9bc9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0c7fbb65-658d-416a-85da-243a966b9bc9\") " pod="openstack/nova-metadata-0" Mar 09 03:04:09 crc kubenswrapper[4901]: I0309 03:04:09.393535 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c7fbb65-658d-416a-85da-243a966b9bc9-logs\") pod \"nova-metadata-0\" (UID: \"0c7fbb65-658d-416a-85da-243a966b9bc9\") " pod="openstack/nova-metadata-0" Mar 09 03:04:09 crc kubenswrapper[4901]: I0309 03:04:09.393645 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4plp\" (UniqueName: \"kubernetes.io/projected/0c7fbb65-658d-416a-85da-243a966b9bc9-kube-api-access-f4plp\") pod \"nova-metadata-0\" (UID: \"0c7fbb65-658d-416a-85da-243a966b9bc9\") " pod="openstack/nova-metadata-0" Mar 09 03:04:09 crc kubenswrapper[4901]: I0309 03:04:09.394346 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c7fbb65-658d-416a-85da-243a966b9bc9-logs\") pod \"nova-metadata-0\" (UID: \"0c7fbb65-658d-416a-85da-243a966b9bc9\") " pod="openstack/nova-metadata-0" Mar 09 03:04:09 crc kubenswrapper[4901]: I0309 03:04:09.398849 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c7fbb65-658d-416a-85da-243a966b9bc9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0c7fbb65-658d-416a-85da-243a966b9bc9\") " pod="openstack/nova-metadata-0" Mar 09 03:04:09 crc kubenswrapper[4901]: I0309 03:04:09.402926 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c7fbb65-658d-416a-85da-243a966b9bc9-config-data\") pod \"nova-metadata-0\" (UID: \"0c7fbb65-658d-416a-85da-243a966b9bc9\") " pod="openstack/nova-metadata-0" Mar 09 03:04:09 crc kubenswrapper[4901]: I0309 03:04:09.403034 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c7fbb65-658d-416a-85da-243a966b9bc9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0c7fbb65-658d-416a-85da-243a966b9bc9\") " pod="openstack/nova-metadata-0" Mar 09 03:04:09 crc kubenswrapper[4901]: I0309 03:04:09.413141 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4plp\" (UniqueName: \"kubernetes.io/projected/0c7fbb65-658d-416a-85da-243a966b9bc9-kube-api-access-f4plp\") pod \"nova-metadata-0\" (UID: \"0c7fbb65-658d-416a-85da-243a966b9bc9\") " pod="openstack/nova-metadata-0" Mar 09 03:04:09 crc kubenswrapper[4901]: I0309 03:04:09.424629 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 03:04:09 crc kubenswrapper[4901]: I0309 03:04:09.692971 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"19b624e5-b3de-4724-b995-829d3fcd48ae","Type":"ContainerStarted","Data":"efaef8fb4ac1ae8528bd34b42efc1ece916dbf688592bd94cb85d4b9a27c1591"} Mar 09 03:04:09 crc kubenswrapper[4901]: I0309 03:04:09.693115 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 09 03:04:09 crc kubenswrapper[4901]: I0309 03:04:09.698954 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="b7764b86-48f0-4058-a60d-9275796fe58c" containerName="nova-scheduler-scheduler" containerID="cri-o://5ce1250df11ac77db18e280bfbff43f51b13a9f3620235afe3114b645f1a1693" gracePeriod=30 Mar 09 03:04:09 crc kubenswrapper[4901]: I0309 03:04:09.714016 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.713999047 podStartE2EDuration="2.713999047s" podCreationTimestamp="2026-03-09 03:04:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 03:04:09.708655472 +0000 UTC m=+1374.298319204" watchObservedRunningTime="2026-03-09 03:04:09.713999047 +0000 UTC m=+1374.303662779" Mar 09 03:04:09 crc kubenswrapper[4901]: I0309 03:04:09.916102 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 03:04:10 crc kubenswrapper[4901]: I0309 03:04:10.140117 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2224855e-40d6-45cd-b001-18e3cc94610d" path="/var/lib/kubelet/pods/2224855e-40d6-45cd-b001-18e3cc94610d/volumes" Mar 09 03:04:10 crc kubenswrapper[4901]: I0309 03:04:10.142467 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b885da9-9534-4b38-af0f-6e88cd7c068b" path="/var/lib/kubelet/pods/3b885da9-9534-4b38-af0f-6e88cd7c068b/volumes" Mar 09 03:04:10 crc kubenswrapper[4901]: I0309 03:04:10.714998 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0c7fbb65-658d-416a-85da-243a966b9bc9","Type":"ContainerStarted","Data":"84cf5d3e65a2b63112b3e902289abee82dcc47de580617ca860d4fe52534a346"} Mar 09 03:04:10 crc kubenswrapper[4901]: I0309 03:04:10.715261 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0c7fbb65-658d-416a-85da-243a966b9bc9","Type":"ContainerStarted","Data":"dca3cac93605433ad08ce4bac4c9419914d58e497e5fb2f136941444b9e7a625"} Mar 09 03:04:10 crc kubenswrapper[4901]: I0309 03:04:10.715277 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0c7fbb65-658d-416a-85da-243a966b9bc9","Type":"ContainerStarted","Data":"0b697d69770acc967f3d5a01316656bf4883a8aa0eb64a1805dfde0d5bc66e6f"} Mar 09 03:04:10 crc kubenswrapper[4901]: I0309 03:04:10.742985 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.7429596 podStartE2EDuration="1.7429596s" podCreationTimestamp="2026-03-09 03:04:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 03:04:10.735723397 +0000 UTC m=+1375.325387139" watchObservedRunningTime="2026-03-09 03:04:10.7429596 +0000 UTC m=+1375.332623362" Mar 09 03:04:11 crc kubenswrapper[4901]: E0309 03:04:11.810506 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5ce1250df11ac77db18e280bfbff43f51b13a9f3620235afe3114b645f1a1693" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 09 03:04:11 crc kubenswrapper[4901]: E0309 03:04:11.813756 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5ce1250df11ac77db18e280bfbff43f51b13a9f3620235afe3114b645f1a1693" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 09 03:04:11 crc kubenswrapper[4901]: E0309 03:04:11.815646 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5ce1250df11ac77db18e280bfbff43f51b13a9f3620235afe3114b645f1a1693" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 09 03:04:11 crc kubenswrapper[4901]: E0309 03:04:11.815708 4901 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="b7764b86-48f0-4058-a60d-9275796fe58c" containerName="nova-scheduler-scheduler" Mar 09 03:04:12 crc kubenswrapper[4901]: I0309 03:04:12.740146 4901 generic.go:334] "Generic (PLEG): container finished" podID="b7764b86-48f0-4058-a60d-9275796fe58c" containerID="5ce1250df11ac77db18e280bfbff43f51b13a9f3620235afe3114b645f1a1693" exitCode=0 Mar 09 03:04:12 crc kubenswrapper[4901]: I0309 03:04:12.740276 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b7764b86-48f0-4058-a60d-9275796fe58c","Type":"ContainerDied","Data":"5ce1250df11ac77db18e280bfbff43f51b13a9f3620235afe3114b645f1a1693"} Mar 09 03:04:12 crc kubenswrapper[4901]: I0309 03:04:12.853392 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 03:04:12 crc kubenswrapper[4901]: I0309 03:04:12.965766 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7764b86-48f0-4058-a60d-9275796fe58c-config-data\") pod \"b7764b86-48f0-4058-a60d-9275796fe58c\" (UID: \"b7764b86-48f0-4058-a60d-9275796fe58c\") " Mar 09 03:04:12 crc kubenswrapper[4901]: I0309 03:04:12.965942 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jz7c\" (UniqueName: \"kubernetes.io/projected/b7764b86-48f0-4058-a60d-9275796fe58c-kube-api-access-6jz7c\") pod \"b7764b86-48f0-4058-a60d-9275796fe58c\" (UID: \"b7764b86-48f0-4058-a60d-9275796fe58c\") " Mar 09 03:04:12 crc kubenswrapper[4901]: I0309 03:04:12.965987 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7764b86-48f0-4058-a60d-9275796fe58c-combined-ca-bundle\") pod \"b7764b86-48f0-4058-a60d-9275796fe58c\" (UID: \"b7764b86-48f0-4058-a60d-9275796fe58c\") " Mar 09 03:04:12 crc kubenswrapper[4901]: I0309 03:04:12.973426 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7764b86-48f0-4058-a60d-9275796fe58c-kube-api-access-6jz7c" (OuterVolumeSpecName: "kube-api-access-6jz7c") pod "b7764b86-48f0-4058-a60d-9275796fe58c" (UID: "b7764b86-48f0-4058-a60d-9275796fe58c"). InnerVolumeSpecName "kube-api-access-6jz7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:04:12 crc kubenswrapper[4901]: I0309 03:04:12.991908 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7764b86-48f0-4058-a60d-9275796fe58c-config-data" (OuterVolumeSpecName: "config-data") pod "b7764b86-48f0-4058-a60d-9275796fe58c" (UID: "b7764b86-48f0-4058-a60d-9275796fe58c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:04:12 crc kubenswrapper[4901]: I0309 03:04:12.999733 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7764b86-48f0-4058-a60d-9275796fe58c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b7764b86-48f0-4058-a60d-9275796fe58c" (UID: "b7764b86-48f0-4058-a60d-9275796fe58c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:04:13 crc kubenswrapper[4901]: I0309 03:04:13.068986 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jz7c\" (UniqueName: \"kubernetes.io/projected/b7764b86-48f0-4058-a60d-9275796fe58c-kube-api-access-6jz7c\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:13 crc kubenswrapper[4901]: I0309 03:04:13.069037 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7764b86-48f0-4058-a60d-9275796fe58c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:13 crc kubenswrapper[4901]: I0309 03:04:13.069055 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7764b86-48f0-4058-a60d-9275796fe58c-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:13 crc kubenswrapper[4901]: I0309 03:04:13.128672 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 09 03:04:13 crc kubenswrapper[4901]: I0309 03:04:13.749538 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 03:04:13 crc kubenswrapper[4901]: I0309 03:04:13.757488 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b7764b86-48f0-4058-a60d-9275796fe58c","Type":"ContainerDied","Data":"88a9ee72d697005f633fa182795e673a83604d8f8f3ff10163899fc0259d33ad"} Mar 09 03:04:13 crc kubenswrapper[4901]: I0309 03:04:13.757568 4901 scope.go:117] "RemoveContainer" containerID="5ce1250df11ac77db18e280bfbff43f51b13a9f3620235afe3114b645f1a1693" Mar 09 03:04:13 crc kubenswrapper[4901]: I0309 03:04:13.757798 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 03:04:13 crc kubenswrapper[4901]: I0309 03:04:13.760068 4901 generic.go:334] "Generic (PLEG): container finished" podID="3c442232-3859-43a4-9f6a-bc330c647b14" containerID="2262a0acce159c7c77430c43bd8af63d5d0bbe8965d780c78a1c50ba93771971" exitCode=0 Mar 09 03:04:13 crc kubenswrapper[4901]: I0309 03:04:13.760107 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3c442232-3859-43a4-9f6a-bc330c647b14","Type":"ContainerDied","Data":"2262a0acce159c7c77430c43bd8af63d5d0bbe8965d780c78a1c50ba93771971"} Mar 09 03:04:13 crc kubenswrapper[4901]: I0309 03:04:13.760134 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3c442232-3859-43a4-9f6a-bc330c647b14","Type":"ContainerDied","Data":"071d4808f8beab3ebc139bcb783c4e85a52870c24a0059bc855d8d99700b21bc"} Mar 09 03:04:13 crc kubenswrapper[4901]: I0309 03:04:13.760190 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 03:04:13 crc kubenswrapper[4901]: I0309 03:04:13.789088 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c442232-3859-43a4-9f6a-bc330c647b14-combined-ca-bundle\") pod \"3c442232-3859-43a4-9f6a-bc330c647b14\" (UID: \"3c442232-3859-43a4-9f6a-bc330c647b14\") " Mar 09 03:04:13 crc kubenswrapper[4901]: I0309 03:04:13.789288 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c442232-3859-43a4-9f6a-bc330c647b14-config-data\") pod \"3c442232-3859-43a4-9f6a-bc330c647b14\" (UID: \"3c442232-3859-43a4-9f6a-bc330c647b14\") " Mar 09 03:04:13 crc kubenswrapper[4901]: I0309 03:04:13.794895 4901 scope.go:117] "RemoveContainer" containerID="2262a0acce159c7c77430c43bd8af63d5d0bbe8965d780c78a1c50ba93771971" Mar 09 03:04:13 crc kubenswrapper[4901]: I0309 03:04:13.824039 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 03:04:13 crc kubenswrapper[4901]: I0309 03:04:13.837615 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 03:04:13 crc kubenswrapper[4901]: I0309 03:04:13.842734 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c442232-3859-43a4-9f6a-bc330c647b14-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3c442232-3859-43a4-9f6a-bc330c647b14" (UID: "3c442232-3859-43a4-9f6a-bc330c647b14"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:04:13 crc kubenswrapper[4901]: I0309 03:04:13.859102 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c442232-3859-43a4-9f6a-bc330c647b14-config-data" (OuterVolumeSpecName: "config-data") pod "3c442232-3859-43a4-9f6a-bc330c647b14" (UID: "3c442232-3859-43a4-9f6a-bc330c647b14"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:04:13 crc kubenswrapper[4901]: I0309 03:04:13.864656 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 03:04:13 crc kubenswrapper[4901]: I0309 03:04:13.865963 4901 scope.go:117] "RemoveContainer" containerID="134a1b6b3562289ab6b2e8c4a9b2c0888bd313dc89a6d399f2696607e0296849" Mar 09 03:04:13 crc kubenswrapper[4901]: E0309 03:04:13.867207 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c442232-3859-43a4-9f6a-bc330c647b14" containerName="nova-api-log" Mar 09 03:04:13 crc kubenswrapper[4901]: I0309 03:04:13.867251 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c442232-3859-43a4-9f6a-bc330c647b14" containerName="nova-api-log" Mar 09 03:04:13 crc kubenswrapper[4901]: E0309 03:04:13.867267 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c442232-3859-43a4-9f6a-bc330c647b14" containerName="nova-api-api" Mar 09 03:04:13 crc kubenswrapper[4901]: I0309 03:04:13.867273 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c442232-3859-43a4-9f6a-bc330c647b14" containerName="nova-api-api" Mar 09 03:04:13 crc kubenswrapper[4901]: E0309 03:04:13.867291 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7764b86-48f0-4058-a60d-9275796fe58c" containerName="nova-scheduler-scheduler" Mar 09 03:04:13 crc kubenswrapper[4901]: I0309 03:04:13.867298 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7764b86-48f0-4058-a60d-9275796fe58c" containerName="nova-scheduler-scheduler" Mar 09 03:04:13 crc kubenswrapper[4901]: I0309 03:04:13.867498 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c442232-3859-43a4-9f6a-bc330c647b14" containerName="nova-api-log" Mar 09 03:04:13 crc kubenswrapper[4901]: I0309 03:04:13.867520 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7764b86-48f0-4058-a60d-9275796fe58c" containerName="nova-scheduler-scheduler" Mar 09 03:04:13 crc kubenswrapper[4901]: I0309 03:04:13.867532 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c442232-3859-43a4-9f6a-bc330c647b14" containerName="nova-api-api" Mar 09 03:04:13 crc kubenswrapper[4901]: I0309 03:04:13.868142 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 03:04:13 crc kubenswrapper[4901]: I0309 03:04:13.870831 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 09 03:04:13 crc kubenswrapper[4901]: I0309 03:04:13.882150 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 03:04:13 crc kubenswrapper[4901]: I0309 03:04:13.890430 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96tc6\" (UniqueName: \"kubernetes.io/projected/3c442232-3859-43a4-9f6a-bc330c647b14-kube-api-access-96tc6\") pod \"3c442232-3859-43a4-9f6a-bc330c647b14\" (UID: \"3c442232-3859-43a4-9f6a-bc330c647b14\") " Mar 09 03:04:13 crc kubenswrapper[4901]: I0309 03:04:13.890482 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c442232-3859-43a4-9f6a-bc330c647b14-logs\") pod \"3c442232-3859-43a4-9f6a-bc330c647b14\" (UID: \"3c442232-3859-43a4-9f6a-bc330c647b14\") " Mar 09 03:04:13 crc kubenswrapper[4901]: I0309 03:04:13.890714 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8653c2ab-c097-435e-b694-5c894b6bdd11-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8653c2ab-c097-435e-b694-5c894b6bdd11\") " pod="openstack/nova-scheduler-0" Mar 09 03:04:13 crc kubenswrapper[4901]: I0309 03:04:13.890746 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n9jd\" (UniqueName: \"kubernetes.io/projected/8653c2ab-c097-435e-b694-5c894b6bdd11-kube-api-access-9n9jd\") pod \"nova-scheduler-0\" (UID: \"8653c2ab-c097-435e-b694-5c894b6bdd11\") " pod="openstack/nova-scheduler-0" Mar 09 03:04:13 crc kubenswrapper[4901]: I0309 03:04:13.890863 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8653c2ab-c097-435e-b694-5c894b6bdd11-config-data\") pod \"nova-scheduler-0\" (UID: \"8653c2ab-c097-435e-b694-5c894b6bdd11\") " pod="openstack/nova-scheduler-0" Mar 09 03:04:13 crc kubenswrapper[4901]: I0309 03:04:13.890908 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c442232-3859-43a4-9f6a-bc330c647b14-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:13 crc kubenswrapper[4901]: I0309 03:04:13.890918 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c442232-3859-43a4-9f6a-bc330c647b14-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:13 crc kubenswrapper[4901]: I0309 03:04:13.891126 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c442232-3859-43a4-9f6a-bc330c647b14-logs" (OuterVolumeSpecName: "logs") pod "3c442232-3859-43a4-9f6a-bc330c647b14" (UID: "3c442232-3859-43a4-9f6a-bc330c647b14"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:04:13 crc kubenswrapper[4901]: I0309 03:04:13.894448 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c442232-3859-43a4-9f6a-bc330c647b14-kube-api-access-96tc6" (OuterVolumeSpecName: "kube-api-access-96tc6") pod "3c442232-3859-43a4-9f6a-bc330c647b14" (UID: "3c442232-3859-43a4-9f6a-bc330c647b14"). InnerVolumeSpecName "kube-api-access-96tc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:04:13 crc kubenswrapper[4901]: I0309 03:04:13.928458 4901 scope.go:117] "RemoveContainer" containerID="2262a0acce159c7c77430c43bd8af63d5d0bbe8965d780c78a1c50ba93771971" Mar 09 03:04:13 crc kubenswrapper[4901]: E0309 03:04:13.929444 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2262a0acce159c7c77430c43bd8af63d5d0bbe8965d780c78a1c50ba93771971\": container with ID starting with 2262a0acce159c7c77430c43bd8af63d5d0bbe8965d780c78a1c50ba93771971 not found: ID does not exist" containerID="2262a0acce159c7c77430c43bd8af63d5d0bbe8965d780c78a1c50ba93771971" Mar 09 03:04:13 crc kubenswrapper[4901]: I0309 03:04:13.929471 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2262a0acce159c7c77430c43bd8af63d5d0bbe8965d780c78a1c50ba93771971"} err="failed to get container status \"2262a0acce159c7c77430c43bd8af63d5d0bbe8965d780c78a1c50ba93771971\": rpc error: code = NotFound desc = could not find container \"2262a0acce159c7c77430c43bd8af63d5d0bbe8965d780c78a1c50ba93771971\": container with ID starting with 2262a0acce159c7c77430c43bd8af63d5d0bbe8965d780c78a1c50ba93771971 not found: ID does not exist" Mar 09 03:04:13 crc kubenswrapper[4901]: I0309 03:04:13.929490 4901 scope.go:117] "RemoveContainer" containerID="134a1b6b3562289ab6b2e8c4a9b2c0888bd313dc89a6d399f2696607e0296849" Mar 09 03:04:13 crc kubenswrapper[4901]: E0309 03:04:13.929810 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"134a1b6b3562289ab6b2e8c4a9b2c0888bd313dc89a6d399f2696607e0296849\": container with ID starting with 134a1b6b3562289ab6b2e8c4a9b2c0888bd313dc89a6d399f2696607e0296849 not found: ID does not exist" containerID="134a1b6b3562289ab6b2e8c4a9b2c0888bd313dc89a6d399f2696607e0296849" Mar 09 03:04:13 crc kubenswrapper[4901]: I0309 03:04:13.929830 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"134a1b6b3562289ab6b2e8c4a9b2c0888bd313dc89a6d399f2696607e0296849"} err="failed to get container status \"134a1b6b3562289ab6b2e8c4a9b2c0888bd313dc89a6d399f2696607e0296849\": rpc error: code = NotFound desc = could not find container \"134a1b6b3562289ab6b2e8c4a9b2c0888bd313dc89a6d399f2696607e0296849\": container with ID starting with 134a1b6b3562289ab6b2e8c4a9b2c0888bd313dc89a6d399f2696607e0296849 not found: ID does not exist" Mar 09 03:04:13 crc kubenswrapper[4901]: I0309 03:04:13.939318 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 09 03:04:13 crc kubenswrapper[4901]: I0309 03:04:13.992714 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8653c2ab-c097-435e-b694-5c894b6bdd11-config-data\") pod \"nova-scheduler-0\" (UID: \"8653c2ab-c097-435e-b694-5c894b6bdd11\") " pod="openstack/nova-scheduler-0" Mar 09 03:04:13 crc kubenswrapper[4901]: I0309 03:04:13.992774 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8653c2ab-c097-435e-b694-5c894b6bdd11-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8653c2ab-c097-435e-b694-5c894b6bdd11\") " pod="openstack/nova-scheduler-0" Mar 09 03:04:13 crc kubenswrapper[4901]: I0309 03:04:13.992801 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9n9jd\" (UniqueName: \"kubernetes.io/projected/8653c2ab-c097-435e-b694-5c894b6bdd11-kube-api-access-9n9jd\") pod \"nova-scheduler-0\" (UID: \"8653c2ab-c097-435e-b694-5c894b6bdd11\") " pod="openstack/nova-scheduler-0" Mar 09 03:04:13 crc kubenswrapper[4901]: I0309 03:04:13.992959 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96tc6\" (UniqueName: \"kubernetes.io/projected/3c442232-3859-43a4-9f6a-bc330c647b14-kube-api-access-96tc6\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:13 crc kubenswrapper[4901]: I0309 03:04:13.992976 4901 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c442232-3859-43a4-9f6a-bc330c647b14-logs\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:13 crc kubenswrapper[4901]: I0309 03:04:13.996698 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8653c2ab-c097-435e-b694-5c894b6bdd11-config-data\") pod \"nova-scheduler-0\" (UID: \"8653c2ab-c097-435e-b694-5c894b6bdd11\") " pod="openstack/nova-scheduler-0" Mar 09 03:04:13 crc kubenswrapper[4901]: I0309 03:04:13.996940 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8653c2ab-c097-435e-b694-5c894b6bdd11-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8653c2ab-c097-435e-b694-5c894b6bdd11\") " pod="openstack/nova-scheduler-0" Mar 09 03:04:14 crc kubenswrapper[4901]: I0309 03:04:14.019473 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n9jd\" (UniqueName: \"kubernetes.io/projected/8653c2ab-c097-435e-b694-5c894b6bdd11-kube-api-access-9n9jd\") pod \"nova-scheduler-0\" (UID: \"8653c2ab-c097-435e-b694-5c894b6bdd11\") " pod="openstack/nova-scheduler-0" Mar 09 03:04:14 crc kubenswrapper[4901]: I0309 03:04:14.128779 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7764b86-48f0-4058-a60d-9275796fe58c" path="/var/lib/kubelet/pods/b7764b86-48f0-4058-a60d-9275796fe58c/volumes" Mar 09 03:04:14 crc kubenswrapper[4901]: I0309 03:04:14.129487 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 09 03:04:14 crc kubenswrapper[4901]: I0309 03:04:14.129735 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 09 03:04:14 crc kubenswrapper[4901]: I0309 03:04:14.133895 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 09 03:04:14 crc kubenswrapper[4901]: I0309 03:04:14.140858 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 03:04:14 crc kubenswrapper[4901]: I0309 03:04:14.144100 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 09 03:04:14 crc kubenswrapper[4901]: I0309 03:04:14.145004 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 09 03:04:14 crc kubenswrapper[4901]: I0309 03:04:14.196889 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgwm4\" (UniqueName: \"kubernetes.io/projected/258f621f-8909-4d36-8f2f-bdd166e47139-kube-api-access-qgwm4\") pod \"nova-api-0\" (UID: \"258f621f-8909-4d36-8f2f-bdd166e47139\") " pod="openstack/nova-api-0" Mar 09 03:04:14 crc kubenswrapper[4901]: I0309 03:04:14.196927 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/258f621f-8909-4d36-8f2f-bdd166e47139-logs\") pod \"nova-api-0\" (UID: \"258f621f-8909-4d36-8f2f-bdd166e47139\") " pod="openstack/nova-api-0" Mar 09 03:04:14 crc kubenswrapper[4901]: I0309 03:04:14.196951 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/258f621f-8909-4d36-8f2f-bdd166e47139-config-data\") pod \"nova-api-0\" (UID: \"258f621f-8909-4d36-8f2f-bdd166e47139\") " pod="openstack/nova-api-0" Mar 09 03:04:14 crc kubenswrapper[4901]: I0309 03:04:14.196991 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/258f621f-8909-4d36-8f2f-bdd166e47139-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"258f621f-8909-4d36-8f2f-bdd166e47139\") " pod="openstack/nova-api-0" Mar 09 03:04:14 crc kubenswrapper[4901]: I0309 03:04:14.228735 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 03:04:14 crc kubenswrapper[4901]: I0309 03:04:14.298865 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgwm4\" (UniqueName: \"kubernetes.io/projected/258f621f-8909-4d36-8f2f-bdd166e47139-kube-api-access-qgwm4\") pod \"nova-api-0\" (UID: \"258f621f-8909-4d36-8f2f-bdd166e47139\") " pod="openstack/nova-api-0" Mar 09 03:04:14 crc kubenswrapper[4901]: I0309 03:04:14.298953 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/258f621f-8909-4d36-8f2f-bdd166e47139-logs\") pod \"nova-api-0\" (UID: \"258f621f-8909-4d36-8f2f-bdd166e47139\") " pod="openstack/nova-api-0" Mar 09 03:04:14 crc kubenswrapper[4901]: I0309 03:04:14.299018 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/258f621f-8909-4d36-8f2f-bdd166e47139-config-data\") pod \"nova-api-0\" (UID: \"258f621f-8909-4d36-8f2f-bdd166e47139\") " pod="openstack/nova-api-0" Mar 09 03:04:14 crc kubenswrapper[4901]: I0309 03:04:14.299104 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/258f621f-8909-4d36-8f2f-bdd166e47139-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"258f621f-8909-4d36-8f2f-bdd166e47139\") " pod="openstack/nova-api-0" Mar 09 03:04:14 crc kubenswrapper[4901]: I0309 03:04:14.299943 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/258f621f-8909-4d36-8f2f-bdd166e47139-logs\") pod \"nova-api-0\" (UID: \"258f621f-8909-4d36-8f2f-bdd166e47139\") " pod="openstack/nova-api-0" Mar 09 03:04:14 crc kubenswrapper[4901]: I0309 03:04:14.308178 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/258f621f-8909-4d36-8f2f-bdd166e47139-config-data\") pod \"nova-api-0\" (UID: \"258f621f-8909-4d36-8f2f-bdd166e47139\") " pod="openstack/nova-api-0" Mar 09 03:04:14 crc kubenswrapper[4901]: I0309 03:04:14.308865 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/258f621f-8909-4d36-8f2f-bdd166e47139-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"258f621f-8909-4d36-8f2f-bdd166e47139\") " pod="openstack/nova-api-0" Mar 09 03:04:14 crc kubenswrapper[4901]: I0309 03:04:14.326179 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgwm4\" (UniqueName: \"kubernetes.io/projected/258f621f-8909-4d36-8f2f-bdd166e47139-kube-api-access-qgwm4\") pod \"nova-api-0\" (UID: \"258f621f-8909-4d36-8f2f-bdd166e47139\") " pod="openstack/nova-api-0" Mar 09 03:04:14 crc kubenswrapper[4901]: I0309 03:04:14.425815 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 09 03:04:14 crc kubenswrapper[4901]: I0309 03:04:14.425966 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 09 03:04:14 crc kubenswrapper[4901]: I0309 03:04:14.462451 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 03:04:14 crc kubenswrapper[4901]: W0309 03:04:14.710312 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8653c2ab_c097_435e_b694_5c894b6bdd11.slice/crio-43c1362482fc71ee4beee7cac5ce56c56f84ae35e91d7b3cef3569a3d2bcb581 WatchSource:0}: Error finding container 43c1362482fc71ee4beee7cac5ce56c56f84ae35e91d7b3cef3569a3d2bcb581: Status 404 returned error can't find the container with id 43c1362482fc71ee4beee7cac5ce56c56f84ae35e91d7b3cef3569a3d2bcb581 Mar 09 03:04:14 crc kubenswrapper[4901]: I0309 03:04:14.711466 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 03:04:14 crc kubenswrapper[4901]: I0309 03:04:14.773162 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8653c2ab-c097-435e-b694-5c894b6bdd11","Type":"ContainerStarted","Data":"43c1362482fc71ee4beee7cac5ce56c56f84ae35e91d7b3cef3569a3d2bcb581"} Mar 09 03:04:14 crc kubenswrapper[4901]: W0309 03:04:14.915721 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod258f621f_8909_4d36_8f2f_bdd166e47139.slice/crio-e476aa66077d85f60a3bdd6349d821c47394cb422ca4e82a6e76537fa21c7c73 WatchSource:0}: Error finding container e476aa66077d85f60a3bdd6349d821c47394cb422ca4e82a6e76537fa21c7c73: Status 404 returned error can't find the container with id e476aa66077d85f60a3bdd6349d821c47394cb422ca4e82a6e76537fa21c7c73 Mar 09 03:04:14 crc kubenswrapper[4901]: I0309 03:04:14.918652 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 09 03:04:15 crc kubenswrapper[4901]: I0309 03:04:15.788522 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"258f621f-8909-4d36-8f2f-bdd166e47139","Type":"ContainerStarted","Data":"e37a6eec70c902f78746a9a22ee650d4599cd6a648db9148591ccbd0155126e9"} Mar 09 03:04:15 crc kubenswrapper[4901]: I0309 03:04:15.788829 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"258f621f-8909-4d36-8f2f-bdd166e47139","Type":"ContainerStarted","Data":"8330b5ad7d717048a6505dd4d7ea09f3041f917dbb4d0755882ea880e972135f"} Mar 09 03:04:15 crc kubenswrapper[4901]: I0309 03:04:15.788844 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"258f621f-8909-4d36-8f2f-bdd166e47139","Type":"ContainerStarted","Data":"e476aa66077d85f60a3bdd6349d821c47394cb422ca4e82a6e76537fa21c7c73"} Mar 09 03:04:15 crc kubenswrapper[4901]: I0309 03:04:15.790455 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8653c2ab-c097-435e-b694-5c894b6bdd11","Type":"ContainerStarted","Data":"5525bd22a7f2778ddd76385ce3c2eaf6dc537d127ed1059b1bcae84388c30602"} Mar 09 03:04:15 crc kubenswrapper[4901]: I0309 03:04:15.836893 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.8368646379999998 podStartE2EDuration="2.836864638s" podCreationTimestamp="2026-03-09 03:04:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 03:04:15.835020922 +0000 UTC m=+1380.424684694" watchObservedRunningTime="2026-03-09 03:04:15.836864638 +0000 UTC m=+1380.426528410" Mar 09 03:04:15 crc kubenswrapper[4901]: I0309 03:04:15.851386 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.851362354 podStartE2EDuration="1.851362354s" podCreationTimestamp="2026-03-09 03:04:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 03:04:15.817688505 +0000 UTC m=+1380.407352247" watchObservedRunningTime="2026-03-09 03:04:15.851362354 +0000 UTC m=+1380.441026096" Mar 09 03:04:16 crc kubenswrapper[4901]: I0309 03:04:16.125208 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c442232-3859-43a4-9f6a-bc330c647b14" path="/var/lib/kubelet/pods/3c442232-3859-43a4-9f6a-bc330c647b14/volumes" Mar 09 03:04:17 crc kubenswrapper[4901]: I0309 03:04:17.661180 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 03:04:17 crc kubenswrapper[4901]: I0309 03:04:17.661704 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="993fdfea-9981-48c0-9b5b-c78eab5106a0" containerName="kube-state-metrics" containerID="cri-o://bbda781da1c908b27a4862b6c710a520d1d856aa2df5f0b8d9a4ae8fa51858c2" gracePeriod=30 Mar 09 03:04:17 crc kubenswrapper[4901]: I0309 03:04:17.861338 4901 generic.go:334] "Generic (PLEG): container finished" podID="993fdfea-9981-48c0-9b5b-c78eab5106a0" containerID="bbda781da1c908b27a4862b6c710a520d1d856aa2df5f0b8d9a4ae8fa51858c2" exitCode=2 Mar 09 03:04:17 crc kubenswrapper[4901]: I0309 03:04:17.861385 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"993fdfea-9981-48c0-9b5b-c78eab5106a0","Type":"ContainerDied","Data":"bbda781da1c908b27a4862b6c710a520d1d856aa2df5f0b8d9a4ae8fa51858c2"} Mar 09 03:04:18 crc kubenswrapper[4901]: I0309 03:04:18.205325 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 09 03:04:18 crc kubenswrapper[4901]: I0309 03:04:18.375003 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhzl8\" (UniqueName: \"kubernetes.io/projected/993fdfea-9981-48c0-9b5b-c78eab5106a0-kube-api-access-rhzl8\") pod \"993fdfea-9981-48c0-9b5b-c78eab5106a0\" (UID: \"993fdfea-9981-48c0-9b5b-c78eab5106a0\") " Mar 09 03:04:18 crc kubenswrapper[4901]: I0309 03:04:18.386936 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/993fdfea-9981-48c0-9b5b-c78eab5106a0-kube-api-access-rhzl8" (OuterVolumeSpecName: "kube-api-access-rhzl8") pod "993fdfea-9981-48c0-9b5b-c78eab5106a0" (UID: "993fdfea-9981-48c0-9b5b-c78eab5106a0"). InnerVolumeSpecName "kube-api-access-rhzl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:04:18 crc kubenswrapper[4901]: I0309 03:04:18.477537 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhzl8\" (UniqueName: \"kubernetes.io/projected/993fdfea-9981-48c0-9b5b-c78eab5106a0-kube-api-access-rhzl8\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:18 crc kubenswrapper[4901]: I0309 03:04:18.876541 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"993fdfea-9981-48c0-9b5b-c78eab5106a0","Type":"ContainerDied","Data":"8cdb357b3a0e049df907d97e8d5470b2840561b835a914a24933289af5f80c99"} Mar 09 03:04:18 crc kubenswrapper[4901]: I0309 03:04:18.876608 4901 scope.go:117] "RemoveContainer" containerID="bbda781da1c908b27a4862b6c710a520d1d856aa2df5f0b8d9a4ae8fa51858c2" Mar 09 03:04:18 crc kubenswrapper[4901]: I0309 03:04:18.876614 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 09 03:04:18 crc kubenswrapper[4901]: I0309 03:04:18.932467 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 03:04:18 crc kubenswrapper[4901]: I0309 03:04:18.951805 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 03:04:18 crc kubenswrapper[4901]: I0309 03:04:18.987474 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 03:04:18 crc kubenswrapper[4901]: E0309 03:04:18.988305 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="993fdfea-9981-48c0-9b5b-c78eab5106a0" containerName="kube-state-metrics" Mar 09 03:04:18 crc kubenswrapper[4901]: I0309 03:04:18.988337 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="993fdfea-9981-48c0-9b5b-c78eab5106a0" containerName="kube-state-metrics" Mar 09 03:04:18 crc kubenswrapper[4901]: I0309 03:04:18.988703 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="993fdfea-9981-48c0-9b5b-c78eab5106a0" containerName="kube-state-metrics" Mar 09 03:04:18 crc kubenswrapper[4901]: I0309 03:04:18.989732 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 09 03:04:18 crc kubenswrapper[4901]: I0309 03:04:18.991696 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 09 03:04:18 crc kubenswrapper[4901]: I0309 03:04:18.991889 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 09 03:04:19 crc kubenswrapper[4901]: I0309 03:04:19.010513 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 03:04:19 crc kubenswrapper[4901]: I0309 03:04:19.188424 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7050aa4c-725b-482a-8b90-f1374b3a4a42-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"7050aa4c-725b-482a-8b90-f1374b3a4a42\") " pod="openstack/kube-state-metrics-0" Mar 09 03:04:19 crc kubenswrapper[4901]: I0309 03:04:19.188468 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7050aa4c-725b-482a-8b90-f1374b3a4a42-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"7050aa4c-725b-482a-8b90-f1374b3a4a42\") " pod="openstack/kube-state-metrics-0" Mar 09 03:04:19 crc kubenswrapper[4901]: I0309 03:04:19.188557 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7050aa4c-725b-482a-8b90-f1374b3a4a42-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"7050aa4c-725b-482a-8b90-f1374b3a4a42\") " pod="openstack/kube-state-metrics-0" Mar 09 03:04:19 crc kubenswrapper[4901]: I0309 03:04:19.188603 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5jhd\" (UniqueName: \"kubernetes.io/projected/7050aa4c-725b-482a-8b90-f1374b3a4a42-kube-api-access-z5jhd\") pod \"kube-state-metrics-0\" (UID: \"7050aa4c-725b-482a-8b90-f1374b3a4a42\") " pod="openstack/kube-state-metrics-0" Mar 09 03:04:19 crc kubenswrapper[4901]: I0309 03:04:19.229453 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 09 03:04:19 crc kubenswrapper[4901]: I0309 03:04:19.289948 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7050aa4c-725b-482a-8b90-f1374b3a4a42-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"7050aa4c-725b-482a-8b90-f1374b3a4a42\") " pod="openstack/kube-state-metrics-0" Mar 09 03:04:19 crc kubenswrapper[4901]: I0309 03:04:19.290073 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7050aa4c-725b-482a-8b90-f1374b3a4a42-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"7050aa4c-725b-482a-8b90-f1374b3a4a42\") " pod="openstack/kube-state-metrics-0" Mar 09 03:04:19 crc kubenswrapper[4901]: I0309 03:04:19.290145 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5jhd\" (UniqueName: \"kubernetes.io/projected/7050aa4c-725b-482a-8b90-f1374b3a4a42-kube-api-access-z5jhd\") pod \"kube-state-metrics-0\" (UID: \"7050aa4c-725b-482a-8b90-f1374b3a4a42\") " pod="openstack/kube-state-metrics-0" Mar 09 03:04:19 crc kubenswrapper[4901]: I0309 03:04:19.290201 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7050aa4c-725b-482a-8b90-f1374b3a4a42-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"7050aa4c-725b-482a-8b90-f1374b3a4a42\") " pod="openstack/kube-state-metrics-0" Mar 09 03:04:19 crc kubenswrapper[4901]: I0309 03:04:19.296179 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7050aa4c-725b-482a-8b90-f1374b3a4a42-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"7050aa4c-725b-482a-8b90-f1374b3a4a42\") " pod="openstack/kube-state-metrics-0" Mar 09 03:04:19 crc kubenswrapper[4901]: I0309 03:04:19.308354 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7050aa4c-725b-482a-8b90-f1374b3a4a42-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"7050aa4c-725b-482a-8b90-f1374b3a4a42\") " pod="openstack/kube-state-metrics-0" Mar 09 03:04:19 crc kubenswrapper[4901]: I0309 03:04:19.308461 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7050aa4c-725b-482a-8b90-f1374b3a4a42-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"7050aa4c-725b-482a-8b90-f1374b3a4a42\") " pod="openstack/kube-state-metrics-0" Mar 09 03:04:19 crc kubenswrapper[4901]: I0309 03:04:19.312175 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5jhd\" (UniqueName: \"kubernetes.io/projected/7050aa4c-725b-482a-8b90-f1374b3a4a42-kube-api-access-z5jhd\") pod \"kube-state-metrics-0\" (UID: \"7050aa4c-725b-482a-8b90-f1374b3a4a42\") " pod="openstack/kube-state-metrics-0" Mar 09 03:04:19 crc kubenswrapper[4901]: I0309 03:04:19.424955 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 09 03:04:19 crc kubenswrapper[4901]: I0309 03:04:19.425369 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 09 03:04:19 crc kubenswrapper[4901]: I0309 03:04:19.449019 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 03:04:19 crc kubenswrapper[4901]: I0309 03:04:19.449358 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="82b374be-f2cd-4656-87de-434995c335b8" containerName="ceilometer-central-agent" containerID="cri-o://be64a084be3568bf9e8eacb212506ae565f9cb7c53a2ab9b4191564a854b1707" gracePeriod=30 Mar 09 03:04:19 crc kubenswrapper[4901]: I0309 03:04:19.449420 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="82b374be-f2cd-4656-87de-434995c335b8" containerName="proxy-httpd" containerID="cri-o://490fc515cecfddb8c4df164cdf67c0f9aa81056209eb1f052da2c95b40961783" gracePeriod=30 Mar 09 03:04:19 crc kubenswrapper[4901]: I0309 03:04:19.449468 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="82b374be-f2cd-4656-87de-434995c335b8" containerName="ceilometer-notification-agent" containerID="cri-o://194f85138ab8744ab8a124c0966ef227053b765ea797045a5a24001b90c9697f" gracePeriod=30 Mar 09 03:04:19 crc kubenswrapper[4901]: I0309 03:04:19.449418 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="82b374be-f2cd-4656-87de-434995c335b8" containerName="sg-core" containerID="cri-o://327e1760b929a425e6deaf746b7c294662305acfc7f358f0737461a3d46dc213" gracePeriod=30 Mar 09 03:04:19 crc kubenswrapper[4901]: I0309 03:04:19.611754 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 09 03:04:20 crc kubenswrapper[4901]: I0309 03:04:19.892934 4901 generic.go:334] "Generic (PLEG): container finished" podID="82b374be-f2cd-4656-87de-434995c335b8" containerID="490fc515cecfddb8c4df164cdf67c0f9aa81056209eb1f052da2c95b40961783" exitCode=0 Mar 09 03:04:20 crc kubenswrapper[4901]: I0309 03:04:19.893204 4901 generic.go:334] "Generic (PLEG): container finished" podID="82b374be-f2cd-4656-87de-434995c335b8" containerID="327e1760b929a425e6deaf746b7c294662305acfc7f358f0737461a3d46dc213" exitCode=2 Mar 09 03:04:20 crc kubenswrapper[4901]: I0309 03:04:19.893213 4901 generic.go:334] "Generic (PLEG): container finished" podID="82b374be-f2cd-4656-87de-434995c335b8" containerID="be64a084be3568bf9e8eacb212506ae565f9cb7c53a2ab9b4191564a854b1707" exitCode=0 Mar 09 03:04:20 crc kubenswrapper[4901]: I0309 03:04:19.893088 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82b374be-f2cd-4656-87de-434995c335b8","Type":"ContainerDied","Data":"490fc515cecfddb8c4df164cdf67c0f9aa81056209eb1f052da2c95b40961783"} Mar 09 03:04:20 crc kubenswrapper[4901]: I0309 03:04:19.893283 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82b374be-f2cd-4656-87de-434995c335b8","Type":"ContainerDied","Data":"327e1760b929a425e6deaf746b7c294662305acfc7f358f0737461a3d46dc213"} Mar 09 03:04:20 crc kubenswrapper[4901]: I0309 03:04:19.893296 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82b374be-f2cd-4656-87de-434995c335b8","Type":"ContainerDied","Data":"be64a084be3568bf9e8eacb212506ae565f9cb7c53a2ab9b4191564a854b1707"} Mar 09 03:04:20 crc kubenswrapper[4901]: I0309 03:04:20.118406 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="993fdfea-9981-48c0-9b5b-c78eab5106a0" path="/var/lib/kubelet/pods/993fdfea-9981-48c0-9b5b-c78eab5106a0/volumes" Mar 09 03:04:20 crc kubenswrapper[4901]: I0309 03:04:20.439472 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="0c7fbb65-658d-416a-85da-243a966b9bc9" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 03:04:20 crc kubenswrapper[4901]: I0309 03:04:20.439498 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="0c7fbb65-658d-416a-85da-243a966b9bc9" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 03:04:20 crc kubenswrapper[4901]: I0309 03:04:20.793745 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 03:04:20 crc kubenswrapper[4901]: I0309 03:04:20.908697 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"7050aa4c-725b-482a-8b90-f1374b3a4a42","Type":"ContainerStarted","Data":"1fea3c826a151a41c7248570920e76e67f4663d409b5d8418412d59bbc46f4e5"} Mar 09 03:04:21 crc kubenswrapper[4901]: I0309 03:04:21.920091 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"7050aa4c-725b-482a-8b90-f1374b3a4a42","Type":"ContainerStarted","Data":"149efda8ee8e0be18d7dd9fb462950b6382146330b30292eff8d423e80ed5cc6"} Mar 09 03:04:21 crc kubenswrapper[4901]: I0309 03:04:21.920495 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 09 03:04:21 crc kubenswrapper[4901]: I0309 03:04:21.938645 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.490812473 podStartE2EDuration="3.938617238s" podCreationTimestamp="2026-03-09 03:04:18 +0000 UTC" firstStartedPulling="2026-03-09 03:04:20.793005933 +0000 UTC m=+1385.382669665" lastFinishedPulling="2026-03-09 03:04:21.240810658 +0000 UTC m=+1385.830474430" observedRunningTime="2026-03-09 03:04:21.936819543 +0000 UTC m=+1386.526483345" watchObservedRunningTime="2026-03-09 03:04:21.938617238 +0000 UTC m=+1386.528281010" Mar 09 03:04:22 crc kubenswrapper[4901]: I0309 03:04:22.564483 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 03:04:22 crc kubenswrapper[4901]: I0309 03:04:22.661163 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82b374be-f2cd-4656-87de-434995c335b8-log-httpd\") pod \"82b374be-f2cd-4656-87de-434995c335b8\" (UID: \"82b374be-f2cd-4656-87de-434995c335b8\") " Mar 09 03:04:22 crc kubenswrapper[4901]: I0309 03:04:22.661324 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/82b374be-f2cd-4656-87de-434995c335b8-sg-core-conf-yaml\") pod \"82b374be-f2cd-4656-87de-434995c335b8\" (UID: \"82b374be-f2cd-4656-87de-434995c335b8\") " Mar 09 03:04:22 crc kubenswrapper[4901]: I0309 03:04:22.661407 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82b374be-f2cd-4656-87de-434995c335b8-run-httpd\") pod \"82b374be-f2cd-4656-87de-434995c335b8\" (UID: \"82b374be-f2cd-4656-87de-434995c335b8\") " Mar 09 03:04:22 crc kubenswrapper[4901]: I0309 03:04:22.661462 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82b374be-f2cd-4656-87de-434995c335b8-scripts\") pod \"82b374be-f2cd-4656-87de-434995c335b8\" (UID: \"82b374be-f2cd-4656-87de-434995c335b8\") " Mar 09 03:04:22 crc kubenswrapper[4901]: I0309 03:04:22.661531 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82b374be-f2cd-4656-87de-434995c335b8-combined-ca-bundle\") pod \"82b374be-f2cd-4656-87de-434995c335b8\" (UID: \"82b374be-f2cd-4656-87de-434995c335b8\") " Mar 09 03:04:22 crc kubenswrapper[4901]: I0309 03:04:22.661675 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82b374be-f2cd-4656-87de-434995c335b8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "82b374be-f2cd-4656-87de-434995c335b8" (UID: "82b374be-f2cd-4656-87de-434995c335b8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:04:22 crc kubenswrapper[4901]: I0309 03:04:22.661804 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82b374be-f2cd-4656-87de-434995c335b8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "82b374be-f2cd-4656-87de-434995c335b8" (UID: "82b374be-f2cd-4656-87de-434995c335b8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:04:22 crc kubenswrapper[4901]: I0309 03:04:22.662951 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2rhj\" (UniqueName: \"kubernetes.io/projected/82b374be-f2cd-4656-87de-434995c335b8-kube-api-access-n2rhj\") pod \"82b374be-f2cd-4656-87de-434995c335b8\" (UID: \"82b374be-f2cd-4656-87de-434995c335b8\") " Mar 09 03:04:22 crc kubenswrapper[4901]: I0309 03:04:22.663488 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82b374be-f2cd-4656-87de-434995c335b8-config-data\") pod \"82b374be-f2cd-4656-87de-434995c335b8\" (UID: \"82b374be-f2cd-4656-87de-434995c335b8\") " Mar 09 03:04:22 crc kubenswrapper[4901]: I0309 03:04:22.664683 4901 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82b374be-f2cd-4656-87de-434995c335b8-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:22 crc kubenswrapper[4901]: I0309 03:04:22.664709 4901 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82b374be-f2cd-4656-87de-434995c335b8-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:22 crc kubenswrapper[4901]: I0309 03:04:22.669423 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82b374be-f2cd-4656-87de-434995c335b8-scripts" (OuterVolumeSpecName: "scripts") pod "82b374be-f2cd-4656-87de-434995c335b8" (UID: "82b374be-f2cd-4656-87de-434995c335b8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:04:22 crc kubenswrapper[4901]: I0309 03:04:22.669583 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82b374be-f2cd-4656-87de-434995c335b8-kube-api-access-n2rhj" (OuterVolumeSpecName: "kube-api-access-n2rhj") pod "82b374be-f2cd-4656-87de-434995c335b8" (UID: "82b374be-f2cd-4656-87de-434995c335b8"). InnerVolumeSpecName "kube-api-access-n2rhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:04:22 crc kubenswrapper[4901]: I0309 03:04:22.691877 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82b374be-f2cd-4656-87de-434995c335b8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "82b374be-f2cd-4656-87de-434995c335b8" (UID: "82b374be-f2cd-4656-87de-434995c335b8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:04:22 crc kubenswrapper[4901]: I0309 03:04:22.766046 4901 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/82b374be-f2cd-4656-87de-434995c335b8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:22 crc kubenswrapper[4901]: I0309 03:04:22.766083 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82b374be-f2cd-4656-87de-434995c335b8-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:22 crc kubenswrapper[4901]: I0309 03:04:22.766101 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2rhj\" (UniqueName: \"kubernetes.io/projected/82b374be-f2cd-4656-87de-434995c335b8-kube-api-access-n2rhj\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:22 crc kubenswrapper[4901]: I0309 03:04:22.766695 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82b374be-f2cd-4656-87de-434995c335b8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "82b374be-f2cd-4656-87de-434995c335b8" (UID: "82b374be-f2cd-4656-87de-434995c335b8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:04:22 crc kubenswrapper[4901]: I0309 03:04:22.793877 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82b374be-f2cd-4656-87de-434995c335b8-config-data" (OuterVolumeSpecName: "config-data") pod "82b374be-f2cd-4656-87de-434995c335b8" (UID: "82b374be-f2cd-4656-87de-434995c335b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:04:22 crc kubenswrapper[4901]: I0309 03:04:22.868329 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82b374be-f2cd-4656-87de-434995c335b8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:22 crc kubenswrapper[4901]: I0309 03:04:22.868368 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82b374be-f2cd-4656-87de-434995c335b8-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:22 crc kubenswrapper[4901]: I0309 03:04:22.933451 4901 generic.go:334] "Generic (PLEG): container finished" podID="82b374be-f2cd-4656-87de-434995c335b8" containerID="194f85138ab8744ab8a124c0966ef227053b765ea797045a5a24001b90c9697f" exitCode=0 Mar 09 03:04:22 crc kubenswrapper[4901]: I0309 03:04:22.933515 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 03:04:22 crc kubenswrapper[4901]: I0309 03:04:22.933560 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82b374be-f2cd-4656-87de-434995c335b8","Type":"ContainerDied","Data":"194f85138ab8744ab8a124c0966ef227053b765ea797045a5a24001b90c9697f"} Mar 09 03:04:22 crc kubenswrapper[4901]: I0309 03:04:22.933614 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82b374be-f2cd-4656-87de-434995c335b8","Type":"ContainerDied","Data":"d52cb09ac30319a2c4deb33c5e55021fc2ff481395bdb4392a711410f679a1a7"} Mar 09 03:04:22 crc kubenswrapper[4901]: I0309 03:04:22.933644 4901 scope.go:117] "RemoveContainer" containerID="490fc515cecfddb8c4df164cdf67c0f9aa81056209eb1f052da2c95b40961783" Mar 09 03:04:22 crc kubenswrapper[4901]: I0309 03:04:22.957665 4901 scope.go:117] "RemoveContainer" containerID="327e1760b929a425e6deaf746b7c294662305acfc7f358f0737461a3d46dc213" Mar 09 03:04:22 crc kubenswrapper[4901]: I0309 03:04:22.974253 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 03:04:22 crc kubenswrapper[4901]: I0309 03:04:22.994949 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 03:04:23 crc kubenswrapper[4901]: I0309 03:04:23.011099 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 03:04:23 crc kubenswrapper[4901]: E0309 03:04:23.011628 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82b374be-f2cd-4656-87de-434995c335b8" containerName="sg-core" Mar 09 03:04:23 crc kubenswrapper[4901]: I0309 03:04:23.011643 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="82b374be-f2cd-4656-87de-434995c335b8" containerName="sg-core" Mar 09 03:04:23 crc kubenswrapper[4901]: E0309 03:04:23.011659 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82b374be-f2cd-4656-87de-434995c335b8" containerName="ceilometer-notification-agent" Mar 09 03:04:23 crc kubenswrapper[4901]: I0309 03:04:23.011667 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="82b374be-f2cd-4656-87de-434995c335b8" containerName="ceilometer-notification-agent" Mar 09 03:04:23 crc kubenswrapper[4901]: E0309 03:04:23.011683 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82b374be-f2cd-4656-87de-434995c335b8" containerName="ceilometer-central-agent" Mar 09 03:04:23 crc kubenswrapper[4901]: I0309 03:04:23.011691 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="82b374be-f2cd-4656-87de-434995c335b8" containerName="ceilometer-central-agent" Mar 09 03:04:23 crc kubenswrapper[4901]: E0309 03:04:23.011716 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82b374be-f2cd-4656-87de-434995c335b8" containerName="proxy-httpd" Mar 09 03:04:23 crc kubenswrapper[4901]: I0309 03:04:23.011724 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="82b374be-f2cd-4656-87de-434995c335b8" containerName="proxy-httpd" Mar 09 03:04:23 crc kubenswrapper[4901]: I0309 03:04:23.011939 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="82b374be-f2cd-4656-87de-434995c335b8" containerName="sg-core" Mar 09 03:04:23 crc kubenswrapper[4901]: I0309 03:04:23.011976 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="82b374be-f2cd-4656-87de-434995c335b8" containerName="ceilometer-notification-agent" Mar 09 03:04:23 crc kubenswrapper[4901]: I0309 03:04:23.011990 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="82b374be-f2cd-4656-87de-434995c335b8" containerName="ceilometer-central-agent" Mar 09 03:04:23 crc kubenswrapper[4901]: I0309 03:04:23.012005 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="82b374be-f2cd-4656-87de-434995c335b8" containerName="proxy-httpd" Mar 09 03:04:23 crc kubenswrapper[4901]: I0309 03:04:23.014359 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 03:04:23 crc kubenswrapper[4901]: I0309 03:04:23.017332 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 03:04:23 crc kubenswrapper[4901]: I0309 03:04:23.017536 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 03:04:23 crc kubenswrapper[4901]: I0309 03:04:23.021478 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 09 03:04:23 crc kubenswrapper[4901]: I0309 03:04:23.027047 4901 scope.go:117] "RemoveContainer" containerID="194f85138ab8744ab8a124c0966ef227053b765ea797045a5a24001b90c9697f" Mar 09 03:04:23 crc kubenswrapper[4901]: I0309 03:04:23.031586 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 03:04:23 crc kubenswrapper[4901]: I0309 03:04:23.068611 4901 scope.go:117] "RemoveContainer" containerID="be64a084be3568bf9e8eacb212506ae565f9cb7c53a2ab9b4191564a854b1707" Mar 09 03:04:23 crc kubenswrapper[4901]: I0309 03:04:23.088955 4901 scope.go:117] "RemoveContainer" containerID="490fc515cecfddb8c4df164cdf67c0f9aa81056209eb1f052da2c95b40961783" Mar 09 03:04:23 crc kubenswrapper[4901]: E0309 03:04:23.089533 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"490fc515cecfddb8c4df164cdf67c0f9aa81056209eb1f052da2c95b40961783\": container with ID starting with 490fc515cecfddb8c4df164cdf67c0f9aa81056209eb1f052da2c95b40961783 not found: ID does not exist" containerID="490fc515cecfddb8c4df164cdf67c0f9aa81056209eb1f052da2c95b40961783" Mar 09 03:04:23 crc kubenswrapper[4901]: I0309 03:04:23.089589 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"490fc515cecfddb8c4df164cdf67c0f9aa81056209eb1f052da2c95b40961783"} err="failed to get container status \"490fc515cecfddb8c4df164cdf67c0f9aa81056209eb1f052da2c95b40961783\": rpc error: code = NotFound desc = could not find container \"490fc515cecfddb8c4df164cdf67c0f9aa81056209eb1f052da2c95b40961783\": container with ID starting with 490fc515cecfddb8c4df164cdf67c0f9aa81056209eb1f052da2c95b40961783 not found: ID does not exist" Mar 09 03:04:23 crc kubenswrapper[4901]: I0309 03:04:23.089628 4901 scope.go:117] "RemoveContainer" containerID="327e1760b929a425e6deaf746b7c294662305acfc7f358f0737461a3d46dc213" Mar 09 03:04:23 crc kubenswrapper[4901]: E0309 03:04:23.089960 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"327e1760b929a425e6deaf746b7c294662305acfc7f358f0737461a3d46dc213\": container with ID starting with 327e1760b929a425e6deaf746b7c294662305acfc7f358f0737461a3d46dc213 not found: ID does not exist" containerID="327e1760b929a425e6deaf746b7c294662305acfc7f358f0737461a3d46dc213" Mar 09 03:04:23 crc kubenswrapper[4901]: I0309 03:04:23.089999 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"327e1760b929a425e6deaf746b7c294662305acfc7f358f0737461a3d46dc213"} err="failed to get container status \"327e1760b929a425e6deaf746b7c294662305acfc7f358f0737461a3d46dc213\": rpc error: code = NotFound desc = could not find container \"327e1760b929a425e6deaf746b7c294662305acfc7f358f0737461a3d46dc213\": container with ID starting with 327e1760b929a425e6deaf746b7c294662305acfc7f358f0737461a3d46dc213 not found: ID does not exist" Mar 09 03:04:23 crc kubenswrapper[4901]: I0309 03:04:23.090022 4901 scope.go:117] "RemoveContainer" containerID="194f85138ab8744ab8a124c0966ef227053b765ea797045a5a24001b90c9697f" Mar 09 03:04:23 crc kubenswrapper[4901]: E0309 03:04:23.090329 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"194f85138ab8744ab8a124c0966ef227053b765ea797045a5a24001b90c9697f\": container with ID starting with 194f85138ab8744ab8a124c0966ef227053b765ea797045a5a24001b90c9697f not found: ID does not exist" containerID="194f85138ab8744ab8a124c0966ef227053b765ea797045a5a24001b90c9697f" Mar 09 03:04:23 crc kubenswrapper[4901]: I0309 03:04:23.090369 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"194f85138ab8744ab8a124c0966ef227053b765ea797045a5a24001b90c9697f"} err="failed to get container status \"194f85138ab8744ab8a124c0966ef227053b765ea797045a5a24001b90c9697f\": rpc error: code = NotFound desc = could not find container \"194f85138ab8744ab8a124c0966ef227053b765ea797045a5a24001b90c9697f\": container with ID starting with 194f85138ab8744ab8a124c0966ef227053b765ea797045a5a24001b90c9697f not found: ID does not exist" Mar 09 03:04:23 crc kubenswrapper[4901]: I0309 03:04:23.090393 4901 scope.go:117] "RemoveContainer" containerID="be64a084be3568bf9e8eacb212506ae565f9cb7c53a2ab9b4191564a854b1707" Mar 09 03:04:23 crc kubenswrapper[4901]: E0309 03:04:23.090757 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be64a084be3568bf9e8eacb212506ae565f9cb7c53a2ab9b4191564a854b1707\": container with ID starting with be64a084be3568bf9e8eacb212506ae565f9cb7c53a2ab9b4191564a854b1707 not found: ID does not exist" containerID="be64a084be3568bf9e8eacb212506ae565f9cb7c53a2ab9b4191564a854b1707" Mar 09 03:04:23 crc kubenswrapper[4901]: I0309 03:04:23.090790 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be64a084be3568bf9e8eacb212506ae565f9cb7c53a2ab9b4191564a854b1707"} err="failed to get container status \"be64a084be3568bf9e8eacb212506ae565f9cb7c53a2ab9b4191564a854b1707\": rpc error: code = NotFound desc = could not find container \"be64a084be3568bf9e8eacb212506ae565f9cb7c53a2ab9b4191564a854b1707\": container with ID starting with be64a084be3568bf9e8eacb212506ae565f9cb7c53a2ab9b4191564a854b1707 not found: ID does not exist" Mar 09 03:04:23 crc kubenswrapper[4901]: I0309 03:04:23.176361 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af8a02ba-7609-485f-86c7-09a4d9ba2e59-scripts\") pod \"ceilometer-0\" (UID: \"af8a02ba-7609-485f-86c7-09a4d9ba2e59\") " pod="openstack/ceilometer-0" Mar 09 03:04:23 crc kubenswrapper[4901]: I0309 03:04:23.176471 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af8a02ba-7609-485f-86c7-09a4d9ba2e59-log-httpd\") pod \"ceilometer-0\" (UID: \"af8a02ba-7609-485f-86c7-09a4d9ba2e59\") " pod="openstack/ceilometer-0" Mar 09 03:04:23 crc kubenswrapper[4901]: I0309 03:04:23.176557 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af8a02ba-7609-485f-86c7-09a4d9ba2e59-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"af8a02ba-7609-485f-86c7-09a4d9ba2e59\") " pod="openstack/ceilometer-0" Mar 09 03:04:23 crc kubenswrapper[4901]: I0309 03:04:23.176762 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm5xj\" (UniqueName: \"kubernetes.io/projected/af8a02ba-7609-485f-86c7-09a4d9ba2e59-kube-api-access-xm5xj\") pod \"ceilometer-0\" (UID: \"af8a02ba-7609-485f-86c7-09a4d9ba2e59\") " pod="openstack/ceilometer-0" Mar 09 03:04:23 crc kubenswrapper[4901]: I0309 03:04:23.176937 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af8a02ba-7609-485f-86c7-09a4d9ba2e59-run-httpd\") pod \"ceilometer-0\" (UID: \"af8a02ba-7609-485f-86c7-09a4d9ba2e59\") " pod="openstack/ceilometer-0" Mar 09 03:04:23 crc kubenswrapper[4901]: I0309 03:04:23.177150 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af8a02ba-7609-485f-86c7-09a4d9ba2e59-config-data\") pod \"ceilometer-0\" (UID: \"af8a02ba-7609-485f-86c7-09a4d9ba2e59\") " pod="openstack/ceilometer-0" Mar 09 03:04:23 crc kubenswrapper[4901]: I0309 03:04:23.177310 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/af8a02ba-7609-485f-86c7-09a4d9ba2e59-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"af8a02ba-7609-485f-86c7-09a4d9ba2e59\") " pod="openstack/ceilometer-0" Mar 09 03:04:23 crc kubenswrapper[4901]: I0309 03:04:23.177354 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/af8a02ba-7609-485f-86c7-09a4d9ba2e59-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"af8a02ba-7609-485f-86c7-09a4d9ba2e59\") " pod="openstack/ceilometer-0" Mar 09 03:04:23 crc kubenswrapper[4901]: I0309 03:04:23.278908 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af8a02ba-7609-485f-86c7-09a4d9ba2e59-config-data\") pod \"ceilometer-0\" (UID: \"af8a02ba-7609-485f-86c7-09a4d9ba2e59\") " pod="openstack/ceilometer-0" Mar 09 03:04:23 crc kubenswrapper[4901]: I0309 03:04:23.278993 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/af8a02ba-7609-485f-86c7-09a4d9ba2e59-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"af8a02ba-7609-485f-86c7-09a4d9ba2e59\") " pod="openstack/ceilometer-0" Mar 09 03:04:23 crc kubenswrapper[4901]: I0309 03:04:23.279016 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/af8a02ba-7609-485f-86c7-09a4d9ba2e59-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"af8a02ba-7609-485f-86c7-09a4d9ba2e59\") " pod="openstack/ceilometer-0" Mar 09 03:04:23 crc kubenswrapper[4901]: I0309 03:04:23.279053 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af8a02ba-7609-485f-86c7-09a4d9ba2e59-scripts\") pod \"ceilometer-0\" (UID: \"af8a02ba-7609-485f-86c7-09a4d9ba2e59\") " pod="openstack/ceilometer-0" Mar 09 03:04:23 crc kubenswrapper[4901]: I0309 03:04:23.279077 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af8a02ba-7609-485f-86c7-09a4d9ba2e59-log-httpd\") pod \"ceilometer-0\" (UID: \"af8a02ba-7609-485f-86c7-09a4d9ba2e59\") " pod="openstack/ceilometer-0" Mar 09 03:04:23 crc kubenswrapper[4901]: I0309 03:04:23.279112 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af8a02ba-7609-485f-86c7-09a4d9ba2e59-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"af8a02ba-7609-485f-86c7-09a4d9ba2e59\") " pod="openstack/ceilometer-0" Mar 09 03:04:23 crc kubenswrapper[4901]: I0309 03:04:23.279203 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xm5xj\" (UniqueName: \"kubernetes.io/projected/af8a02ba-7609-485f-86c7-09a4d9ba2e59-kube-api-access-xm5xj\") pod \"ceilometer-0\" (UID: \"af8a02ba-7609-485f-86c7-09a4d9ba2e59\") " pod="openstack/ceilometer-0" Mar 09 03:04:23 crc kubenswrapper[4901]: I0309 03:04:23.279266 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af8a02ba-7609-485f-86c7-09a4d9ba2e59-run-httpd\") pod \"ceilometer-0\" (UID: \"af8a02ba-7609-485f-86c7-09a4d9ba2e59\") " pod="openstack/ceilometer-0" Mar 09 03:04:23 crc kubenswrapper[4901]: I0309 03:04:23.280419 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af8a02ba-7609-485f-86c7-09a4d9ba2e59-log-httpd\") pod \"ceilometer-0\" (UID: \"af8a02ba-7609-485f-86c7-09a4d9ba2e59\") " pod="openstack/ceilometer-0" Mar 09 03:04:23 crc kubenswrapper[4901]: I0309 03:04:23.281268 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af8a02ba-7609-485f-86c7-09a4d9ba2e59-run-httpd\") pod \"ceilometer-0\" (UID: \"af8a02ba-7609-485f-86c7-09a4d9ba2e59\") " pod="openstack/ceilometer-0" Mar 09 03:04:23 crc kubenswrapper[4901]: I0309 03:04:23.284008 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/af8a02ba-7609-485f-86c7-09a4d9ba2e59-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"af8a02ba-7609-485f-86c7-09a4d9ba2e59\") " pod="openstack/ceilometer-0" Mar 09 03:04:23 crc kubenswrapper[4901]: I0309 03:04:23.285120 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af8a02ba-7609-485f-86c7-09a4d9ba2e59-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"af8a02ba-7609-485f-86c7-09a4d9ba2e59\") " pod="openstack/ceilometer-0" Mar 09 03:04:23 crc kubenswrapper[4901]: I0309 03:04:23.286029 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af8a02ba-7609-485f-86c7-09a4d9ba2e59-config-data\") pod \"ceilometer-0\" (UID: \"af8a02ba-7609-485f-86c7-09a4d9ba2e59\") " pod="openstack/ceilometer-0" Mar 09 03:04:23 crc kubenswrapper[4901]: I0309 03:04:23.295149 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af8a02ba-7609-485f-86c7-09a4d9ba2e59-scripts\") pod \"ceilometer-0\" (UID: \"af8a02ba-7609-485f-86c7-09a4d9ba2e59\") " pod="openstack/ceilometer-0" Mar 09 03:04:23 crc kubenswrapper[4901]: I0309 03:04:23.295728 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/af8a02ba-7609-485f-86c7-09a4d9ba2e59-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"af8a02ba-7609-485f-86c7-09a4d9ba2e59\") " pod="openstack/ceilometer-0" Mar 09 03:04:23 crc kubenswrapper[4901]: I0309 03:04:23.301016 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm5xj\" (UniqueName: \"kubernetes.io/projected/af8a02ba-7609-485f-86c7-09a4d9ba2e59-kube-api-access-xm5xj\") pod \"ceilometer-0\" (UID: \"af8a02ba-7609-485f-86c7-09a4d9ba2e59\") " pod="openstack/ceilometer-0" Mar 09 03:04:23 crc kubenswrapper[4901]: I0309 03:04:23.353493 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 03:04:23 crc kubenswrapper[4901]: I0309 03:04:23.869283 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 03:04:23 crc kubenswrapper[4901]: W0309 03:04:23.878481 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf8a02ba_7609_485f_86c7_09a4d9ba2e59.slice/crio-6c5334d2f7cd065a52ffe5ff3d49e355f6911d174012743979061301d813441b WatchSource:0}: Error finding container 6c5334d2f7cd065a52ffe5ff3d49e355f6911d174012743979061301d813441b: Status 404 returned error can't find the container with id 6c5334d2f7cd065a52ffe5ff3d49e355f6911d174012743979061301d813441b Mar 09 03:04:23 crc kubenswrapper[4901]: I0309 03:04:23.951266 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af8a02ba-7609-485f-86c7-09a4d9ba2e59","Type":"ContainerStarted","Data":"6c5334d2f7cd065a52ffe5ff3d49e355f6911d174012743979061301d813441b"} Mar 09 03:04:24 crc kubenswrapper[4901]: I0309 03:04:24.116569 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82b374be-f2cd-4656-87de-434995c335b8" path="/var/lib/kubelet/pods/82b374be-f2cd-4656-87de-434995c335b8/volumes" Mar 09 03:04:24 crc kubenswrapper[4901]: I0309 03:04:24.228867 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 09 03:04:24 crc kubenswrapper[4901]: I0309 03:04:24.265793 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 09 03:04:24 crc kubenswrapper[4901]: I0309 03:04:24.463132 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 09 03:04:24 crc kubenswrapper[4901]: I0309 03:04:24.463200 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 09 03:04:24 crc kubenswrapper[4901]: I0309 03:04:24.964826 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af8a02ba-7609-485f-86c7-09a4d9ba2e59","Type":"ContainerStarted","Data":"15ea0bf3cdd9841411ad8bfe69bfb6d1bfa47555f7306fb86d7a6d9be2ab54db"} Mar 09 03:04:25 crc kubenswrapper[4901]: I0309 03:04:25.006434 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 09 03:04:25 crc kubenswrapper[4901]: I0309 03:04:25.545410 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="258f621f-8909-4d36-8f2f-bdd166e47139" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.199:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 03:04:25 crc kubenswrapper[4901]: I0309 03:04:25.545409 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="258f621f-8909-4d36-8f2f-bdd166e47139" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.199:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 03:04:25 crc kubenswrapper[4901]: I0309 03:04:25.977580 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af8a02ba-7609-485f-86c7-09a4d9ba2e59","Type":"ContainerStarted","Data":"3a58648443d73d61e25350c6fccf6cfa99f350e533540cc5929f0298a5bdcef1"} Mar 09 03:04:25 crc kubenswrapper[4901]: I0309 03:04:25.977912 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af8a02ba-7609-485f-86c7-09a4d9ba2e59","Type":"ContainerStarted","Data":"9b21723bc3fd4a42353b511c49a6688c2d0c7f04ba36d9ccd7c5488fb59d3de4"} Mar 09 03:04:27 crc kubenswrapper[4901]: I0309 03:04:27.998572 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af8a02ba-7609-485f-86c7-09a4d9ba2e59","Type":"ContainerStarted","Data":"0ac491278f2c49774f34e7e865692cadf5e0f002291ecd4bfd4afcaaa09e9221"} Mar 09 03:04:27 crc kubenswrapper[4901]: I0309 03:04:27.999360 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 03:04:29 crc kubenswrapper[4901]: I0309 03:04:29.432557 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 09 03:04:29 crc kubenswrapper[4901]: I0309 03:04:29.434485 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 09 03:04:29 crc kubenswrapper[4901]: I0309 03:04:29.440127 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 09 03:04:29 crc kubenswrapper[4901]: I0309 03:04:29.461832 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.75677703 podStartE2EDuration="7.461817879s" podCreationTimestamp="2026-03-09 03:04:22 +0000 UTC" firstStartedPulling="2026-03-09 03:04:23.880414814 +0000 UTC m=+1388.470078546" lastFinishedPulling="2026-03-09 03:04:27.585455633 +0000 UTC m=+1392.175119395" observedRunningTime="2026-03-09 03:04:28.027584774 +0000 UTC m=+1392.617248506" watchObservedRunningTime="2026-03-09 03:04:29.461817879 +0000 UTC m=+1394.051481611" Mar 09 03:04:29 crc kubenswrapper[4901]: I0309 03:04:29.631776 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 09 03:04:30 crc kubenswrapper[4901]: I0309 03:04:30.024016 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 09 03:04:31 crc kubenswrapper[4901]: I0309 03:04:31.935589 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 09 03:04:32 crc kubenswrapper[4901]: I0309 03:04:32.042050 4901 generic.go:334] "Generic (PLEG): container finished" podID="8155ec57-49a6-4d30-930e-9f10e3f28d17" containerID="db19a4f3b32cda9cbdf9d87cfcc422227be8d976ac4f57d9207a8d89e12f330c" exitCode=137 Mar 09 03:04:32 crc kubenswrapper[4901]: I0309 03:04:32.042121 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8155ec57-49a6-4d30-930e-9f10e3f28d17","Type":"ContainerDied","Data":"db19a4f3b32cda9cbdf9d87cfcc422227be8d976ac4f57d9207a8d89e12f330c"} Mar 09 03:04:32 crc kubenswrapper[4901]: I0309 03:04:32.042158 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8155ec57-49a6-4d30-930e-9f10e3f28d17","Type":"ContainerDied","Data":"6353c34d78ff3563cd0844b477ac08d81a1bc39a1bf3831ad829e47004f81e4c"} Mar 09 03:04:32 crc kubenswrapper[4901]: I0309 03:04:32.042176 4901 scope.go:117] "RemoveContainer" containerID="db19a4f3b32cda9cbdf9d87cfcc422227be8d976ac4f57d9207a8d89e12f330c" Mar 09 03:04:32 crc kubenswrapper[4901]: I0309 03:04:32.042190 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 09 03:04:32 crc kubenswrapper[4901]: I0309 03:04:32.061817 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8155ec57-49a6-4d30-930e-9f10e3f28d17-combined-ca-bundle\") pod \"8155ec57-49a6-4d30-930e-9f10e3f28d17\" (UID: \"8155ec57-49a6-4d30-930e-9f10e3f28d17\") " Mar 09 03:04:32 crc kubenswrapper[4901]: I0309 03:04:32.061918 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnpwr\" (UniqueName: \"kubernetes.io/projected/8155ec57-49a6-4d30-930e-9f10e3f28d17-kube-api-access-nnpwr\") pod \"8155ec57-49a6-4d30-930e-9f10e3f28d17\" (UID: \"8155ec57-49a6-4d30-930e-9f10e3f28d17\") " Mar 09 03:04:32 crc kubenswrapper[4901]: I0309 03:04:32.061962 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8155ec57-49a6-4d30-930e-9f10e3f28d17-config-data\") pod \"8155ec57-49a6-4d30-930e-9f10e3f28d17\" (UID: \"8155ec57-49a6-4d30-930e-9f10e3f28d17\") " Mar 09 03:04:32 crc kubenswrapper[4901]: I0309 03:04:32.069576 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8155ec57-49a6-4d30-930e-9f10e3f28d17-kube-api-access-nnpwr" (OuterVolumeSpecName: "kube-api-access-nnpwr") pod "8155ec57-49a6-4d30-930e-9f10e3f28d17" (UID: "8155ec57-49a6-4d30-930e-9f10e3f28d17"). InnerVolumeSpecName "kube-api-access-nnpwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:04:32 crc kubenswrapper[4901]: I0309 03:04:32.069696 4901 scope.go:117] "RemoveContainer" containerID="db19a4f3b32cda9cbdf9d87cfcc422227be8d976ac4f57d9207a8d89e12f330c" Mar 09 03:04:32 crc kubenswrapper[4901]: E0309 03:04:32.070087 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db19a4f3b32cda9cbdf9d87cfcc422227be8d976ac4f57d9207a8d89e12f330c\": container with ID starting with db19a4f3b32cda9cbdf9d87cfcc422227be8d976ac4f57d9207a8d89e12f330c not found: ID does not exist" containerID="db19a4f3b32cda9cbdf9d87cfcc422227be8d976ac4f57d9207a8d89e12f330c" Mar 09 03:04:32 crc kubenswrapper[4901]: I0309 03:04:32.070124 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db19a4f3b32cda9cbdf9d87cfcc422227be8d976ac4f57d9207a8d89e12f330c"} err="failed to get container status \"db19a4f3b32cda9cbdf9d87cfcc422227be8d976ac4f57d9207a8d89e12f330c\": rpc error: code = NotFound desc = could not find container \"db19a4f3b32cda9cbdf9d87cfcc422227be8d976ac4f57d9207a8d89e12f330c\": container with ID starting with db19a4f3b32cda9cbdf9d87cfcc422227be8d976ac4f57d9207a8d89e12f330c not found: ID does not exist" Mar 09 03:04:32 crc kubenswrapper[4901]: I0309 03:04:32.090548 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8155ec57-49a6-4d30-930e-9f10e3f28d17-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8155ec57-49a6-4d30-930e-9f10e3f28d17" (UID: "8155ec57-49a6-4d30-930e-9f10e3f28d17"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:04:32 crc kubenswrapper[4901]: I0309 03:04:32.103565 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8155ec57-49a6-4d30-930e-9f10e3f28d17-config-data" (OuterVolumeSpecName: "config-data") pod "8155ec57-49a6-4d30-930e-9f10e3f28d17" (UID: "8155ec57-49a6-4d30-930e-9f10e3f28d17"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:04:32 crc kubenswrapper[4901]: I0309 03:04:32.165014 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnpwr\" (UniqueName: \"kubernetes.io/projected/8155ec57-49a6-4d30-930e-9f10e3f28d17-kube-api-access-nnpwr\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:32 crc kubenswrapper[4901]: I0309 03:04:32.165064 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8155ec57-49a6-4d30-930e-9f10e3f28d17-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:32 crc kubenswrapper[4901]: I0309 03:04:32.165079 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8155ec57-49a6-4d30-930e-9f10e3f28d17-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:32 crc kubenswrapper[4901]: I0309 03:04:32.370944 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 03:04:32 crc kubenswrapper[4901]: I0309 03:04:32.387173 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 03:04:32 crc kubenswrapper[4901]: I0309 03:04:32.406281 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 03:04:32 crc kubenswrapper[4901]: E0309 03:04:32.406883 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8155ec57-49a6-4d30-930e-9f10e3f28d17" containerName="nova-cell1-novncproxy-novncproxy" Mar 09 03:04:32 crc kubenswrapper[4901]: I0309 03:04:32.406920 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="8155ec57-49a6-4d30-930e-9f10e3f28d17" containerName="nova-cell1-novncproxy-novncproxy" Mar 09 03:04:32 crc kubenswrapper[4901]: I0309 03:04:32.407504 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="8155ec57-49a6-4d30-930e-9f10e3f28d17" containerName="nova-cell1-novncproxy-novncproxy" Mar 09 03:04:32 crc kubenswrapper[4901]: I0309 03:04:32.408799 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 09 03:04:32 crc kubenswrapper[4901]: I0309 03:04:32.411537 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 09 03:04:32 crc kubenswrapper[4901]: I0309 03:04:32.412522 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 09 03:04:32 crc kubenswrapper[4901]: I0309 03:04:32.412955 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 09 03:04:32 crc kubenswrapper[4901]: I0309 03:04:32.415786 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 03:04:32 crc kubenswrapper[4901]: I0309 03:04:32.573996 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52461a44-ded9-4025-b0f1-85c22462a04f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"52461a44-ded9-4025-b0f1-85c22462a04f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 03:04:32 crc kubenswrapper[4901]: I0309 03:04:32.574112 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52461a44-ded9-4025-b0f1-85c22462a04f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"52461a44-ded9-4025-b0f1-85c22462a04f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 03:04:32 crc kubenswrapper[4901]: I0309 03:04:32.574511 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gb5x2\" (UniqueName: \"kubernetes.io/projected/52461a44-ded9-4025-b0f1-85c22462a04f-kube-api-access-gb5x2\") pod \"nova-cell1-novncproxy-0\" (UID: \"52461a44-ded9-4025-b0f1-85c22462a04f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 03:04:32 crc kubenswrapper[4901]: I0309 03:04:32.574693 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/52461a44-ded9-4025-b0f1-85c22462a04f-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"52461a44-ded9-4025-b0f1-85c22462a04f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 03:04:32 crc kubenswrapper[4901]: I0309 03:04:32.574763 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/52461a44-ded9-4025-b0f1-85c22462a04f-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"52461a44-ded9-4025-b0f1-85c22462a04f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 03:04:32 crc kubenswrapper[4901]: I0309 03:04:32.676617 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52461a44-ded9-4025-b0f1-85c22462a04f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"52461a44-ded9-4025-b0f1-85c22462a04f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 03:04:32 crc kubenswrapper[4901]: I0309 03:04:32.676736 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gb5x2\" (UniqueName: \"kubernetes.io/projected/52461a44-ded9-4025-b0f1-85c22462a04f-kube-api-access-gb5x2\") pod \"nova-cell1-novncproxy-0\" (UID: \"52461a44-ded9-4025-b0f1-85c22462a04f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 03:04:32 crc kubenswrapper[4901]: I0309 03:04:32.676797 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/52461a44-ded9-4025-b0f1-85c22462a04f-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"52461a44-ded9-4025-b0f1-85c22462a04f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 03:04:32 crc kubenswrapper[4901]: I0309 03:04:32.676831 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/52461a44-ded9-4025-b0f1-85c22462a04f-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"52461a44-ded9-4025-b0f1-85c22462a04f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 03:04:32 crc kubenswrapper[4901]: I0309 03:04:32.676920 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52461a44-ded9-4025-b0f1-85c22462a04f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"52461a44-ded9-4025-b0f1-85c22462a04f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 03:04:32 crc kubenswrapper[4901]: I0309 03:04:32.682586 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52461a44-ded9-4025-b0f1-85c22462a04f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"52461a44-ded9-4025-b0f1-85c22462a04f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 03:04:32 crc kubenswrapper[4901]: I0309 03:04:32.683185 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/52461a44-ded9-4025-b0f1-85c22462a04f-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"52461a44-ded9-4025-b0f1-85c22462a04f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 03:04:32 crc kubenswrapper[4901]: I0309 03:04:32.684373 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52461a44-ded9-4025-b0f1-85c22462a04f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"52461a44-ded9-4025-b0f1-85c22462a04f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 03:04:32 crc kubenswrapper[4901]: I0309 03:04:32.696429 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/52461a44-ded9-4025-b0f1-85c22462a04f-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"52461a44-ded9-4025-b0f1-85c22462a04f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 03:04:32 crc kubenswrapper[4901]: I0309 03:04:32.697099 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gb5x2\" (UniqueName: \"kubernetes.io/projected/52461a44-ded9-4025-b0f1-85c22462a04f-kube-api-access-gb5x2\") pod \"nova-cell1-novncproxy-0\" (UID: \"52461a44-ded9-4025-b0f1-85c22462a04f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 03:04:32 crc kubenswrapper[4901]: I0309 03:04:32.728871 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 09 03:04:33 crc kubenswrapper[4901]: I0309 03:04:33.312788 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 03:04:33 crc kubenswrapper[4901]: I0309 03:04:33.863975 4901 scope.go:117] "RemoveContainer" containerID="6e34be1e8a47029c733d66e7e516cb6e368f429de6d79071ffc355445d8f8446" Mar 09 03:04:34 crc kubenswrapper[4901]: I0309 03:04:34.066152 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"52461a44-ded9-4025-b0f1-85c22462a04f","Type":"ContainerStarted","Data":"4fc838d65f3d594291f10b8a95a561b0be21c7dd87bb4f1fb968c80a5bf041fa"} Mar 09 03:04:34 crc kubenswrapper[4901]: I0309 03:04:34.066206 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"52461a44-ded9-4025-b0f1-85c22462a04f","Type":"ContainerStarted","Data":"22706626cd82769b885a529ea386be9f38a0164eb502bacdbbb1be7b00386f81"} Mar 09 03:04:34 crc kubenswrapper[4901]: I0309 03:04:34.095736 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.095717635 podStartE2EDuration="2.095717635s" podCreationTimestamp="2026-03-09 03:04:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 03:04:34.081218519 +0000 UTC m=+1398.670882281" watchObservedRunningTime="2026-03-09 03:04:34.095717635 +0000 UTC m=+1398.685381367" Mar 09 03:04:34 crc kubenswrapper[4901]: I0309 03:04:34.129170 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8155ec57-49a6-4d30-930e-9f10e3f28d17" path="/var/lib/kubelet/pods/8155ec57-49a6-4d30-930e-9f10e3f28d17/volumes" Mar 09 03:04:34 crc kubenswrapper[4901]: I0309 03:04:34.467368 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 09 03:04:34 crc kubenswrapper[4901]: I0309 03:04:34.468694 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 09 03:04:34 crc kubenswrapper[4901]: I0309 03:04:34.469179 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 09 03:04:34 crc kubenswrapper[4901]: I0309 03:04:34.469227 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 09 03:04:34 crc kubenswrapper[4901]: I0309 03:04:34.473722 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 09 03:04:34 crc kubenswrapper[4901]: I0309 03:04:34.474237 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 09 03:04:34 crc kubenswrapper[4901]: I0309 03:04:34.707503 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-c8964d89c-vw8lc"] Mar 09 03:04:34 crc kubenswrapper[4901]: I0309 03:04:34.710989 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c8964d89c-vw8lc" Mar 09 03:04:34 crc kubenswrapper[4901]: I0309 03:04:34.722472 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c8964d89c-vw8lc"] Mar 09 03:04:34 crc kubenswrapper[4901]: I0309 03:04:34.841108 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcw8z\" (UniqueName: \"kubernetes.io/projected/4ba69329-2c9f-4938-89b0-d1fa314d5a30-kube-api-access-vcw8z\") pod \"dnsmasq-dns-c8964d89c-vw8lc\" (UID: \"4ba69329-2c9f-4938-89b0-d1fa314d5a30\") " pod="openstack/dnsmasq-dns-c8964d89c-vw8lc" Mar 09 03:04:34 crc kubenswrapper[4901]: I0309 03:04:34.841152 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ba69329-2c9f-4938-89b0-d1fa314d5a30-ovsdbserver-sb\") pod \"dnsmasq-dns-c8964d89c-vw8lc\" (UID: \"4ba69329-2c9f-4938-89b0-d1fa314d5a30\") " pod="openstack/dnsmasq-dns-c8964d89c-vw8lc" Mar 09 03:04:34 crc kubenswrapper[4901]: I0309 03:04:34.841173 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ba69329-2c9f-4938-89b0-d1fa314d5a30-config\") pod \"dnsmasq-dns-c8964d89c-vw8lc\" (UID: \"4ba69329-2c9f-4938-89b0-d1fa314d5a30\") " pod="openstack/dnsmasq-dns-c8964d89c-vw8lc" Mar 09 03:04:34 crc kubenswrapper[4901]: I0309 03:04:34.841186 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ba69329-2c9f-4938-89b0-d1fa314d5a30-ovsdbserver-nb\") pod \"dnsmasq-dns-c8964d89c-vw8lc\" (UID: \"4ba69329-2c9f-4938-89b0-d1fa314d5a30\") " pod="openstack/dnsmasq-dns-c8964d89c-vw8lc" Mar 09 03:04:34 crc kubenswrapper[4901]: I0309 03:04:34.841246 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4ba69329-2c9f-4938-89b0-d1fa314d5a30-dns-swift-storage-0\") pod \"dnsmasq-dns-c8964d89c-vw8lc\" (UID: \"4ba69329-2c9f-4938-89b0-d1fa314d5a30\") " pod="openstack/dnsmasq-dns-c8964d89c-vw8lc" Mar 09 03:04:34 crc kubenswrapper[4901]: I0309 03:04:34.841705 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ba69329-2c9f-4938-89b0-d1fa314d5a30-dns-svc\") pod \"dnsmasq-dns-c8964d89c-vw8lc\" (UID: \"4ba69329-2c9f-4938-89b0-d1fa314d5a30\") " pod="openstack/dnsmasq-dns-c8964d89c-vw8lc" Mar 09 03:04:34 crc kubenswrapper[4901]: I0309 03:04:34.943919 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ba69329-2c9f-4938-89b0-d1fa314d5a30-dns-svc\") pod \"dnsmasq-dns-c8964d89c-vw8lc\" (UID: \"4ba69329-2c9f-4938-89b0-d1fa314d5a30\") " pod="openstack/dnsmasq-dns-c8964d89c-vw8lc" Mar 09 03:04:34 crc kubenswrapper[4901]: I0309 03:04:34.944031 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcw8z\" (UniqueName: \"kubernetes.io/projected/4ba69329-2c9f-4938-89b0-d1fa314d5a30-kube-api-access-vcw8z\") pod \"dnsmasq-dns-c8964d89c-vw8lc\" (UID: \"4ba69329-2c9f-4938-89b0-d1fa314d5a30\") " pod="openstack/dnsmasq-dns-c8964d89c-vw8lc" Mar 09 03:04:34 crc kubenswrapper[4901]: I0309 03:04:34.944054 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ba69329-2c9f-4938-89b0-d1fa314d5a30-ovsdbserver-sb\") pod \"dnsmasq-dns-c8964d89c-vw8lc\" (UID: \"4ba69329-2c9f-4938-89b0-d1fa314d5a30\") " pod="openstack/dnsmasq-dns-c8964d89c-vw8lc" Mar 09 03:04:34 crc kubenswrapper[4901]: I0309 03:04:34.944075 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ba69329-2c9f-4938-89b0-d1fa314d5a30-config\") pod \"dnsmasq-dns-c8964d89c-vw8lc\" (UID: \"4ba69329-2c9f-4938-89b0-d1fa314d5a30\") " pod="openstack/dnsmasq-dns-c8964d89c-vw8lc" Mar 09 03:04:34 crc kubenswrapper[4901]: I0309 03:04:34.944090 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ba69329-2c9f-4938-89b0-d1fa314d5a30-ovsdbserver-nb\") pod \"dnsmasq-dns-c8964d89c-vw8lc\" (UID: \"4ba69329-2c9f-4938-89b0-d1fa314d5a30\") " pod="openstack/dnsmasq-dns-c8964d89c-vw8lc" Mar 09 03:04:34 crc kubenswrapper[4901]: I0309 03:04:34.944123 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4ba69329-2c9f-4938-89b0-d1fa314d5a30-dns-swift-storage-0\") pod \"dnsmasq-dns-c8964d89c-vw8lc\" (UID: \"4ba69329-2c9f-4938-89b0-d1fa314d5a30\") " pod="openstack/dnsmasq-dns-c8964d89c-vw8lc" Mar 09 03:04:34 crc kubenswrapper[4901]: I0309 03:04:34.944944 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ba69329-2c9f-4938-89b0-d1fa314d5a30-dns-svc\") pod \"dnsmasq-dns-c8964d89c-vw8lc\" (UID: \"4ba69329-2c9f-4938-89b0-d1fa314d5a30\") " pod="openstack/dnsmasq-dns-c8964d89c-vw8lc" Mar 09 03:04:34 crc kubenswrapper[4901]: I0309 03:04:34.945041 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ba69329-2c9f-4938-89b0-d1fa314d5a30-config\") pod \"dnsmasq-dns-c8964d89c-vw8lc\" (UID: \"4ba69329-2c9f-4938-89b0-d1fa314d5a30\") " pod="openstack/dnsmasq-dns-c8964d89c-vw8lc" Mar 09 03:04:34 crc kubenswrapper[4901]: I0309 03:04:34.945232 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ba69329-2c9f-4938-89b0-d1fa314d5a30-ovsdbserver-nb\") pod \"dnsmasq-dns-c8964d89c-vw8lc\" (UID: \"4ba69329-2c9f-4938-89b0-d1fa314d5a30\") " pod="openstack/dnsmasq-dns-c8964d89c-vw8lc" Mar 09 03:04:34 crc kubenswrapper[4901]: I0309 03:04:34.945309 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ba69329-2c9f-4938-89b0-d1fa314d5a30-ovsdbserver-sb\") pod \"dnsmasq-dns-c8964d89c-vw8lc\" (UID: \"4ba69329-2c9f-4938-89b0-d1fa314d5a30\") " pod="openstack/dnsmasq-dns-c8964d89c-vw8lc" Mar 09 03:04:34 crc kubenswrapper[4901]: I0309 03:04:34.945338 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4ba69329-2c9f-4938-89b0-d1fa314d5a30-dns-swift-storage-0\") pod \"dnsmasq-dns-c8964d89c-vw8lc\" (UID: \"4ba69329-2c9f-4938-89b0-d1fa314d5a30\") " pod="openstack/dnsmasq-dns-c8964d89c-vw8lc" Mar 09 03:04:34 crc kubenswrapper[4901]: I0309 03:04:34.969046 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcw8z\" (UniqueName: \"kubernetes.io/projected/4ba69329-2c9f-4938-89b0-d1fa314d5a30-kube-api-access-vcw8z\") pod \"dnsmasq-dns-c8964d89c-vw8lc\" (UID: \"4ba69329-2c9f-4938-89b0-d1fa314d5a30\") " pod="openstack/dnsmasq-dns-c8964d89c-vw8lc" Mar 09 03:04:35 crc kubenswrapper[4901]: I0309 03:04:35.050481 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c8964d89c-vw8lc" Mar 09 03:04:35 crc kubenswrapper[4901]: W0309 03:04:35.587535 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ba69329_2c9f_4938_89b0_d1fa314d5a30.slice/crio-f2071503a136379a8a6d0a61f956bf12bd79625d11cb0b7a8f2eb58b90a4db68 WatchSource:0}: Error finding container f2071503a136379a8a6d0a61f956bf12bd79625d11cb0b7a8f2eb58b90a4db68: Status 404 returned error can't find the container with id f2071503a136379a8a6d0a61f956bf12bd79625d11cb0b7a8f2eb58b90a4db68 Mar 09 03:04:35 crc kubenswrapper[4901]: I0309 03:04:35.593894 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c8964d89c-vw8lc"] Mar 09 03:04:36 crc kubenswrapper[4901]: I0309 03:04:36.098113 4901 generic.go:334] "Generic (PLEG): container finished" podID="4ba69329-2c9f-4938-89b0-d1fa314d5a30" containerID="ff692325b2a193aa8217a1042192281dc1b97f1fb55180f2002c7c09b3a9f731" exitCode=0 Mar 09 03:04:36 crc kubenswrapper[4901]: I0309 03:04:36.099078 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c8964d89c-vw8lc" event={"ID":"4ba69329-2c9f-4938-89b0-d1fa314d5a30","Type":"ContainerDied","Data":"ff692325b2a193aa8217a1042192281dc1b97f1fb55180f2002c7c09b3a9f731"} Mar 09 03:04:36 crc kubenswrapper[4901]: I0309 03:04:36.099116 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c8964d89c-vw8lc" event={"ID":"4ba69329-2c9f-4938-89b0-d1fa314d5a30","Type":"ContainerStarted","Data":"f2071503a136379a8a6d0a61f956bf12bd79625d11cb0b7a8f2eb58b90a4db68"} Mar 09 03:04:36 crc kubenswrapper[4901]: I0309 03:04:36.588095 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 03:04:36 crc kubenswrapper[4901]: I0309 03:04:36.588585 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="af8a02ba-7609-485f-86c7-09a4d9ba2e59" containerName="ceilometer-central-agent" containerID="cri-o://15ea0bf3cdd9841411ad8bfe69bfb6d1bfa47555f7306fb86d7a6d9be2ab54db" gracePeriod=30 Mar 09 03:04:36 crc kubenswrapper[4901]: I0309 03:04:36.588690 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="af8a02ba-7609-485f-86c7-09a4d9ba2e59" containerName="ceilometer-notification-agent" containerID="cri-o://9b21723bc3fd4a42353b511c49a6688c2d0c7f04ba36d9ccd7c5488fb59d3de4" gracePeriod=30 Mar 09 03:04:36 crc kubenswrapper[4901]: I0309 03:04:36.588717 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="af8a02ba-7609-485f-86c7-09a4d9ba2e59" containerName="sg-core" containerID="cri-o://3a58648443d73d61e25350c6fccf6cfa99f350e533540cc5929f0298a5bdcef1" gracePeriod=30 Mar 09 03:04:36 crc kubenswrapper[4901]: I0309 03:04:36.588701 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="af8a02ba-7609-485f-86c7-09a4d9ba2e59" containerName="proxy-httpd" containerID="cri-o://0ac491278f2c49774f34e7e865692cadf5e0f002291ecd4bfd4afcaaa09e9221" gracePeriod=30 Mar 09 03:04:37 crc kubenswrapper[4901]: I0309 03:04:37.122489 4901 generic.go:334] "Generic (PLEG): container finished" podID="af8a02ba-7609-485f-86c7-09a4d9ba2e59" containerID="0ac491278f2c49774f34e7e865692cadf5e0f002291ecd4bfd4afcaaa09e9221" exitCode=0 Mar 09 03:04:37 crc kubenswrapper[4901]: I0309 03:04:37.122789 4901 generic.go:334] "Generic (PLEG): container finished" podID="af8a02ba-7609-485f-86c7-09a4d9ba2e59" containerID="3a58648443d73d61e25350c6fccf6cfa99f350e533540cc5929f0298a5bdcef1" exitCode=2 Mar 09 03:04:37 crc kubenswrapper[4901]: I0309 03:04:37.122800 4901 generic.go:334] "Generic (PLEG): container finished" podID="af8a02ba-7609-485f-86c7-09a4d9ba2e59" containerID="15ea0bf3cdd9841411ad8bfe69bfb6d1bfa47555f7306fb86d7a6d9be2ab54db" exitCode=0 Mar 09 03:04:37 crc kubenswrapper[4901]: I0309 03:04:37.123008 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af8a02ba-7609-485f-86c7-09a4d9ba2e59","Type":"ContainerDied","Data":"0ac491278f2c49774f34e7e865692cadf5e0f002291ecd4bfd4afcaaa09e9221"} Mar 09 03:04:37 crc kubenswrapper[4901]: I0309 03:04:37.123037 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af8a02ba-7609-485f-86c7-09a4d9ba2e59","Type":"ContainerDied","Data":"3a58648443d73d61e25350c6fccf6cfa99f350e533540cc5929f0298a5bdcef1"} Mar 09 03:04:37 crc kubenswrapper[4901]: I0309 03:04:37.123047 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af8a02ba-7609-485f-86c7-09a4d9ba2e59","Type":"ContainerDied","Data":"15ea0bf3cdd9841411ad8bfe69bfb6d1bfa47555f7306fb86d7a6d9be2ab54db"} Mar 09 03:04:37 crc kubenswrapper[4901]: I0309 03:04:37.135129 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c8964d89c-vw8lc" event={"ID":"4ba69329-2c9f-4938-89b0-d1fa314d5a30","Type":"ContainerStarted","Data":"2023e3fb3e8f7021afadc49905f2c75c9b97db5b0d3ac7172346d27ce1b5d2f7"} Mar 09 03:04:37 crc kubenswrapper[4901]: I0309 03:04:37.135711 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-c8964d89c-vw8lc" Mar 09 03:04:37 crc kubenswrapper[4901]: I0309 03:04:37.160923 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 09 03:04:37 crc kubenswrapper[4901]: I0309 03:04:37.161122 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="258f621f-8909-4d36-8f2f-bdd166e47139" containerName="nova-api-log" containerID="cri-o://8330b5ad7d717048a6505dd4d7ea09f3041f917dbb4d0755882ea880e972135f" gracePeriod=30 Mar 09 03:04:37 crc kubenswrapper[4901]: I0309 03:04:37.161558 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="258f621f-8909-4d36-8f2f-bdd166e47139" containerName="nova-api-api" containerID="cri-o://e37a6eec70c902f78746a9a22ee650d4599cd6a648db9148591ccbd0155126e9" gracePeriod=30 Mar 09 03:04:37 crc kubenswrapper[4901]: I0309 03:04:37.167873 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-c8964d89c-vw8lc" podStartSLOduration=3.16785803 podStartE2EDuration="3.16785803s" podCreationTimestamp="2026-03-09 03:04:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 03:04:37.15197131 +0000 UTC m=+1401.741635042" watchObservedRunningTime="2026-03-09 03:04:37.16785803 +0000 UTC m=+1401.757521762" Mar 09 03:04:37 crc kubenswrapper[4901]: I0309 03:04:37.481784 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 03:04:37 crc kubenswrapper[4901]: I0309 03:04:37.594357 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af8a02ba-7609-485f-86c7-09a4d9ba2e59-log-httpd\") pod \"af8a02ba-7609-485f-86c7-09a4d9ba2e59\" (UID: \"af8a02ba-7609-485f-86c7-09a4d9ba2e59\") " Mar 09 03:04:37 crc kubenswrapper[4901]: I0309 03:04:37.594527 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xm5xj\" (UniqueName: \"kubernetes.io/projected/af8a02ba-7609-485f-86c7-09a4d9ba2e59-kube-api-access-xm5xj\") pod \"af8a02ba-7609-485f-86c7-09a4d9ba2e59\" (UID: \"af8a02ba-7609-485f-86c7-09a4d9ba2e59\") " Mar 09 03:04:37 crc kubenswrapper[4901]: I0309 03:04:37.594585 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/af8a02ba-7609-485f-86c7-09a4d9ba2e59-ceilometer-tls-certs\") pod \"af8a02ba-7609-485f-86c7-09a4d9ba2e59\" (UID: \"af8a02ba-7609-485f-86c7-09a4d9ba2e59\") " Mar 09 03:04:37 crc kubenswrapper[4901]: I0309 03:04:37.594639 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/af8a02ba-7609-485f-86c7-09a4d9ba2e59-sg-core-conf-yaml\") pod \"af8a02ba-7609-485f-86c7-09a4d9ba2e59\" (UID: \"af8a02ba-7609-485f-86c7-09a4d9ba2e59\") " Mar 09 03:04:37 crc kubenswrapper[4901]: I0309 03:04:37.594721 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af8a02ba-7609-485f-86c7-09a4d9ba2e59-config-data\") pod \"af8a02ba-7609-485f-86c7-09a4d9ba2e59\" (UID: \"af8a02ba-7609-485f-86c7-09a4d9ba2e59\") " Mar 09 03:04:37 crc kubenswrapper[4901]: I0309 03:04:37.594756 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af8a02ba-7609-485f-86c7-09a4d9ba2e59-run-httpd\") pod \"af8a02ba-7609-485f-86c7-09a4d9ba2e59\" (UID: \"af8a02ba-7609-485f-86c7-09a4d9ba2e59\") " Mar 09 03:04:37 crc kubenswrapper[4901]: I0309 03:04:37.594810 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af8a02ba-7609-485f-86c7-09a4d9ba2e59-combined-ca-bundle\") pod \"af8a02ba-7609-485f-86c7-09a4d9ba2e59\" (UID: \"af8a02ba-7609-485f-86c7-09a4d9ba2e59\") " Mar 09 03:04:37 crc kubenswrapper[4901]: I0309 03:04:37.594865 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af8a02ba-7609-485f-86c7-09a4d9ba2e59-scripts\") pod \"af8a02ba-7609-485f-86c7-09a4d9ba2e59\" (UID: \"af8a02ba-7609-485f-86c7-09a4d9ba2e59\") " Mar 09 03:04:37 crc kubenswrapper[4901]: I0309 03:04:37.596836 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af8a02ba-7609-485f-86c7-09a4d9ba2e59-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "af8a02ba-7609-485f-86c7-09a4d9ba2e59" (UID: "af8a02ba-7609-485f-86c7-09a4d9ba2e59"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:04:37 crc kubenswrapper[4901]: I0309 03:04:37.597114 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af8a02ba-7609-485f-86c7-09a4d9ba2e59-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "af8a02ba-7609-485f-86c7-09a4d9ba2e59" (UID: "af8a02ba-7609-485f-86c7-09a4d9ba2e59"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:04:37 crc kubenswrapper[4901]: I0309 03:04:37.602720 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af8a02ba-7609-485f-86c7-09a4d9ba2e59-scripts" (OuterVolumeSpecName: "scripts") pod "af8a02ba-7609-485f-86c7-09a4d9ba2e59" (UID: "af8a02ba-7609-485f-86c7-09a4d9ba2e59"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:04:37 crc kubenswrapper[4901]: I0309 03:04:37.604114 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af8a02ba-7609-485f-86c7-09a4d9ba2e59-kube-api-access-xm5xj" (OuterVolumeSpecName: "kube-api-access-xm5xj") pod "af8a02ba-7609-485f-86c7-09a4d9ba2e59" (UID: "af8a02ba-7609-485f-86c7-09a4d9ba2e59"). InnerVolumeSpecName "kube-api-access-xm5xj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:04:37 crc kubenswrapper[4901]: I0309 03:04:37.627930 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af8a02ba-7609-485f-86c7-09a4d9ba2e59-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "af8a02ba-7609-485f-86c7-09a4d9ba2e59" (UID: "af8a02ba-7609-485f-86c7-09a4d9ba2e59"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:04:37 crc kubenswrapper[4901]: I0309 03:04:37.675418 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af8a02ba-7609-485f-86c7-09a4d9ba2e59-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "af8a02ba-7609-485f-86c7-09a4d9ba2e59" (UID: "af8a02ba-7609-485f-86c7-09a4d9ba2e59"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:04:37 crc kubenswrapper[4901]: I0309 03:04:37.698678 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xm5xj\" (UniqueName: \"kubernetes.io/projected/af8a02ba-7609-485f-86c7-09a4d9ba2e59-kube-api-access-xm5xj\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:37 crc kubenswrapper[4901]: I0309 03:04:37.698726 4901 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/af8a02ba-7609-485f-86c7-09a4d9ba2e59-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:37 crc kubenswrapper[4901]: I0309 03:04:37.698745 4901 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/af8a02ba-7609-485f-86c7-09a4d9ba2e59-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:37 crc kubenswrapper[4901]: I0309 03:04:37.698764 4901 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af8a02ba-7609-485f-86c7-09a4d9ba2e59-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:37 crc kubenswrapper[4901]: I0309 03:04:37.698781 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af8a02ba-7609-485f-86c7-09a4d9ba2e59-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:37 crc kubenswrapper[4901]: I0309 03:04:37.698799 4901 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af8a02ba-7609-485f-86c7-09a4d9ba2e59-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:37 crc kubenswrapper[4901]: I0309 03:04:37.700467 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af8a02ba-7609-485f-86c7-09a4d9ba2e59-config-data" (OuterVolumeSpecName: "config-data") pod "af8a02ba-7609-485f-86c7-09a4d9ba2e59" (UID: "af8a02ba-7609-485f-86c7-09a4d9ba2e59"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:04:37 crc kubenswrapper[4901]: I0309 03:04:37.714133 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af8a02ba-7609-485f-86c7-09a4d9ba2e59-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "af8a02ba-7609-485f-86c7-09a4d9ba2e59" (UID: "af8a02ba-7609-485f-86c7-09a4d9ba2e59"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:04:37 crc kubenswrapper[4901]: I0309 03:04:37.729343 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 09 03:04:37 crc kubenswrapper[4901]: I0309 03:04:37.801418 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af8a02ba-7609-485f-86c7-09a4d9ba2e59-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:37 crc kubenswrapper[4901]: I0309 03:04:37.801480 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af8a02ba-7609-485f-86c7-09a4d9ba2e59-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:38 crc kubenswrapper[4901]: I0309 03:04:38.145601 4901 generic.go:334] "Generic (PLEG): container finished" podID="258f621f-8909-4d36-8f2f-bdd166e47139" containerID="8330b5ad7d717048a6505dd4d7ea09f3041f917dbb4d0755882ea880e972135f" exitCode=143 Mar 09 03:04:38 crc kubenswrapper[4901]: I0309 03:04:38.145671 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"258f621f-8909-4d36-8f2f-bdd166e47139","Type":"ContainerDied","Data":"8330b5ad7d717048a6505dd4d7ea09f3041f917dbb4d0755882ea880e972135f"} Mar 09 03:04:38 crc kubenswrapper[4901]: I0309 03:04:38.149013 4901 generic.go:334] "Generic (PLEG): container finished" podID="af8a02ba-7609-485f-86c7-09a4d9ba2e59" containerID="9b21723bc3fd4a42353b511c49a6688c2d0c7f04ba36d9ccd7c5488fb59d3de4" exitCode=0 Mar 09 03:04:38 crc kubenswrapper[4901]: I0309 03:04:38.149405 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af8a02ba-7609-485f-86c7-09a4d9ba2e59","Type":"ContainerDied","Data":"9b21723bc3fd4a42353b511c49a6688c2d0c7f04ba36d9ccd7c5488fb59d3de4"} Mar 09 03:04:38 crc kubenswrapper[4901]: I0309 03:04:38.149460 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af8a02ba-7609-485f-86c7-09a4d9ba2e59","Type":"ContainerDied","Data":"6c5334d2f7cd065a52ffe5ff3d49e355f6911d174012743979061301d813441b"} Mar 09 03:04:38 crc kubenswrapper[4901]: I0309 03:04:38.149479 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 03:04:38 crc kubenswrapper[4901]: I0309 03:04:38.149485 4901 scope.go:117] "RemoveContainer" containerID="0ac491278f2c49774f34e7e865692cadf5e0f002291ecd4bfd4afcaaa09e9221" Mar 09 03:04:38 crc kubenswrapper[4901]: I0309 03:04:38.182134 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 03:04:38 crc kubenswrapper[4901]: I0309 03:04:38.200792 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 03:04:38 crc kubenswrapper[4901]: I0309 03:04:38.204569 4901 scope.go:117] "RemoveContainer" containerID="3a58648443d73d61e25350c6fccf6cfa99f350e533540cc5929f0298a5bdcef1" Mar 09 03:04:38 crc kubenswrapper[4901]: I0309 03:04:38.222248 4901 scope.go:117] "RemoveContainer" containerID="9b21723bc3fd4a42353b511c49a6688c2d0c7f04ba36d9ccd7c5488fb59d3de4" Mar 09 03:04:38 crc kubenswrapper[4901]: I0309 03:04:38.236439 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 03:04:38 crc kubenswrapper[4901]: E0309 03:04:38.237345 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af8a02ba-7609-485f-86c7-09a4d9ba2e59" containerName="proxy-httpd" Mar 09 03:04:38 crc kubenswrapper[4901]: I0309 03:04:38.237368 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="af8a02ba-7609-485f-86c7-09a4d9ba2e59" containerName="proxy-httpd" Mar 09 03:04:38 crc kubenswrapper[4901]: E0309 03:04:38.237387 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af8a02ba-7609-485f-86c7-09a4d9ba2e59" containerName="sg-core" Mar 09 03:04:38 crc kubenswrapper[4901]: I0309 03:04:38.237395 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="af8a02ba-7609-485f-86c7-09a4d9ba2e59" containerName="sg-core" Mar 09 03:04:38 crc kubenswrapper[4901]: E0309 03:04:38.237410 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af8a02ba-7609-485f-86c7-09a4d9ba2e59" containerName="ceilometer-central-agent" Mar 09 03:04:38 crc kubenswrapper[4901]: I0309 03:04:38.237417 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="af8a02ba-7609-485f-86c7-09a4d9ba2e59" containerName="ceilometer-central-agent" Mar 09 03:04:38 crc kubenswrapper[4901]: E0309 03:04:38.237442 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af8a02ba-7609-485f-86c7-09a4d9ba2e59" containerName="ceilometer-notification-agent" Mar 09 03:04:38 crc kubenswrapper[4901]: I0309 03:04:38.237450 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="af8a02ba-7609-485f-86c7-09a4d9ba2e59" containerName="ceilometer-notification-agent" Mar 09 03:04:38 crc kubenswrapper[4901]: I0309 03:04:38.237697 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="af8a02ba-7609-485f-86c7-09a4d9ba2e59" containerName="ceilometer-central-agent" Mar 09 03:04:38 crc kubenswrapper[4901]: I0309 03:04:38.237716 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="af8a02ba-7609-485f-86c7-09a4d9ba2e59" containerName="proxy-httpd" Mar 09 03:04:38 crc kubenswrapper[4901]: I0309 03:04:38.237740 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="af8a02ba-7609-485f-86c7-09a4d9ba2e59" containerName="ceilometer-notification-agent" Mar 09 03:04:38 crc kubenswrapper[4901]: I0309 03:04:38.237755 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="af8a02ba-7609-485f-86c7-09a4d9ba2e59" containerName="sg-core" Mar 09 03:04:38 crc kubenswrapper[4901]: I0309 03:04:38.239737 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 03:04:38 crc kubenswrapper[4901]: I0309 03:04:38.242743 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 03:04:38 crc kubenswrapper[4901]: I0309 03:04:38.242855 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 03:04:38 crc kubenswrapper[4901]: I0309 03:04:38.242870 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 09 03:04:38 crc kubenswrapper[4901]: I0309 03:04:38.245608 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 03:04:38 crc kubenswrapper[4901]: I0309 03:04:38.252024 4901 scope.go:117] "RemoveContainer" containerID="15ea0bf3cdd9841411ad8bfe69bfb6d1bfa47555f7306fb86d7a6d9be2ab54db" Mar 09 03:04:38 crc kubenswrapper[4901]: I0309 03:04:38.271640 4901 scope.go:117] "RemoveContainer" containerID="0ac491278f2c49774f34e7e865692cadf5e0f002291ecd4bfd4afcaaa09e9221" Mar 09 03:04:38 crc kubenswrapper[4901]: E0309 03:04:38.272203 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ac491278f2c49774f34e7e865692cadf5e0f002291ecd4bfd4afcaaa09e9221\": container with ID starting with 0ac491278f2c49774f34e7e865692cadf5e0f002291ecd4bfd4afcaaa09e9221 not found: ID does not exist" containerID="0ac491278f2c49774f34e7e865692cadf5e0f002291ecd4bfd4afcaaa09e9221" Mar 09 03:04:38 crc kubenswrapper[4901]: I0309 03:04:38.272287 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ac491278f2c49774f34e7e865692cadf5e0f002291ecd4bfd4afcaaa09e9221"} err="failed to get container status \"0ac491278f2c49774f34e7e865692cadf5e0f002291ecd4bfd4afcaaa09e9221\": rpc error: code = NotFound desc = could not find container \"0ac491278f2c49774f34e7e865692cadf5e0f002291ecd4bfd4afcaaa09e9221\": container with ID starting with 0ac491278f2c49774f34e7e865692cadf5e0f002291ecd4bfd4afcaaa09e9221 not found: ID does not exist" Mar 09 03:04:38 crc kubenswrapper[4901]: I0309 03:04:38.272325 4901 scope.go:117] "RemoveContainer" containerID="3a58648443d73d61e25350c6fccf6cfa99f350e533540cc5929f0298a5bdcef1" Mar 09 03:04:38 crc kubenswrapper[4901]: E0309 03:04:38.272843 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a58648443d73d61e25350c6fccf6cfa99f350e533540cc5929f0298a5bdcef1\": container with ID starting with 3a58648443d73d61e25350c6fccf6cfa99f350e533540cc5929f0298a5bdcef1 not found: ID does not exist" containerID="3a58648443d73d61e25350c6fccf6cfa99f350e533540cc5929f0298a5bdcef1" Mar 09 03:04:38 crc kubenswrapper[4901]: I0309 03:04:38.272884 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a58648443d73d61e25350c6fccf6cfa99f350e533540cc5929f0298a5bdcef1"} err="failed to get container status \"3a58648443d73d61e25350c6fccf6cfa99f350e533540cc5929f0298a5bdcef1\": rpc error: code = NotFound desc = could not find container \"3a58648443d73d61e25350c6fccf6cfa99f350e533540cc5929f0298a5bdcef1\": container with ID starting with 3a58648443d73d61e25350c6fccf6cfa99f350e533540cc5929f0298a5bdcef1 not found: ID does not exist" Mar 09 03:04:38 crc kubenswrapper[4901]: I0309 03:04:38.272918 4901 scope.go:117] "RemoveContainer" containerID="9b21723bc3fd4a42353b511c49a6688c2d0c7f04ba36d9ccd7c5488fb59d3de4" Mar 09 03:04:38 crc kubenswrapper[4901]: E0309 03:04:38.274249 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b21723bc3fd4a42353b511c49a6688c2d0c7f04ba36d9ccd7c5488fb59d3de4\": container with ID starting with 9b21723bc3fd4a42353b511c49a6688c2d0c7f04ba36d9ccd7c5488fb59d3de4 not found: ID does not exist" containerID="9b21723bc3fd4a42353b511c49a6688c2d0c7f04ba36d9ccd7c5488fb59d3de4" Mar 09 03:04:38 crc kubenswrapper[4901]: I0309 03:04:38.274285 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b21723bc3fd4a42353b511c49a6688c2d0c7f04ba36d9ccd7c5488fb59d3de4"} err="failed to get container status \"9b21723bc3fd4a42353b511c49a6688c2d0c7f04ba36d9ccd7c5488fb59d3de4\": rpc error: code = NotFound desc = could not find container \"9b21723bc3fd4a42353b511c49a6688c2d0c7f04ba36d9ccd7c5488fb59d3de4\": container with ID starting with 9b21723bc3fd4a42353b511c49a6688c2d0c7f04ba36d9ccd7c5488fb59d3de4 not found: ID does not exist" Mar 09 03:04:38 crc kubenswrapper[4901]: I0309 03:04:38.274308 4901 scope.go:117] "RemoveContainer" containerID="15ea0bf3cdd9841411ad8bfe69bfb6d1bfa47555f7306fb86d7a6d9be2ab54db" Mar 09 03:04:38 crc kubenswrapper[4901]: E0309 03:04:38.274943 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15ea0bf3cdd9841411ad8bfe69bfb6d1bfa47555f7306fb86d7a6d9be2ab54db\": container with ID starting with 15ea0bf3cdd9841411ad8bfe69bfb6d1bfa47555f7306fb86d7a6d9be2ab54db not found: ID does not exist" containerID="15ea0bf3cdd9841411ad8bfe69bfb6d1bfa47555f7306fb86d7a6d9be2ab54db" Mar 09 03:04:38 crc kubenswrapper[4901]: I0309 03:04:38.274989 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15ea0bf3cdd9841411ad8bfe69bfb6d1bfa47555f7306fb86d7a6d9be2ab54db"} err="failed to get container status \"15ea0bf3cdd9841411ad8bfe69bfb6d1bfa47555f7306fb86d7a6d9be2ab54db\": rpc error: code = NotFound desc = could not find container \"15ea0bf3cdd9841411ad8bfe69bfb6d1bfa47555f7306fb86d7a6d9be2ab54db\": container with ID starting with 15ea0bf3cdd9841411ad8bfe69bfb6d1bfa47555f7306fb86d7a6d9be2ab54db not found: ID does not exist" Mar 09 03:04:38 crc kubenswrapper[4901]: I0309 03:04:38.369853 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 03:04:38 crc kubenswrapper[4901]: E0309 03:04:38.370544 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[ceilometer-tls-certs combined-ca-bundle config-data kube-api-access-w66p2 log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="71690095-3031-4232-a10e-52b16c51590b" Mar 09 03:04:38 crc kubenswrapper[4901]: I0309 03:04:38.411711 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71690095-3031-4232-a10e-52b16c51590b-scripts\") pod \"ceilometer-0\" (UID: \"71690095-3031-4232-a10e-52b16c51590b\") " pod="openstack/ceilometer-0" Mar 09 03:04:38 crc kubenswrapper[4901]: I0309 03:04:38.411758 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71690095-3031-4232-a10e-52b16c51590b-log-httpd\") pod \"ceilometer-0\" (UID: \"71690095-3031-4232-a10e-52b16c51590b\") " pod="openstack/ceilometer-0" Mar 09 03:04:38 crc kubenswrapper[4901]: I0309 03:04:38.411801 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71690095-3031-4232-a10e-52b16c51590b-run-httpd\") pod \"ceilometer-0\" (UID: \"71690095-3031-4232-a10e-52b16c51590b\") " pod="openstack/ceilometer-0" Mar 09 03:04:38 crc kubenswrapper[4901]: I0309 03:04:38.411930 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71690095-3031-4232-a10e-52b16c51590b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"71690095-3031-4232-a10e-52b16c51590b\") " pod="openstack/ceilometer-0" Mar 09 03:04:38 crc kubenswrapper[4901]: I0309 03:04:38.412195 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/71690095-3031-4232-a10e-52b16c51590b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"71690095-3031-4232-a10e-52b16c51590b\") " pod="openstack/ceilometer-0" Mar 09 03:04:38 crc kubenswrapper[4901]: I0309 03:04:38.412308 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71690095-3031-4232-a10e-52b16c51590b-config-data\") pod \"ceilometer-0\" (UID: \"71690095-3031-4232-a10e-52b16c51590b\") " pod="openstack/ceilometer-0" Mar 09 03:04:38 crc kubenswrapper[4901]: I0309 03:04:38.412527 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/71690095-3031-4232-a10e-52b16c51590b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"71690095-3031-4232-a10e-52b16c51590b\") " pod="openstack/ceilometer-0" Mar 09 03:04:38 crc kubenswrapper[4901]: I0309 03:04:38.412719 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w66p2\" (UniqueName: \"kubernetes.io/projected/71690095-3031-4232-a10e-52b16c51590b-kube-api-access-w66p2\") pod \"ceilometer-0\" (UID: \"71690095-3031-4232-a10e-52b16c51590b\") " pod="openstack/ceilometer-0" Mar 09 03:04:38 crc kubenswrapper[4901]: I0309 03:04:38.514615 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/71690095-3031-4232-a10e-52b16c51590b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"71690095-3031-4232-a10e-52b16c51590b\") " pod="openstack/ceilometer-0" Mar 09 03:04:38 crc kubenswrapper[4901]: I0309 03:04:38.514658 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71690095-3031-4232-a10e-52b16c51590b-config-data\") pod \"ceilometer-0\" (UID: \"71690095-3031-4232-a10e-52b16c51590b\") " pod="openstack/ceilometer-0" Mar 09 03:04:38 crc kubenswrapper[4901]: I0309 03:04:38.514715 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/71690095-3031-4232-a10e-52b16c51590b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"71690095-3031-4232-a10e-52b16c51590b\") " pod="openstack/ceilometer-0" Mar 09 03:04:38 crc kubenswrapper[4901]: I0309 03:04:38.514767 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w66p2\" (UniqueName: \"kubernetes.io/projected/71690095-3031-4232-a10e-52b16c51590b-kube-api-access-w66p2\") pod \"ceilometer-0\" (UID: \"71690095-3031-4232-a10e-52b16c51590b\") " pod="openstack/ceilometer-0" Mar 09 03:04:38 crc kubenswrapper[4901]: I0309 03:04:38.514789 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71690095-3031-4232-a10e-52b16c51590b-scripts\") pod \"ceilometer-0\" (UID: \"71690095-3031-4232-a10e-52b16c51590b\") " pod="openstack/ceilometer-0" Mar 09 03:04:38 crc kubenswrapper[4901]: I0309 03:04:38.514806 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71690095-3031-4232-a10e-52b16c51590b-log-httpd\") pod \"ceilometer-0\" (UID: \"71690095-3031-4232-a10e-52b16c51590b\") " pod="openstack/ceilometer-0" Mar 09 03:04:38 crc kubenswrapper[4901]: I0309 03:04:38.514839 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71690095-3031-4232-a10e-52b16c51590b-run-httpd\") pod \"ceilometer-0\" (UID: \"71690095-3031-4232-a10e-52b16c51590b\") " pod="openstack/ceilometer-0" Mar 09 03:04:38 crc kubenswrapper[4901]: I0309 03:04:38.514860 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71690095-3031-4232-a10e-52b16c51590b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"71690095-3031-4232-a10e-52b16c51590b\") " pod="openstack/ceilometer-0" Mar 09 03:04:38 crc kubenswrapper[4901]: I0309 03:04:38.517320 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71690095-3031-4232-a10e-52b16c51590b-log-httpd\") pod \"ceilometer-0\" (UID: \"71690095-3031-4232-a10e-52b16c51590b\") " pod="openstack/ceilometer-0" Mar 09 03:04:38 crc kubenswrapper[4901]: I0309 03:04:38.517668 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71690095-3031-4232-a10e-52b16c51590b-run-httpd\") pod \"ceilometer-0\" (UID: \"71690095-3031-4232-a10e-52b16c51590b\") " pod="openstack/ceilometer-0" Mar 09 03:04:38 crc kubenswrapper[4901]: I0309 03:04:38.520710 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71690095-3031-4232-a10e-52b16c51590b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"71690095-3031-4232-a10e-52b16c51590b\") " pod="openstack/ceilometer-0" Mar 09 03:04:38 crc kubenswrapper[4901]: I0309 03:04:38.521811 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/71690095-3031-4232-a10e-52b16c51590b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"71690095-3031-4232-a10e-52b16c51590b\") " pod="openstack/ceilometer-0" Mar 09 03:04:38 crc kubenswrapper[4901]: I0309 03:04:38.521831 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/71690095-3031-4232-a10e-52b16c51590b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"71690095-3031-4232-a10e-52b16c51590b\") " pod="openstack/ceilometer-0" Mar 09 03:04:38 crc kubenswrapper[4901]: I0309 03:04:38.523457 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71690095-3031-4232-a10e-52b16c51590b-config-data\") pod \"ceilometer-0\" (UID: \"71690095-3031-4232-a10e-52b16c51590b\") " pod="openstack/ceilometer-0" Mar 09 03:04:38 crc kubenswrapper[4901]: I0309 03:04:38.526135 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71690095-3031-4232-a10e-52b16c51590b-scripts\") pod \"ceilometer-0\" (UID: \"71690095-3031-4232-a10e-52b16c51590b\") " pod="openstack/ceilometer-0" Mar 09 03:04:38 crc kubenswrapper[4901]: I0309 03:04:38.533645 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w66p2\" (UniqueName: \"kubernetes.io/projected/71690095-3031-4232-a10e-52b16c51590b-kube-api-access-w66p2\") pod \"ceilometer-0\" (UID: \"71690095-3031-4232-a10e-52b16c51590b\") " pod="openstack/ceilometer-0" Mar 09 03:04:39 crc kubenswrapper[4901]: I0309 03:04:39.170380 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 03:04:39 crc kubenswrapper[4901]: I0309 03:04:39.195686 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 03:04:39 crc kubenswrapper[4901]: I0309 03:04:39.331213 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71690095-3031-4232-a10e-52b16c51590b-scripts\") pod \"71690095-3031-4232-a10e-52b16c51590b\" (UID: \"71690095-3031-4232-a10e-52b16c51590b\") " Mar 09 03:04:39 crc kubenswrapper[4901]: I0309 03:04:39.331391 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71690095-3031-4232-a10e-52b16c51590b-log-httpd\") pod \"71690095-3031-4232-a10e-52b16c51590b\" (UID: \"71690095-3031-4232-a10e-52b16c51590b\") " Mar 09 03:04:39 crc kubenswrapper[4901]: I0309 03:04:39.331432 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71690095-3031-4232-a10e-52b16c51590b-config-data\") pod \"71690095-3031-4232-a10e-52b16c51590b\" (UID: \"71690095-3031-4232-a10e-52b16c51590b\") " Mar 09 03:04:39 crc kubenswrapper[4901]: I0309 03:04:39.331452 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/71690095-3031-4232-a10e-52b16c51590b-sg-core-conf-yaml\") pod \"71690095-3031-4232-a10e-52b16c51590b\" (UID: \"71690095-3031-4232-a10e-52b16c51590b\") " Mar 09 03:04:39 crc kubenswrapper[4901]: I0309 03:04:39.331485 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/71690095-3031-4232-a10e-52b16c51590b-ceilometer-tls-certs\") pod \"71690095-3031-4232-a10e-52b16c51590b\" (UID: \"71690095-3031-4232-a10e-52b16c51590b\") " Mar 09 03:04:39 crc kubenswrapper[4901]: I0309 03:04:39.331529 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71690095-3031-4232-a10e-52b16c51590b-combined-ca-bundle\") pod \"71690095-3031-4232-a10e-52b16c51590b\" (UID: \"71690095-3031-4232-a10e-52b16c51590b\") " Mar 09 03:04:39 crc kubenswrapper[4901]: I0309 03:04:39.331557 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71690095-3031-4232-a10e-52b16c51590b-run-httpd\") pod \"71690095-3031-4232-a10e-52b16c51590b\" (UID: \"71690095-3031-4232-a10e-52b16c51590b\") " Mar 09 03:04:39 crc kubenswrapper[4901]: I0309 03:04:39.331590 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w66p2\" (UniqueName: \"kubernetes.io/projected/71690095-3031-4232-a10e-52b16c51590b-kube-api-access-w66p2\") pod \"71690095-3031-4232-a10e-52b16c51590b\" (UID: \"71690095-3031-4232-a10e-52b16c51590b\") " Mar 09 03:04:39 crc kubenswrapper[4901]: I0309 03:04:39.331876 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71690095-3031-4232-a10e-52b16c51590b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "71690095-3031-4232-a10e-52b16c51590b" (UID: "71690095-3031-4232-a10e-52b16c51590b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:04:39 crc kubenswrapper[4901]: I0309 03:04:39.332172 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71690095-3031-4232-a10e-52b16c51590b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "71690095-3031-4232-a10e-52b16c51590b" (UID: "71690095-3031-4232-a10e-52b16c51590b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:04:39 crc kubenswrapper[4901]: I0309 03:04:39.332190 4901 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71690095-3031-4232-a10e-52b16c51590b-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:39 crc kubenswrapper[4901]: I0309 03:04:39.336792 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71690095-3031-4232-a10e-52b16c51590b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "71690095-3031-4232-a10e-52b16c51590b" (UID: "71690095-3031-4232-a10e-52b16c51590b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:04:39 crc kubenswrapper[4901]: I0309 03:04:39.337844 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71690095-3031-4232-a10e-52b16c51590b-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "71690095-3031-4232-a10e-52b16c51590b" (UID: "71690095-3031-4232-a10e-52b16c51590b"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:04:39 crc kubenswrapper[4901]: I0309 03:04:39.338191 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71690095-3031-4232-a10e-52b16c51590b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "71690095-3031-4232-a10e-52b16c51590b" (UID: "71690095-3031-4232-a10e-52b16c51590b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:04:39 crc kubenswrapper[4901]: I0309 03:04:39.339096 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71690095-3031-4232-a10e-52b16c51590b-kube-api-access-w66p2" (OuterVolumeSpecName: "kube-api-access-w66p2") pod "71690095-3031-4232-a10e-52b16c51590b" (UID: "71690095-3031-4232-a10e-52b16c51590b"). InnerVolumeSpecName "kube-api-access-w66p2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:04:39 crc kubenswrapper[4901]: I0309 03:04:39.350695 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71690095-3031-4232-a10e-52b16c51590b-config-data" (OuterVolumeSpecName: "config-data") pod "71690095-3031-4232-a10e-52b16c51590b" (UID: "71690095-3031-4232-a10e-52b16c51590b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:04:39 crc kubenswrapper[4901]: I0309 03:04:39.357886 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71690095-3031-4232-a10e-52b16c51590b-scripts" (OuterVolumeSpecName: "scripts") pod "71690095-3031-4232-a10e-52b16c51590b" (UID: "71690095-3031-4232-a10e-52b16c51590b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:04:39 crc kubenswrapper[4901]: I0309 03:04:39.433528 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71690095-3031-4232-a10e-52b16c51590b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:39 crc kubenswrapper[4901]: I0309 03:04:39.433563 4901 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71690095-3031-4232-a10e-52b16c51590b-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:39 crc kubenswrapper[4901]: I0309 03:04:39.433575 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w66p2\" (UniqueName: \"kubernetes.io/projected/71690095-3031-4232-a10e-52b16c51590b-kube-api-access-w66p2\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:39 crc kubenswrapper[4901]: I0309 03:04:39.433588 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71690095-3031-4232-a10e-52b16c51590b-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:39 crc kubenswrapper[4901]: I0309 03:04:39.433601 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71690095-3031-4232-a10e-52b16c51590b-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:39 crc kubenswrapper[4901]: I0309 03:04:39.433624 4901 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/71690095-3031-4232-a10e-52b16c51590b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:39 crc kubenswrapper[4901]: I0309 03:04:39.433634 4901 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/71690095-3031-4232-a10e-52b16c51590b-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:40 crc kubenswrapper[4901]: I0309 03:04:40.122667 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af8a02ba-7609-485f-86c7-09a4d9ba2e59" path="/var/lib/kubelet/pods/af8a02ba-7609-485f-86c7-09a4d9ba2e59/volumes" Mar 09 03:04:40 crc kubenswrapper[4901]: I0309 03:04:40.182419 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 03:04:40 crc kubenswrapper[4901]: I0309 03:04:40.285152 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 03:04:40 crc kubenswrapper[4901]: I0309 03:04:40.305345 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 03:04:40 crc kubenswrapper[4901]: I0309 03:04:40.327727 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 03:04:40 crc kubenswrapper[4901]: I0309 03:04:40.341654 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 03:04:40 crc kubenswrapper[4901]: I0309 03:04:40.347332 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 03:04:40 crc kubenswrapper[4901]: I0309 03:04:40.348955 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 03:04:40 crc kubenswrapper[4901]: I0309 03:04:40.349383 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 09 03:04:40 crc kubenswrapper[4901]: I0309 03:04:40.349696 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 03:04:40 crc kubenswrapper[4901]: I0309 03:04:40.451571 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c3d4e9a-122e-4894-98b2-91784a9f44e8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2c3d4e9a-122e-4894-98b2-91784a9f44e8\") " pod="openstack/ceilometer-0" Mar 09 03:04:40 crc kubenswrapper[4901]: I0309 03:04:40.451691 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c3d4e9a-122e-4894-98b2-91784a9f44e8-log-httpd\") pod \"ceilometer-0\" (UID: \"2c3d4e9a-122e-4894-98b2-91784a9f44e8\") " pod="openstack/ceilometer-0" Mar 09 03:04:40 crc kubenswrapper[4901]: I0309 03:04:40.451754 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c3d4e9a-122e-4894-98b2-91784a9f44e8-scripts\") pod \"ceilometer-0\" (UID: \"2c3d4e9a-122e-4894-98b2-91784a9f44e8\") " pod="openstack/ceilometer-0" Mar 09 03:04:40 crc kubenswrapper[4901]: I0309 03:04:40.451842 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c3d4e9a-122e-4894-98b2-91784a9f44e8-run-httpd\") pod \"ceilometer-0\" (UID: \"2c3d4e9a-122e-4894-98b2-91784a9f44e8\") " pod="openstack/ceilometer-0" Mar 09 03:04:40 crc kubenswrapper[4901]: I0309 03:04:40.451893 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c3d4e9a-122e-4894-98b2-91784a9f44e8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2c3d4e9a-122e-4894-98b2-91784a9f44e8\") " pod="openstack/ceilometer-0" Mar 09 03:04:40 crc kubenswrapper[4901]: I0309 03:04:40.451925 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p5pq\" (UniqueName: \"kubernetes.io/projected/2c3d4e9a-122e-4894-98b2-91784a9f44e8-kube-api-access-4p5pq\") pod \"ceilometer-0\" (UID: \"2c3d4e9a-122e-4894-98b2-91784a9f44e8\") " pod="openstack/ceilometer-0" Mar 09 03:04:40 crc kubenswrapper[4901]: I0309 03:04:40.452000 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c3d4e9a-122e-4894-98b2-91784a9f44e8-config-data\") pod \"ceilometer-0\" (UID: \"2c3d4e9a-122e-4894-98b2-91784a9f44e8\") " pod="openstack/ceilometer-0" Mar 09 03:04:40 crc kubenswrapper[4901]: I0309 03:04:40.452040 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2c3d4e9a-122e-4894-98b2-91784a9f44e8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2c3d4e9a-122e-4894-98b2-91784a9f44e8\") " pod="openstack/ceilometer-0" Mar 09 03:04:40 crc kubenswrapper[4901]: I0309 03:04:40.554027 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c3d4e9a-122e-4894-98b2-91784a9f44e8-log-httpd\") pod \"ceilometer-0\" (UID: \"2c3d4e9a-122e-4894-98b2-91784a9f44e8\") " pod="openstack/ceilometer-0" Mar 09 03:04:40 crc kubenswrapper[4901]: I0309 03:04:40.554117 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c3d4e9a-122e-4894-98b2-91784a9f44e8-scripts\") pod \"ceilometer-0\" (UID: \"2c3d4e9a-122e-4894-98b2-91784a9f44e8\") " pod="openstack/ceilometer-0" Mar 09 03:04:40 crc kubenswrapper[4901]: I0309 03:04:40.554199 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c3d4e9a-122e-4894-98b2-91784a9f44e8-run-httpd\") pod \"ceilometer-0\" (UID: \"2c3d4e9a-122e-4894-98b2-91784a9f44e8\") " pod="openstack/ceilometer-0" Mar 09 03:04:40 crc kubenswrapper[4901]: I0309 03:04:40.554273 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c3d4e9a-122e-4894-98b2-91784a9f44e8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2c3d4e9a-122e-4894-98b2-91784a9f44e8\") " pod="openstack/ceilometer-0" Mar 09 03:04:40 crc kubenswrapper[4901]: I0309 03:04:40.554307 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4p5pq\" (UniqueName: \"kubernetes.io/projected/2c3d4e9a-122e-4894-98b2-91784a9f44e8-kube-api-access-4p5pq\") pod \"ceilometer-0\" (UID: \"2c3d4e9a-122e-4894-98b2-91784a9f44e8\") " pod="openstack/ceilometer-0" Mar 09 03:04:40 crc kubenswrapper[4901]: I0309 03:04:40.554378 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c3d4e9a-122e-4894-98b2-91784a9f44e8-config-data\") pod \"ceilometer-0\" (UID: \"2c3d4e9a-122e-4894-98b2-91784a9f44e8\") " pod="openstack/ceilometer-0" Mar 09 03:04:40 crc kubenswrapper[4901]: I0309 03:04:40.554404 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2c3d4e9a-122e-4894-98b2-91784a9f44e8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2c3d4e9a-122e-4894-98b2-91784a9f44e8\") " pod="openstack/ceilometer-0" Mar 09 03:04:40 crc kubenswrapper[4901]: I0309 03:04:40.554470 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c3d4e9a-122e-4894-98b2-91784a9f44e8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2c3d4e9a-122e-4894-98b2-91784a9f44e8\") " pod="openstack/ceilometer-0" Mar 09 03:04:40 crc kubenswrapper[4901]: I0309 03:04:40.561722 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c3d4e9a-122e-4894-98b2-91784a9f44e8-run-httpd\") pod \"ceilometer-0\" (UID: \"2c3d4e9a-122e-4894-98b2-91784a9f44e8\") " pod="openstack/ceilometer-0" Mar 09 03:04:40 crc kubenswrapper[4901]: I0309 03:04:40.562090 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c3d4e9a-122e-4894-98b2-91784a9f44e8-log-httpd\") pod \"ceilometer-0\" (UID: \"2c3d4e9a-122e-4894-98b2-91784a9f44e8\") " pod="openstack/ceilometer-0" Mar 09 03:04:40 crc kubenswrapper[4901]: I0309 03:04:40.565939 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c3d4e9a-122e-4894-98b2-91784a9f44e8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2c3d4e9a-122e-4894-98b2-91784a9f44e8\") " pod="openstack/ceilometer-0" Mar 09 03:04:40 crc kubenswrapper[4901]: I0309 03:04:40.569112 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c3d4e9a-122e-4894-98b2-91784a9f44e8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2c3d4e9a-122e-4894-98b2-91784a9f44e8\") " pod="openstack/ceilometer-0" Mar 09 03:04:40 crc kubenswrapper[4901]: I0309 03:04:40.570823 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2c3d4e9a-122e-4894-98b2-91784a9f44e8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2c3d4e9a-122e-4894-98b2-91784a9f44e8\") " pod="openstack/ceilometer-0" Mar 09 03:04:40 crc kubenswrapper[4901]: I0309 03:04:40.574395 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c3d4e9a-122e-4894-98b2-91784a9f44e8-config-data\") pod \"ceilometer-0\" (UID: \"2c3d4e9a-122e-4894-98b2-91784a9f44e8\") " pod="openstack/ceilometer-0" Mar 09 03:04:40 crc kubenswrapper[4901]: I0309 03:04:40.576041 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c3d4e9a-122e-4894-98b2-91784a9f44e8-scripts\") pod \"ceilometer-0\" (UID: \"2c3d4e9a-122e-4894-98b2-91784a9f44e8\") " pod="openstack/ceilometer-0" Mar 09 03:04:40 crc kubenswrapper[4901]: I0309 03:04:40.588978 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p5pq\" (UniqueName: \"kubernetes.io/projected/2c3d4e9a-122e-4894-98b2-91784a9f44e8-kube-api-access-4p5pq\") pod \"ceilometer-0\" (UID: \"2c3d4e9a-122e-4894-98b2-91784a9f44e8\") " pod="openstack/ceilometer-0" Mar 09 03:04:40 crc kubenswrapper[4901]: I0309 03:04:40.732505 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 03:04:40 crc kubenswrapper[4901]: I0309 03:04:40.772829 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 03:04:40 crc kubenswrapper[4901]: I0309 03:04:40.859707 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/258f621f-8909-4d36-8f2f-bdd166e47139-logs\") pod \"258f621f-8909-4d36-8f2f-bdd166e47139\" (UID: \"258f621f-8909-4d36-8f2f-bdd166e47139\") " Mar 09 03:04:40 crc kubenswrapper[4901]: I0309 03:04:40.859969 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/258f621f-8909-4d36-8f2f-bdd166e47139-combined-ca-bundle\") pod \"258f621f-8909-4d36-8f2f-bdd166e47139\" (UID: \"258f621f-8909-4d36-8f2f-bdd166e47139\") " Mar 09 03:04:40 crc kubenswrapper[4901]: I0309 03:04:40.860007 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgwm4\" (UniqueName: \"kubernetes.io/projected/258f621f-8909-4d36-8f2f-bdd166e47139-kube-api-access-qgwm4\") pod \"258f621f-8909-4d36-8f2f-bdd166e47139\" (UID: \"258f621f-8909-4d36-8f2f-bdd166e47139\") " Mar 09 03:04:40 crc kubenswrapper[4901]: I0309 03:04:40.860044 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/258f621f-8909-4d36-8f2f-bdd166e47139-config-data\") pod \"258f621f-8909-4d36-8f2f-bdd166e47139\" (UID: \"258f621f-8909-4d36-8f2f-bdd166e47139\") " Mar 09 03:04:40 crc kubenswrapper[4901]: I0309 03:04:40.860421 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/258f621f-8909-4d36-8f2f-bdd166e47139-logs" (OuterVolumeSpecName: "logs") pod "258f621f-8909-4d36-8f2f-bdd166e47139" (UID: "258f621f-8909-4d36-8f2f-bdd166e47139"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:04:40 crc kubenswrapper[4901]: I0309 03:04:40.871382 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/258f621f-8909-4d36-8f2f-bdd166e47139-kube-api-access-qgwm4" (OuterVolumeSpecName: "kube-api-access-qgwm4") pod "258f621f-8909-4d36-8f2f-bdd166e47139" (UID: "258f621f-8909-4d36-8f2f-bdd166e47139"). InnerVolumeSpecName "kube-api-access-qgwm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:04:40 crc kubenswrapper[4901]: I0309 03:04:40.894429 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/258f621f-8909-4d36-8f2f-bdd166e47139-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "258f621f-8909-4d36-8f2f-bdd166e47139" (UID: "258f621f-8909-4d36-8f2f-bdd166e47139"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:04:40 crc kubenswrapper[4901]: I0309 03:04:40.901832 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/258f621f-8909-4d36-8f2f-bdd166e47139-config-data" (OuterVolumeSpecName: "config-data") pod "258f621f-8909-4d36-8f2f-bdd166e47139" (UID: "258f621f-8909-4d36-8f2f-bdd166e47139"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:04:40 crc kubenswrapper[4901]: I0309 03:04:40.962826 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/258f621f-8909-4d36-8f2f-bdd166e47139-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:40 crc kubenswrapper[4901]: I0309 03:04:40.963099 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgwm4\" (UniqueName: \"kubernetes.io/projected/258f621f-8909-4d36-8f2f-bdd166e47139-kube-api-access-qgwm4\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:40 crc kubenswrapper[4901]: I0309 03:04:40.963113 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/258f621f-8909-4d36-8f2f-bdd166e47139-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:40 crc kubenswrapper[4901]: I0309 03:04:40.963126 4901 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/258f621f-8909-4d36-8f2f-bdd166e47139-logs\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:41 crc kubenswrapper[4901]: I0309 03:04:41.193784 4901 generic.go:334] "Generic (PLEG): container finished" podID="258f621f-8909-4d36-8f2f-bdd166e47139" containerID="e37a6eec70c902f78746a9a22ee650d4599cd6a648db9148591ccbd0155126e9" exitCode=0 Mar 09 03:04:41 crc kubenswrapper[4901]: I0309 03:04:41.193828 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"258f621f-8909-4d36-8f2f-bdd166e47139","Type":"ContainerDied","Data":"e37a6eec70c902f78746a9a22ee650d4599cd6a648db9148591ccbd0155126e9"} Mar 09 03:04:41 crc kubenswrapper[4901]: I0309 03:04:41.193857 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 03:04:41 crc kubenswrapper[4901]: I0309 03:04:41.193877 4901 scope.go:117] "RemoveContainer" containerID="e37a6eec70c902f78746a9a22ee650d4599cd6a648db9148591ccbd0155126e9" Mar 09 03:04:41 crc kubenswrapper[4901]: I0309 03:04:41.193863 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"258f621f-8909-4d36-8f2f-bdd166e47139","Type":"ContainerDied","Data":"e476aa66077d85f60a3bdd6349d821c47394cb422ca4e82a6e76537fa21c7c73"} Mar 09 03:04:41 crc kubenswrapper[4901]: I0309 03:04:41.216570 4901 scope.go:117] "RemoveContainer" containerID="8330b5ad7d717048a6505dd4d7ea09f3041f917dbb4d0755882ea880e972135f" Mar 09 03:04:41 crc kubenswrapper[4901]: I0309 03:04:41.249662 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 09 03:04:41 crc kubenswrapper[4901]: I0309 03:04:41.256444 4901 scope.go:117] "RemoveContainer" containerID="e37a6eec70c902f78746a9a22ee650d4599cd6a648db9148591ccbd0155126e9" Mar 09 03:04:41 crc kubenswrapper[4901]: E0309 03:04:41.260491 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e37a6eec70c902f78746a9a22ee650d4599cd6a648db9148591ccbd0155126e9\": container with ID starting with e37a6eec70c902f78746a9a22ee650d4599cd6a648db9148591ccbd0155126e9 not found: ID does not exist" containerID="e37a6eec70c902f78746a9a22ee650d4599cd6a648db9148591ccbd0155126e9" Mar 09 03:04:41 crc kubenswrapper[4901]: I0309 03:04:41.260528 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e37a6eec70c902f78746a9a22ee650d4599cd6a648db9148591ccbd0155126e9"} err="failed to get container status \"e37a6eec70c902f78746a9a22ee650d4599cd6a648db9148591ccbd0155126e9\": rpc error: code = NotFound desc = could not find container \"e37a6eec70c902f78746a9a22ee650d4599cd6a648db9148591ccbd0155126e9\": container with ID starting with e37a6eec70c902f78746a9a22ee650d4599cd6a648db9148591ccbd0155126e9 not found: ID does not exist" Mar 09 03:04:41 crc kubenswrapper[4901]: I0309 03:04:41.260549 4901 scope.go:117] "RemoveContainer" containerID="8330b5ad7d717048a6505dd4d7ea09f3041f917dbb4d0755882ea880e972135f" Mar 09 03:04:41 crc kubenswrapper[4901]: E0309 03:04:41.263036 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8330b5ad7d717048a6505dd4d7ea09f3041f917dbb4d0755882ea880e972135f\": container with ID starting with 8330b5ad7d717048a6505dd4d7ea09f3041f917dbb4d0755882ea880e972135f not found: ID does not exist" containerID="8330b5ad7d717048a6505dd4d7ea09f3041f917dbb4d0755882ea880e972135f" Mar 09 03:04:41 crc kubenswrapper[4901]: I0309 03:04:41.263062 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8330b5ad7d717048a6505dd4d7ea09f3041f917dbb4d0755882ea880e972135f"} err="failed to get container status \"8330b5ad7d717048a6505dd4d7ea09f3041f917dbb4d0755882ea880e972135f\": rpc error: code = NotFound desc = could not find container \"8330b5ad7d717048a6505dd4d7ea09f3041f917dbb4d0755882ea880e972135f\": container with ID starting with 8330b5ad7d717048a6505dd4d7ea09f3041f917dbb4d0755882ea880e972135f not found: ID does not exist" Mar 09 03:04:41 crc kubenswrapper[4901]: I0309 03:04:41.276471 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 09 03:04:41 crc kubenswrapper[4901]: I0309 03:04:41.284008 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 09 03:04:41 crc kubenswrapper[4901]: E0309 03:04:41.284488 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="258f621f-8909-4d36-8f2f-bdd166e47139" containerName="nova-api-log" Mar 09 03:04:41 crc kubenswrapper[4901]: I0309 03:04:41.284512 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="258f621f-8909-4d36-8f2f-bdd166e47139" containerName="nova-api-log" Mar 09 03:04:41 crc kubenswrapper[4901]: E0309 03:04:41.284537 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="258f621f-8909-4d36-8f2f-bdd166e47139" containerName="nova-api-api" Mar 09 03:04:41 crc kubenswrapper[4901]: I0309 03:04:41.284544 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="258f621f-8909-4d36-8f2f-bdd166e47139" containerName="nova-api-api" Mar 09 03:04:41 crc kubenswrapper[4901]: I0309 03:04:41.284739 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="258f621f-8909-4d36-8f2f-bdd166e47139" containerName="nova-api-api" Mar 09 03:04:41 crc kubenswrapper[4901]: I0309 03:04:41.284791 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="258f621f-8909-4d36-8f2f-bdd166e47139" containerName="nova-api-log" Mar 09 03:04:41 crc kubenswrapper[4901]: I0309 03:04:41.285834 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 03:04:41 crc kubenswrapper[4901]: I0309 03:04:41.288933 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 09 03:04:41 crc kubenswrapper[4901]: I0309 03:04:41.289655 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 09 03:04:41 crc kubenswrapper[4901]: I0309 03:04:41.289796 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 09 03:04:41 crc kubenswrapper[4901]: I0309 03:04:41.296899 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 03:04:41 crc kubenswrapper[4901]: I0309 03:04:41.304714 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 09 03:04:41 crc kubenswrapper[4901]: I0309 03:04:41.372477 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af4527fd-88dd-47e5-bde9-7813caf84a2d-config-data\") pod \"nova-api-0\" (UID: \"af4527fd-88dd-47e5-bde9-7813caf84a2d\") " pod="openstack/nova-api-0" Mar 09 03:04:41 crc kubenswrapper[4901]: I0309 03:04:41.372525 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af4527fd-88dd-47e5-bde9-7813caf84a2d-logs\") pod \"nova-api-0\" (UID: \"af4527fd-88dd-47e5-bde9-7813caf84a2d\") " pod="openstack/nova-api-0" Mar 09 03:04:41 crc kubenswrapper[4901]: I0309 03:04:41.372787 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af4527fd-88dd-47e5-bde9-7813caf84a2d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"af4527fd-88dd-47e5-bde9-7813caf84a2d\") " pod="openstack/nova-api-0" Mar 09 03:04:41 crc kubenswrapper[4901]: I0309 03:04:41.372826 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af4527fd-88dd-47e5-bde9-7813caf84a2d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"af4527fd-88dd-47e5-bde9-7813caf84a2d\") " pod="openstack/nova-api-0" Mar 09 03:04:41 crc kubenswrapper[4901]: I0309 03:04:41.372864 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/af4527fd-88dd-47e5-bde9-7813caf84a2d-public-tls-certs\") pod \"nova-api-0\" (UID: \"af4527fd-88dd-47e5-bde9-7813caf84a2d\") " pod="openstack/nova-api-0" Mar 09 03:04:41 crc kubenswrapper[4901]: I0309 03:04:41.373131 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njwhb\" (UniqueName: \"kubernetes.io/projected/af4527fd-88dd-47e5-bde9-7813caf84a2d-kube-api-access-njwhb\") pod \"nova-api-0\" (UID: \"af4527fd-88dd-47e5-bde9-7813caf84a2d\") " pod="openstack/nova-api-0" Mar 09 03:04:41 crc kubenswrapper[4901]: I0309 03:04:41.475190 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njwhb\" (UniqueName: \"kubernetes.io/projected/af4527fd-88dd-47e5-bde9-7813caf84a2d-kube-api-access-njwhb\") pod \"nova-api-0\" (UID: \"af4527fd-88dd-47e5-bde9-7813caf84a2d\") " pod="openstack/nova-api-0" Mar 09 03:04:41 crc kubenswrapper[4901]: I0309 03:04:41.475312 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af4527fd-88dd-47e5-bde9-7813caf84a2d-config-data\") pod \"nova-api-0\" (UID: \"af4527fd-88dd-47e5-bde9-7813caf84a2d\") " pod="openstack/nova-api-0" Mar 09 03:04:41 crc kubenswrapper[4901]: I0309 03:04:41.475343 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af4527fd-88dd-47e5-bde9-7813caf84a2d-logs\") pod \"nova-api-0\" (UID: \"af4527fd-88dd-47e5-bde9-7813caf84a2d\") " pod="openstack/nova-api-0" Mar 09 03:04:41 crc kubenswrapper[4901]: I0309 03:04:41.475474 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af4527fd-88dd-47e5-bde9-7813caf84a2d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"af4527fd-88dd-47e5-bde9-7813caf84a2d\") " pod="openstack/nova-api-0" Mar 09 03:04:41 crc kubenswrapper[4901]: I0309 03:04:41.475503 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af4527fd-88dd-47e5-bde9-7813caf84a2d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"af4527fd-88dd-47e5-bde9-7813caf84a2d\") " pod="openstack/nova-api-0" Mar 09 03:04:41 crc kubenswrapper[4901]: I0309 03:04:41.475539 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/af4527fd-88dd-47e5-bde9-7813caf84a2d-public-tls-certs\") pod \"nova-api-0\" (UID: \"af4527fd-88dd-47e5-bde9-7813caf84a2d\") " pod="openstack/nova-api-0" Mar 09 03:04:41 crc kubenswrapper[4901]: I0309 03:04:41.476646 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af4527fd-88dd-47e5-bde9-7813caf84a2d-logs\") pod \"nova-api-0\" (UID: \"af4527fd-88dd-47e5-bde9-7813caf84a2d\") " pod="openstack/nova-api-0" Mar 09 03:04:41 crc kubenswrapper[4901]: I0309 03:04:41.480752 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af4527fd-88dd-47e5-bde9-7813caf84a2d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"af4527fd-88dd-47e5-bde9-7813caf84a2d\") " pod="openstack/nova-api-0" Mar 09 03:04:41 crc kubenswrapper[4901]: I0309 03:04:41.481511 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/af4527fd-88dd-47e5-bde9-7813caf84a2d-public-tls-certs\") pod \"nova-api-0\" (UID: \"af4527fd-88dd-47e5-bde9-7813caf84a2d\") " pod="openstack/nova-api-0" Mar 09 03:04:41 crc kubenswrapper[4901]: I0309 03:04:41.482292 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af4527fd-88dd-47e5-bde9-7813caf84a2d-config-data\") pod \"nova-api-0\" (UID: \"af4527fd-88dd-47e5-bde9-7813caf84a2d\") " pod="openstack/nova-api-0" Mar 09 03:04:41 crc kubenswrapper[4901]: I0309 03:04:41.483746 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af4527fd-88dd-47e5-bde9-7813caf84a2d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"af4527fd-88dd-47e5-bde9-7813caf84a2d\") " pod="openstack/nova-api-0" Mar 09 03:04:41 crc kubenswrapper[4901]: I0309 03:04:41.497797 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njwhb\" (UniqueName: \"kubernetes.io/projected/af4527fd-88dd-47e5-bde9-7813caf84a2d-kube-api-access-njwhb\") pod \"nova-api-0\" (UID: \"af4527fd-88dd-47e5-bde9-7813caf84a2d\") " pod="openstack/nova-api-0" Mar 09 03:04:41 crc kubenswrapper[4901]: I0309 03:04:41.605533 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 03:04:42 crc kubenswrapper[4901]: I0309 03:04:42.050458 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 09 03:04:42 crc kubenswrapper[4901]: I0309 03:04:42.119448 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="258f621f-8909-4d36-8f2f-bdd166e47139" path="/var/lib/kubelet/pods/258f621f-8909-4d36-8f2f-bdd166e47139/volumes" Mar 09 03:04:42 crc kubenswrapper[4901]: I0309 03:04:42.120602 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71690095-3031-4232-a10e-52b16c51590b" path="/var/lib/kubelet/pods/71690095-3031-4232-a10e-52b16c51590b/volumes" Mar 09 03:04:42 crc kubenswrapper[4901]: I0309 03:04:42.212088 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2c3d4e9a-122e-4894-98b2-91784a9f44e8","Type":"ContainerStarted","Data":"fd30c7afd1e4a7b025b8b590cdbf65ff2deb03badcd9cc3419bf24e1b542c1ed"} Mar 09 03:04:42 crc kubenswrapper[4901]: I0309 03:04:42.212399 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2c3d4e9a-122e-4894-98b2-91784a9f44e8","Type":"ContainerStarted","Data":"75f36c13b7aae4668a2ff9369eb56a9b13df781875a0b09a75c7ec4d0762cf1a"} Mar 09 03:04:42 crc kubenswrapper[4901]: I0309 03:04:42.215611 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"af4527fd-88dd-47e5-bde9-7813caf84a2d","Type":"ContainerStarted","Data":"b6125e05fa62fe3debb29acf01b57b9ab6c7001dd28b9627d1eb7f147ff9fa2e"} Mar 09 03:04:42 crc kubenswrapper[4901]: I0309 03:04:42.215684 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"af4527fd-88dd-47e5-bde9-7813caf84a2d","Type":"ContainerStarted","Data":"684c98fd1b7c16fe0438eb7b7d630b50ce4e47166be500691236b3a06fe4c967"} Mar 09 03:04:42 crc kubenswrapper[4901]: I0309 03:04:42.729443 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 09 03:04:42 crc kubenswrapper[4901]: I0309 03:04:42.750193 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 09 03:04:43 crc kubenswrapper[4901]: I0309 03:04:43.229778 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2c3d4e9a-122e-4894-98b2-91784a9f44e8","Type":"ContainerStarted","Data":"ef72ec9152a71c9764e40c053af5316561da42488861f73e1bfb72f8538b1bb8"} Mar 09 03:04:43 crc kubenswrapper[4901]: I0309 03:04:43.230271 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2c3d4e9a-122e-4894-98b2-91784a9f44e8","Type":"ContainerStarted","Data":"5728e2e081eb4563b67c226046888a882e8bdb83914ea86f35f1527e56c5d36a"} Mar 09 03:04:43 crc kubenswrapper[4901]: I0309 03:04:43.232974 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"af4527fd-88dd-47e5-bde9-7813caf84a2d","Type":"ContainerStarted","Data":"70b7dded54b5b180e678b520941248da5da9b41ee342a579a9f4522788f006b5"} Mar 09 03:04:43 crc kubenswrapper[4901]: I0309 03:04:43.262329 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.262301146 podStartE2EDuration="2.262301146s" podCreationTimestamp="2026-03-09 03:04:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 03:04:43.247993745 +0000 UTC m=+1407.837657547" watchObservedRunningTime="2026-03-09 03:04:43.262301146 +0000 UTC m=+1407.851964888" Mar 09 03:04:43 crc kubenswrapper[4901]: I0309 03:04:43.265654 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 09 03:04:43 crc kubenswrapper[4901]: I0309 03:04:43.474827 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-bxxsr"] Mar 09 03:04:43 crc kubenswrapper[4901]: I0309 03:04:43.476848 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-bxxsr" Mar 09 03:04:43 crc kubenswrapper[4901]: I0309 03:04:43.482818 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 09 03:04:43 crc kubenswrapper[4901]: I0309 03:04:43.483533 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 09 03:04:43 crc kubenswrapper[4901]: I0309 03:04:43.492452 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-bxxsr"] Mar 09 03:04:43 crc kubenswrapper[4901]: I0309 03:04:43.617694 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a004f541-f809-42ea-89ac-13bea0f45829-scripts\") pod \"nova-cell1-cell-mapping-bxxsr\" (UID: \"a004f541-f809-42ea-89ac-13bea0f45829\") " pod="openstack/nova-cell1-cell-mapping-bxxsr" Mar 09 03:04:43 crc kubenswrapper[4901]: I0309 03:04:43.618049 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6cc6\" (UniqueName: \"kubernetes.io/projected/a004f541-f809-42ea-89ac-13bea0f45829-kube-api-access-w6cc6\") pod \"nova-cell1-cell-mapping-bxxsr\" (UID: \"a004f541-f809-42ea-89ac-13bea0f45829\") " pod="openstack/nova-cell1-cell-mapping-bxxsr" Mar 09 03:04:43 crc kubenswrapper[4901]: I0309 03:04:43.618313 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a004f541-f809-42ea-89ac-13bea0f45829-config-data\") pod \"nova-cell1-cell-mapping-bxxsr\" (UID: \"a004f541-f809-42ea-89ac-13bea0f45829\") " pod="openstack/nova-cell1-cell-mapping-bxxsr" Mar 09 03:04:43 crc kubenswrapper[4901]: I0309 03:04:43.618452 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a004f541-f809-42ea-89ac-13bea0f45829-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-bxxsr\" (UID: \"a004f541-f809-42ea-89ac-13bea0f45829\") " pod="openstack/nova-cell1-cell-mapping-bxxsr" Mar 09 03:04:43 crc kubenswrapper[4901]: I0309 03:04:43.720632 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a004f541-f809-42ea-89ac-13bea0f45829-scripts\") pod \"nova-cell1-cell-mapping-bxxsr\" (UID: \"a004f541-f809-42ea-89ac-13bea0f45829\") " pod="openstack/nova-cell1-cell-mapping-bxxsr" Mar 09 03:04:43 crc kubenswrapper[4901]: I0309 03:04:43.721731 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6cc6\" (UniqueName: \"kubernetes.io/projected/a004f541-f809-42ea-89ac-13bea0f45829-kube-api-access-w6cc6\") pod \"nova-cell1-cell-mapping-bxxsr\" (UID: \"a004f541-f809-42ea-89ac-13bea0f45829\") " pod="openstack/nova-cell1-cell-mapping-bxxsr" Mar 09 03:04:43 crc kubenswrapper[4901]: I0309 03:04:43.721958 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a004f541-f809-42ea-89ac-13bea0f45829-config-data\") pod \"nova-cell1-cell-mapping-bxxsr\" (UID: \"a004f541-f809-42ea-89ac-13bea0f45829\") " pod="openstack/nova-cell1-cell-mapping-bxxsr" Mar 09 03:04:43 crc kubenswrapper[4901]: I0309 03:04:43.722173 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a004f541-f809-42ea-89ac-13bea0f45829-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-bxxsr\" (UID: \"a004f541-f809-42ea-89ac-13bea0f45829\") " pod="openstack/nova-cell1-cell-mapping-bxxsr" Mar 09 03:04:43 crc kubenswrapper[4901]: I0309 03:04:43.727869 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a004f541-f809-42ea-89ac-13bea0f45829-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-bxxsr\" (UID: \"a004f541-f809-42ea-89ac-13bea0f45829\") " pod="openstack/nova-cell1-cell-mapping-bxxsr" Mar 09 03:04:43 crc kubenswrapper[4901]: I0309 03:04:43.733942 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a004f541-f809-42ea-89ac-13bea0f45829-config-data\") pod \"nova-cell1-cell-mapping-bxxsr\" (UID: \"a004f541-f809-42ea-89ac-13bea0f45829\") " pod="openstack/nova-cell1-cell-mapping-bxxsr" Mar 09 03:04:43 crc kubenswrapper[4901]: I0309 03:04:43.735794 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a004f541-f809-42ea-89ac-13bea0f45829-scripts\") pod \"nova-cell1-cell-mapping-bxxsr\" (UID: \"a004f541-f809-42ea-89ac-13bea0f45829\") " pod="openstack/nova-cell1-cell-mapping-bxxsr" Mar 09 03:04:43 crc kubenswrapper[4901]: I0309 03:04:43.741191 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6cc6\" (UniqueName: \"kubernetes.io/projected/a004f541-f809-42ea-89ac-13bea0f45829-kube-api-access-w6cc6\") pod \"nova-cell1-cell-mapping-bxxsr\" (UID: \"a004f541-f809-42ea-89ac-13bea0f45829\") " pod="openstack/nova-cell1-cell-mapping-bxxsr" Mar 09 03:04:43 crc kubenswrapper[4901]: I0309 03:04:43.809156 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-bxxsr" Mar 09 03:04:44 crc kubenswrapper[4901]: I0309 03:04:44.304344 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-bxxsr"] Mar 09 03:04:44 crc kubenswrapper[4901]: W0309 03:04:44.308703 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda004f541_f809_42ea_89ac_13bea0f45829.slice/crio-462cb3fdd5acd84b4a07fc50e9298655d8e2ef05c0d93aaf754e3539e2e98ac1 WatchSource:0}: Error finding container 462cb3fdd5acd84b4a07fc50e9298655d8e2ef05c0d93aaf754e3539e2e98ac1: Status 404 returned error can't find the container with id 462cb3fdd5acd84b4a07fc50e9298655d8e2ef05c0d93aaf754e3539e2e98ac1 Mar 09 03:04:45 crc kubenswrapper[4901]: I0309 03:04:45.052406 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-c8964d89c-vw8lc" Mar 09 03:04:45 crc kubenswrapper[4901]: I0309 03:04:45.138849 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7559df67df-7q7jp"] Mar 09 03:04:45 crc kubenswrapper[4901]: I0309 03:04:45.139265 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7559df67df-7q7jp" podUID="f0c1248f-7953-4a19-a4ff-f7b717bd699b" containerName="dnsmasq-dns" containerID="cri-o://da3dd0236d7e00ec5f51e67a138e5431887dd87e41d62b3f23daf194a910085a" gracePeriod=10 Mar 09 03:04:45 crc kubenswrapper[4901]: I0309 03:04:45.256923 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2c3d4e9a-122e-4894-98b2-91784a9f44e8","Type":"ContainerStarted","Data":"2cd524a1798da2be5336faeee44da3b0c7ebcf43441ba6883148aca3906e53c0"} Mar 09 03:04:45 crc kubenswrapper[4901]: I0309 03:04:45.257056 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 03:04:45 crc kubenswrapper[4901]: I0309 03:04:45.259493 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-bxxsr" event={"ID":"a004f541-f809-42ea-89ac-13bea0f45829","Type":"ContainerStarted","Data":"fe08a232bade973666189357aa00dd8de0649590396e21fa9caeeccc1270a3ef"} Mar 09 03:04:45 crc kubenswrapper[4901]: I0309 03:04:45.259552 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-bxxsr" event={"ID":"a004f541-f809-42ea-89ac-13bea0f45829","Type":"ContainerStarted","Data":"462cb3fdd5acd84b4a07fc50e9298655d8e2ef05c0d93aaf754e3539e2e98ac1"} Mar 09 03:04:45 crc kubenswrapper[4901]: I0309 03:04:45.298197 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.8061925589999999 podStartE2EDuration="5.298176814s" podCreationTimestamp="2026-03-09 03:04:40 +0000 UTC" firstStartedPulling="2026-03-09 03:04:41.286709087 +0000 UTC m=+1405.876372809" lastFinishedPulling="2026-03-09 03:04:44.778693312 +0000 UTC m=+1409.368357064" observedRunningTime="2026-03-09 03:04:45.288514941 +0000 UTC m=+1409.878178673" watchObservedRunningTime="2026-03-09 03:04:45.298176814 +0000 UTC m=+1409.887840546" Mar 09 03:04:45 crc kubenswrapper[4901]: I0309 03:04:45.323946 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-bxxsr" podStartSLOduration=2.323928244 podStartE2EDuration="2.323928244s" podCreationTimestamp="2026-03-09 03:04:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 03:04:45.302880843 +0000 UTC m=+1409.892544575" watchObservedRunningTime="2026-03-09 03:04:45.323928244 +0000 UTC m=+1409.913591976" Mar 09 03:04:45 crc kubenswrapper[4901]: I0309 03:04:45.623093 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7559df67df-7q7jp" Mar 09 03:04:45 crc kubenswrapper[4901]: I0309 03:04:45.781798 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f0c1248f-7953-4a19-a4ff-f7b717bd699b-dns-swift-storage-0\") pod \"f0c1248f-7953-4a19-a4ff-f7b717bd699b\" (UID: \"f0c1248f-7953-4a19-a4ff-f7b717bd699b\") " Mar 09 03:04:45 crc kubenswrapper[4901]: I0309 03:04:45.782062 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6l2g\" (UniqueName: \"kubernetes.io/projected/f0c1248f-7953-4a19-a4ff-f7b717bd699b-kube-api-access-h6l2g\") pod \"f0c1248f-7953-4a19-a4ff-f7b717bd699b\" (UID: \"f0c1248f-7953-4a19-a4ff-f7b717bd699b\") " Mar 09 03:04:45 crc kubenswrapper[4901]: I0309 03:04:45.782262 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f0c1248f-7953-4a19-a4ff-f7b717bd699b-ovsdbserver-sb\") pod \"f0c1248f-7953-4a19-a4ff-f7b717bd699b\" (UID: \"f0c1248f-7953-4a19-a4ff-f7b717bd699b\") " Mar 09 03:04:45 crc kubenswrapper[4901]: I0309 03:04:45.782533 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0c1248f-7953-4a19-a4ff-f7b717bd699b-dns-svc\") pod \"f0c1248f-7953-4a19-a4ff-f7b717bd699b\" (UID: \"f0c1248f-7953-4a19-a4ff-f7b717bd699b\") " Mar 09 03:04:45 crc kubenswrapper[4901]: I0309 03:04:45.782613 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0c1248f-7953-4a19-a4ff-f7b717bd699b-config\") pod \"f0c1248f-7953-4a19-a4ff-f7b717bd699b\" (UID: \"f0c1248f-7953-4a19-a4ff-f7b717bd699b\") " Mar 09 03:04:45 crc kubenswrapper[4901]: I0309 03:04:45.782814 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f0c1248f-7953-4a19-a4ff-f7b717bd699b-ovsdbserver-nb\") pod \"f0c1248f-7953-4a19-a4ff-f7b717bd699b\" (UID: \"f0c1248f-7953-4a19-a4ff-f7b717bd699b\") " Mar 09 03:04:45 crc kubenswrapper[4901]: I0309 03:04:45.792419 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0c1248f-7953-4a19-a4ff-f7b717bd699b-kube-api-access-h6l2g" (OuterVolumeSpecName: "kube-api-access-h6l2g") pod "f0c1248f-7953-4a19-a4ff-f7b717bd699b" (UID: "f0c1248f-7953-4a19-a4ff-f7b717bd699b"). InnerVolumeSpecName "kube-api-access-h6l2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:04:45 crc kubenswrapper[4901]: I0309 03:04:45.825611 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0c1248f-7953-4a19-a4ff-f7b717bd699b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f0c1248f-7953-4a19-a4ff-f7b717bd699b" (UID: "f0c1248f-7953-4a19-a4ff-f7b717bd699b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:04:45 crc kubenswrapper[4901]: I0309 03:04:45.833439 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0c1248f-7953-4a19-a4ff-f7b717bd699b-config" (OuterVolumeSpecName: "config") pod "f0c1248f-7953-4a19-a4ff-f7b717bd699b" (UID: "f0c1248f-7953-4a19-a4ff-f7b717bd699b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:04:45 crc kubenswrapper[4901]: I0309 03:04:45.836248 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0c1248f-7953-4a19-a4ff-f7b717bd699b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f0c1248f-7953-4a19-a4ff-f7b717bd699b" (UID: "f0c1248f-7953-4a19-a4ff-f7b717bd699b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:04:45 crc kubenswrapper[4901]: I0309 03:04:45.836794 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0c1248f-7953-4a19-a4ff-f7b717bd699b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f0c1248f-7953-4a19-a4ff-f7b717bd699b" (UID: "f0c1248f-7953-4a19-a4ff-f7b717bd699b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:04:45 crc kubenswrapper[4901]: I0309 03:04:45.864143 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0c1248f-7953-4a19-a4ff-f7b717bd699b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f0c1248f-7953-4a19-a4ff-f7b717bd699b" (UID: "f0c1248f-7953-4a19-a4ff-f7b717bd699b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:04:45 crc kubenswrapper[4901]: I0309 03:04:45.886337 4901 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f0c1248f-7953-4a19-a4ff-f7b717bd699b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:45 crc kubenswrapper[4901]: I0309 03:04:45.886389 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6l2g\" (UniqueName: \"kubernetes.io/projected/f0c1248f-7953-4a19-a4ff-f7b717bd699b-kube-api-access-h6l2g\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:45 crc kubenswrapper[4901]: I0309 03:04:45.886402 4901 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f0c1248f-7953-4a19-a4ff-f7b717bd699b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:45 crc kubenswrapper[4901]: I0309 03:04:45.886412 4901 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0c1248f-7953-4a19-a4ff-f7b717bd699b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:45 crc kubenswrapper[4901]: I0309 03:04:45.886420 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0c1248f-7953-4a19-a4ff-f7b717bd699b-config\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:45 crc kubenswrapper[4901]: I0309 03:04:45.886428 4901 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f0c1248f-7953-4a19-a4ff-f7b717bd699b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:46 crc kubenswrapper[4901]: I0309 03:04:46.272255 4901 generic.go:334] "Generic (PLEG): container finished" podID="f0c1248f-7953-4a19-a4ff-f7b717bd699b" containerID="da3dd0236d7e00ec5f51e67a138e5431887dd87e41d62b3f23daf194a910085a" exitCode=0 Mar 09 03:04:46 crc kubenswrapper[4901]: I0309 03:04:46.272342 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7559df67df-7q7jp" event={"ID":"f0c1248f-7953-4a19-a4ff-f7b717bd699b","Type":"ContainerDied","Data":"da3dd0236d7e00ec5f51e67a138e5431887dd87e41d62b3f23daf194a910085a"} Mar 09 03:04:46 crc kubenswrapper[4901]: I0309 03:04:46.272362 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7559df67df-7q7jp" Mar 09 03:04:46 crc kubenswrapper[4901]: I0309 03:04:46.272394 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7559df67df-7q7jp" event={"ID":"f0c1248f-7953-4a19-a4ff-f7b717bd699b","Type":"ContainerDied","Data":"8bc7e7bcfa9ea0b7f46346767b5526b50417a3194a64db5883b1b935e93b9a82"} Mar 09 03:04:46 crc kubenswrapper[4901]: I0309 03:04:46.272416 4901 scope.go:117] "RemoveContainer" containerID="da3dd0236d7e00ec5f51e67a138e5431887dd87e41d62b3f23daf194a910085a" Mar 09 03:04:46 crc kubenswrapper[4901]: I0309 03:04:46.300919 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7559df67df-7q7jp"] Mar 09 03:04:46 crc kubenswrapper[4901]: I0309 03:04:46.304845 4901 scope.go:117] "RemoveContainer" containerID="07bf31610f8975ae92a0e27b9fcd014493e2f2ff22c12620f19d45ee99431eff" Mar 09 03:04:46 crc kubenswrapper[4901]: I0309 03:04:46.310854 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7559df67df-7q7jp"] Mar 09 03:04:46 crc kubenswrapper[4901]: I0309 03:04:46.335320 4901 scope.go:117] "RemoveContainer" containerID="da3dd0236d7e00ec5f51e67a138e5431887dd87e41d62b3f23daf194a910085a" Mar 09 03:04:46 crc kubenswrapper[4901]: E0309 03:04:46.335898 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da3dd0236d7e00ec5f51e67a138e5431887dd87e41d62b3f23daf194a910085a\": container with ID starting with da3dd0236d7e00ec5f51e67a138e5431887dd87e41d62b3f23daf194a910085a not found: ID does not exist" containerID="da3dd0236d7e00ec5f51e67a138e5431887dd87e41d62b3f23daf194a910085a" Mar 09 03:04:46 crc kubenswrapper[4901]: I0309 03:04:46.335951 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da3dd0236d7e00ec5f51e67a138e5431887dd87e41d62b3f23daf194a910085a"} err="failed to get container status \"da3dd0236d7e00ec5f51e67a138e5431887dd87e41d62b3f23daf194a910085a\": rpc error: code = NotFound desc = could not find container \"da3dd0236d7e00ec5f51e67a138e5431887dd87e41d62b3f23daf194a910085a\": container with ID starting with da3dd0236d7e00ec5f51e67a138e5431887dd87e41d62b3f23daf194a910085a not found: ID does not exist" Mar 09 03:04:46 crc kubenswrapper[4901]: I0309 03:04:46.335977 4901 scope.go:117] "RemoveContainer" containerID="07bf31610f8975ae92a0e27b9fcd014493e2f2ff22c12620f19d45ee99431eff" Mar 09 03:04:46 crc kubenswrapper[4901]: E0309 03:04:46.336443 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07bf31610f8975ae92a0e27b9fcd014493e2f2ff22c12620f19d45ee99431eff\": container with ID starting with 07bf31610f8975ae92a0e27b9fcd014493e2f2ff22c12620f19d45ee99431eff not found: ID does not exist" containerID="07bf31610f8975ae92a0e27b9fcd014493e2f2ff22c12620f19d45ee99431eff" Mar 09 03:04:46 crc kubenswrapper[4901]: I0309 03:04:46.336465 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07bf31610f8975ae92a0e27b9fcd014493e2f2ff22c12620f19d45ee99431eff"} err="failed to get container status \"07bf31610f8975ae92a0e27b9fcd014493e2f2ff22c12620f19d45ee99431eff\": rpc error: code = NotFound desc = could not find container \"07bf31610f8975ae92a0e27b9fcd014493e2f2ff22c12620f19d45ee99431eff\": container with ID starting with 07bf31610f8975ae92a0e27b9fcd014493e2f2ff22c12620f19d45ee99431eff not found: ID does not exist" Mar 09 03:04:48 crc kubenswrapper[4901]: I0309 03:04:48.128117 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0c1248f-7953-4a19-a4ff-f7b717bd699b" path="/var/lib/kubelet/pods/f0c1248f-7953-4a19-a4ff-f7b717bd699b/volumes" Mar 09 03:04:49 crc kubenswrapper[4901]: I0309 03:04:49.308108 4901 generic.go:334] "Generic (PLEG): container finished" podID="a004f541-f809-42ea-89ac-13bea0f45829" containerID="fe08a232bade973666189357aa00dd8de0649590396e21fa9caeeccc1270a3ef" exitCode=0 Mar 09 03:04:49 crc kubenswrapper[4901]: I0309 03:04:49.308199 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-bxxsr" event={"ID":"a004f541-f809-42ea-89ac-13bea0f45829","Type":"ContainerDied","Data":"fe08a232bade973666189357aa00dd8de0649590396e21fa9caeeccc1270a3ef"} Mar 09 03:04:50 crc kubenswrapper[4901]: I0309 03:04:50.757261 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-bxxsr" Mar 09 03:04:50 crc kubenswrapper[4901]: I0309 03:04:50.895473 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6cc6\" (UniqueName: \"kubernetes.io/projected/a004f541-f809-42ea-89ac-13bea0f45829-kube-api-access-w6cc6\") pod \"a004f541-f809-42ea-89ac-13bea0f45829\" (UID: \"a004f541-f809-42ea-89ac-13bea0f45829\") " Mar 09 03:04:50 crc kubenswrapper[4901]: I0309 03:04:50.895712 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a004f541-f809-42ea-89ac-13bea0f45829-combined-ca-bundle\") pod \"a004f541-f809-42ea-89ac-13bea0f45829\" (UID: \"a004f541-f809-42ea-89ac-13bea0f45829\") " Mar 09 03:04:50 crc kubenswrapper[4901]: I0309 03:04:50.895800 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a004f541-f809-42ea-89ac-13bea0f45829-config-data\") pod \"a004f541-f809-42ea-89ac-13bea0f45829\" (UID: \"a004f541-f809-42ea-89ac-13bea0f45829\") " Mar 09 03:04:50 crc kubenswrapper[4901]: I0309 03:04:50.895901 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a004f541-f809-42ea-89ac-13bea0f45829-scripts\") pod \"a004f541-f809-42ea-89ac-13bea0f45829\" (UID: \"a004f541-f809-42ea-89ac-13bea0f45829\") " Mar 09 03:04:50 crc kubenswrapper[4901]: I0309 03:04:50.900938 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a004f541-f809-42ea-89ac-13bea0f45829-scripts" (OuterVolumeSpecName: "scripts") pod "a004f541-f809-42ea-89ac-13bea0f45829" (UID: "a004f541-f809-42ea-89ac-13bea0f45829"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:04:50 crc kubenswrapper[4901]: I0309 03:04:50.904104 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a004f541-f809-42ea-89ac-13bea0f45829-kube-api-access-w6cc6" (OuterVolumeSpecName: "kube-api-access-w6cc6") pod "a004f541-f809-42ea-89ac-13bea0f45829" (UID: "a004f541-f809-42ea-89ac-13bea0f45829"). InnerVolumeSpecName "kube-api-access-w6cc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:04:50 crc kubenswrapper[4901]: I0309 03:04:50.926370 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a004f541-f809-42ea-89ac-13bea0f45829-config-data" (OuterVolumeSpecName: "config-data") pod "a004f541-f809-42ea-89ac-13bea0f45829" (UID: "a004f541-f809-42ea-89ac-13bea0f45829"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:04:50 crc kubenswrapper[4901]: I0309 03:04:50.926647 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a004f541-f809-42ea-89ac-13bea0f45829-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a004f541-f809-42ea-89ac-13bea0f45829" (UID: "a004f541-f809-42ea-89ac-13bea0f45829"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:04:50 crc kubenswrapper[4901]: I0309 03:04:50.998452 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6cc6\" (UniqueName: \"kubernetes.io/projected/a004f541-f809-42ea-89ac-13bea0f45829-kube-api-access-w6cc6\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:50 crc kubenswrapper[4901]: I0309 03:04:50.998491 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a004f541-f809-42ea-89ac-13bea0f45829-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:50 crc kubenswrapper[4901]: I0309 03:04:50.998500 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a004f541-f809-42ea-89ac-13bea0f45829-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:50 crc kubenswrapper[4901]: I0309 03:04:50.998511 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a004f541-f809-42ea-89ac-13bea0f45829-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:51 crc kubenswrapper[4901]: I0309 03:04:51.335345 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-bxxsr" event={"ID":"a004f541-f809-42ea-89ac-13bea0f45829","Type":"ContainerDied","Data":"462cb3fdd5acd84b4a07fc50e9298655d8e2ef05c0d93aaf754e3539e2e98ac1"} Mar 09 03:04:51 crc kubenswrapper[4901]: I0309 03:04:51.336178 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="462cb3fdd5acd84b4a07fc50e9298655d8e2ef05c0d93aaf754e3539e2e98ac1" Mar 09 03:04:51 crc kubenswrapper[4901]: I0309 03:04:51.335417 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-bxxsr" Mar 09 03:04:51 crc kubenswrapper[4901]: I0309 03:04:51.557586 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 09 03:04:51 crc kubenswrapper[4901]: I0309 03:04:51.559187 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="af4527fd-88dd-47e5-bde9-7813caf84a2d" containerName="nova-api-log" containerID="cri-o://b6125e05fa62fe3debb29acf01b57b9ab6c7001dd28b9627d1eb7f147ff9fa2e" gracePeriod=30 Mar 09 03:04:51 crc kubenswrapper[4901]: I0309 03:04:51.560035 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="af4527fd-88dd-47e5-bde9-7813caf84a2d" containerName="nova-api-api" containerID="cri-o://70b7dded54b5b180e678b520941248da5da9b41ee342a579a9f4522788f006b5" gracePeriod=30 Mar 09 03:04:51 crc kubenswrapper[4901]: I0309 03:04:51.577626 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 03:04:51 crc kubenswrapper[4901]: I0309 03:04:51.577873 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="8653c2ab-c097-435e-b694-5c894b6bdd11" containerName="nova-scheduler-scheduler" containerID="cri-o://5525bd22a7f2778ddd76385ce3c2eaf6dc537d127ed1059b1bcae84388c30602" gracePeriod=30 Mar 09 03:04:51 crc kubenswrapper[4901]: I0309 03:04:51.655009 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 03:04:51 crc kubenswrapper[4901]: I0309 03:04:51.655273 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0c7fbb65-658d-416a-85da-243a966b9bc9" containerName="nova-metadata-log" containerID="cri-o://dca3cac93605433ad08ce4bac4c9419914d58e497e5fb2f136941444b9e7a625" gracePeriod=30 Mar 09 03:04:51 crc kubenswrapper[4901]: I0309 03:04:51.656012 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0c7fbb65-658d-416a-85da-243a966b9bc9" containerName="nova-metadata-metadata" containerID="cri-o://84cf5d3e65a2b63112b3e902289abee82dcc47de580617ca860d4fe52534a346" gracePeriod=30 Mar 09 03:04:52 crc kubenswrapper[4901]: I0309 03:04:52.098923 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 03:04:52 crc kubenswrapper[4901]: I0309 03:04:52.228607 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njwhb\" (UniqueName: \"kubernetes.io/projected/af4527fd-88dd-47e5-bde9-7813caf84a2d-kube-api-access-njwhb\") pod \"af4527fd-88dd-47e5-bde9-7813caf84a2d\" (UID: \"af4527fd-88dd-47e5-bde9-7813caf84a2d\") " Mar 09 03:04:52 crc kubenswrapper[4901]: I0309 03:04:52.229723 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af4527fd-88dd-47e5-bde9-7813caf84a2d-config-data\") pod \"af4527fd-88dd-47e5-bde9-7813caf84a2d\" (UID: \"af4527fd-88dd-47e5-bde9-7813caf84a2d\") " Mar 09 03:04:52 crc kubenswrapper[4901]: I0309 03:04:52.230048 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/af4527fd-88dd-47e5-bde9-7813caf84a2d-public-tls-certs\") pod \"af4527fd-88dd-47e5-bde9-7813caf84a2d\" (UID: \"af4527fd-88dd-47e5-bde9-7813caf84a2d\") " Mar 09 03:04:52 crc kubenswrapper[4901]: I0309 03:04:52.230125 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af4527fd-88dd-47e5-bde9-7813caf84a2d-combined-ca-bundle\") pod \"af4527fd-88dd-47e5-bde9-7813caf84a2d\" (UID: \"af4527fd-88dd-47e5-bde9-7813caf84a2d\") " Mar 09 03:04:52 crc kubenswrapper[4901]: I0309 03:04:52.230210 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af4527fd-88dd-47e5-bde9-7813caf84a2d-internal-tls-certs\") pod \"af4527fd-88dd-47e5-bde9-7813caf84a2d\" (UID: \"af4527fd-88dd-47e5-bde9-7813caf84a2d\") " Mar 09 03:04:52 crc kubenswrapper[4901]: I0309 03:04:52.230342 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af4527fd-88dd-47e5-bde9-7813caf84a2d-logs\") pod \"af4527fd-88dd-47e5-bde9-7813caf84a2d\" (UID: \"af4527fd-88dd-47e5-bde9-7813caf84a2d\") " Mar 09 03:04:52 crc kubenswrapper[4901]: I0309 03:04:52.232211 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af4527fd-88dd-47e5-bde9-7813caf84a2d-logs" (OuterVolumeSpecName: "logs") pod "af4527fd-88dd-47e5-bde9-7813caf84a2d" (UID: "af4527fd-88dd-47e5-bde9-7813caf84a2d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:04:52 crc kubenswrapper[4901]: I0309 03:04:52.260090 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af4527fd-88dd-47e5-bde9-7813caf84a2d-kube-api-access-njwhb" (OuterVolumeSpecName: "kube-api-access-njwhb") pod "af4527fd-88dd-47e5-bde9-7813caf84a2d" (UID: "af4527fd-88dd-47e5-bde9-7813caf84a2d"). InnerVolumeSpecName "kube-api-access-njwhb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:04:52 crc kubenswrapper[4901]: I0309 03:04:52.334566 4901 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af4527fd-88dd-47e5-bde9-7813caf84a2d-logs\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:52 crc kubenswrapper[4901]: I0309 03:04:52.334610 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njwhb\" (UniqueName: \"kubernetes.io/projected/af4527fd-88dd-47e5-bde9-7813caf84a2d-kube-api-access-njwhb\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:52 crc kubenswrapper[4901]: I0309 03:04:52.353637 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af4527fd-88dd-47e5-bde9-7813caf84a2d-config-data" (OuterVolumeSpecName: "config-data") pod "af4527fd-88dd-47e5-bde9-7813caf84a2d" (UID: "af4527fd-88dd-47e5-bde9-7813caf84a2d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:04:52 crc kubenswrapper[4901]: I0309 03:04:52.366517 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af4527fd-88dd-47e5-bde9-7813caf84a2d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "af4527fd-88dd-47e5-bde9-7813caf84a2d" (UID: "af4527fd-88dd-47e5-bde9-7813caf84a2d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:04:52 crc kubenswrapper[4901]: I0309 03:04:52.385587 4901 generic.go:334] "Generic (PLEG): container finished" podID="af4527fd-88dd-47e5-bde9-7813caf84a2d" containerID="70b7dded54b5b180e678b520941248da5da9b41ee342a579a9f4522788f006b5" exitCode=0 Mar 09 03:04:52 crc kubenswrapper[4901]: I0309 03:04:52.385628 4901 generic.go:334] "Generic (PLEG): container finished" podID="af4527fd-88dd-47e5-bde9-7813caf84a2d" containerID="b6125e05fa62fe3debb29acf01b57b9ab6c7001dd28b9627d1eb7f147ff9fa2e" exitCode=143 Mar 09 03:04:52 crc kubenswrapper[4901]: I0309 03:04:52.385668 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"af4527fd-88dd-47e5-bde9-7813caf84a2d","Type":"ContainerDied","Data":"70b7dded54b5b180e678b520941248da5da9b41ee342a579a9f4522788f006b5"} Mar 09 03:04:52 crc kubenswrapper[4901]: I0309 03:04:52.385695 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"af4527fd-88dd-47e5-bde9-7813caf84a2d","Type":"ContainerDied","Data":"b6125e05fa62fe3debb29acf01b57b9ab6c7001dd28b9627d1eb7f147ff9fa2e"} Mar 09 03:04:52 crc kubenswrapper[4901]: I0309 03:04:52.385703 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"af4527fd-88dd-47e5-bde9-7813caf84a2d","Type":"ContainerDied","Data":"684c98fd1b7c16fe0438eb7b7d630b50ce4e47166be500691236b3a06fe4c967"} Mar 09 03:04:52 crc kubenswrapper[4901]: I0309 03:04:52.385717 4901 scope.go:117] "RemoveContainer" containerID="70b7dded54b5b180e678b520941248da5da9b41ee342a579a9f4522788f006b5" Mar 09 03:04:52 crc kubenswrapper[4901]: I0309 03:04:52.385818 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 03:04:52 crc kubenswrapper[4901]: I0309 03:04:52.397437 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af4527fd-88dd-47e5-bde9-7813caf84a2d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "af4527fd-88dd-47e5-bde9-7813caf84a2d" (UID: "af4527fd-88dd-47e5-bde9-7813caf84a2d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:04:52 crc kubenswrapper[4901]: I0309 03:04:52.399948 4901 generic.go:334] "Generic (PLEG): container finished" podID="0c7fbb65-658d-416a-85da-243a966b9bc9" containerID="dca3cac93605433ad08ce4bac4c9419914d58e497e5fb2f136941444b9e7a625" exitCode=143 Mar 09 03:04:52 crc kubenswrapper[4901]: I0309 03:04:52.399988 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0c7fbb65-658d-416a-85da-243a966b9bc9","Type":"ContainerDied","Data":"dca3cac93605433ad08ce4bac4c9419914d58e497e5fb2f136941444b9e7a625"} Mar 09 03:04:52 crc kubenswrapper[4901]: I0309 03:04:52.417520 4901 scope.go:117] "RemoveContainer" containerID="b6125e05fa62fe3debb29acf01b57b9ab6c7001dd28b9627d1eb7f147ff9fa2e" Mar 09 03:04:52 crc kubenswrapper[4901]: I0309 03:04:52.419647 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af4527fd-88dd-47e5-bde9-7813caf84a2d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "af4527fd-88dd-47e5-bde9-7813caf84a2d" (UID: "af4527fd-88dd-47e5-bde9-7813caf84a2d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:04:52 crc kubenswrapper[4901]: I0309 03:04:52.434576 4901 scope.go:117] "RemoveContainer" containerID="70b7dded54b5b180e678b520941248da5da9b41ee342a579a9f4522788f006b5" Mar 09 03:04:52 crc kubenswrapper[4901]: I0309 03:04:52.435711 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af4527fd-88dd-47e5-bde9-7813caf84a2d-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:52 crc kubenswrapper[4901]: I0309 03:04:52.435739 4901 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/af4527fd-88dd-47e5-bde9-7813caf84a2d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:52 crc kubenswrapper[4901]: I0309 03:04:52.435749 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af4527fd-88dd-47e5-bde9-7813caf84a2d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:52 crc kubenswrapper[4901]: I0309 03:04:52.435758 4901 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af4527fd-88dd-47e5-bde9-7813caf84a2d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:52 crc kubenswrapper[4901]: E0309 03:04:52.436075 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70b7dded54b5b180e678b520941248da5da9b41ee342a579a9f4522788f006b5\": container with ID starting with 70b7dded54b5b180e678b520941248da5da9b41ee342a579a9f4522788f006b5 not found: ID does not exist" containerID="70b7dded54b5b180e678b520941248da5da9b41ee342a579a9f4522788f006b5" Mar 09 03:04:52 crc kubenswrapper[4901]: I0309 03:04:52.436104 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70b7dded54b5b180e678b520941248da5da9b41ee342a579a9f4522788f006b5"} err="failed to get container status \"70b7dded54b5b180e678b520941248da5da9b41ee342a579a9f4522788f006b5\": rpc error: code = NotFound desc = could not find container \"70b7dded54b5b180e678b520941248da5da9b41ee342a579a9f4522788f006b5\": container with ID starting with 70b7dded54b5b180e678b520941248da5da9b41ee342a579a9f4522788f006b5 not found: ID does not exist" Mar 09 03:04:52 crc kubenswrapper[4901]: I0309 03:04:52.436123 4901 scope.go:117] "RemoveContainer" containerID="b6125e05fa62fe3debb29acf01b57b9ab6c7001dd28b9627d1eb7f147ff9fa2e" Mar 09 03:04:52 crc kubenswrapper[4901]: E0309 03:04:52.436338 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6125e05fa62fe3debb29acf01b57b9ab6c7001dd28b9627d1eb7f147ff9fa2e\": container with ID starting with b6125e05fa62fe3debb29acf01b57b9ab6c7001dd28b9627d1eb7f147ff9fa2e not found: ID does not exist" containerID="b6125e05fa62fe3debb29acf01b57b9ab6c7001dd28b9627d1eb7f147ff9fa2e" Mar 09 03:04:52 crc kubenswrapper[4901]: I0309 03:04:52.436370 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6125e05fa62fe3debb29acf01b57b9ab6c7001dd28b9627d1eb7f147ff9fa2e"} err="failed to get container status \"b6125e05fa62fe3debb29acf01b57b9ab6c7001dd28b9627d1eb7f147ff9fa2e\": rpc error: code = NotFound desc = could not find container \"b6125e05fa62fe3debb29acf01b57b9ab6c7001dd28b9627d1eb7f147ff9fa2e\": container with ID starting with b6125e05fa62fe3debb29acf01b57b9ab6c7001dd28b9627d1eb7f147ff9fa2e not found: ID does not exist" Mar 09 03:04:52 crc kubenswrapper[4901]: I0309 03:04:52.436383 4901 scope.go:117] "RemoveContainer" containerID="70b7dded54b5b180e678b520941248da5da9b41ee342a579a9f4522788f006b5" Mar 09 03:04:52 crc kubenswrapper[4901]: I0309 03:04:52.436571 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70b7dded54b5b180e678b520941248da5da9b41ee342a579a9f4522788f006b5"} err="failed to get container status \"70b7dded54b5b180e678b520941248da5da9b41ee342a579a9f4522788f006b5\": rpc error: code = NotFound desc = could not find container \"70b7dded54b5b180e678b520941248da5da9b41ee342a579a9f4522788f006b5\": container with ID starting with 70b7dded54b5b180e678b520941248da5da9b41ee342a579a9f4522788f006b5 not found: ID does not exist" Mar 09 03:04:52 crc kubenswrapper[4901]: I0309 03:04:52.436602 4901 scope.go:117] "RemoveContainer" containerID="b6125e05fa62fe3debb29acf01b57b9ab6c7001dd28b9627d1eb7f147ff9fa2e" Mar 09 03:04:52 crc kubenswrapper[4901]: I0309 03:04:52.436857 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6125e05fa62fe3debb29acf01b57b9ab6c7001dd28b9627d1eb7f147ff9fa2e"} err="failed to get container status \"b6125e05fa62fe3debb29acf01b57b9ab6c7001dd28b9627d1eb7f147ff9fa2e\": rpc error: code = NotFound desc = could not find container \"b6125e05fa62fe3debb29acf01b57b9ab6c7001dd28b9627d1eb7f147ff9fa2e\": container with ID starting with b6125e05fa62fe3debb29acf01b57b9ab6c7001dd28b9627d1eb7f147ff9fa2e not found: ID does not exist" Mar 09 03:04:52 crc kubenswrapper[4901]: I0309 03:04:52.885833 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 09 03:04:52 crc kubenswrapper[4901]: I0309 03:04:52.902069 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 09 03:04:52 crc kubenswrapper[4901]: I0309 03:04:52.928350 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 09 03:04:52 crc kubenswrapper[4901]: E0309 03:04:52.928954 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0c1248f-7953-4a19-a4ff-f7b717bd699b" containerName="dnsmasq-dns" Mar 09 03:04:52 crc kubenswrapper[4901]: I0309 03:04:52.928970 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0c1248f-7953-4a19-a4ff-f7b717bd699b" containerName="dnsmasq-dns" Mar 09 03:04:52 crc kubenswrapper[4901]: E0309 03:04:52.928988 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af4527fd-88dd-47e5-bde9-7813caf84a2d" containerName="nova-api-api" Mar 09 03:04:52 crc kubenswrapper[4901]: I0309 03:04:52.929006 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="af4527fd-88dd-47e5-bde9-7813caf84a2d" containerName="nova-api-api" Mar 09 03:04:52 crc kubenswrapper[4901]: E0309 03:04:52.929021 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af4527fd-88dd-47e5-bde9-7813caf84a2d" containerName="nova-api-log" Mar 09 03:04:52 crc kubenswrapper[4901]: I0309 03:04:52.929027 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="af4527fd-88dd-47e5-bde9-7813caf84a2d" containerName="nova-api-log" Mar 09 03:04:52 crc kubenswrapper[4901]: E0309 03:04:52.929054 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a004f541-f809-42ea-89ac-13bea0f45829" containerName="nova-manage" Mar 09 03:04:52 crc kubenswrapper[4901]: I0309 03:04:52.929060 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="a004f541-f809-42ea-89ac-13bea0f45829" containerName="nova-manage" Mar 09 03:04:52 crc kubenswrapper[4901]: E0309 03:04:52.929071 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0c1248f-7953-4a19-a4ff-f7b717bd699b" containerName="init" Mar 09 03:04:52 crc kubenswrapper[4901]: I0309 03:04:52.929077 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0c1248f-7953-4a19-a4ff-f7b717bd699b" containerName="init" Mar 09 03:04:52 crc kubenswrapper[4901]: I0309 03:04:52.929275 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="a004f541-f809-42ea-89ac-13bea0f45829" containerName="nova-manage" Mar 09 03:04:52 crc kubenswrapper[4901]: I0309 03:04:52.929285 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0c1248f-7953-4a19-a4ff-f7b717bd699b" containerName="dnsmasq-dns" Mar 09 03:04:52 crc kubenswrapper[4901]: I0309 03:04:52.929294 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="af4527fd-88dd-47e5-bde9-7813caf84a2d" containerName="nova-api-log" Mar 09 03:04:52 crc kubenswrapper[4901]: I0309 03:04:52.929307 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="af4527fd-88dd-47e5-bde9-7813caf84a2d" containerName="nova-api-api" Mar 09 03:04:52 crc kubenswrapper[4901]: I0309 03:04:52.930281 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 03:04:52 crc kubenswrapper[4901]: I0309 03:04:52.935967 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 09 03:04:52 crc kubenswrapper[4901]: I0309 03:04:52.935988 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 09 03:04:52 crc kubenswrapper[4901]: I0309 03:04:52.936400 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 09 03:04:52 crc kubenswrapper[4901]: I0309 03:04:52.940051 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 09 03:04:53 crc kubenswrapper[4901]: I0309 03:04:53.055744 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d-public-tls-certs\") pod \"nova-api-0\" (UID: \"ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d\") " pod="openstack/nova-api-0" Mar 09 03:04:53 crc kubenswrapper[4901]: I0309 03:04:53.055777 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d-config-data\") pod \"nova-api-0\" (UID: \"ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d\") " pod="openstack/nova-api-0" Mar 09 03:04:53 crc kubenswrapper[4901]: I0309 03:04:53.055821 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d\") " pod="openstack/nova-api-0" Mar 09 03:04:53 crc kubenswrapper[4901]: I0309 03:04:53.055839 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d-logs\") pod \"nova-api-0\" (UID: \"ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d\") " pod="openstack/nova-api-0" Mar 09 03:04:53 crc kubenswrapper[4901]: I0309 03:04:53.056141 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d\") " pod="openstack/nova-api-0" Mar 09 03:04:53 crc kubenswrapper[4901]: I0309 03:04:53.056384 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tq6z\" (UniqueName: \"kubernetes.io/projected/ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d-kube-api-access-7tq6z\") pod \"nova-api-0\" (UID: \"ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d\") " pod="openstack/nova-api-0" Mar 09 03:04:53 crc kubenswrapper[4901]: I0309 03:04:53.119332 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 03:04:53 crc kubenswrapper[4901]: I0309 03:04:53.158022 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d\") " pod="openstack/nova-api-0" Mar 09 03:04:53 crc kubenswrapper[4901]: I0309 03:04:53.158143 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tq6z\" (UniqueName: \"kubernetes.io/projected/ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d-kube-api-access-7tq6z\") pod \"nova-api-0\" (UID: \"ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d\") " pod="openstack/nova-api-0" Mar 09 03:04:53 crc kubenswrapper[4901]: I0309 03:04:53.158372 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d-public-tls-certs\") pod \"nova-api-0\" (UID: \"ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d\") " pod="openstack/nova-api-0" Mar 09 03:04:53 crc kubenswrapper[4901]: I0309 03:04:53.158427 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d-config-data\") pod \"nova-api-0\" (UID: \"ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d\") " pod="openstack/nova-api-0" Mar 09 03:04:53 crc kubenswrapper[4901]: I0309 03:04:53.158560 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d\") " pod="openstack/nova-api-0" Mar 09 03:04:53 crc kubenswrapper[4901]: I0309 03:04:53.159291 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d-logs\") pod \"nova-api-0\" (UID: \"ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d\") " pod="openstack/nova-api-0" Mar 09 03:04:53 crc kubenswrapper[4901]: I0309 03:04:53.159963 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d-logs\") pod \"nova-api-0\" (UID: \"ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d\") " pod="openstack/nova-api-0" Mar 09 03:04:53 crc kubenswrapper[4901]: I0309 03:04:53.164618 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d\") " pod="openstack/nova-api-0" Mar 09 03:04:53 crc kubenswrapper[4901]: I0309 03:04:53.166110 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d\") " pod="openstack/nova-api-0" Mar 09 03:04:53 crc kubenswrapper[4901]: I0309 03:04:53.167345 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d-config-data\") pod \"nova-api-0\" (UID: \"ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d\") " pod="openstack/nova-api-0" Mar 09 03:04:53 crc kubenswrapper[4901]: I0309 03:04:53.169482 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d-public-tls-certs\") pod \"nova-api-0\" (UID: \"ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d\") " pod="openstack/nova-api-0" Mar 09 03:04:53 crc kubenswrapper[4901]: I0309 03:04:53.180153 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tq6z\" (UniqueName: \"kubernetes.io/projected/ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d-kube-api-access-7tq6z\") pod \"nova-api-0\" (UID: \"ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d\") " pod="openstack/nova-api-0" Mar 09 03:04:53 crc kubenswrapper[4901]: I0309 03:04:53.261080 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8653c2ab-c097-435e-b694-5c894b6bdd11-config-data\") pod \"8653c2ab-c097-435e-b694-5c894b6bdd11\" (UID: \"8653c2ab-c097-435e-b694-5c894b6bdd11\") " Mar 09 03:04:53 crc kubenswrapper[4901]: I0309 03:04:53.261142 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9n9jd\" (UniqueName: \"kubernetes.io/projected/8653c2ab-c097-435e-b694-5c894b6bdd11-kube-api-access-9n9jd\") pod \"8653c2ab-c097-435e-b694-5c894b6bdd11\" (UID: \"8653c2ab-c097-435e-b694-5c894b6bdd11\") " Mar 09 03:04:53 crc kubenswrapper[4901]: I0309 03:04:53.261806 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 03:04:53 crc kubenswrapper[4901]: I0309 03:04:53.261884 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8653c2ab-c097-435e-b694-5c894b6bdd11-combined-ca-bundle\") pod \"8653c2ab-c097-435e-b694-5c894b6bdd11\" (UID: \"8653c2ab-c097-435e-b694-5c894b6bdd11\") " Mar 09 03:04:53 crc kubenswrapper[4901]: I0309 03:04:53.263785 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8653c2ab-c097-435e-b694-5c894b6bdd11-kube-api-access-9n9jd" (OuterVolumeSpecName: "kube-api-access-9n9jd") pod "8653c2ab-c097-435e-b694-5c894b6bdd11" (UID: "8653c2ab-c097-435e-b694-5c894b6bdd11"). InnerVolumeSpecName "kube-api-access-9n9jd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:04:53 crc kubenswrapper[4901]: I0309 03:04:53.304465 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8653c2ab-c097-435e-b694-5c894b6bdd11-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8653c2ab-c097-435e-b694-5c894b6bdd11" (UID: "8653c2ab-c097-435e-b694-5c894b6bdd11"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:04:53 crc kubenswrapper[4901]: I0309 03:04:53.317099 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8653c2ab-c097-435e-b694-5c894b6bdd11-config-data" (OuterVolumeSpecName: "config-data") pod "8653c2ab-c097-435e-b694-5c894b6bdd11" (UID: "8653c2ab-c097-435e-b694-5c894b6bdd11"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:04:53 crc kubenswrapper[4901]: I0309 03:04:53.363934 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8653c2ab-c097-435e-b694-5c894b6bdd11-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:53 crc kubenswrapper[4901]: I0309 03:04:53.363962 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9n9jd\" (UniqueName: \"kubernetes.io/projected/8653c2ab-c097-435e-b694-5c894b6bdd11-kube-api-access-9n9jd\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:53 crc kubenswrapper[4901]: I0309 03:04:53.363971 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8653c2ab-c097-435e-b694-5c894b6bdd11-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:53 crc kubenswrapper[4901]: I0309 03:04:53.415738 4901 generic.go:334] "Generic (PLEG): container finished" podID="8653c2ab-c097-435e-b694-5c894b6bdd11" containerID="5525bd22a7f2778ddd76385ce3c2eaf6dc537d127ed1059b1bcae84388c30602" exitCode=0 Mar 09 03:04:53 crc kubenswrapper[4901]: I0309 03:04:53.415792 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8653c2ab-c097-435e-b694-5c894b6bdd11","Type":"ContainerDied","Data":"5525bd22a7f2778ddd76385ce3c2eaf6dc537d127ed1059b1bcae84388c30602"} Mar 09 03:04:53 crc kubenswrapper[4901]: I0309 03:04:53.416080 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8653c2ab-c097-435e-b694-5c894b6bdd11","Type":"ContainerDied","Data":"43c1362482fc71ee4beee7cac5ce56c56f84ae35e91d7b3cef3569a3d2bcb581"} Mar 09 03:04:53 crc kubenswrapper[4901]: I0309 03:04:53.416106 4901 scope.go:117] "RemoveContainer" containerID="5525bd22a7f2778ddd76385ce3c2eaf6dc537d127ed1059b1bcae84388c30602" Mar 09 03:04:53 crc kubenswrapper[4901]: I0309 03:04:53.415832 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 03:04:53 crc kubenswrapper[4901]: I0309 03:04:53.469125 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 03:04:53 crc kubenswrapper[4901]: I0309 03:04:53.485170 4901 scope.go:117] "RemoveContainer" containerID="5525bd22a7f2778ddd76385ce3c2eaf6dc537d127ed1059b1bcae84388c30602" Mar 09 03:04:53 crc kubenswrapper[4901]: E0309 03:04:53.487866 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5525bd22a7f2778ddd76385ce3c2eaf6dc537d127ed1059b1bcae84388c30602\": container with ID starting with 5525bd22a7f2778ddd76385ce3c2eaf6dc537d127ed1059b1bcae84388c30602 not found: ID does not exist" containerID="5525bd22a7f2778ddd76385ce3c2eaf6dc537d127ed1059b1bcae84388c30602" Mar 09 03:04:53 crc kubenswrapper[4901]: I0309 03:04:53.487901 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5525bd22a7f2778ddd76385ce3c2eaf6dc537d127ed1059b1bcae84388c30602"} err="failed to get container status \"5525bd22a7f2778ddd76385ce3c2eaf6dc537d127ed1059b1bcae84388c30602\": rpc error: code = NotFound desc = could not find container \"5525bd22a7f2778ddd76385ce3c2eaf6dc537d127ed1059b1bcae84388c30602\": container with ID starting with 5525bd22a7f2778ddd76385ce3c2eaf6dc537d127ed1059b1bcae84388c30602 not found: ID does not exist" Mar 09 03:04:53 crc kubenswrapper[4901]: I0309 03:04:53.489547 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 03:04:53 crc kubenswrapper[4901]: I0309 03:04:53.506759 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 03:04:53 crc kubenswrapper[4901]: E0309 03:04:53.507160 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8653c2ab-c097-435e-b694-5c894b6bdd11" containerName="nova-scheduler-scheduler" Mar 09 03:04:53 crc kubenswrapper[4901]: I0309 03:04:53.507173 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="8653c2ab-c097-435e-b694-5c894b6bdd11" containerName="nova-scheduler-scheduler" Mar 09 03:04:53 crc kubenswrapper[4901]: I0309 03:04:53.507444 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="8653c2ab-c097-435e-b694-5c894b6bdd11" containerName="nova-scheduler-scheduler" Mar 09 03:04:53 crc kubenswrapper[4901]: I0309 03:04:53.508059 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 03:04:53 crc kubenswrapper[4901]: I0309 03:04:53.511565 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 09 03:04:53 crc kubenswrapper[4901]: I0309 03:04:53.530494 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 03:04:53 crc kubenswrapper[4901]: I0309 03:04:53.567948 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b3d0806-00f0-46d7-a77f-f505583e49a2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6b3d0806-00f0-46d7-a77f-f505583e49a2\") " pod="openstack/nova-scheduler-0" Mar 09 03:04:53 crc kubenswrapper[4901]: I0309 03:04:53.568021 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjt8j\" (UniqueName: \"kubernetes.io/projected/6b3d0806-00f0-46d7-a77f-f505583e49a2-kube-api-access-mjt8j\") pod \"nova-scheduler-0\" (UID: \"6b3d0806-00f0-46d7-a77f-f505583e49a2\") " pod="openstack/nova-scheduler-0" Mar 09 03:04:53 crc kubenswrapper[4901]: I0309 03:04:53.568280 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b3d0806-00f0-46d7-a77f-f505583e49a2-config-data\") pod \"nova-scheduler-0\" (UID: \"6b3d0806-00f0-46d7-a77f-f505583e49a2\") " pod="openstack/nova-scheduler-0" Mar 09 03:04:53 crc kubenswrapper[4901]: I0309 03:04:53.669672 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b3d0806-00f0-46d7-a77f-f505583e49a2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6b3d0806-00f0-46d7-a77f-f505583e49a2\") " pod="openstack/nova-scheduler-0" Mar 09 03:04:53 crc kubenswrapper[4901]: I0309 03:04:53.669740 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjt8j\" (UniqueName: \"kubernetes.io/projected/6b3d0806-00f0-46d7-a77f-f505583e49a2-kube-api-access-mjt8j\") pod \"nova-scheduler-0\" (UID: \"6b3d0806-00f0-46d7-a77f-f505583e49a2\") " pod="openstack/nova-scheduler-0" Mar 09 03:04:53 crc kubenswrapper[4901]: I0309 03:04:53.669836 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b3d0806-00f0-46d7-a77f-f505583e49a2-config-data\") pod \"nova-scheduler-0\" (UID: \"6b3d0806-00f0-46d7-a77f-f505583e49a2\") " pod="openstack/nova-scheduler-0" Mar 09 03:04:53 crc kubenswrapper[4901]: I0309 03:04:53.674962 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b3d0806-00f0-46d7-a77f-f505583e49a2-config-data\") pod \"nova-scheduler-0\" (UID: \"6b3d0806-00f0-46d7-a77f-f505583e49a2\") " pod="openstack/nova-scheduler-0" Mar 09 03:04:53 crc kubenswrapper[4901]: I0309 03:04:53.680927 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b3d0806-00f0-46d7-a77f-f505583e49a2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6b3d0806-00f0-46d7-a77f-f505583e49a2\") " pod="openstack/nova-scheduler-0" Mar 09 03:04:53 crc kubenswrapper[4901]: I0309 03:04:53.687302 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjt8j\" (UniqueName: \"kubernetes.io/projected/6b3d0806-00f0-46d7-a77f-f505583e49a2-kube-api-access-mjt8j\") pod \"nova-scheduler-0\" (UID: \"6b3d0806-00f0-46d7-a77f-f505583e49a2\") " pod="openstack/nova-scheduler-0" Mar 09 03:04:53 crc kubenswrapper[4901]: I0309 03:04:53.811806 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 09 03:04:53 crc kubenswrapper[4901]: W0309 03:04:53.813742 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab8ead2b_4e3f_4d04_b16c_7b2a08b5aa9d.slice/crio-dd9b08f42cb5c4cb3a7f77bfdc8c4d6f8d8459932ce797310f693a7e030be0cf WatchSource:0}: Error finding container dd9b08f42cb5c4cb3a7f77bfdc8c4d6f8d8459932ce797310f693a7e030be0cf: Status 404 returned error can't find the container with id dd9b08f42cb5c4cb3a7f77bfdc8c4d6f8d8459932ce797310f693a7e030be0cf Mar 09 03:04:53 crc kubenswrapper[4901]: I0309 03:04:53.832899 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 03:04:54 crc kubenswrapper[4901]: I0309 03:04:54.117087 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8653c2ab-c097-435e-b694-5c894b6bdd11" path="/var/lib/kubelet/pods/8653c2ab-c097-435e-b694-5c894b6bdd11/volumes" Mar 09 03:04:54 crc kubenswrapper[4901]: I0309 03:04:54.118576 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af4527fd-88dd-47e5-bde9-7813caf84a2d" path="/var/lib/kubelet/pods/af4527fd-88dd-47e5-bde9-7813caf84a2d/volumes" Mar 09 03:04:54 crc kubenswrapper[4901]: I0309 03:04:54.331233 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 03:04:54 crc kubenswrapper[4901]: W0309 03:04:54.342388 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b3d0806_00f0_46d7_a77f_f505583e49a2.slice/crio-905d7cd97d4c58eb8b586f62c5b5e78d65d3ffb136122fb1c755faa875c96459 WatchSource:0}: Error finding container 905d7cd97d4c58eb8b586f62c5b5e78d65d3ffb136122fb1c755faa875c96459: Status 404 returned error can't find the container with id 905d7cd97d4c58eb8b586f62c5b5e78d65d3ffb136122fb1c755faa875c96459 Mar 09 03:04:54 crc kubenswrapper[4901]: I0309 03:04:54.433293 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d","Type":"ContainerStarted","Data":"f97ba1fa97a190b5ca6b6e5bbe19c6161fb765c75b9fffa80f6b76be75c0c237"} Mar 09 03:04:54 crc kubenswrapper[4901]: I0309 03:04:54.433336 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d","Type":"ContainerStarted","Data":"8f3e0ecc8701ef810ffac6d0b7ee7465e40d3318f8d6cae6a1382396e02c80d0"} Mar 09 03:04:54 crc kubenswrapper[4901]: I0309 03:04:54.433347 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d","Type":"ContainerStarted","Data":"dd9b08f42cb5c4cb3a7f77bfdc8c4d6f8d8459932ce797310f693a7e030be0cf"} Mar 09 03:04:54 crc kubenswrapper[4901]: I0309 03:04:54.440526 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6b3d0806-00f0-46d7-a77f-f505583e49a2","Type":"ContainerStarted","Data":"905d7cd97d4c58eb8b586f62c5b5e78d65d3ffb136122fb1c755faa875c96459"} Mar 09 03:04:54 crc kubenswrapper[4901]: I0309 03:04:54.464343 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.464319254 podStartE2EDuration="2.464319254s" podCreationTimestamp="2026-03-09 03:04:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 03:04:54.460734513 +0000 UTC m=+1419.050398255" watchObservedRunningTime="2026-03-09 03:04:54.464319254 +0000 UTC m=+1419.053982996" Mar 09 03:04:54 crc kubenswrapper[4901]: I0309 03:04:54.813307 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="0c7fbb65-658d-416a-85da-243a966b9bc9" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": read tcp 10.217.0.2:53900->10.217.0.197:8775: read: connection reset by peer" Mar 09 03:04:54 crc kubenswrapper[4901]: I0309 03:04:54.813330 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="0c7fbb65-658d-416a-85da-243a966b9bc9" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": read tcp 10.217.0.2:53888->10.217.0.197:8775: read: connection reset by peer" Mar 09 03:04:55 crc kubenswrapper[4901]: I0309 03:04:55.370496 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 03:04:55 crc kubenswrapper[4901]: I0309 03:04:55.454998 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6b3d0806-00f0-46d7-a77f-f505583e49a2","Type":"ContainerStarted","Data":"5254cd71ad47a3d4e3364e0e37c621c03d973081d495139c8fd33e59e315cc27"} Mar 09 03:04:55 crc kubenswrapper[4901]: I0309 03:04:55.461864 4901 generic.go:334] "Generic (PLEG): container finished" podID="0c7fbb65-658d-416a-85da-243a966b9bc9" containerID="84cf5d3e65a2b63112b3e902289abee82dcc47de580617ca860d4fe52534a346" exitCode=0 Mar 09 03:04:55 crc kubenswrapper[4901]: I0309 03:04:55.462723 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 03:04:55 crc kubenswrapper[4901]: I0309 03:04:55.462880 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0c7fbb65-658d-416a-85da-243a966b9bc9","Type":"ContainerDied","Data":"84cf5d3e65a2b63112b3e902289abee82dcc47de580617ca860d4fe52534a346"} Mar 09 03:04:55 crc kubenswrapper[4901]: I0309 03:04:55.462909 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0c7fbb65-658d-416a-85da-243a966b9bc9","Type":"ContainerDied","Data":"0b697d69770acc967f3d5a01316656bf4883a8aa0eb64a1805dfde0d5bc66e6f"} Mar 09 03:04:55 crc kubenswrapper[4901]: I0309 03:04:55.462925 4901 scope.go:117] "RemoveContainer" containerID="84cf5d3e65a2b63112b3e902289abee82dcc47de580617ca860d4fe52534a346" Mar 09 03:04:55 crc kubenswrapper[4901]: I0309 03:04:55.485057 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.485040539 podStartE2EDuration="2.485040539s" podCreationTimestamp="2026-03-09 03:04:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 03:04:55.482751281 +0000 UTC m=+1420.072415053" watchObservedRunningTime="2026-03-09 03:04:55.485040539 +0000 UTC m=+1420.074704271" Mar 09 03:04:55 crc kubenswrapper[4901]: I0309 03:04:55.496850 4901 scope.go:117] "RemoveContainer" containerID="dca3cac93605433ad08ce4bac4c9419914d58e497e5fb2f136941444b9e7a625" Mar 09 03:04:55 crc kubenswrapper[4901]: I0309 03:04:55.507715 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c7fbb65-658d-416a-85da-243a966b9bc9-combined-ca-bundle\") pod \"0c7fbb65-658d-416a-85da-243a966b9bc9\" (UID: \"0c7fbb65-658d-416a-85da-243a966b9bc9\") " Mar 09 03:04:55 crc kubenswrapper[4901]: I0309 03:04:55.507876 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c7fbb65-658d-416a-85da-243a966b9bc9-logs\") pod \"0c7fbb65-658d-416a-85da-243a966b9bc9\" (UID: \"0c7fbb65-658d-416a-85da-243a966b9bc9\") " Mar 09 03:04:55 crc kubenswrapper[4901]: I0309 03:04:55.508077 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4plp\" (UniqueName: \"kubernetes.io/projected/0c7fbb65-658d-416a-85da-243a966b9bc9-kube-api-access-f4plp\") pod \"0c7fbb65-658d-416a-85da-243a966b9bc9\" (UID: \"0c7fbb65-658d-416a-85da-243a966b9bc9\") " Mar 09 03:04:55 crc kubenswrapper[4901]: I0309 03:04:55.508110 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c7fbb65-658d-416a-85da-243a966b9bc9-config-data\") pod \"0c7fbb65-658d-416a-85da-243a966b9bc9\" (UID: \"0c7fbb65-658d-416a-85da-243a966b9bc9\") " Mar 09 03:04:55 crc kubenswrapper[4901]: I0309 03:04:55.508270 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c7fbb65-658d-416a-85da-243a966b9bc9-nova-metadata-tls-certs\") pod \"0c7fbb65-658d-416a-85da-243a966b9bc9\" (UID: \"0c7fbb65-658d-416a-85da-243a966b9bc9\") " Mar 09 03:04:55 crc kubenswrapper[4901]: I0309 03:04:55.508863 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c7fbb65-658d-416a-85da-243a966b9bc9-logs" (OuterVolumeSpecName: "logs") pod "0c7fbb65-658d-416a-85da-243a966b9bc9" (UID: "0c7fbb65-658d-416a-85da-243a966b9bc9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:04:55 crc kubenswrapper[4901]: I0309 03:04:55.509071 4901 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c7fbb65-658d-416a-85da-243a966b9bc9-logs\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:55 crc kubenswrapper[4901]: I0309 03:04:55.518797 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c7fbb65-658d-416a-85da-243a966b9bc9-kube-api-access-f4plp" (OuterVolumeSpecName: "kube-api-access-f4plp") pod "0c7fbb65-658d-416a-85da-243a966b9bc9" (UID: "0c7fbb65-658d-416a-85da-243a966b9bc9"). InnerVolumeSpecName "kube-api-access-f4plp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:04:55 crc kubenswrapper[4901]: I0309 03:04:55.528791 4901 scope.go:117] "RemoveContainer" containerID="84cf5d3e65a2b63112b3e902289abee82dcc47de580617ca860d4fe52534a346" Mar 09 03:04:55 crc kubenswrapper[4901]: E0309 03:04:55.536480 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84cf5d3e65a2b63112b3e902289abee82dcc47de580617ca860d4fe52534a346\": container with ID starting with 84cf5d3e65a2b63112b3e902289abee82dcc47de580617ca860d4fe52534a346 not found: ID does not exist" containerID="84cf5d3e65a2b63112b3e902289abee82dcc47de580617ca860d4fe52534a346" Mar 09 03:04:55 crc kubenswrapper[4901]: I0309 03:04:55.536535 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84cf5d3e65a2b63112b3e902289abee82dcc47de580617ca860d4fe52534a346"} err="failed to get container status \"84cf5d3e65a2b63112b3e902289abee82dcc47de580617ca860d4fe52534a346\": rpc error: code = NotFound desc = could not find container \"84cf5d3e65a2b63112b3e902289abee82dcc47de580617ca860d4fe52534a346\": container with ID starting with 84cf5d3e65a2b63112b3e902289abee82dcc47de580617ca860d4fe52534a346 not found: ID does not exist" Mar 09 03:04:55 crc kubenswrapper[4901]: I0309 03:04:55.536563 4901 scope.go:117] "RemoveContainer" containerID="dca3cac93605433ad08ce4bac4c9419914d58e497e5fb2f136941444b9e7a625" Mar 09 03:04:55 crc kubenswrapper[4901]: E0309 03:04:55.537077 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dca3cac93605433ad08ce4bac4c9419914d58e497e5fb2f136941444b9e7a625\": container with ID starting with dca3cac93605433ad08ce4bac4c9419914d58e497e5fb2f136941444b9e7a625 not found: ID does not exist" containerID="dca3cac93605433ad08ce4bac4c9419914d58e497e5fb2f136941444b9e7a625" Mar 09 03:04:55 crc kubenswrapper[4901]: I0309 03:04:55.537142 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dca3cac93605433ad08ce4bac4c9419914d58e497e5fb2f136941444b9e7a625"} err="failed to get container status \"dca3cac93605433ad08ce4bac4c9419914d58e497e5fb2f136941444b9e7a625\": rpc error: code = NotFound desc = could not find container \"dca3cac93605433ad08ce4bac4c9419914d58e497e5fb2f136941444b9e7a625\": container with ID starting with dca3cac93605433ad08ce4bac4c9419914d58e497e5fb2f136941444b9e7a625 not found: ID does not exist" Mar 09 03:04:55 crc kubenswrapper[4901]: I0309 03:04:55.540832 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c7fbb65-658d-416a-85da-243a966b9bc9-config-data" (OuterVolumeSpecName: "config-data") pod "0c7fbb65-658d-416a-85da-243a966b9bc9" (UID: "0c7fbb65-658d-416a-85da-243a966b9bc9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:04:55 crc kubenswrapper[4901]: I0309 03:04:55.563807 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c7fbb65-658d-416a-85da-243a966b9bc9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0c7fbb65-658d-416a-85da-243a966b9bc9" (UID: "0c7fbb65-658d-416a-85da-243a966b9bc9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:04:55 crc kubenswrapper[4901]: I0309 03:04:55.586296 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c7fbb65-658d-416a-85da-243a966b9bc9-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "0c7fbb65-658d-416a-85da-243a966b9bc9" (UID: "0c7fbb65-658d-416a-85da-243a966b9bc9"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:04:55 crc kubenswrapper[4901]: I0309 03:04:55.610716 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c7fbb65-658d-416a-85da-243a966b9bc9-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:55 crc kubenswrapper[4901]: I0309 03:04:55.610751 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4plp\" (UniqueName: \"kubernetes.io/projected/0c7fbb65-658d-416a-85da-243a966b9bc9-kube-api-access-f4plp\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:55 crc kubenswrapper[4901]: I0309 03:04:55.610763 4901 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c7fbb65-658d-416a-85da-243a966b9bc9-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:55 crc kubenswrapper[4901]: I0309 03:04:55.610772 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c7fbb65-658d-416a-85da-243a966b9bc9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:04:55 crc kubenswrapper[4901]: I0309 03:04:55.798696 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 03:04:55 crc kubenswrapper[4901]: I0309 03:04:55.809762 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 03:04:55 crc kubenswrapper[4901]: I0309 03:04:55.825394 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 09 03:04:55 crc kubenswrapper[4901]: E0309 03:04:55.825745 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c7fbb65-658d-416a-85da-243a966b9bc9" containerName="nova-metadata-log" Mar 09 03:04:55 crc kubenswrapper[4901]: I0309 03:04:55.825761 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c7fbb65-658d-416a-85da-243a966b9bc9" containerName="nova-metadata-log" Mar 09 03:04:55 crc kubenswrapper[4901]: E0309 03:04:55.825776 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c7fbb65-658d-416a-85da-243a966b9bc9" containerName="nova-metadata-metadata" Mar 09 03:04:55 crc kubenswrapper[4901]: I0309 03:04:55.825782 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c7fbb65-658d-416a-85da-243a966b9bc9" containerName="nova-metadata-metadata" Mar 09 03:04:55 crc kubenswrapper[4901]: I0309 03:04:55.826143 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c7fbb65-658d-416a-85da-243a966b9bc9" containerName="nova-metadata-log" Mar 09 03:04:55 crc kubenswrapper[4901]: I0309 03:04:55.826177 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c7fbb65-658d-416a-85da-243a966b9bc9" containerName="nova-metadata-metadata" Mar 09 03:04:55 crc kubenswrapper[4901]: I0309 03:04:55.827133 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 03:04:55 crc kubenswrapper[4901]: I0309 03:04:55.834392 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 09 03:04:55 crc kubenswrapper[4901]: I0309 03:04:55.836637 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 09 03:04:55 crc kubenswrapper[4901]: I0309 03:04:55.844967 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 03:04:55 crc kubenswrapper[4901]: I0309 03:04:55.917598 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cd100e4-dfd3-45a7-a97c-84a05c352883-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7cd100e4-dfd3-45a7-a97c-84a05c352883\") " pod="openstack/nova-metadata-0" Mar 09 03:04:55 crc kubenswrapper[4901]: I0309 03:04:55.917686 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cd100e4-dfd3-45a7-a97c-84a05c352883-config-data\") pod \"nova-metadata-0\" (UID: \"7cd100e4-dfd3-45a7-a97c-84a05c352883\") " pod="openstack/nova-metadata-0" Mar 09 03:04:55 crc kubenswrapper[4901]: I0309 03:04:55.917719 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chvdf\" (UniqueName: \"kubernetes.io/projected/7cd100e4-dfd3-45a7-a97c-84a05c352883-kube-api-access-chvdf\") pod \"nova-metadata-0\" (UID: \"7cd100e4-dfd3-45a7-a97c-84a05c352883\") " pod="openstack/nova-metadata-0" Mar 09 03:04:55 crc kubenswrapper[4901]: I0309 03:04:55.917748 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cd100e4-dfd3-45a7-a97c-84a05c352883-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7cd100e4-dfd3-45a7-a97c-84a05c352883\") " pod="openstack/nova-metadata-0" Mar 09 03:04:55 crc kubenswrapper[4901]: I0309 03:04:55.917802 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cd100e4-dfd3-45a7-a97c-84a05c352883-logs\") pod \"nova-metadata-0\" (UID: \"7cd100e4-dfd3-45a7-a97c-84a05c352883\") " pod="openstack/nova-metadata-0" Mar 09 03:04:56 crc kubenswrapper[4901]: I0309 03:04:56.019064 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cd100e4-dfd3-45a7-a97c-84a05c352883-config-data\") pod \"nova-metadata-0\" (UID: \"7cd100e4-dfd3-45a7-a97c-84a05c352883\") " pod="openstack/nova-metadata-0" Mar 09 03:04:56 crc kubenswrapper[4901]: I0309 03:04:56.019105 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chvdf\" (UniqueName: \"kubernetes.io/projected/7cd100e4-dfd3-45a7-a97c-84a05c352883-kube-api-access-chvdf\") pod \"nova-metadata-0\" (UID: \"7cd100e4-dfd3-45a7-a97c-84a05c352883\") " pod="openstack/nova-metadata-0" Mar 09 03:04:56 crc kubenswrapper[4901]: I0309 03:04:56.019133 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cd100e4-dfd3-45a7-a97c-84a05c352883-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7cd100e4-dfd3-45a7-a97c-84a05c352883\") " pod="openstack/nova-metadata-0" Mar 09 03:04:56 crc kubenswrapper[4901]: I0309 03:04:56.019182 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cd100e4-dfd3-45a7-a97c-84a05c352883-logs\") pod \"nova-metadata-0\" (UID: \"7cd100e4-dfd3-45a7-a97c-84a05c352883\") " pod="openstack/nova-metadata-0" Mar 09 03:04:56 crc kubenswrapper[4901]: I0309 03:04:56.019274 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cd100e4-dfd3-45a7-a97c-84a05c352883-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7cd100e4-dfd3-45a7-a97c-84a05c352883\") " pod="openstack/nova-metadata-0" Mar 09 03:04:56 crc kubenswrapper[4901]: I0309 03:04:56.020355 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cd100e4-dfd3-45a7-a97c-84a05c352883-logs\") pod \"nova-metadata-0\" (UID: \"7cd100e4-dfd3-45a7-a97c-84a05c352883\") " pod="openstack/nova-metadata-0" Mar 09 03:04:56 crc kubenswrapper[4901]: I0309 03:04:56.023418 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cd100e4-dfd3-45a7-a97c-84a05c352883-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7cd100e4-dfd3-45a7-a97c-84a05c352883\") " pod="openstack/nova-metadata-0" Mar 09 03:04:56 crc kubenswrapper[4901]: I0309 03:04:56.023426 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cd100e4-dfd3-45a7-a97c-84a05c352883-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7cd100e4-dfd3-45a7-a97c-84a05c352883\") " pod="openstack/nova-metadata-0" Mar 09 03:04:56 crc kubenswrapper[4901]: I0309 03:04:56.032261 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cd100e4-dfd3-45a7-a97c-84a05c352883-config-data\") pod \"nova-metadata-0\" (UID: \"7cd100e4-dfd3-45a7-a97c-84a05c352883\") " pod="openstack/nova-metadata-0" Mar 09 03:04:56 crc kubenswrapper[4901]: I0309 03:04:56.037759 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chvdf\" (UniqueName: \"kubernetes.io/projected/7cd100e4-dfd3-45a7-a97c-84a05c352883-kube-api-access-chvdf\") pod \"nova-metadata-0\" (UID: \"7cd100e4-dfd3-45a7-a97c-84a05c352883\") " pod="openstack/nova-metadata-0" Mar 09 03:04:56 crc kubenswrapper[4901]: I0309 03:04:56.118349 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c7fbb65-658d-416a-85da-243a966b9bc9" path="/var/lib/kubelet/pods/0c7fbb65-658d-416a-85da-243a966b9bc9/volumes" Mar 09 03:04:56 crc kubenswrapper[4901]: I0309 03:04:56.148139 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 03:04:56 crc kubenswrapper[4901]: I0309 03:04:56.641975 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 03:04:56 crc kubenswrapper[4901]: W0309 03:04:56.644505 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7cd100e4_dfd3_45a7_a97c_84a05c352883.slice/crio-55cb610bc695f2d82cda0af0f6fffc59f7cbc2cd09f4aeebbcc8033406228d71 WatchSource:0}: Error finding container 55cb610bc695f2d82cda0af0f6fffc59f7cbc2cd09f4aeebbcc8033406228d71: Status 404 returned error can't find the container with id 55cb610bc695f2d82cda0af0f6fffc59f7cbc2cd09f4aeebbcc8033406228d71 Mar 09 03:04:57 crc kubenswrapper[4901]: I0309 03:04:57.486632 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7cd100e4-dfd3-45a7-a97c-84a05c352883","Type":"ContainerStarted","Data":"633856b9abfea0469521c94f58caeaeb509a2acfab5f38815cbd06bf2bc15a57"} Mar 09 03:04:57 crc kubenswrapper[4901]: I0309 03:04:57.486997 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7cd100e4-dfd3-45a7-a97c-84a05c352883","Type":"ContainerStarted","Data":"243e391338d48efa65297ce9611e5eccbd8854abfd2a58cc9b19ec8fa2a3478d"} Mar 09 03:04:57 crc kubenswrapper[4901]: I0309 03:04:57.487020 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7cd100e4-dfd3-45a7-a97c-84a05c352883","Type":"ContainerStarted","Data":"55cb610bc695f2d82cda0af0f6fffc59f7cbc2cd09f4aeebbcc8033406228d71"} Mar 09 03:04:57 crc kubenswrapper[4901]: I0309 03:04:57.513939 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.5139171620000003 podStartE2EDuration="2.513917162s" podCreationTimestamp="2026-03-09 03:04:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 03:04:57.513274316 +0000 UTC m=+1422.102938078" watchObservedRunningTime="2026-03-09 03:04:57.513917162 +0000 UTC m=+1422.103580934" Mar 09 03:04:58 crc kubenswrapper[4901]: I0309 03:04:58.833289 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 09 03:05:01 crc kubenswrapper[4901]: I0309 03:05:01.149044 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 09 03:05:01 crc kubenswrapper[4901]: I0309 03:05:01.149450 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 09 03:05:03 crc kubenswrapper[4901]: I0309 03:05:03.263170 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 09 03:05:03 crc kubenswrapper[4901]: I0309 03:05:03.263565 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 09 03:05:03 crc kubenswrapper[4901]: I0309 03:05:03.833806 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 09 03:05:03 crc kubenswrapper[4901]: I0309 03:05:03.887138 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 09 03:05:04 crc kubenswrapper[4901]: I0309 03:05:04.279444 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.208:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 03:05:04 crc kubenswrapper[4901]: I0309 03:05:04.279504 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.208:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 03:05:04 crc kubenswrapper[4901]: I0309 03:05:04.634274 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 09 03:05:06 crc kubenswrapper[4901]: I0309 03:05:06.149356 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 09 03:05:06 crc kubenswrapper[4901]: I0309 03:05:06.149723 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 09 03:05:07 crc kubenswrapper[4901]: I0309 03:05:07.162483 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7cd100e4-dfd3-45a7-a97c-84a05c352883" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.210:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 03:05:07 crc kubenswrapper[4901]: I0309 03:05:07.162569 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7cd100e4-dfd3-45a7-a97c-84a05c352883" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.210:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 03:05:10 crc kubenswrapper[4901]: I0309 03:05:10.744322 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 09 03:05:13 crc kubenswrapper[4901]: I0309 03:05:13.270422 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 09 03:05:13 crc kubenswrapper[4901]: I0309 03:05:13.271153 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 09 03:05:13 crc kubenswrapper[4901]: I0309 03:05:13.280591 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 09 03:05:13 crc kubenswrapper[4901]: I0309 03:05:13.294463 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 09 03:05:13 crc kubenswrapper[4901]: I0309 03:05:13.691165 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 09 03:05:13 crc kubenswrapper[4901]: I0309 03:05:13.715740 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 09 03:05:16 crc kubenswrapper[4901]: I0309 03:05:16.156744 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 09 03:05:16 crc kubenswrapper[4901]: I0309 03:05:16.160447 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 09 03:05:16 crc kubenswrapper[4901]: I0309 03:05:16.168586 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 09 03:05:16 crc kubenswrapper[4901]: I0309 03:05:16.737698 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 09 03:05:34 crc kubenswrapper[4901]: I0309 03:05:34.680865 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 09 03:05:34 crc kubenswrapper[4901]: I0309 03:05:34.681670 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="34bc86a8-8821-462a-b15b-c2f847f44be2" containerName="openstackclient" containerID="cri-o://1c5af01525dee027d91335bfc70f1c4c730cdc63e3d8be7b219fbbab05c905fa" gracePeriod=2 Mar 09 03:05:34 crc kubenswrapper[4901]: I0309 03:05:34.715834 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 09 03:05:34 crc kubenswrapper[4901]: I0309 03:05:34.724355 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-d247z"] Mar 09 03:05:34 crc kubenswrapper[4901]: E0309 03:05:34.724731 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34bc86a8-8821-462a-b15b-c2f847f44be2" containerName="openstackclient" Mar 09 03:05:34 crc kubenswrapper[4901]: I0309 03:05:34.724744 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="34bc86a8-8821-462a-b15b-c2f847f44be2" containerName="openstackclient" Mar 09 03:05:34 crc kubenswrapper[4901]: I0309 03:05:34.724951 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="34bc86a8-8821-462a-b15b-c2f847f44be2" containerName="openstackclient" Mar 09 03:05:34 crc kubenswrapper[4901]: I0309 03:05:34.725577 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-d247z" Mar 09 03:05:34 crc kubenswrapper[4901]: I0309 03:05:34.737845 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 09 03:05:34 crc kubenswrapper[4901]: I0309 03:05:34.739165 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlh9p\" (UniqueName: \"kubernetes.io/projected/d57347fc-0546-466f-95e6-055857ca3685-kube-api-access-vlh9p\") pod \"root-account-create-update-d247z\" (UID: \"d57347fc-0546-466f-95e6-055857ca3685\") " pod="openstack/root-account-create-update-d247z" Mar 09 03:05:34 crc kubenswrapper[4901]: I0309 03:05:34.739234 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d57347fc-0546-466f-95e6-055857ca3685-operator-scripts\") pod \"root-account-create-update-d247z\" (UID: \"d57347fc-0546-466f-95e6-055857ca3685\") " pod="openstack/root-account-create-update-d247z" Mar 09 03:05:34 crc kubenswrapper[4901]: I0309 03:05:34.760680 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-d247z"] Mar 09 03:05:34 crc kubenswrapper[4901]: I0309 03:05:34.806499 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-b7nv8"] Mar 09 03:05:34 crc kubenswrapper[4901]: I0309 03:05:34.833207 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-b7nv8"] Mar 09 03:05:34 crc kubenswrapper[4901]: I0309 03:05:34.842074 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlh9p\" (UniqueName: \"kubernetes.io/projected/d57347fc-0546-466f-95e6-055857ca3685-kube-api-access-vlh9p\") pod \"root-account-create-update-d247z\" (UID: \"d57347fc-0546-466f-95e6-055857ca3685\") " pod="openstack/root-account-create-update-d247z" Mar 09 03:05:34 crc kubenswrapper[4901]: I0309 03:05:34.842139 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d57347fc-0546-466f-95e6-055857ca3685-operator-scripts\") pod \"root-account-create-update-d247z\" (UID: \"d57347fc-0546-466f-95e6-055857ca3685\") " pod="openstack/root-account-create-update-d247z" Mar 09 03:05:34 crc kubenswrapper[4901]: I0309 03:05:34.842909 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d57347fc-0546-466f-95e6-055857ca3685-operator-scripts\") pod \"root-account-create-update-d247z\" (UID: \"d57347fc-0546-466f-95e6-055857ca3685\") " pod="openstack/root-account-create-update-d247z" Mar 09 03:05:34 crc kubenswrapper[4901]: I0309 03:05:34.925971 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlh9p\" (UniqueName: \"kubernetes.io/projected/d57347fc-0546-466f-95e6-055857ca3685-kube-api-access-vlh9p\") pod \"root-account-create-update-d247z\" (UID: \"d57347fc-0546-466f-95e6-055857ca3685\") " pod="openstack/root-account-create-update-d247z" Mar 09 03:05:34 crc kubenswrapper[4901]: I0309 03:05:34.934807 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 09 03:05:34 crc kubenswrapper[4901]: I0309 03:05:34.935484 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="c86c22f2-896c-4c29-95c7-024aea61dcd2" containerName="openstack-network-exporter" containerID="cri-o://85a90efd98e045964af92120b771705db922d73189fbe96e8a40862a41bec02c" gracePeriod=300 Mar 09 03:05:34 crc kubenswrapper[4901]: I0309 03:05:34.949680 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cddb-account-create-update-z7t9t"] Mar 09 03:05:34 crc kubenswrapper[4901]: I0309 03:05:34.951413 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cddb-account-create-update-z7t9t" Mar 09 03:05:34 crc kubenswrapper[4901]: I0309 03:05:34.954883 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 09 03:05:34 crc kubenswrapper[4901]: I0309 03:05:34.998003 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cddb-account-create-update-z7t9t"] Mar 09 03:05:35 crc kubenswrapper[4901]: I0309 03:05:35.027345 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-3c33-account-create-update-8vqvj"] Mar 09 03:05:35 crc kubenswrapper[4901]: I0309 03:05:35.030263 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-3c33-account-create-update-8vqvj" Mar 09 03:05:35 crc kubenswrapper[4901]: I0309 03:05:35.036034 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-3c33-account-create-update-8vqvj"] Mar 09 03:05:35 crc kubenswrapper[4901]: I0309 03:05:35.037405 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 09 03:05:35 crc kubenswrapper[4901]: I0309 03:05:35.059236 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-d247z" Mar 09 03:05:35 crc kubenswrapper[4901]: I0309 03:05:35.087529 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cddb-account-create-update-457m6"] Mar 09 03:05:35 crc kubenswrapper[4901]: I0309 03:05:35.120386 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cddb-account-create-update-457m6"] Mar 09 03:05:35 crc kubenswrapper[4901]: I0309 03:05:35.156925 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q867c\" (UniqueName: \"kubernetes.io/projected/97944a12-e740-486d-ab39-6b03818f3cbd-kube-api-access-q867c\") pod \"barbican-3c33-account-create-update-8vqvj\" (UID: \"97944a12-e740-486d-ab39-6b03818f3cbd\") " pod="openstack/barbican-3c33-account-create-update-8vqvj" Mar 09 03:05:35 crc kubenswrapper[4901]: I0309 03:05:35.156963 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6b8a05a-d698-4770-a883-fd60b61190b7-operator-scripts\") pod \"nova-cell1-cddb-account-create-update-z7t9t\" (UID: \"a6b8a05a-d698-4770-a883-fd60b61190b7\") " pod="openstack/nova-cell1-cddb-account-create-update-z7t9t" Mar 09 03:05:35 crc kubenswrapper[4901]: I0309 03:05:35.157000 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwprt\" (UniqueName: \"kubernetes.io/projected/a6b8a05a-d698-4770-a883-fd60b61190b7-kube-api-access-zwprt\") pod \"nova-cell1-cddb-account-create-update-z7t9t\" (UID: \"a6b8a05a-d698-4770-a883-fd60b61190b7\") " pod="openstack/nova-cell1-cddb-account-create-update-z7t9t" Mar 09 03:05:35 crc kubenswrapper[4901]: I0309 03:05:35.157042 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97944a12-e740-486d-ab39-6b03818f3cbd-operator-scripts\") pod \"barbican-3c33-account-create-update-8vqvj\" (UID: \"97944a12-e740-486d-ab39-6b03818f3cbd\") " pod="openstack/barbican-3c33-account-create-update-8vqvj" Mar 09 03:05:35 crc kubenswrapper[4901]: I0309 03:05:35.157119 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-3c33-account-create-update-5dpsl"] Mar 09 03:05:35 crc kubenswrapper[4901]: I0309 03:05:35.172342 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-3c33-account-create-update-5dpsl"] Mar 09 03:05:35 crc kubenswrapper[4901]: I0309 03:05:35.188187 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="c86c22f2-896c-4c29-95c7-024aea61dcd2" containerName="ovsdbserver-nb" containerID="cri-o://f60874d330498787b4f53dbd548f5bbb7d7609369bb61c9f558d83dea563dbb7" gracePeriod=300 Mar 09 03:05:35 crc kubenswrapper[4901]: I0309 03:05:35.200367 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 03:05:35 crc kubenswrapper[4901]: I0309 03:05:35.219710 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-149a-account-create-update-727vf"] Mar 09 03:05:35 crc kubenswrapper[4901]: I0309 03:05:35.244547 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-149a-account-create-update-727vf"] Mar 09 03:05:35 crc kubenswrapper[4901]: I0309 03:05:35.260297 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q867c\" (UniqueName: \"kubernetes.io/projected/97944a12-e740-486d-ab39-6b03818f3cbd-kube-api-access-q867c\") pod \"barbican-3c33-account-create-update-8vqvj\" (UID: \"97944a12-e740-486d-ab39-6b03818f3cbd\") " pod="openstack/barbican-3c33-account-create-update-8vqvj" Mar 09 03:05:35 crc kubenswrapper[4901]: I0309 03:05:35.260339 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6b8a05a-d698-4770-a883-fd60b61190b7-operator-scripts\") pod \"nova-cell1-cddb-account-create-update-z7t9t\" (UID: \"a6b8a05a-d698-4770-a883-fd60b61190b7\") " pod="openstack/nova-cell1-cddb-account-create-update-z7t9t" Mar 09 03:05:35 crc kubenswrapper[4901]: I0309 03:05:35.260383 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwprt\" (UniqueName: \"kubernetes.io/projected/a6b8a05a-d698-4770-a883-fd60b61190b7-kube-api-access-zwprt\") pod \"nova-cell1-cddb-account-create-update-z7t9t\" (UID: \"a6b8a05a-d698-4770-a883-fd60b61190b7\") " pod="openstack/nova-cell1-cddb-account-create-update-z7t9t" Mar 09 03:05:35 crc kubenswrapper[4901]: I0309 03:05:35.260436 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97944a12-e740-486d-ab39-6b03818f3cbd-operator-scripts\") pod \"barbican-3c33-account-create-update-8vqvj\" (UID: \"97944a12-e740-486d-ab39-6b03818f3cbd\") " pod="openstack/barbican-3c33-account-create-update-8vqvj" Mar 09 03:05:35 crc kubenswrapper[4901]: I0309 03:05:35.261178 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97944a12-e740-486d-ab39-6b03818f3cbd-operator-scripts\") pod \"barbican-3c33-account-create-update-8vqvj\" (UID: \"97944a12-e740-486d-ab39-6b03818f3cbd\") " pod="openstack/barbican-3c33-account-create-update-8vqvj" Mar 09 03:05:35 crc kubenswrapper[4901]: I0309 03:05:35.262164 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6b8a05a-d698-4770-a883-fd60b61190b7-operator-scripts\") pod \"nova-cell1-cddb-account-create-update-z7t9t\" (UID: \"a6b8a05a-d698-4770-a883-fd60b61190b7\") " pod="openstack/nova-cell1-cddb-account-create-update-z7t9t" Mar 09 03:05:35 crc kubenswrapper[4901]: I0309 03:05:35.280290 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-hvvzs"] Mar 09 03:05:35 crc kubenswrapper[4901]: I0309 03:05:35.303158 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q867c\" (UniqueName: \"kubernetes.io/projected/97944a12-e740-486d-ab39-6b03818f3cbd-kube-api-access-q867c\") pod \"barbican-3c33-account-create-update-8vqvj\" (UID: \"97944a12-e740-486d-ab39-6b03818f3cbd\") " pod="openstack/barbican-3c33-account-create-update-8vqvj" Mar 09 03:05:35 crc kubenswrapper[4901]: I0309 03:05:35.303420 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-hvvzs"] Mar 09 03:05:35 crc kubenswrapper[4901]: I0309 03:05:35.312744 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwprt\" (UniqueName: \"kubernetes.io/projected/a6b8a05a-d698-4770-a883-fd60b61190b7-kube-api-access-zwprt\") pod \"nova-cell1-cddb-account-create-update-z7t9t\" (UID: \"a6b8a05a-d698-4770-a883-fd60b61190b7\") " pod="openstack/nova-cell1-cddb-account-create-update-z7t9t" Mar 09 03:05:35 crc kubenswrapper[4901]: I0309 03:05:35.313314 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cddb-account-create-update-z7t9t" Mar 09 03:05:35 crc kubenswrapper[4901]: I0309 03:05:35.330307 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-24d57"] Mar 09 03:05:35 crc kubenswrapper[4901]: I0309 03:05:35.338663 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-24d57"] Mar 09 03:05:35 crc kubenswrapper[4901]: E0309 03:05:35.365607 4901 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 09 03:05:35 crc kubenswrapper[4901]: E0309 03:05:35.365654 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/98538e55-cb87-49e2-9fd5-fff06d7edfdd-config-data podName:98538e55-cb87-49e2-9fd5-fff06d7edfdd nodeName:}" failed. No retries permitted until 2026-03-09 03:05:35.865638896 +0000 UTC m=+1460.455302628 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/98538e55-cb87-49e2-9fd5-fff06d7edfdd-config-data") pod "rabbitmq-cell1-server-0" (UID: "98538e55-cb87-49e2-9fd5-fff06d7edfdd") : configmap "rabbitmq-cell1-config-data" not found Mar 09 03:05:35 crc kubenswrapper[4901]: I0309 03:05:35.417477 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Mar 09 03:05:35 crc kubenswrapper[4901]: I0309 03:05:35.417760 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="26f9c7a2-e2b4-4be1-8206-6c067702cc74" containerName="ovn-northd" containerID="cri-o://380ec713a35cadba56e726245f8b17f2443d117a6aac88cbd4d4d5386efa672d" gracePeriod=30 Mar 09 03:05:35 crc kubenswrapper[4901]: I0309 03:05:35.418138 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="26f9c7a2-e2b4-4be1-8206-6c067702cc74" containerName="openstack-network-exporter" containerID="cri-o://40a1a57654e5d5e4c9a9cd5ea7b2003a15b2859aff2f929b6d3f5d5857124593" gracePeriod=30 Mar 09 03:05:35 crc kubenswrapper[4901]: I0309 03:05:35.445976 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-e2c2-account-create-update-kdpd5"] Mar 09 03:05:35 crc kubenswrapper[4901]: I0309 03:05:35.457765 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-e2c2-account-create-update-kdpd5"] Mar 09 03:05:35 crc kubenswrapper[4901]: I0309 03:05:35.464285 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-bxxsr"] Mar 09 03:05:35 crc kubenswrapper[4901]: I0309 03:05:35.472155 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-bxxsr"] Mar 09 03:05:35 crc kubenswrapper[4901]: I0309 03:05:35.522706 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-3c33-account-create-update-8vqvj" Mar 09 03:05:35 crc kubenswrapper[4901]: I0309 03:05:35.586550 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-kb7jk"] Mar 09 03:05:35 crc kubenswrapper[4901]: I0309 03:05:35.604973 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-kb7jk"] Mar 09 03:05:35 crc kubenswrapper[4901]: I0309 03:05:35.684176 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-lb2xt"] Mar 09 03:05:35 crc kubenswrapper[4901]: I0309 03:05:35.734727 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-lb2xt"] Mar 09 03:05:35 crc kubenswrapper[4901]: I0309 03:05:35.816292 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-hltph"] Mar 09 03:05:35 crc kubenswrapper[4901]: I0309 03:05:35.852049 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-2mh7b"] Mar 09 03:05:35 crc kubenswrapper[4901]: I0309 03:05:35.866321 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-2mh7b"] Mar 09 03:05:35 crc kubenswrapper[4901]: I0309 03:05:35.878393 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-5rg5k"] Mar 09 03:05:35 crc kubenswrapper[4901]: E0309 03:05:35.896638 4901 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 09 03:05:35 crc kubenswrapper[4901]: E0309 03:05:35.896705 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/98538e55-cb87-49e2-9fd5-fff06d7edfdd-config-data podName:98538e55-cb87-49e2-9fd5-fff06d7edfdd nodeName:}" failed. No retries permitted until 2026-03-09 03:05:36.896689697 +0000 UTC m=+1461.486353419 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/98538e55-cb87-49e2-9fd5-fff06d7edfdd-config-data") pod "rabbitmq-cell1-server-0" (UID: "98538e55-cb87-49e2-9fd5-fff06d7edfdd") : configmap "rabbitmq-cell1-config-data" not found Mar 09 03:05:35 crc kubenswrapper[4901]: I0309 03:05:35.911806 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-nmx84"] Mar 09 03:05:35 crc kubenswrapper[4901]: I0309 03:05:35.912016 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-nmx84" podUID="065dfe75-7489-4b15-8a4d-4adf13393aea" containerName="openstack-network-exporter" containerID="cri-o://97ab13e50a94e0d652fdbd979cbe25cc835cb2286e4d09687a06d52e3b5f01f1" gracePeriod=30 Mar 09 03:05:35 crc kubenswrapper[4901]: I0309 03:05:35.957379 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c8964d89c-vw8lc"] Mar 09 03:05:35 crc kubenswrapper[4901]: I0309 03:05:35.957636 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-c8964d89c-vw8lc" podUID="4ba69329-2c9f-4938-89b0-d1fa314d5a30" containerName="dnsmasq-dns" containerID="cri-o://2023e3fb3e8f7021afadc49905f2c75c9b97db5b0d3ac7172346d27ce1b5d2f7" gracePeriod=10 Mar 09 03:05:35 crc kubenswrapper[4901]: I0309 03:05:35.980730 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-wbqpx"] Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.013323 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-wbqpx"] Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.030496 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.045651 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5ddb85f7bb-phpwc"] Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.045880 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-5ddb85f7bb-phpwc" podUID="966d96ae-fba9-4ecd-85e5-a81cecfb2ed3" containerName="placement-log" containerID="cri-o://4da5234fa05d69c236f53cbbc103505a093111b7b3b09f7a401eaded8dc333cb" gracePeriod=30 Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.046390 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-5ddb85f7bb-phpwc" podUID="966d96ae-fba9-4ecd-85e5-a81cecfb2ed3" containerName="placement-api" containerID="cri-o://ff023d589de2e36a5396ff975d9b99b7c27af8bbe416330e6c016841896c5a55" gracePeriod=30 Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.068533 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-kzq5q"] Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.071074 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_c86c22f2-896c-4c29-95c7-024aea61dcd2/ovsdbserver-nb/0.log" Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.095479 4901 generic.go:334] "Generic (PLEG): container finished" podID="c86c22f2-896c-4c29-95c7-024aea61dcd2" containerID="85a90efd98e045964af92120b771705db922d73189fbe96e8a40862a41bec02c" exitCode=2 Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.095504 4901 generic.go:334] "Generic (PLEG): container finished" podID="c86c22f2-896c-4c29-95c7-024aea61dcd2" containerID="f60874d330498787b4f53dbd548f5bbb7d7609369bb61c9f558d83dea563dbb7" exitCode=143 Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.095628 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-kzq5q"] Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.095650 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"c86c22f2-896c-4c29-95c7-024aea61dcd2","Type":"ContainerDied","Data":"85a90efd98e045964af92120b771705db922d73189fbe96e8a40862a41bec02c"} Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.095680 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"c86c22f2-896c-4c29-95c7-024aea61dcd2","Type":"ContainerDied","Data":"f60874d330498787b4f53dbd548f5bbb7d7609369bb61c9f558d83dea563dbb7"} Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.102012 4901 generic.go:334] "Generic (PLEG): container finished" podID="26f9c7a2-e2b4-4be1-8206-6c067702cc74" containerID="40a1a57654e5d5e4c9a9cd5ea7b2003a15b2859aff2f929b6d3f5d5857124593" exitCode=2 Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.102044 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"26f9c7a2-e2b4-4be1-8206-6c067702cc74","Type":"ContainerDied","Data":"40a1a57654e5d5e4c9a9cd5ea7b2003a15b2859aff2f929b6d3f5d5857124593"} Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.102139 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.152108 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="167ad9cc-678d-499b-9be0-2e74112f84c9" containerName="openstack-network-exporter" containerID="cri-o://6f9b2c7bb5b12105cf493783a9ea56c27bfdd210b2404ad5c2701248c69906c1" gracePeriod=300 Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.156666 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19d5d146-9bfc-45cd-ae62-ffd05473b125" path="/var/lib/kubelet/pods/19d5d146-9bfc-45cd-ae62-ffd05473b125/volumes" Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.157432 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d4ba8c8-8fd5-4f29-a16e-bf1e628f99c0" path="/var/lib/kubelet/pods/1d4ba8c8-8fd5-4f29-a16e-bf1e628f99c0/volumes" Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.157922 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50a6d871-6acd-45a8-ae76-958e8fd0b9ec" path="/var/lib/kubelet/pods/50a6d871-6acd-45a8-ae76-958e8fd0b9ec/volumes" Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.162467 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61302320-9299-4dcc-abeb-05c28dd977c1" path="/var/lib/kubelet/pods/61302320-9299-4dcc-abeb-05c28dd977c1/volumes" Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.163386 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6494c542-3d82-43a7-b938-77820e0d3adb" path="/var/lib/kubelet/pods/6494c542-3d82-43a7-b938-77820e0d3adb/volumes" Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.179368 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7acea3ec-d1eb-4971-b3b5-7c0b898cf07c" path="/var/lib/kubelet/pods/7acea3ec-d1eb-4971-b3b5-7c0b898cf07c/volumes" Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.180385 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87881a32-acab-48f5-8e13-a5f2c01fdc09" path="/var/lib/kubelet/pods/87881a32-acab-48f5-8e13-a5f2c01fdc09/volumes" Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.180883 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a004f541-f809-42ea-89ac-13bea0f45829" path="/var/lib/kubelet/pods/a004f541-f809-42ea-89ac-13bea0f45829/volumes" Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.188523 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5" path="/var/lib/kubelet/pods/aa52f1ab-baf8-4f87-9125-6f7c6ddf21a5/volumes" Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.189141 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd374a21-cd74-447e-ab94-4e60e6f0d7be" path="/var/lib/kubelet/pods/cd374a21-cd74-447e-ab94-4e60e6f0d7be/volumes" Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.189677 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df5d1410-e474-4ae8-980b-46092cc080b0" path="/var/lib/kubelet/pods/df5d1410-e474-4ae8-980b-46092cc080b0/volumes" Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.204314 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e040c407-4b37-4bee-b200-0d97b5767ef1" path="/var/lib/kubelet/pods/e040c407-4b37-4bee-b200-0d97b5767ef1/volumes" Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.205341 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecfb1b2f-eb9b-47e0-905b-5785fc307df9" path="/var/lib/kubelet/pods/ecfb1b2f-eb9b-47e0-905b-5785fc307df9/volumes" Mar 09 03:05:36 crc kubenswrapper[4901]: E0309 03:05:36.206096 4901 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 09 03:05:36 crc kubenswrapper[4901]: E0309 03:05:36.206142 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/46c7df0b-fc0a-4fd9-b097-72da03442510-config-data podName:46c7df0b-fc0a-4fd9-b097-72da03442510 nodeName:}" failed. No retries permitted until 2026-03-09 03:05:36.706128883 +0000 UTC m=+1461.295792615 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/46c7df0b-fc0a-4fd9-b097-72da03442510-config-data") pod "rabbitmq-server-0" (UID: "46c7df0b-fc0a-4fd9-b097-72da03442510") : configmap "rabbitmq-config-data" not found Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.260371 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.260616 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="5276569a-5e4d-4bbb-ab39-f9a420e4a3e9" containerName="cinder-scheduler" containerID="cri-o://facb483ac1a2684cbdb2157f973980b22dbea7114e3234f3ba9ca3d2bc68a22b" gracePeriod=30 Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.261001 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="5276569a-5e4d-4bbb-ab39-f9a420e4a3e9" containerName="probe" containerID="cri-o://91cbb6deb4ec6b7d9a4c55257a984d7c3866e71cfa748163111d9fb04ddec075" gracePeriod=30 Mar 09 03:05:36 crc kubenswrapper[4901]: E0309 03:05:36.281738 4901 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-5rg5k" message=< Mar 09 03:05:36 crc kubenswrapper[4901]: Exiting ovn-controller (1) [ OK ] Mar 09 03:05:36 crc kubenswrapper[4901]: > Mar 09 03:05:36 crc kubenswrapper[4901]: E0309 03:05:36.281781 4901 kuberuntime_container.go:691] "PreStop hook failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " pod="openstack/ovn-controller-5rg5k" podUID="dc54c941-19d2-42c1-b9f0-a3a58999bda5" containerName="ovn-controller" containerID="cri-o://9006efa47acc80f02568c7e41f3501e04cd4ba5afcd137f8e6891cbea2267262" Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.281816 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-5rg5k" podUID="dc54c941-19d2-42c1-b9f0-a3a58999bda5" containerName="ovn-controller" containerID="cri-o://9006efa47acc80f02568c7e41f3501e04cd4ba5afcd137f8e6891cbea2267262" gracePeriod=30 Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.293295 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.293764 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerName="account-server" containerID="cri-o://5bf4c0e106e3a5033b95c4d3d3124a40b5aeabe081706850be3c85ef4ff88af9" gracePeriod=30 Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.294134 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerName="swift-recon-cron" containerID="cri-o://5a61a9051d7facf2f0fe68fbc34956eedc046cea9aa7aa425a2ad7983580763d" gracePeriod=30 Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.294179 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerName="rsync" containerID="cri-o://98baa58acecf6a889a2bcf29944696da987b06e0a31a75f7580e5b62f1cf2db6" gracePeriod=30 Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.294212 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerName="object-expirer" containerID="cri-o://ec7b6ca857145cd8ffde882836905792898feadc504e21586c4cd6aba7ec5a11" gracePeriod=30 Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.294257 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerName="object-updater" containerID="cri-o://1f84aa3c39b4c9622fe4347965d91df9c469a92794647651d5a87ec099686973" gracePeriod=30 Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.294290 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerName="object-auditor" containerID="cri-o://e1eb6f0364aa3902d58b824b7fb25b904c93c1eeb008b6cf519903b0f5d38d17" gracePeriod=30 Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.294321 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerName="object-replicator" containerID="cri-o://2a68b1ca4efba68812d3a303a2ceab2b4b6448914471d7a3decd7cf6b6f34bb6" gracePeriod=30 Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.294369 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerName="object-server" containerID="cri-o://b23a37af3d719448611f4ad6a32fe5c2c308cd7ba1a776e15eadba7f364fb7bf" gracePeriod=30 Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.294398 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerName="container-updater" containerID="cri-o://deebe1091232da3c6c138fb30edea0b726dc89153aad8d9068b83577825506dd" gracePeriod=30 Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.294433 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerName="container-auditor" containerID="cri-o://000fe2a3e4881852b517c846ea1372dc4b8cf6aada1cff25241e58df0a0f1d14" gracePeriod=30 Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.294467 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerName="container-replicator" containerID="cri-o://efdf4e619a6d24b736b4544527ea94436e6c978c7ceba7ef958652cf7cb597b8" gracePeriod=30 Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.294499 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerName="container-server" containerID="cri-o://88fc84894c5e86912090b30b3eb8149fd1b794d55763d71b556f863fbc68ed0f" gracePeriod=30 Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.294529 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerName="account-reaper" containerID="cri-o://8439b508bcb9b7e1d34dad860ed688032784076964cafe31bf8854469d12a0c4" gracePeriod=30 Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.294558 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerName="account-auditor" containerID="cri-o://b505e13d0afa284626e3a000524fb455406b74e6c642956f44df576c999c444c" gracePeriod=30 Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.294589 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerName="account-replicator" containerID="cri-o://0508076142d286eb3dc29b982443d11cf9f76d1d98901e2dde15dd0067359954" gracePeriod=30 Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.324370 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.324661 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="719d451b-159a-4fa7-9c72-54f42fb4f216" containerName="cinder-api-log" containerID="cri-o://3d80911034ba2c7400726494bee1c9c208e59f1ab6ee5fe2955eaf17b33102d5" gracePeriod=30 Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.325213 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="719d451b-159a-4fa7-9c72-54f42fb4f216" containerName="cinder-api" containerID="cri-o://bddc8674c0121619ce3f35bb3a4687012269faf10b370e3405b16aff91ea8de7" gracePeriod=30 Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.340417 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-m97c8"] Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.341374 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="167ad9cc-678d-499b-9be0-2e74112f84c9" containerName="ovsdbserver-sb" containerID="cri-o://58f28fe8133335254744ffb487e2889617fa95aef6fad8082e0e5543fe0012a2" gracePeriod=300 Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.359374 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-m97c8"] Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.378966 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.379178 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ac19cc68-f23c-4622-b265-6e94db65a43f" containerName="glance-log" containerID="cri-o://d69c564e49621dd1b26ec80f330b2a7ebc14dc4c83034905dc611e74754ca966" gracePeriod=30 Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.379845 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ac19cc68-f23c-4622-b265-6e94db65a43f" containerName="glance-httpd" containerID="cri-o://fffb85172792a1f2d2029246715912b5942a62d42487ca9ffed83a800ad7a8d7" gracePeriod=30 Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.394325 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_c86c22f2-896c-4c29-95c7-024aea61dcd2/ovsdbserver-nb/0.log" Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.394424 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.398215 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-d247z"] Mar 09 03:05:36 crc kubenswrapper[4901]: E0309 03:05:36.403336 4901 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 09 03:05:36 crc kubenswrapper[4901]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:763d1f1e8a1cf877c151c59609960fd2fa29e7e50001f8818122a2d51878befa,Command:[/bin/sh -c #!/bin/bash Mar 09 03:05:36 crc kubenswrapper[4901]: Mar 09 03:05:36 crc kubenswrapper[4901]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 09 03:05:36 crc kubenswrapper[4901]: Mar 09 03:05:36 crc kubenswrapper[4901]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 09 03:05:36 crc kubenswrapper[4901]: Mar 09 03:05:36 crc kubenswrapper[4901]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 09 03:05:36 crc kubenswrapper[4901]: Mar 09 03:05:36 crc kubenswrapper[4901]: if [ -n "" ]; then Mar 09 03:05:36 crc kubenswrapper[4901]: GRANT_DATABASE="" Mar 09 03:05:36 crc kubenswrapper[4901]: else Mar 09 03:05:36 crc kubenswrapper[4901]: GRANT_DATABASE="*" Mar 09 03:05:36 crc kubenswrapper[4901]: fi Mar 09 03:05:36 crc kubenswrapper[4901]: Mar 09 03:05:36 crc kubenswrapper[4901]: # going for maximum compatibility here: Mar 09 03:05:36 crc kubenswrapper[4901]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 09 03:05:36 crc kubenswrapper[4901]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 09 03:05:36 crc kubenswrapper[4901]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 09 03:05:36 crc kubenswrapper[4901]: # support updates Mar 09 03:05:36 crc kubenswrapper[4901]: Mar 09 03:05:36 crc kubenswrapper[4901]: $MYSQL_CMD < logger="UnhandledError" Mar 09 03:05:36 crc kubenswrapper[4901]: E0309 03:05:36.404557 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-cell1-mariadb-root-db-secret\\\" not found\"" pod="openstack/root-account-create-update-d247z" podUID="d57347fc-0546-466f-95e6-055857ca3685" Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.430986 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.433478 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7cd100e4-dfd3-45a7-a97c-84a05c352883" containerName="nova-metadata-log" containerID="cri-o://243e391338d48efa65297ce9611e5eccbd8854abfd2a58cc9b19ec8fa2a3478d" gracePeriod=30 Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.433704 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7cd100e4-dfd3-45a7-a97c-84a05c352883" containerName="nova-metadata-metadata" containerID="cri-o://633856b9abfea0469521c94f58caeaeb509a2acfab5f38815cbd06bf2bc15a57" gracePeriod=30 Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.471507 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.471722 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="12ec135f-33b3-4be3-bb27-5bb0ea25ddce" containerName="glance-log" containerID="cri-o://b6d35aeb5dab9771d9b67accacc82110a3fcd1a1a64f3b6be5cc15e368bd1336" gracePeriod=30 Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.471967 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="12ec135f-33b3-4be3-bb27-5bb0ea25ddce" containerName="glance-httpd" containerID="cri-o://923da5f384dd45a881e4088f214dd21db8342270329da027650b26fef0f69378" gracePeriod=30 Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.522729 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpl8z\" (UniqueName: \"kubernetes.io/projected/c86c22f2-896c-4c29-95c7-024aea61dcd2-kube-api-access-tpl8z\") pod \"c86c22f2-896c-4c29-95c7-024aea61dcd2\" (UID: \"c86c22f2-896c-4c29-95c7-024aea61dcd2\") " Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.524271 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"c86c22f2-896c-4c29-95c7-024aea61dcd2\" (UID: \"c86c22f2-896c-4c29-95c7-024aea61dcd2\") " Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.524329 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c86c22f2-896c-4c29-95c7-024aea61dcd2-config\") pod \"c86c22f2-896c-4c29-95c7-024aea61dcd2\" (UID: \"c86c22f2-896c-4c29-95c7-024aea61dcd2\") " Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.524362 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c86c22f2-896c-4c29-95c7-024aea61dcd2-ovsdb-rundir\") pod \"c86c22f2-896c-4c29-95c7-024aea61dcd2\" (UID: \"c86c22f2-896c-4c29-95c7-024aea61dcd2\") " Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.524403 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c86c22f2-896c-4c29-95c7-024aea61dcd2-metrics-certs-tls-certs\") pod \"c86c22f2-896c-4c29-95c7-024aea61dcd2\" (UID: \"c86c22f2-896c-4c29-95c7-024aea61dcd2\") " Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.524433 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c86c22f2-896c-4c29-95c7-024aea61dcd2-scripts\") pod \"c86c22f2-896c-4c29-95c7-024aea61dcd2\" (UID: \"c86c22f2-896c-4c29-95c7-024aea61dcd2\") " Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.524515 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c86c22f2-896c-4c29-95c7-024aea61dcd2-combined-ca-bundle\") pod \"c86c22f2-896c-4c29-95c7-024aea61dcd2\" (UID: \"c86c22f2-896c-4c29-95c7-024aea61dcd2\") " Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.524535 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c86c22f2-896c-4c29-95c7-024aea61dcd2-ovsdbserver-nb-tls-certs\") pod \"c86c22f2-896c-4c29-95c7-024aea61dcd2\" (UID: \"c86c22f2-896c-4c29-95c7-024aea61dcd2\") " Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.528752 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c86c22f2-896c-4c29-95c7-024aea61dcd2-scripts" (OuterVolumeSpecName: "scripts") pod "c86c22f2-896c-4c29-95c7-024aea61dcd2" (UID: "c86c22f2-896c-4c29-95c7-024aea61dcd2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.530521 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c86c22f2-896c-4c29-95c7-024aea61dcd2-config" (OuterVolumeSpecName: "config") pod "c86c22f2-896c-4c29-95c7-024aea61dcd2" (UID: "c86c22f2-896c-4c29-95c7-024aea61dcd2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.530946 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c86c22f2-896c-4c29-95c7-024aea61dcd2-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "c86c22f2-896c-4c29-95c7-024aea61dcd2" (UID: "c86c22f2-896c-4c29-95c7-024aea61dcd2"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.530977 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.545536 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c86c22f2-896c-4c29-95c7-024aea61dcd2-kube-api-access-tpl8z" (OuterVolumeSpecName: "kube-api-access-tpl8z") pod "c86c22f2-896c-4c29-95c7-024aea61dcd2" (UID: "c86c22f2-896c-4c29-95c7-024aea61dcd2"). InnerVolumeSpecName "kube-api-access-tpl8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.565460 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "c86c22f2-896c-4c29-95c7-024aea61dcd2" (UID: "c86c22f2-896c-4c29-95c7-024aea61dcd2"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.578312 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-8504-account-create-update-82s6b"] Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.611726 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-8504-account-create-update-82s6b"] Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.646637 4901 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.646666 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c86c22f2-896c-4c29-95c7-024aea61dcd2-config\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.646677 4901 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c86c22f2-896c-4c29-95c7-024aea61dcd2-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.646685 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c86c22f2-896c-4c29-95c7-024aea61dcd2-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.646694 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpl8z\" (UniqueName: \"kubernetes.io/projected/c86c22f2-896c-4c29-95c7-024aea61dcd2-kube-api-access-tpl8z\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.690089 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.690367 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d" containerName="nova-api-log" containerID="cri-o://8f3e0ecc8701ef810ffac6d0b7ee7465e40d3318f8d6cae6a1382396e02c80d0" gracePeriod=30 Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.690648 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d" containerName="nova-api-api" containerID="cri-o://f97ba1fa97a190b5ca6b6e5bbe19c6161fb765c75b9fffa80f6b76be75c0c237" gracePeriod=30 Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.720747 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c86c22f2-896c-4c29-95c7-024aea61dcd2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c86c22f2-896c-4c29-95c7-024aea61dcd2" (UID: "c86c22f2-896c-4c29-95c7-024aea61dcd2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.720781 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-nghlv"] Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.793681 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c86c22f2-896c-4c29-95c7-024aea61dcd2-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "c86c22f2-896c-4c29-95c7-024aea61dcd2" (UID: "c86c22f2-896c-4c29-95c7-024aea61dcd2"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.793938 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-nghlv"] Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.794540 4901 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.818366 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-plljd"] Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.832862 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c86c22f2-896c-4c29-95c7-024aea61dcd2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.832896 4901 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c86c22f2-896c-4c29-95c7-024aea61dcd2-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.832908 4901 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:36 crc kubenswrapper[4901]: E0309 03:05:36.832974 4901 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 09 03:05:36 crc kubenswrapper[4901]: E0309 03:05:36.833027 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/46c7df0b-fc0a-4fd9-b097-72da03442510-config-data podName:46c7df0b-fc0a-4fd9-b097-72da03442510 nodeName:}" failed. No retries permitted until 2026-03-09 03:05:37.833010408 +0000 UTC m=+1462.422674140 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/46c7df0b-fc0a-4fd9-b097-72da03442510-config-data") pod "rabbitmq-server-0" (UID: "46c7df0b-fc0a-4fd9-b097-72da03442510") : configmap "rabbitmq-config-data" not found Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.871131 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-plljd"] Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.886941 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c86c22f2-896c-4c29-95c7-024aea61dcd2-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "c86c22f2-896c-4c29-95c7-024aea61dcd2" (UID: "c86c22f2-896c-4c29-95c7-024aea61dcd2"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.894055 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-8712-account-create-update-6l9cg"] Mar 09 03:05:36 crc kubenswrapper[4901]: E0309 03:05:36.897544 4901 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 09 03:05:36 crc kubenswrapper[4901]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:763d1f1e8a1cf877c151c59609960fd2fa29e7e50001f8818122a2d51878befa,Command:[/bin/sh -c #!/bin/bash Mar 09 03:05:36 crc kubenswrapper[4901]: Mar 09 03:05:36 crc kubenswrapper[4901]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 09 03:05:36 crc kubenswrapper[4901]: Mar 09 03:05:36 crc kubenswrapper[4901]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 09 03:05:36 crc kubenswrapper[4901]: Mar 09 03:05:36 crc kubenswrapper[4901]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 09 03:05:36 crc kubenswrapper[4901]: Mar 09 03:05:36 crc kubenswrapper[4901]: if [ -n "nova_cell1" ]; then Mar 09 03:05:36 crc kubenswrapper[4901]: GRANT_DATABASE="nova_cell1" Mar 09 03:05:36 crc kubenswrapper[4901]: else Mar 09 03:05:36 crc kubenswrapper[4901]: GRANT_DATABASE="*" Mar 09 03:05:36 crc kubenswrapper[4901]: fi Mar 09 03:05:36 crc kubenswrapper[4901]: Mar 09 03:05:36 crc kubenswrapper[4901]: # going for maximum compatibility here: Mar 09 03:05:36 crc kubenswrapper[4901]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 09 03:05:36 crc kubenswrapper[4901]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 09 03:05:36 crc kubenswrapper[4901]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 09 03:05:36 crc kubenswrapper[4901]: # support updates Mar 09 03:05:36 crc kubenswrapper[4901]: Mar 09 03:05:36 crc kubenswrapper[4901]: $MYSQL_CMD < logger="UnhandledError" Mar 09 03:05:36 crc kubenswrapper[4901]: E0309 03:05:36.901786 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell1-db-secret\\\" not found\"" pod="openstack/nova-cell1-cddb-account-create-update-z7t9t" podUID="a6b8a05a-d698-4770-a883-fd60b61190b7" Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.909596 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-8712-account-create-update-6l9cg"] Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.934397 4901 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c86c22f2-896c-4c29-95c7-024aea61dcd2-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:36 crc kubenswrapper[4901]: E0309 03:05:36.934458 4901 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 09 03:05:36 crc kubenswrapper[4901]: E0309 03:05:36.934495 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/98538e55-cb87-49e2-9fd5-fff06d7edfdd-config-data podName:98538e55-cb87-49e2-9fd5-fff06d7edfdd nodeName:}" failed. No retries permitted until 2026-03-09 03:05:38.934482644 +0000 UTC m=+1463.524146376 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/98538e55-cb87-49e2-9fd5-fff06d7edfdd-config-data") pod "rabbitmq-cell1-server-0" (UID: "98538e55-cb87-49e2-9fd5-fff06d7edfdd") : configmap "rabbitmq-cell1-config-data" not found Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.946792 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-9kg2m"] Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.962907 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-9kg2m"] Mar 09 03:05:36 crc kubenswrapper[4901]: I0309 03:05:36.987304 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-3362-account-create-update-tdpdc"] Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.004518 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5b7f6df545-whtgc"] Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.004770 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5b7f6df545-whtgc" podUID="a29f795d-59d2-4e43-a6ee-6190dc0ad67d" containerName="neutron-api" containerID="cri-o://fd7c32b5c4e4206c35907a41758ab9c087a3999127a4db227227c92656f344b5" gracePeriod=30 Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.004829 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5b7f6df545-whtgc" podUID="a29f795d-59d2-4e43-a6ee-6190dc0ad67d" containerName="neutron-httpd" containerID="cri-o://42ab54fcd0a589e516d8eb72267b465d9fcb7af80252fced9a388a56a784446b" gracePeriod=30 Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.013341 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-hltph" podUID="4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89" containerName="ovsdb-server" containerID="cri-o://0ebf319a3bbc109bfb53682936200a7e2d64ef909307f9919c3a44fc2eece41f" gracePeriod=29 Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.025819 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-3362-account-create-update-tdpdc"] Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.036281 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-kks6g"] Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.044238 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-hltph" podUID="4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89" containerName="ovs-vswitchd" containerID="cri-o://b269dfeb8bccf0d2d404b3336072d680e723d68372ed379a3aaa3d30efaa387d" gracePeriod=29 Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.044739 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-kks6g"] Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.058938 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="f0098aa8-4248-48ec-a254-368c395308b1" containerName="galera" containerID="cri-o://b4c164d97e9bee042efac1c9e73f97d746429bbdb1ecf72f43fc47c10cd2ec24" gracePeriod=30 Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.059045 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cddb-account-create-update-z7t9t"] Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.074786 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-29jrh"] Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.081824 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-sfgd9"] Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.088712 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-29jrh"] Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.098890 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-04a2-account-create-update-lz6dq"] Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.106850 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-sfgd9"] Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.110550 4901 generic.go:334] "Generic (PLEG): container finished" podID="ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d" containerID="8f3e0ecc8701ef810ffac6d0b7ee7465e40d3318f8d6cae6a1382396e02c80d0" exitCode=143 Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.110613 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d","Type":"ContainerDied","Data":"8f3e0ecc8701ef810ffac6d0b7ee7465e40d3318f8d6cae6a1382396e02c80d0"} Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.112388 4901 generic.go:334] "Generic (PLEG): container finished" podID="966d96ae-fba9-4ecd-85e5-a81cecfb2ed3" containerID="4da5234fa05d69c236f53cbbc103505a093111b7b3b09f7a401eaded8dc333cb" exitCode=143 Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.112420 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5ddb85f7bb-phpwc" event={"ID":"966d96ae-fba9-4ecd-85e5-a81cecfb2ed3","Type":"ContainerDied","Data":"4da5234fa05d69c236f53cbbc103505a093111b7b3b09f7a401eaded8dc333cb"} Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.115865 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cddb-account-create-update-z7t9t"] Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.117060 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_167ad9cc-678d-499b-9be0-2e74112f84c9/ovsdbserver-sb/0.log" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.117092 4901 generic.go:334] "Generic (PLEG): container finished" podID="167ad9cc-678d-499b-9be0-2e74112f84c9" containerID="6f9b2c7bb5b12105cf493783a9ea56c27bfdd210b2404ad5c2701248c69906c1" exitCode=2 Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.117101 4901 generic.go:334] "Generic (PLEG): container finished" podID="167ad9cc-678d-499b-9be0-2e74112f84c9" containerID="58f28fe8133335254744ffb487e2889617fa95aef6fad8082e0e5543fe0012a2" exitCode=143 Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.117167 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"167ad9cc-678d-499b-9be0-2e74112f84c9","Type":"ContainerDied","Data":"6f9b2c7bb5b12105cf493783a9ea56c27bfdd210b2404ad5c2701248c69906c1"} Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.117249 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"167ad9cc-678d-499b-9be0-2e74112f84c9","Type":"ContainerDied","Data":"58f28fe8133335254744ffb487e2889617fa95aef6fad8082e0e5543fe0012a2"} Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.120410 4901 generic.go:334] "Generic (PLEG): container finished" podID="ac19cc68-f23c-4622-b265-6e94db65a43f" containerID="d69c564e49621dd1b26ec80f330b2a7ebc14dc4c83034905dc611e74754ca966" exitCode=143 Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.120457 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ac19cc68-f23c-4622-b265-6e94db65a43f","Type":"ContainerDied","Data":"d69c564e49621dd1b26ec80f330b2a7ebc14dc4c83034905dc611e74754ca966"} Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.125093 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-04a2-account-create-update-lz6dq"] Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.132923 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-sxdq5"] Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.143689 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cddb-account-create-update-z7t9t" event={"ID":"a6b8a05a-d698-4770-a883-fd60b61190b7","Type":"ContainerStarted","Data":"1e0d2229701327b5551de4e9a3afef86818395b4442cd0ea81a014074d9c3a4b"} Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.144065 4901 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/nova-cell1-cddb-account-create-update-z7t9t" secret="" err="secret \"galera-openstack-cell1-dockercfg-mhqqf\" not found" Mar 09 03:05:37 crc kubenswrapper[4901]: E0309 03:05:37.145713 4901 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 09 03:05:37 crc kubenswrapper[4901]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:763d1f1e8a1cf877c151c59609960fd2fa29e7e50001f8818122a2d51878befa,Command:[/bin/sh -c #!/bin/bash Mar 09 03:05:37 crc kubenswrapper[4901]: Mar 09 03:05:37 crc kubenswrapper[4901]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 09 03:05:37 crc kubenswrapper[4901]: Mar 09 03:05:37 crc kubenswrapper[4901]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 09 03:05:37 crc kubenswrapper[4901]: Mar 09 03:05:37 crc kubenswrapper[4901]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 09 03:05:37 crc kubenswrapper[4901]: Mar 09 03:05:37 crc kubenswrapper[4901]: if [ -n "nova_cell1" ]; then Mar 09 03:05:37 crc kubenswrapper[4901]: GRANT_DATABASE="nova_cell1" Mar 09 03:05:37 crc kubenswrapper[4901]: else Mar 09 03:05:37 crc kubenswrapper[4901]: GRANT_DATABASE="*" Mar 09 03:05:37 crc kubenswrapper[4901]: fi Mar 09 03:05:37 crc kubenswrapper[4901]: Mar 09 03:05:37 crc kubenswrapper[4901]: # going for maximum compatibility here: Mar 09 03:05:37 crc kubenswrapper[4901]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 09 03:05:37 crc kubenswrapper[4901]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 09 03:05:37 crc kubenswrapper[4901]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 09 03:05:37 crc kubenswrapper[4901]: # support updates Mar 09 03:05:37 crc kubenswrapper[4901]: Mar 09 03:05:37 crc kubenswrapper[4901]: $MYSQL_CMD < logger="UnhandledError" Mar 09 03:05:37 crc kubenswrapper[4901]: E0309 03:05:37.147204 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell1-db-secret\\\" not found\"" pod="openstack/nova-cell1-cddb-account-create-update-z7t9t" podUID="a6b8a05a-d698-4770-a883-fd60b61190b7" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.157557 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-579bf976f9-ds45q"] Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.157884 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-579bf976f9-ds45q" podUID="cfd87218-f7dc-424a-acda-dd7b57792738" containerName="proxy-httpd" containerID="cri-o://5aadda7ec4443b77cc72cf257c26eb2f3cb4e579053dc6d7e5abdc079cb7dcfd" gracePeriod=30 Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.158239 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-579bf976f9-ds45q" podUID="cfd87218-f7dc-424a-acda-dd7b57792738" containerName="proxy-server" containerID="cri-o://f2e8fa826008c2c9f01478d90a32d0b9381a5e77f7d22cee3432c76f471846e2" gracePeriod=30 Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.177437 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-sxdq5"] Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.180047 4901 generic.go:334] "Generic (PLEG): container finished" podID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerID="98baa58acecf6a889a2bcf29944696da987b06e0a31a75f7580e5b62f1cf2db6" exitCode=0 Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.180067 4901 generic.go:334] "Generic (PLEG): container finished" podID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerID="ec7b6ca857145cd8ffde882836905792898feadc504e21586c4cd6aba7ec5a11" exitCode=0 Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.180075 4901 generic.go:334] "Generic (PLEG): container finished" podID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerID="1f84aa3c39b4c9622fe4347965d91df9c469a92794647651d5a87ec099686973" exitCode=0 Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.180081 4901 generic.go:334] "Generic (PLEG): container finished" podID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerID="e1eb6f0364aa3902d58b824b7fb25b904c93c1eeb008b6cf519903b0f5d38d17" exitCode=0 Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.180087 4901 generic.go:334] "Generic (PLEG): container finished" podID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerID="2a68b1ca4efba68812d3a303a2ceab2b4b6448914471d7a3decd7cf6b6f34bb6" exitCode=0 Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.180093 4901 generic.go:334] "Generic (PLEG): container finished" podID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerID="b23a37af3d719448611f4ad6a32fe5c2c308cd7ba1a776e15eadba7f364fb7bf" exitCode=0 Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.180100 4901 generic.go:334] "Generic (PLEG): container finished" podID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerID="deebe1091232da3c6c138fb30edea0b726dc89153aad8d9068b83577825506dd" exitCode=0 Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.180107 4901 generic.go:334] "Generic (PLEG): container finished" podID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerID="000fe2a3e4881852b517c846ea1372dc4b8cf6aada1cff25241e58df0a0f1d14" exitCode=0 Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.180113 4901 generic.go:334] "Generic (PLEG): container finished" podID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerID="efdf4e619a6d24b736b4544527ea94436e6c978c7ceba7ef958652cf7cb597b8" exitCode=0 Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.180119 4901 generic.go:334] "Generic (PLEG): container finished" podID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerID="88fc84894c5e86912090b30b3eb8149fd1b794d55763d71b556f863fbc68ed0f" exitCode=0 Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.180126 4901 generic.go:334] "Generic (PLEG): container finished" podID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerID="8439b508bcb9b7e1d34dad860ed688032784076964cafe31bf8854469d12a0c4" exitCode=0 Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.180131 4901 generic.go:334] "Generic (PLEG): container finished" podID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerID="b505e13d0afa284626e3a000524fb455406b74e6c642956f44df576c999c444c" exitCode=0 Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.180137 4901 generic.go:334] "Generic (PLEG): container finished" podID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerID="0508076142d286eb3dc29b982443d11cf9f76d1d98901e2dde15dd0067359954" exitCode=0 Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.180144 4901 generic.go:334] "Generic (PLEG): container finished" podID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerID="5bf4c0e106e3a5033b95c4d3d3124a40b5aeabe081706850be3c85ef4ff88af9" exitCode=0 Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.180192 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"56e038ec-a406-4f6b-9b8a-135c56be7514","Type":"ContainerDied","Data":"98baa58acecf6a889a2bcf29944696da987b06e0a31a75f7580e5b62f1cf2db6"} Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.180214 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"56e038ec-a406-4f6b-9b8a-135c56be7514","Type":"ContainerDied","Data":"ec7b6ca857145cd8ffde882836905792898feadc504e21586c4cd6aba7ec5a11"} Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.180657 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"56e038ec-a406-4f6b-9b8a-135c56be7514","Type":"ContainerDied","Data":"1f84aa3c39b4c9622fe4347965d91df9c469a92794647651d5a87ec099686973"} Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.180672 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"56e038ec-a406-4f6b-9b8a-135c56be7514","Type":"ContainerDied","Data":"e1eb6f0364aa3902d58b824b7fb25b904c93c1eeb008b6cf519903b0f5d38d17"} Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.180682 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"56e038ec-a406-4f6b-9b8a-135c56be7514","Type":"ContainerDied","Data":"2a68b1ca4efba68812d3a303a2ceab2b4b6448914471d7a3decd7cf6b6f34bb6"} Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.180691 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"56e038ec-a406-4f6b-9b8a-135c56be7514","Type":"ContainerDied","Data":"b23a37af3d719448611f4ad6a32fe5c2c308cd7ba1a776e15eadba7f364fb7bf"} Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.180700 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"56e038ec-a406-4f6b-9b8a-135c56be7514","Type":"ContainerDied","Data":"deebe1091232da3c6c138fb30edea0b726dc89153aad8d9068b83577825506dd"} Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.180708 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"56e038ec-a406-4f6b-9b8a-135c56be7514","Type":"ContainerDied","Data":"000fe2a3e4881852b517c846ea1372dc4b8cf6aada1cff25241e58df0a0f1d14"} Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.180716 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"56e038ec-a406-4f6b-9b8a-135c56be7514","Type":"ContainerDied","Data":"efdf4e619a6d24b736b4544527ea94436e6c978c7ceba7ef958652cf7cb597b8"} Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.180724 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"56e038ec-a406-4f6b-9b8a-135c56be7514","Type":"ContainerDied","Data":"88fc84894c5e86912090b30b3eb8149fd1b794d55763d71b556f863fbc68ed0f"} Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.180733 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"56e038ec-a406-4f6b-9b8a-135c56be7514","Type":"ContainerDied","Data":"8439b508bcb9b7e1d34dad860ed688032784076964cafe31bf8854469d12a0c4"} Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.180741 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"56e038ec-a406-4f6b-9b8a-135c56be7514","Type":"ContainerDied","Data":"b505e13d0afa284626e3a000524fb455406b74e6c642956f44df576c999c444c"} Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.180749 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"56e038ec-a406-4f6b-9b8a-135c56be7514","Type":"ContainerDied","Data":"0508076142d286eb3dc29b982443d11cf9f76d1d98901e2dde15dd0067359954"} Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.180757 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"56e038ec-a406-4f6b-9b8a-135c56be7514","Type":"ContainerDied","Data":"5bf4c0e106e3a5033b95c4d3d3124a40b5aeabe081706850be3c85ef4ff88af9"} Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.188124 4901 generic.go:334] "Generic (PLEG): container finished" podID="4ba69329-2c9f-4938-89b0-d1fa314d5a30" containerID="2023e3fb3e8f7021afadc49905f2c75c9b97db5b0d3ac7172346d27ce1b5d2f7" exitCode=0 Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.188209 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.188266 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c8964d89c-vw8lc" event={"ID":"4ba69329-2c9f-4938-89b0-d1fa314d5a30","Type":"ContainerDied","Data":"2023e3fb3e8f7021afadc49905f2c75c9b97db5b0d3ac7172346d27ce1b5d2f7"} Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.188283 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c8964d89c-vw8lc" event={"ID":"4ba69329-2c9f-4938-89b0-d1fa314d5a30","Type":"ContainerDied","Data":"f2071503a136379a8a6d0a61f956bf12bd79625d11cb0b7a8f2eb58b90a4db68"} Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.188292 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2071503a136379a8a6d0a61f956bf12bd79625d11cb0b7a8f2eb58b90a4db68" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.188453 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="52461a44-ded9-4025-b0f1-85c22462a04f" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://4fc838d65f3d594291f10b8a95a561b0be21c7dd87bb4f1fb968c80a5bf041fa" gracePeriod=30 Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.194108 4901 generic.go:334] "Generic (PLEG): container finished" podID="4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89" containerID="0ebf319a3bbc109bfb53682936200a7e2d64ef909307f9919c3a44fc2eece41f" exitCode=0 Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.194157 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hltph" event={"ID":"4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89","Type":"ContainerDied","Data":"0ebf319a3bbc109bfb53682936200a7e2d64ef909307f9919c3a44fc2eece41f"} Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.197488 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_c86c22f2-896c-4c29-95c7-024aea61dcd2/ovsdbserver-nb/0.log" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.197541 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"c86c22f2-896c-4c29-95c7-024aea61dcd2","Type":"ContainerDied","Data":"66faf20c023bbaffbd283fb5360df2c2927680324af54043b31e59e6a5b95696"} Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.197585 4901 scope.go:117] "RemoveContainer" containerID="85a90efd98e045964af92120b771705db922d73189fbe96e8a40862a41bec02c" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.197716 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.204572 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-3c33-account-create-update-8vqvj"] Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.213906 4901 generic.go:334] "Generic (PLEG): container finished" podID="7cd100e4-dfd3-45a7-a97c-84a05c352883" containerID="243e391338d48efa65297ce9611e5eccbd8854abfd2a58cc9b19ec8fa2a3478d" exitCode=143 Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.214036 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7cd100e4-dfd3-45a7-a97c-84a05c352883","Type":"ContainerDied","Data":"243e391338d48efa65297ce9611e5eccbd8854abfd2a58cc9b19ec8fa2a3478d"} Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.216999 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-nmx84_065dfe75-7489-4b15-8a4d-4adf13393aea/openstack-network-exporter/0.log" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.217041 4901 generic.go:334] "Generic (PLEG): container finished" podID="065dfe75-7489-4b15-8a4d-4adf13393aea" containerID="97ab13e50a94e0d652fdbd979cbe25cc835cb2286e4d09687a06d52e3b5f01f1" exitCode=2 Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.217095 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-nmx84" event={"ID":"065dfe75-7489-4b15-8a4d-4adf13393aea","Type":"ContainerDied","Data":"97ab13e50a94e0d652fdbd979cbe25cc835cb2286e4d09687a06d52e3b5f01f1"} Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.217120 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-nmx84" event={"ID":"065dfe75-7489-4b15-8a4d-4adf13393aea","Type":"ContainerDied","Data":"e9fb4b2a34aaff405e7d422b2c31e892d446b9f1b8c75813f2f8499868a73aed"} Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.217130 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9fb4b2a34aaff405e7d422b2c31e892d446b9f1b8c75813f2f8499868a73aed" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.225454 4901 generic.go:334] "Generic (PLEG): container finished" podID="dc54c941-19d2-42c1-b9f0-a3a58999bda5" containerID="9006efa47acc80f02568c7e41f3501e04cd4ba5afcd137f8e6891cbea2267262" exitCode=0 Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.225548 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5rg5k" event={"ID":"dc54c941-19d2-42c1-b9f0-a3a58999bda5","Type":"ContainerDied","Data":"9006efa47acc80f02568c7e41f3501e04cd4ba5afcd137f8e6891cbea2267262"} Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.225609 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5rg5k" event={"ID":"dc54c941-19d2-42c1-b9f0-a3a58999bda5","Type":"ContainerDied","Data":"1fa4549ee30fd478c1cc8063cbb815d38c1347d5e0c4dc36125f708c0a58f4ee"} Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.225623 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1fa4549ee30fd478c1cc8063cbb815d38c1347d5e0c4dc36125f708c0a58f4ee" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.230460 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-7d497f76dc-pptvt"] Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.230875 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-7d497f76dc-pptvt" podUID="6790ccc5-8f7f-4de8-bd69-652661631307" containerName="barbican-worker-log" containerID="cri-o://4b17bc681ef09af1769a69b4b52d036b3ca7d2d0d4b4182be89cf1ab304d4772" gracePeriod=30 Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.231633 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-7d497f76dc-pptvt" podUID="6790ccc5-8f7f-4de8-bd69-652661631307" containerName="barbican-worker" containerID="cri-o://c3f30f9dc1ce91e181e2e5cb21149d56e84cb129c67ad539aaf2087c64070e73" gracePeriod=30 Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.234259 4901 generic.go:334] "Generic (PLEG): container finished" podID="34bc86a8-8821-462a-b15b-c2f847f44be2" containerID="1c5af01525dee027d91335bfc70f1c4c730cdc63e3d8be7b219fbbab05c905fa" exitCode=137 Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.237809 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c8964d89c-vw8lc" Mar 09 03:05:37 crc kubenswrapper[4901]: E0309 03:05:37.241822 4901 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Mar 09 03:05:37 crc kubenswrapper[4901]: E0309 03:05:37.241865 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a6b8a05a-d698-4770-a883-fd60b61190b7-operator-scripts podName:a6b8a05a-d698-4770-a883-fd60b61190b7 nodeName:}" failed. No retries permitted until 2026-03-09 03:05:37.741851528 +0000 UTC m=+1462.331515260 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/a6b8a05a-d698-4770-a883-fd60b61190b7-operator-scripts") pod "nova-cell1-cddb-account-create-update-z7t9t" (UID: "a6b8a05a-d698-4770-a883-fd60b61190b7") : configmap "openstack-cell1-scripts" not found Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.243081 4901 generic.go:334] "Generic (PLEG): container finished" podID="12ec135f-33b3-4be3-bb27-5bb0ea25ddce" containerID="b6d35aeb5dab9771d9b67accacc82110a3fcd1a1a64f3b6be5cc15e368bd1336" exitCode=143 Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.243136 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"12ec135f-33b3-4be3-bb27-5bb0ea25ddce","Type":"ContainerDied","Data":"b6d35aeb5dab9771d9b67accacc82110a3fcd1a1a64f3b6be5cc15e368bd1336"} Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.244659 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-nmx84_065dfe75-7489-4b15-8a4d-4adf13393aea/openstack-network-exporter/0.log" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.244727 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-nmx84" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.259736 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-875b9dd78-8t9g6"] Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.261645 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-875b9dd78-8t9g6" podUID="baa336b3-abdd-43e2-9c54-6d8d34c71204" containerName="barbican-keystone-listener-log" containerID="cri-o://6e6b9a76927163c431cd48ce16cec53d765e06541e962dcf3dccf5f2b20e4b6e" gracePeriod=30 Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.261737 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-875b9dd78-8t9g6" podUID="baa336b3-abdd-43e2-9c54-6d8d34c71204" containerName="barbican-keystone-listener" containerID="cri-o://3ed5c94eba80813636cf9478d7655211647869ac0b7da74e90655f7e8fc79465" gracePeriod=30 Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.265338 4901 generic.go:334] "Generic (PLEG): container finished" podID="719d451b-159a-4fa7-9c72-54f42fb4f216" containerID="3d80911034ba2c7400726494bee1c9c208e59f1ab6ee5fe2955eaf17b33102d5" exitCode=143 Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.265389 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"719d451b-159a-4fa7-9c72-54f42fb4f216","Type":"ContainerDied","Data":"3d80911034ba2c7400726494bee1c9c208e59f1ab6ee5fe2955eaf17b33102d5"} Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.266405 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5rg5k" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.267347 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-d247z" event={"ID":"d57347fc-0546-466f-95e6-055857ca3685","Type":"ContainerStarted","Data":"6be556f249be4c79573196413b63a0499871594f221f3a791d4e9a538be6e9fc"} Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.267390 4901 scope.go:117] "RemoveContainer" containerID="f60874d330498787b4f53dbd548f5bbb7d7609369bb61c9f558d83dea563dbb7" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.267894 4901 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/root-account-create-update-d247z" secret="" err="secret \"galera-openstack-cell1-dockercfg-mhqqf\" not found" Mar 09 03:05:37 crc kubenswrapper[4901]: E0309 03:05:37.270856 4901 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 09 03:05:37 crc kubenswrapper[4901]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:763d1f1e8a1cf877c151c59609960fd2fa29e7e50001f8818122a2d51878befa,Command:[/bin/sh -c #!/bin/bash Mar 09 03:05:37 crc kubenswrapper[4901]: Mar 09 03:05:37 crc kubenswrapper[4901]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 09 03:05:37 crc kubenswrapper[4901]: Mar 09 03:05:37 crc kubenswrapper[4901]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 09 03:05:37 crc kubenswrapper[4901]: Mar 09 03:05:37 crc kubenswrapper[4901]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 09 03:05:37 crc kubenswrapper[4901]: Mar 09 03:05:37 crc kubenswrapper[4901]: if [ -n "" ]; then Mar 09 03:05:37 crc kubenswrapper[4901]: GRANT_DATABASE="" Mar 09 03:05:37 crc kubenswrapper[4901]: else Mar 09 03:05:37 crc kubenswrapper[4901]: GRANT_DATABASE="*" Mar 09 03:05:37 crc kubenswrapper[4901]: fi Mar 09 03:05:37 crc kubenswrapper[4901]: Mar 09 03:05:37 crc kubenswrapper[4901]: # going for maximum compatibility here: Mar 09 03:05:37 crc kubenswrapper[4901]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 09 03:05:37 crc kubenswrapper[4901]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 09 03:05:37 crc kubenswrapper[4901]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 09 03:05:37 crc kubenswrapper[4901]: # support updates Mar 09 03:05:37 crc kubenswrapper[4901]: Mar 09 03:05:37 crc kubenswrapper[4901]: $MYSQL_CMD < logger="UnhandledError" Mar 09 03:05:37 crc kubenswrapper[4901]: E0309 03:05:37.272969 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-cell1-mariadb-root-db-secret\\\" not found\"" pod="openstack/root-account-create-update-d247z" podUID="d57347fc-0546-466f-95e6-055857ca3685" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.276450 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-d247z"] Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.285334 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6cc8485b48-f86rl"] Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.285691 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6cc8485b48-f86rl" podUID="e5efc6dd-6a36-4491-b090-b4c9301ec7d0" containerName="barbican-api-log" containerID="cri-o://148d0935d0af545a31bff0013cab741797e444db8fdba8b7ef3fe82da71d3a67" gracePeriod=30 Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.285889 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6cc8485b48-f86rl" podUID="e5efc6dd-6a36-4491-b090-b4c9301ec7d0" containerName="barbican-api" containerID="cri-o://cf4f1b9588039d52f25928fa5f258488b5893e580645e30a5d8a63dcf3396c0b" gracePeriod=30 Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.304840 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.334852 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-jv9hz"] Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.340383 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-jv9hz"] Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.342929 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntvll\" (UniqueName: \"kubernetes.io/projected/dc54c941-19d2-42c1-b9f0-a3a58999bda5-kube-api-access-ntvll\") pod \"dc54c941-19d2-42c1-b9f0-a3a58999bda5\" (UID: \"dc54c941-19d2-42c1-b9f0-a3a58999bda5\") " Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.342965 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc54c941-19d2-42c1-b9f0-a3a58999bda5-combined-ca-bundle\") pod \"dc54c941-19d2-42c1-b9f0-a3a58999bda5\" (UID: \"dc54c941-19d2-42c1-b9f0-a3a58999bda5\") " Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.343022 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcw8z\" (UniqueName: \"kubernetes.io/projected/4ba69329-2c9f-4938-89b0-d1fa314d5a30-kube-api-access-vcw8z\") pod \"4ba69329-2c9f-4938-89b0-d1fa314d5a30\" (UID: \"4ba69329-2c9f-4938-89b0-d1fa314d5a30\") " Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.343065 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ba69329-2c9f-4938-89b0-d1fa314d5a30-ovsdbserver-nb\") pod \"4ba69329-2c9f-4938-89b0-d1fa314d5a30\" (UID: \"4ba69329-2c9f-4938-89b0-d1fa314d5a30\") " Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.343093 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/dc54c941-19d2-42c1-b9f0-a3a58999bda5-var-log-ovn\") pod \"dc54c941-19d2-42c1-b9f0-a3a58999bda5\" (UID: \"dc54c941-19d2-42c1-b9f0-a3a58999bda5\") " Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.343121 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc54c941-19d2-42c1-b9f0-a3a58999bda5-ovn-controller-tls-certs\") pod \"dc54c941-19d2-42c1-b9f0-a3a58999bda5\" (UID: \"dc54c941-19d2-42c1-b9f0-a3a58999bda5\") " Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.343144 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/dc54c941-19d2-42c1-b9f0-a3a58999bda5-var-run\") pod \"dc54c941-19d2-42c1-b9f0-a3a58999bda5\" (UID: \"dc54c941-19d2-42c1-b9f0-a3a58999bda5\") " Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.343174 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/065dfe75-7489-4b15-8a4d-4adf13393aea-ovn-rundir\") pod \"065dfe75-7489-4b15-8a4d-4adf13393aea\" (UID: \"065dfe75-7489-4b15-8a4d-4adf13393aea\") " Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.343195 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/065dfe75-7489-4b15-8a4d-4adf13393aea-combined-ca-bundle\") pod \"065dfe75-7489-4b15-8a4d-4adf13393aea\" (UID: \"065dfe75-7489-4b15-8a4d-4adf13393aea\") " Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.343215 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/065dfe75-7489-4b15-8a4d-4adf13393aea-ovs-rundir\") pod \"065dfe75-7489-4b15-8a4d-4adf13393aea\" (UID: \"065dfe75-7489-4b15-8a4d-4adf13393aea\") " Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.343253 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ba69329-2c9f-4938-89b0-d1fa314d5a30-dns-svc\") pod \"4ba69329-2c9f-4938-89b0-d1fa314d5a30\" (UID: \"4ba69329-2c9f-4938-89b0-d1fa314d5a30\") " Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.343269 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4ba69329-2c9f-4938-89b0-d1fa314d5a30-dns-swift-storage-0\") pod \"4ba69329-2c9f-4938-89b0-d1fa314d5a30\" (UID: \"4ba69329-2c9f-4938-89b0-d1fa314d5a30\") " Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.343286 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/dc54c941-19d2-42c1-b9f0-a3a58999bda5-var-run-ovn\") pod \"dc54c941-19d2-42c1-b9f0-a3a58999bda5\" (UID: \"dc54c941-19d2-42c1-b9f0-a3a58999bda5\") " Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.343306 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/065dfe75-7489-4b15-8a4d-4adf13393aea-config\") pod \"065dfe75-7489-4b15-8a4d-4adf13393aea\" (UID: \"065dfe75-7489-4b15-8a4d-4adf13393aea\") " Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.343360 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ba69329-2c9f-4938-89b0-d1fa314d5a30-ovsdbserver-sb\") pod \"4ba69329-2c9f-4938-89b0-d1fa314d5a30\" (UID: \"4ba69329-2c9f-4938-89b0-d1fa314d5a30\") " Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.343393 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/065dfe75-7489-4b15-8a4d-4adf13393aea-metrics-certs-tls-certs\") pod \"065dfe75-7489-4b15-8a4d-4adf13393aea\" (UID: \"065dfe75-7489-4b15-8a4d-4adf13393aea\") " Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.343414 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc54c941-19d2-42c1-b9f0-a3a58999bda5-scripts\") pod \"dc54c941-19d2-42c1-b9f0-a3a58999bda5\" (UID: \"dc54c941-19d2-42c1-b9f0-a3a58999bda5\") " Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.343446 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jh9p6\" (UniqueName: \"kubernetes.io/projected/065dfe75-7489-4b15-8a4d-4adf13393aea-kube-api-access-jh9p6\") pod \"065dfe75-7489-4b15-8a4d-4adf13393aea\" (UID: \"065dfe75-7489-4b15-8a4d-4adf13393aea\") " Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.343469 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ba69329-2c9f-4938-89b0-d1fa314d5a30-config\") pod \"4ba69329-2c9f-4938-89b0-d1fa314d5a30\" (UID: \"4ba69329-2c9f-4938-89b0-d1fa314d5a30\") " Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.344196 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc54c941-19d2-42c1-b9f0-a3a58999bda5-var-run" (OuterVolumeSpecName: "var-run") pod "dc54c941-19d2-42c1-b9f0-a3a58999bda5" (UID: "dc54c941-19d2-42c1-b9f0-a3a58999bda5"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.344251 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/065dfe75-7489-4b15-8a4d-4adf13393aea-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "065dfe75-7489-4b15-8a4d-4adf13393aea" (UID: "065dfe75-7489-4b15-8a4d-4adf13393aea"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.350564 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.350664 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc54c941-19d2-42c1-b9f0-a3a58999bda5-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "dc54c941-19d2-42c1-b9f0-a3a58999bda5" (UID: "dc54c941-19d2-42c1-b9f0-a3a58999bda5"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.350739 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="19b624e5-b3de-4724-b995-829d3fcd48ae" containerName="nova-cell1-conductor-conductor" containerID="cri-o://efaef8fb4ac1ae8528bd34b42efc1ece916dbf688592bd94cb85d4b9a27c1591" gracePeriod=30 Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.351267 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/065dfe75-7489-4b15-8a4d-4adf13393aea-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "065dfe75-7489-4b15-8a4d-4adf13393aea" (UID: "065dfe75-7489-4b15-8a4d-4adf13393aea"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.356981 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc54c941-19d2-42c1-b9f0-a3a58999bda5-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "dc54c941-19d2-42c1-b9f0-a3a58999bda5" (UID: "dc54c941-19d2-42c1-b9f0-a3a58999bda5"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.358593 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc54c941-19d2-42c1-b9f0-a3a58999bda5-scripts" (OuterVolumeSpecName: "scripts") pod "dc54c941-19d2-42c1-b9f0-a3a58999bda5" (UID: "dc54c941-19d2-42c1-b9f0-a3a58999bda5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:05:37 crc kubenswrapper[4901]: E0309 03:05:37.358689 4901 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Mar 09 03:05:37 crc kubenswrapper[4901]: E0309 03:05:37.358743 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d57347fc-0546-466f-95e6-055857ca3685-operator-scripts podName:d57347fc-0546-466f-95e6-055857ca3685 nodeName:}" failed. No retries permitted until 2026-03-09 03:05:37.858727063 +0000 UTC m=+1462.448390795 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/d57347fc-0546-466f-95e6-055857ca3685-operator-scripts") pod "root-account-create-update-d247z" (UID: "d57347fc-0546-466f-95e6-055857ca3685") : configmap "openstack-cell1-scripts" not found Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.358780 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-c2nzk"] Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.359376 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ba69329-2c9f-4938-89b0-d1fa314d5a30-kube-api-access-vcw8z" (OuterVolumeSpecName: "kube-api-access-vcw8z") pod "4ba69329-2c9f-4938-89b0-d1fa314d5a30" (UID: "4ba69329-2c9f-4938-89b0-d1fa314d5a30"). InnerVolumeSpecName "kube-api-access-vcw8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.360120 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/065dfe75-7489-4b15-8a4d-4adf13393aea-config" (OuterVolumeSpecName: "config") pod "065dfe75-7489-4b15-8a4d-4adf13393aea" (UID: "065dfe75-7489-4b15-8a4d-4adf13393aea"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.361068 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/065dfe75-7489-4b15-8a4d-4adf13393aea-kube-api-access-jh9p6" (OuterVolumeSpecName: "kube-api-access-jh9p6") pod "065dfe75-7489-4b15-8a4d-4adf13393aea" (UID: "065dfe75-7489-4b15-8a4d-4adf13393aea"). InnerVolumeSpecName "kube-api-access-jh9p6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.366694 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_167ad9cc-678d-499b-9be0-2e74112f84c9/ovsdbserver-sb/0.log" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.366760 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.378449 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc54c941-19d2-42c1-b9f0-a3a58999bda5-kube-api-access-ntvll" (OuterVolumeSpecName: "kube-api-access-ntvll") pod "dc54c941-19d2-42c1-b9f0-a3a58999bda5" (UID: "dc54c941-19d2-42c1-b9f0-a3a58999bda5"). InnerVolumeSpecName "kube-api-access-ntvll". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.389002 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-c2nzk"] Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.390175 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="98538e55-cb87-49e2-9fd5-fff06d7edfdd" containerName="rabbitmq" containerID="cri-o://d88fb8444efa6a21fe15aca1c8ba0da30c0a28364fd9a1356f05611a979ae19f" gracePeriod=604800 Mar 09 03:05:37 crc kubenswrapper[4901]: E0309 03:05:37.398350 4901 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 09 03:05:37 crc kubenswrapper[4901]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:763d1f1e8a1cf877c151c59609960fd2fa29e7e50001f8818122a2d51878befa,Command:[/bin/sh -c #!/bin/bash Mar 09 03:05:37 crc kubenswrapper[4901]: Mar 09 03:05:37 crc kubenswrapper[4901]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 09 03:05:37 crc kubenswrapper[4901]: Mar 09 03:05:37 crc kubenswrapper[4901]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 09 03:05:37 crc kubenswrapper[4901]: Mar 09 03:05:37 crc kubenswrapper[4901]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 09 03:05:37 crc kubenswrapper[4901]: Mar 09 03:05:37 crc kubenswrapper[4901]: if [ -n "barbican" ]; then Mar 09 03:05:37 crc kubenswrapper[4901]: GRANT_DATABASE="barbican" Mar 09 03:05:37 crc kubenswrapper[4901]: else Mar 09 03:05:37 crc kubenswrapper[4901]: GRANT_DATABASE="*" Mar 09 03:05:37 crc kubenswrapper[4901]: fi Mar 09 03:05:37 crc kubenswrapper[4901]: Mar 09 03:05:37 crc kubenswrapper[4901]: # going for maximum compatibility here: Mar 09 03:05:37 crc kubenswrapper[4901]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 09 03:05:37 crc kubenswrapper[4901]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 09 03:05:37 crc kubenswrapper[4901]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 09 03:05:37 crc kubenswrapper[4901]: # support updates Mar 09 03:05:37 crc kubenswrapper[4901]: Mar 09 03:05:37 crc kubenswrapper[4901]: $MYSQL_CMD < logger="UnhandledError" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.398587 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.398743 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="412e01f6-e4bb-4bbd-ba88-5726f3e2f87f" containerName="nova-cell0-conductor-conductor" containerID="cri-o://2a8500133ddbae16882734d17dcfeac24a437220c873e5c49b9335461b23a2a0" gracePeriod=30 Mar 09 03:05:37 crc kubenswrapper[4901]: E0309 03:05:37.406306 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"barbican-db-secret\\\" not found\"" pod="openstack/barbican-3c33-account-create-update-8vqvj" podUID="97944a12-e740-486d-ab39-6b03818f3cbd" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.406705 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.406838 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="6b3d0806-00f0-46d7-a77f-f505583e49a2" containerName="nova-scheduler-scheduler" containerID="cri-o://5254cd71ad47a3d4e3364e0e37c621c03d973081d495139c8fd33e59e315cc27" gracePeriod=30 Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.442517 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.447897 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/167ad9cc-678d-499b-9be0-2e74112f84c9-ovsdbserver-sb-tls-certs\") pod \"167ad9cc-678d-499b-9be0-2e74112f84c9\" (UID: \"167ad9cc-678d-499b-9be0-2e74112f84c9\") " Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.447939 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/167ad9cc-678d-499b-9be0-2e74112f84c9-ovsdb-rundir\") pod \"167ad9cc-678d-499b-9be0-2e74112f84c9\" (UID: \"167ad9cc-678d-499b-9be0-2e74112f84c9\") " Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.448112 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/167ad9cc-678d-499b-9be0-2e74112f84c9-config\") pod \"167ad9cc-678d-499b-9be0-2e74112f84c9\" (UID: \"167ad9cc-678d-499b-9be0-2e74112f84c9\") " Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.448148 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/167ad9cc-678d-499b-9be0-2e74112f84c9-scripts\") pod \"167ad9cc-678d-499b-9be0-2e74112f84c9\" (UID: \"167ad9cc-678d-499b-9be0-2e74112f84c9\") " Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.448163 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/167ad9cc-678d-499b-9be0-2e74112f84c9-combined-ca-bundle\") pod \"167ad9cc-678d-499b-9be0-2e74112f84c9\" (UID: \"167ad9cc-678d-499b-9be0-2e74112f84c9\") " Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.448187 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"167ad9cc-678d-499b-9be0-2e74112f84c9\" (UID: \"167ad9cc-678d-499b-9be0-2e74112f84c9\") " Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.448266 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxs9l\" (UniqueName: \"kubernetes.io/projected/167ad9cc-678d-499b-9be0-2e74112f84c9-kube-api-access-xxs9l\") pod \"167ad9cc-678d-499b-9be0-2e74112f84c9\" (UID: \"167ad9cc-678d-499b-9be0-2e74112f84c9\") " Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.448306 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/167ad9cc-678d-499b-9be0-2e74112f84c9-metrics-certs-tls-certs\") pod \"167ad9cc-678d-499b-9be0-2e74112f84c9\" (UID: \"167ad9cc-678d-499b-9be0-2e74112f84c9\") " Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.448930 4901 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/dc54c941-19d2-42c1-b9f0-a3a58999bda5-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.448941 4901 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/dc54c941-19d2-42c1-b9f0-a3a58999bda5-var-run\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.448950 4901 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/065dfe75-7489-4b15-8a4d-4adf13393aea-ovn-rundir\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.448958 4901 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/065dfe75-7489-4b15-8a4d-4adf13393aea-ovs-rundir\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.448966 4901 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/dc54c941-19d2-42c1-b9f0-a3a58999bda5-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.448975 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/065dfe75-7489-4b15-8a4d-4adf13393aea-config\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.448983 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc54c941-19d2-42c1-b9f0-a3a58999bda5-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.448991 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jh9p6\" (UniqueName: \"kubernetes.io/projected/065dfe75-7489-4b15-8a4d-4adf13393aea-kube-api-access-jh9p6\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.448999 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntvll\" (UniqueName: \"kubernetes.io/projected/dc54c941-19d2-42c1-b9f0-a3a58999bda5-kube-api-access-ntvll\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.449007 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcw8z\" (UniqueName: \"kubernetes.io/projected/4ba69329-2c9f-4938-89b0-d1fa314d5a30-kube-api-access-vcw8z\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.449521 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/167ad9cc-678d-499b-9be0-2e74112f84c9-scripts" (OuterVolumeSpecName: "scripts") pod "167ad9cc-678d-499b-9be0-2e74112f84c9" (UID: "167ad9cc-678d-499b-9be0-2e74112f84c9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.455933 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "167ad9cc-678d-499b-9be0-2e74112f84c9" (UID: "167ad9cc-678d-499b-9be0-2e74112f84c9"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.456459 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/167ad9cc-678d-499b-9be0-2e74112f84c9-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "167ad9cc-678d-499b-9be0-2e74112f84c9" (UID: "167ad9cc-678d-499b-9be0-2e74112f84c9"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.459649 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.459944 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/167ad9cc-678d-499b-9be0-2e74112f84c9-config" (OuterVolumeSpecName: "config") pod "167ad9cc-678d-499b-9be0-2e74112f84c9" (UID: "167ad9cc-678d-499b-9be0-2e74112f84c9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.463561 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.481733 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/167ad9cc-678d-499b-9be0-2e74112f84c9-kube-api-access-xxs9l" (OuterVolumeSpecName: "kube-api-access-xxs9l") pod "167ad9cc-678d-499b-9be0-2e74112f84c9" (UID: "167ad9cc-678d-499b-9be0-2e74112f84c9"). InnerVolumeSpecName "kube-api-access-xxs9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.487307 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-3c33-account-create-update-8vqvj"] Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.519757 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.550233 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/065dfe75-7489-4b15-8a4d-4adf13393aea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "065dfe75-7489-4b15-8a4d-4adf13393aea" (UID: "065dfe75-7489-4b15-8a4d-4adf13393aea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.550604 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/34bc86a8-8821-462a-b15b-c2f847f44be2-openstack-config\") pod \"34bc86a8-8821-462a-b15b-c2f847f44be2\" (UID: \"34bc86a8-8821-462a-b15b-c2f847f44be2\") " Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.550742 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbbk9\" (UniqueName: \"kubernetes.io/projected/34bc86a8-8821-462a-b15b-c2f847f44be2-kube-api-access-jbbk9\") pod \"34bc86a8-8821-462a-b15b-c2f847f44be2\" (UID: \"34bc86a8-8821-462a-b15b-c2f847f44be2\") " Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.565296 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34bc86a8-8821-462a-b15b-c2f847f44be2-combined-ca-bundle\") pod \"34bc86a8-8821-462a-b15b-c2f847f44be2\" (UID: \"34bc86a8-8821-462a-b15b-c2f847f44be2\") " Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.565431 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/34bc86a8-8821-462a-b15b-c2f847f44be2-openstack-config-secret\") pod \"34bc86a8-8821-462a-b15b-c2f847f44be2\" (UID: \"34bc86a8-8821-462a-b15b-c2f847f44be2\") " Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.566434 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/065dfe75-7489-4b15-8a4d-4adf13393aea-combined-ca-bundle\") pod \"065dfe75-7489-4b15-8a4d-4adf13393aea\" (UID: \"065dfe75-7489-4b15-8a4d-4adf13393aea\") " Mar 09 03:05:37 crc kubenswrapper[4901]: W0309 03:05:37.567573 4901 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/065dfe75-7489-4b15-8a4d-4adf13393aea/volumes/kubernetes.io~secret/combined-ca-bundle Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.567611 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/065dfe75-7489-4b15-8a4d-4adf13393aea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "065dfe75-7489-4b15-8a4d-4adf13393aea" (UID: "065dfe75-7489-4b15-8a4d-4adf13393aea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.568033 4901 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/167ad9cc-678d-499b-9be0-2e74112f84c9-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.568057 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/167ad9cc-678d-499b-9be0-2e74112f84c9-config\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.568076 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/167ad9cc-678d-499b-9be0-2e74112f84c9-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.568100 4901 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.568113 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxs9l\" (UniqueName: \"kubernetes.io/projected/167ad9cc-678d-499b-9be0-2e74112f84c9-kube-api-access-xxs9l\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.568132 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/065dfe75-7489-4b15-8a4d-4adf13393aea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.575435 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34bc86a8-8821-462a-b15b-c2f847f44be2-kube-api-access-jbbk9" (OuterVolumeSpecName: "kube-api-access-jbbk9") pod "34bc86a8-8821-462a-b15b-c2f847f44be2" (UID: "34bc86a8-8821-462a-b15b-c2f847f44be2"). InnerVolumeSpecName "kube-api-access-jbbk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.589654 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc54c941-19d2-42c1-b9f0-a3a58999bda5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc54c941-19d2-42c1-b9f0-a3a58999bda5" (UID: "dc54c941-19d2-42c1-b9f0-a3a58999bda5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.604343 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="46c7df0b-fc0a-4fd9-b097-72da03442510" containerName="rabbitmq" containerID="cri-o://8e53b64e302219a5893985d8539fa02c98bbe9e4e4c23ce74114a1519a51b3c0" gracePeriod=604800 Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.644927 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ba69329-2c9f-4938-89b0-d1fa314d5a30-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4ba69329-2c9f-4938-89b0-d1fa314d5a30" (UID: "4ba69329-2c9f-4938-89b0-d1fa314d5a30"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.663058 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/167ad9cc-678d-499b-9be0-2e74112f84c9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "167ad9cc-678d-499b-9be0-2e74112f84c9" (UID: "167ad9cc-678d-499b-9be0-2e74112f84c9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.668632 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ba69329-2c9f-4938-89b0-d1fa314d5a30-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4ba69329-2c9f-4938-89b0-d1fa314d5a30" (UID: "4ba69329-2c9f-4938-89b0-d1fa314d5a30"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.670651 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbbk9\" (UniqueName: \"kubernetes.io/projected/34bc86a8-8821-462a-b15b-c2f847f44be2-kube-api-access-jbbk9\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.670678 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc54c941-19d2-42c1-b9f0-a3a58999bda5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.670688 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/167ad9cc-678d-499b-9be0-2e74112f84c9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.670696 4901 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ba69329-2c9f-4938-89b0-d1fa314d5a30-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.670706 4901 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ba69329-2c9f-4938-89b0-d1fa314d5a30-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.670926 4901 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.671516 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ba69329-2c9f-4938-89b0-d1fa314d5a30-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4ba69329-2c9f-4938-89b0-d1fa314d5a30" (UID: "4ba69329-2c9f-4938-89b0-d1fa314d5a30"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.687261 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34bc86a8-8821-462a-b15b-c2f847f44be2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "34bc86a8-8821-462a-b15b-c2f847f44be2" (UID: "34bc86a8-8821-462a-b15b-c2f847f44be2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.692214 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ba69329-2c9f-4938-89b0-d1fa314d5a30-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4ba69329-2c9f-4938-89b0-d1fa314d5a30" (UID: "4ba69329-2c9f-4938-89b0-d1fa314d5a30"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.707595 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34bc86a8-8821-462a-b15b-c2f847f44be2-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "34bc86a8-8821-462a-b15b-c2f847f44be2" (UID: "34bc86a8-8821-462a-b15b-c2f847f44be2"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.718157 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ba69329-2c9f-4938-89b0-d1fa314d5a30-config" (OuterVolumeSpecName: "config") pod "4ba69329-2c9f-4938-89b0-d1fa314d5a30" (UID: "4ba69329-2c9f-4938-89b0-d1fa314d5a30"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.722001 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc54c941-19d2-42c1-b9f0-a3a58999bda5-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "dc54c941-19d2-42c1-b9f0-a3a58999bda5" (UID: "dc54c941-19d2-42c1-b9f0-a3a58999bda5"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.726464 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/065dfe75-7489-4b15-8a4d-4adf13393aea-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "065dfe75-7489-4b15-8a4d-4adf13393aea" (UID: "065dfe75-7489-4b15-8a4d-4adf13393aea"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.731108 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-cell1-novncproxy-0" podUID="52461a44-ded9-4025-b0f1-85c22462a04f" containerName="nova-cell1-novncproxy-novncproxy" probeResult="failure" output="Get \"https://10.217.0.202:6080/vnc_lite.html\": dial tcp 10.217.0.202:6080: connect: connection refused" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.745377 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/167ad9cc-678d-499b-9be0-2e74112f84c9-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "167ad9cc-678d-499b-9be0-2e74112f84c9" (UID: "167ad9cc-678d-499b-9be0-2e74112f84c9"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.748145 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/167ad9cc-678d-499b-9be0-2e74112f84c9-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "167ad9cc-678d-499b-9be0-2e74112f84c9" (UID: "167ad9cc-678d-499b-9be0-2e74112f84c9"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.756604 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34bc86a8-8821-462a-b15b-c2f847f44be2-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "34bc86a8-8821-462a-b15b-c2f847f44be2" (UID: "34bc86a8-8821-462a-b15b-c2f847f44be2"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.774056 4901 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc54c941-19d2-42c1-b9f0-a3a58999bda5-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.774086 4901 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/167ad9cc-678d-499b-9be0-2e74112f84c9-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.774095 4901 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ba69329-2c9f-4938-89b0-d1fa314d5a30-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.774106 4901 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4ba69329-2c9f-4938-89b0-d1fa314d5a30-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.774114 4901 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/167ad9cc-678d-499b-9be0-2e74112f84c9-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.774122 4901 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/34bc86a8-8821-462a-b15b-c2f847f44be2-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.774132 4901 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/065dfe75-7489-4b15-8a4d-4adf13393aea-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.774140 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ba69329-2c9f-4938-89b0-d1fa314d5a30-config\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:37 crc kubenswrapper[4901]: E0309 03:05:37.774269 4901 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Mar 09 03:05:37 crc kubenswrapper[4901]: E0309 03:05:37.774373 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a6b8a05a-d698-4770-a883-fd60b61190b7-operator-scripts podName:a6b8a05a-d698-4770-a883-fd60b61190b7 nodeName:}" failed. No retries permitted until 2026-03-09 03:05:38.774347364 +0000 UTC m=+1463.364011136 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/a6b8a05a-d698-4770-a883-fd60b61190b7-operator-scripts") pod "nova-cell1-cddb-account-create-update-z7t9t" (UID: "a6b8a05a-d698-4770-a883-fd60b61190b7") : configmap "openstack-cell1-scripts" not found Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.775583 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34bc86a8-8821-462a-b15b-c2f847f44be2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.775748 4901 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:37 crc kubenswrapper[4901]: I0309 03:05:37.775759 4901 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/34bc86a8-8821-462a-b15b-c2f847f44be2-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:37 crc kubenswrapper[4901]: E0309 03:05:37.878433 4901 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Mar 09 03:05:37 crc kubenswrapper[4901]: E0309 03:05:37.878497 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d57347fc-0546-466f-95e6-055857ca3685-operator-scripts podName:d57347fc-0546-466f-95e6-055857ca3685 nodeName:}" failed. No retries permitted until 2026-03-09 03:05:38.878482948 +0000 UTC m=+1463.468146680 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/d57347fc-0546-466f-95e6-055857ca3685-operator-scripts") pod "root-account-create-update-d247z" (UID: "d57347fc-0546-466f-95e6-055857ca3685") : configmap "openstack-cell1-scripts" not found Mar 09 03:05:37 crc kubenswrapper[4901]: E0309 03:05:37.878600 4901 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 09 03:05:37 crc kubenswrapper[4901]: E0309 03:05:37.878685 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/46c7df0b-fc0a-4fd9-b097-72da03442510-config-data podName:46c7df0b-fc0a-4fd9-b097-72da03442510 nodeName:}" failed. No retries permitted until 2026-03-09 03:05:39.878663593 +0000 UTC m=+1464.468327335 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/46c7df0b-fc0a-4fd9-b097-72da03442510-config-data") pod "rabbitmq-server-0" (UID: "46c7df0b-fc0a-4fd9-b097-72da03442510") : configmap "rabbitmq-config-data" not found Mar 09 03:05:38 crc kubenswrapper[4901]: E0309 03:05:38.084412 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="efaef8fb4ac1ae8528bd34b42efc1ece916dbf688592bd94cb85d4b9a27c1591" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 09 03:05:38 crc kubenswrapper[4901]: E0309 03:05:38.085595 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="efaef8fb4ac1ae8528bd34b42efc1ece916dbf688592bd94cb85d4b9a27c1591" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 09 03:05:38 crc kubenswrapper[4901]: E0309 03:05:38.086942 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="efaef8fb4ac1ae8528bd34b42efc1ece916dbf688592bd94cb85d4b9a27c1591" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 09 03:05:38 crc kubenswrapper[4901]: E0309 03:05:38.086994 4901 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="19b624e5-b3de-4724-b995-829d3fcd48ae" containerName="nova-cell1-conductor-conductor" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.152806 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03fb66e1-3428-40d7-a0b4-b8a7938ff800" path="/var/lib/kubelet/pods/03fb66e1-3428-40d7-a0b4-b8a7938ff800/volumes" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.155502 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09d03d05-f63f-4a84-be7e-fcfaaae0505d" path="/var/lib/kubelet/pods/09d03d05-f63f-4a84-be7e-fcfaaae0505d/volumes" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.156436 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c8555bf-e516-4e10-be72-afbbb53fb31e" path="/var/lib/kubelet/pods/0c8555bf-e516-4e10-be72-afbbb53fb31e/volumes" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.157147 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="142cab77-aec7-45a4-9c64-45c3209e2a9d" path="/var/lib/kubelet/pods/142cab77-aec7-45a4-9c64-45c3209e2a9d/volumes" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.162830 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="173f3ec0-4825-426f-8a8f-fa693b7068d2" path="/var/lib/kubelet/pods/173f3ec0-4825-426f-8a8f-fa693b7068d2/volumes" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.163596 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a093673-8ed6-457e-8981-83864827e781" path="/var/lib/kubelet/pods/1a093673-8ed6-457e-8981-83864827e781/volumes" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.164370 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34bc86a8-8821-462a-b15b-c2f847f44be2" path="/var/lib/kubelet/pods/34bc86a8-8821-462a-b15b-c2f847f44be2/volumes" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.165097 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bfe9cc4-0c72-472c-a6ae-e08d6ae58d6c" path="/var/lib/kubelet/pods/5bfe9cc4-0c72-472c-a6ae-e08d6ae58d6c/volumes" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.174979 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="617dd4a9-970a-4c70-b587-f74323c172da" path="/var/lib/kubelet/pods/617dd4a9-970a-4c70-b587-f74323c172da/volumes" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.176730 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="698029af-0c10-4446-81f0-fd59859b8722" path="/var/lib/kubelet/pods/698029af-0c10-4446-81f0-fd59859b8722/volumes" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.177401 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6be4ea2e-b742-478c-a6e3-56f43a856e40" path="/var/lib/kubelet/pods/6be4ea2e-b742-478c-a6e3-56f43a856e40/volumes" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.178039 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e394216-8657-46dd-95d8-5d0e73512d11" path="/var/lib/kubelet/pods/7e394216-8657-46dd-95d8-5d0e73512d11/volumes" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.179855 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfeef473-c12a-4af0-9a27-f1fe52a0b144" path="/var/lib/kubelet/pods/bfeef473-c12a-4af0-9a27-f1fe52a0b144/volumes" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.180867 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c86c22f2-896c-4c29-95c7-024aea61dcd2" path="/var/lib/kubelet/pods/c86c22f2-896c-4c29-95c7-024aea61dcd2/volumes" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.186390 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d752c748-e235-4087-849e-3fe86c6e52b4" path="/var/lib/kubelet/pods/d752c748-e235-4087-849e-3fe86c6e52b4/volumes" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.187369 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea546a17-3742-4171-a2ff-1df8b5dce890" path="/var/lib/kubelet/pods/ea546a17-3742-4171-a2ff-1df8b5dce890/volumes" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.188921 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.189179 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2c3d4e9a-122e-4894-98b2-91784a9f44e8" containerName="ceilometer-central-agent" containerID="cri-o://fd30c7afd1e4a7b025b8b590cdbf65ff2deb03badcd9cc3419bf24e1b542c1ed" gracePeriod=30 Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.189455 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-579bf976f9-ds45q" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.189745 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2c3d4e9a-122e-4894-98b2-91784a9f44e8" containerName="proxy-httpd" containerID="cri-o://2cd524a1798da2be5336faeee44da3b0c7ebcf43441ba6883148aca3906e53c0" gracePeriod=30 Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.189800 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2c3d4e9a-122e-4894-98b2-91784a9f44e8" containerName="sg-core" containerID="cri-o://ef72ec9152a71c9764e40c053af5316561da42488861f73e1bfb72f8538b1bb8" gracePeriod=30 Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.191750 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2c3d4e9a-122e-4894-98b2-91784a9f44e8" containerName="ceilometer-notification-agent" containerID="cri-o://5728e2e081eb4563b67c226046888a882e8bdb83914ea86f35f1527e56c5d36a" gracePeriod=30 Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.229886 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.239340 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.250522 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="7050aa4c-725b-482a-8b90-f1374b3a4a42" containerName="kube-state-metrics" containerID="cri-o://149efda8ee8e0be18d7dd9fb462950b6382146330b30292eff8d423e80ed5cc6" gracePeriod=30 Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.285209 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfd87218-f7dc-424a-acda-dd7b57792738-public-tls-certs\") pod \"cfd87218-f7dc-424a-acda-dd7b57792738\" (UID: \"cfd87218-f7dc-424a-acda-dd7b57792738\") " Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.285293 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/52461a44-ded9-4025-b0f1-85c22462a04f-vencrypt-tls-certs\") pod \"52461a44-ded9-4025-b0f1-85c22462a04f\" (UID: \"52461a44-ded9-4025-b0f1-85c22462a04f\") " Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.285355 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gb5x2\" (UniqueName: \"kubernetes.io/projected/52461a44-ded9-4025-b0f1-85c22462a04f-kube-api-access-gb5x2\") pod \"52461a44-ded9-4025-b0f1-85c22462a04f\" (UID: \"52461a44-ded9-4025-b0f1-85c22462a04f\") " Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.285399 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfd87218-f7dc-424a-acda-dd7b57792738-log-httpd\") pod \"cfd87218-f7dc-424a-acda-dd7b57792738\" (UID: \"cfd87218-f7dc-424a-acda-dd7b57792738\") " Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.285493 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/52461a44-ded9-4025-b0f1-85c22462a04f-nova-novncproxy-tls-certs\") pod \"52461a44-ded9-4025-b0f1-85c22462a04f\" (UID: \"52461a44-ded9-4025-b0f1-85c22462a04f\") " Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.285538 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52461a44-ded9-4025-b0f1-85c22462a04f-combined-ca-bundle\") pod \"52461a44-ded9-4025-b0f1-85c22462a04f\" (UID: \"52461a44-ded9-4025-b0f1-85c22462a04f\") " Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.285576 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfd87218-f7dc-424a-acda-dd7b57792738-internal-tls-certs\") pod \"cfd87218-f7dc-424a-acda-dd7b57792738\" (UID: \"cfd87218-f7dc-424a-acda-dd7b57792738\") " Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.285597 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfd87218-f7dc-424a-acda-dd7b57792738-config-data\") pod \"cfd87218-f7dc-424a-acda-dd7b57792738\" (UID: \"cfd87218-f7dc-424a-acda-dd7b57792738\") " Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.285644 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfd87218-f7dc-424a-acda-dd7b57792738-combined-ca-bundle\") pod \"cfd87218-f7dc-424a-acda-dd7b57792738\" (UID: \"cfd87218-f7dc-424a-acda-dd7b57792738\") " Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.285670 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2bgt\" (UniqueName: \"kubernetes.io/projected/cfd87218-f7dc-424a-acda-dd7b57792738-kube-api-access-z2bgt\") pod \"cfd87218-f7dc-424a-acda-dd7b57792738\" (UID: \"cfd87218-f7dc-424a-acda-dd7b57792738\") " Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.285732 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52461a44-ded9-4025-b0f1-85c22462a04f-config-data\") pod \"52461a44-ded9-4025-b0f1-85c22462a04f\" (UID: \"52461a44-ded9-4025-b0f1-85c22462a04f\") " Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.285775 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfd87218-f7dc-424a-acda-dd7b57792738-run-httpd\") pod \"cfd87218-f7dc-424a-acda-dd7b57792738\" (UID: \"cfd87218-f7dc-424a-acda-dd7b57792738\") " Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.285832 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cfd87218-f7dc-424a-acda-dd7b57792738-etc-swift\") pod \"cfd87218-f7dc-424a-acda-dd7b57792738\" (UID: \"cfd87218-f7dc-424a-acda-dd7b57792738\") " Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.309696 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfd87218-f7dc-424a-acda-dd7b57792738-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "cfd87218-f7dc-424a-acda-dd7b57792738" (UID: "cfd87218-f7dc-424a-acda-dd7b57792738"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.309708 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfd87218-f7dc-424a-acda-dd7b57792738-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "cfd87218-f7dc-424a-acda-dd7b57792738" (UID: "cfd87218-f7dc-424a-acda-dd7b57792738"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.324959 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52461a44-ded9-4025-b0f1-85c22462a04f-kube-api-access-gb5x2" (OuterVolumeSpecName: "kube-api-access-gb5x2") pod "52461a44-ded9-4025-b0f1-85c22462a04f" (UID: "52461a44-ded9-4025-b0f1-85c22462a04f"). InnerVolumeSpecName "kube-api-access-gb5x2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.333011 4901 generic.go:334] "Generic (PLEG): container finished" podID="52461a44-ded9-4025-b0f1-85c22462a04f" containerID="4fc838d65f3d594291f10b8a95a561b0be21c7dd87bb4f1fb968c80a5bf041fa" exitCode=0 Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.333267 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.333259 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"52461a44-ded9-4025-b0f1-85c22462a04f","Type":"ContainerDied","Data":"4fc838d65f3d594291f10b8a95a561b0be21c7dd87bb4f1fb968c80a5bf041fa"} Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.333321 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"52461a44-ded9-4025-b0f1-85c22462a04f","Type":"ContainerDied","Data":"22706626cd82769b885a529ea386be9f38a0164eb502bacdbbb1be7b00386f81"} Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.333339 4901 scope.go:117] "RemoveContainer" containerID="4fc838d65f3d594291f10b8a95a561b0be21c7dd87bb4f1fb968c80a5bf041fa" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.342206 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfd87218-f7dc-424a-acda-dd7b57792738-kube-api-access-z2bgt" (OuterVolumeSpecName: "kube-api-access-z2bgt") pod "cfd87218-f7dc-424a-acda-dd7b57792738" (UID: "cfd87218-f7dc-424a-acda-dd7b57792738"). InnerVolumeSpecName "kube-api-access-z2bgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.356779 4901 generic.go:334] "Generic (PLEG): container finished" podID="f0098aa8-4248-48ec-a254-368c395308b1" containerID="b4c164d97e9bee042efac1c9e73f97d746429bbdb1ecf72f43fc47c10cd2ec24" exitCode=0 Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.356981 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f0098aa8-4248-48ec-a254-368c395308b1","Type":"ContainerDied","Data":"b4c164d97e9bee042efac1c9e73f97d746429bbdb1ecf72f43fc47c10cd2ec24"} Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.371658 4901 generic.go:334] "Generic (PLEG): container finished" podID="cfd87218-f7dc-424a-acda-dd7b57792738" containerID="f2e8fa826008c2c9f01478d90a32d0b9381a5e77f7d22cee3432c76f471846e2" exitCode=0 Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.371694 4901 generic.go:334] "Generic (PLEG): container finished" podID="cfd87218-f7dc-424a-acda-dd7b57792738" containerID="5aadda7ec4443b77cc72cf257c26eb2f3cb4e579053dc6d7e5abdc079cb7dcfd" exitCode=0 Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.371735 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-579bf976f9-ds45q" event={"ID":"cfd87218-f7dc-424a-acda-dd7b57792738","Type":"ContainerDied","Data":"f2e8fa826008c2c9f01478d90a32d0b9381a5e77f7d22cee3432c76f471846e2"} Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.371761 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-579bf976f9-ds45q" event={"ID":"cfd87218-f7dc-424a-acda-dd7b57792738","Type":"ContainerDied","Data":"5aadda7ec4443b77cc72cf257c26eb2f3cb4e579053dc6d7e5abdc079cb7dcfd"} Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.371772 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-579bf976f9-ds45q" event={"ID":"cfd87218-f7dc-424a-acda-dd7b57792738","Type":"ContainerDied","Data":"9d64fb7ca5417e814d9976f81b03a48cde8b3f4460dfc64117105a4c35428c75"} Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.371823 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-579bf976f9-ds45q" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.371938 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfd87218-f7dc-424a-acda-dd7b57792738-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "cfd87218-f7dc-424a-acda-dd7b57792738" (UID: "cfd87218-f7dc-424a-acda-dd7b57792738"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.378047 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-3c33-account-create-update-8vqvj" event={"ID":"97944a12-e740-486d-ab39-6b03818f3cbd","Type":"ContainerStarted","Data":"82ac3107b4fc1a61dc1013fced1cff813f09ec93dc369e57c611c0fa5b71c0d0"} Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.389569 4901 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfd87218-f7dc-424a-acda-dd7b57792738-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.389596 4901 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cfd87218-f7dc-424a-acda-dd7b57792738-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.389605 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gb5x2\" (UniqueName: \"kubernetes.io/projected/52461a44-ded9-4025-b0f1-85c22462a04f-kube-api-access-gb5x2\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.389613 4901 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfd87218-f7dc-424a-acda-dd7b57792738-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.389621 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2bgt\" (UniqueName: \"kubernetes.io/projected/cfd87218-f7dc-424a-acda-dd7b57792738-kube-api-access-z2bgt\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.390914 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.391181 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="2b3e03cd-75ae-46dc-aee4-b778929cf535" containerName="memcached" containerID="cri-o://854d0438c9e18acb0221644ecbe21cbe51324ea8c9135307368198911d10bbd5" gracePeriod=30 Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.400327 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_167ad9cc-678d-499b-9be0-2e74112f84c9/ovsdbserver-sb/0.log" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.400421 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"167ad9cc-678d-499b-9be0-2e74112f84c9","Type":"ContainerDied","Data":"46f0b2c02c4740aa4f7c827024b1083c604e073028efe8be643c2bdaa5a311aa"} Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.400524 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.414675 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.425921 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-0c1f-account-create-update-rllhk"] Mar 09 03:05:38 crc kubenswrapper[4901]: E0309 03:05:38.427155 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="065dfe75-7489-4b15-8a4d-4adf13393aea" containerName="openstack-network-exporter" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.427189 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="065dfe75-7489-4b15-8a4d-4adf13393aea" containerName="openstack-network-exporter" Mar 09 03:05:38 crc kubenswrapper[4901]: E0309 03:05:38.427211 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="167ad9cc-678d-499b-9be0-2e74112f84c9" containerName="openstack-network-exporter" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.427238 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="167ad9cc-678d-499b-9be0-2e74112f84c9" containerName="openstack-network-exporter" Mar 09 03:05:38 crc kubenswrapper[4901]: E0309 03:05:38.427247 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ba69329-2c9f-4938-89b0-d1fa314d5a30" containerName="init" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.427252 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ba69329-2c9f-4938-89b0-d1fa314d5a30" containerName="init" Mar 09 03:05:38 crc kubenswrapper[4901]: E0309 03:05:38.427263 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c86c22f2-896c-4c29-95c7-024aea61dcd2" containerName="ovsdbserver-nb" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.427268 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="c86c22f2-896c-4c29-95c7-024aea61dcd2" containerName="ovsdbserver-nb" Mar 09 03:05:38 crc kubenswrapper[4901]: E0309 03:05:38.427278 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc54c941-19d2-42c1-b9f0-a3a58999bda5" containerName="ovn-controller" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.427284 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc54c941-19d2-42c1-b9f0-a3a58999bda5" containerName="ovn-controller" Mar 09 03:05:38 crc kubenswrapper[4901]: E0309 03:05:38.427293 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="167ad9cc-678d-499b-9be0-2e74112f84c9" containerName="ovsdbserver-sb" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.427299 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="167ad9cc-678d-499b-9be0-2e74112f84c9" containerName="ovsdbserver-sb" Mar 09 03:05:38 crc kubenswrapper[4901]: E0309 03:05:38.427378 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfd87218-f7dc-424a-acda-dd7b57792738" containerName="proxy-httpd" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.427443 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfd87218-f7dc-424a-acda-dd7b57792738" containerName="proxy-httpd" Mar 09 03:05:38 crc kubenswrapper[4901]: E0309 03:05:38.427481 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ba69329-2c9f-4938-89b0-d1fa314d5a30" containerName="dnsmasq-dns" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.427488 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ba69329-2c9f-4938-89b0-d1fa314d5a30" containerName="dnsmasq-dns" Mar 09 03:05:38 crc kubenswrapper[4901]: E0309 03:05:38.427499 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c86c22f2-896c-4c29-95c7-024aea61dcd2" containerName="openstack-network-exporter" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.427505 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="c86c22f2-896c-4c29-95c7-024aea61dcd2" containerName="openstack-network-exporter" Mar 09 03:05:38 crc kubenswrapper[4901]: E0309 03:05:38.427536 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52461a44-ded9-4025-b0f1-85c22462a04f" containerName="nova-cell1-novncproxy-novncproxy" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.427565 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="52461a44-ded9-4025-b0f1-85c22462a04f" containerName="nova-cell1-novncproxy-novncproxy" Mar 09 03:05:38 crc kubenswrapper[4901]: E0309 03:05:38.427577 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfd87218-f7dc-424a-acda-dd7b57792738" containerName="proxy-server" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.427584 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfd87218-f7dc-424a-acda-dd7b57792738" containerName="proxy-server" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.427757 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="167ad9cc-678d-499b-9be0-2e74112f84c9" containerName="ovsdbserver-sb" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.427775 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfd87218-f7dc-424a-acda-dd7b57792738" containerName="proxy-httpd" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.427831 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ba69329-2c9f-4938-89b0-d1fa314d5a30" containerName="dnsmasq-dns" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.427842 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="52461a44-ded9-4025-b0f1-85c22462a04f" containerName="nova-cell1-novncproxy-novncproxy" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.427851 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="c86c22f2-896c-4c29-95c7-024aea61dcd2" containerName="ovsdbserver-nb" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.427862 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="065dfe75-7489-4b15-8a4d-4adf13393aea" containerName="openstack-network-exporter" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.427872 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="167ad9cc-678d-499b-9be0-2e74112f84c9" containerName="openstack-network-exporter" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.427883 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc54c941-19d2-42c1-b9f0-a3a58999bda5" containerName="ovn-controller" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.427899 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="c86c22f2-896c-4c29-95c7-024aea61dcd2" containerName="openstack-network-exporter" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.427911 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfd87218-f7dc-424a-acda-dd7b57792738" containerName="proxy-server" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.428819 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0c1f-account-create-update-rllhk" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.439490 4901 generic.go:334] "Generic (PLEG): container finished" podID="5276569a-5e4d-4bbb-ab39-f9a420e4a3e9" containerID="91cbb6deb4ec6b7d9a4c55257a984d7c3866e71cfa748163111d9fb04ddec075" exitCode=0 Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.439646 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5276569a-5e4d-4bbb-ab39-f9a420e4a3e9","Type":"ContainerDied","Data":"91cbb6deb4ec6b7d9a4c55257a984d7c3866e71cfa748163111d9fb04ddec075"} Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.439987 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.442403 4901 generic.go:334] "Generic (PLEG): container finished" podID="e5efc6dd-6a36-4491-b090-b4c9301ec7d0" containerID="148d0935d0af545a31bff0013cab741797e444db8fdba8b7ef3fe82da71d3a67" exitCode=143 Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.442467 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6cc8485b48-f86rl" event={"ID":"e5efc6dd-6a36-4491-b090-b4c9301ec7d0","Type":"ContainerDied","Data":"148d0935d0af545a31bff0013cab741797e444db8fdba8b7ef3fe82da71d3a67"} Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.445788 4901 generic.go:334] "Generic (PLEG): container finished" podID="baa336b3-abdd-43e2-9c54-6d8d34c71204" containerID="6e6b9a76927163c431cd48ce16cec53d765e06541e962dcf3dccf5f2b20e4b6e" exitCode=143 Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.445819 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-875b9dd78-8t9g6" event={"ID":"baa336b3-abdd-43e2-9c54-6d8d34c71204","Type":"ContainerDied","Data":"6e6b9a76927163c431cd48ce16cec53d765e06541e962dcf3dccf5f2b20e4b6e"} Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.457178 4901 generic.go:334] "Generic (PLEG): container finished" podID="a29f795d-59d2-4e43-a6ee-6190dc0ad67d" containerID="42ab54fcd0a589e516d8eb72267b465d9fcb7af80252fced9a388a56a784446b" exitCode=0 Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.457516 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b7f6df545-whtgc" event={"ID":"a29f795d-59d2-4e43-a6ee-6190dc0ad67d","Type":"ContainerDied","Data":"42ab54fcd0a589e516d8eb72267b465d9fcb7af80252fced9a388a56a784446b"} Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.459815 4901 generic.go:334] "Generic (PLEG): container finished" podID="6790ccc5-8f7f-4de8-bd69-652661631307" containerID="4b17bc681ef09af1769a69b4b52d036b3ca7d2d0d4b4182be89cf1ab304d4772" exitCode=143 Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.459926 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c8964d89c-vw8lc" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.460181 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-nmx84" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.460345 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7d497f76dc-pptvt" event={"ID":"6790ccc5-8f7f-4de8-bd69-652661631307","Type":"ContainerDied","Data":"4b17bc681ef09af1769a69b4b52d036b3ca7d2d0d4b4182be89cf1ab304d4772"} Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.460520 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5rg5k" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.461513 4901 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/root-account-create-update-d247z" secret="" err="secret \"galera-openstack-cell1-dockercfg-mhqqf\" not found" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.482779 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-0c1f-account-create-update-ldjfp"] Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.494662 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpcm2\" (UniqueName: \"kubernetes.io/projected/1d382c9a-714e-41fc-8266-8e4ee322f5c7-kube-api-access-hpcm2\") pod \"keystone-0c1f-account-create-update-rllhk\" (UID: \"1d382c9a-714e-41fc-8266-8e4ee322f5c7\") " pod="openstack/keystone-0c1f-account-create-update-rllhk" Mar 09 03:05:38 crc kubenswrapper[4901]: E0309 03:05:38.495784 4901 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 09 03:05:38 crc kubenswrapper[4901]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:763d1f1e8a1cf877c151c59609960fd2fa29e7e50001f8818122a2d51878befa,Command:[/bin/sh -c #!/bin/bash Mar 09 03:05:38 crc kubenswrapper[4901]: Mar 09 03:05:38 crc kubenswrapper[4901]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 09 03:05:38 crc kubenswrapper[4901]: Mar 09 03:05:38 crc kubenswrapper[4901]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 09 03:05:38 crc kubenswrapper[4901]: Mar 09 03:05:38 crc kubenswrapper[4901]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 09 03:05:38 crc kubenswrapper[4901]: Mar 09 03:05:38 crc kubenswrapper[4901]: if [ -n "" ]; then Mar 09 03:05:38 crc kubenswrapper[4901]: GRANT_DATABASE="" Mar 09 03:05:38 crc kubenswrapper[4901]: else Mar 09 03:05:38 crc kubenswrapper[4901]: GRANT_DATABASE="*" Mar 09 03:05:38 crc kubenswrapper[4901]: fi Mar 09 03:05:38 crc kubenswrapper[4901]: Mar 09 03:05:38 crc kubenswrapper[4901]: # going for maximum compatibility here: Mar 09 03:05:38 crc kubenswrapper[4901]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 09 03:05:38 crc kubenswrapper[4901]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 09 03:05:38 crc kubenswrapper[4901]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 09 03:05:38 crc kubenswrapper[4901]: # support updates Mar 09 03:05:38 crc kubenswrapper[4901]: Mar 09 03:05:38 crc kubenswrapper[4901]: $MYSQL_CMD < logger="UnhandledError" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.498785 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-0c1f-account-create-update-ldjfp"] Mar 09 03:05:38 crc kubenswrapper[4901]: E0309 03:05:38.498877 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-cell1-mariadb-root-db-secret\\\" not found\"" pod="openstack/root-account-create-update-d247z" podUID="d57347fc-0546-466f-95e6-055857ca3685" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.500504 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d382c9a-714e-41fc-8266-8e4ee322f5c7-operator-scripts\") pod \"keystone-0c1f-account-create-update-rllhk\" (UID: \"1d382c9a-714e-41fc-8266-8e4ee322f5c7\") " pod="openstack/keystone-0c1f-account-create-update-rllhk" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.519028 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.536521 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfd87218-f7dc-424a-acda-dd7b57792738-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "cfd87218-f7dc-424a-acda-dd7b57792738" (UID: "cfd87218-f7dc-424a-acda-dd7b57792738"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.551842 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-jl2xh"] Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.560033 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52461a44-ded9-4025-b0f1-85c22462a04f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "52461a44-ded9-4025-b0f1-85c22462a04f" (UID: "52461a44-ded9-4025-b0f1-85c22462a04f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.564701 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52461a44-ded9-4025-b0f1-85c22462a04f-config-data" (OuterVolumeSpecName: "config-data") pod "52461a44-ded9-4025-b0f1-85c22462a04f" (UID: "52461a44-ded9-4025-b0f1-85c22462a04f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.564750 4901 scope.go:117] "RemoveContainer" containerID="4fc838d65f3d594291f10b8a95a561b0be21c7dd87bb4f1fb968c80a5bf041fa" Mar 09 03:05:38 crc kubenswrapper[4901]: E0309 03:05:38.568119 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fc838d65f3d594291f10b8a95a561b0be21c7dd87bb4f1fb968c80a5bf041fa\": container with ID starting with 4fc838d65f3d594291f10b8a95a561b0be21c7dd87bb4f1fb968c80a5bf041fa not found: ID does not exist" containerID="4fc838d65f3d594291f10b8a95a561b0be21c7dd87bb4f1fb968c80a5bf041fa" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.568163 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fc838d65f3d594291f10b8a95a561b0be21c7dd87bb4f1fb968c80a5bf041fa"} err="failed to get container status \"4fc838d65f3d594291f10b8a95a561b0be21c7dd87bb4f1fb968c80a5bf041fa\": rpc error: code = NotFound desc = could not find container \"4fc838d65f3d594291f10b8a95a561b0be21c7dd87bb4f1fb968c80a5bf041fa\": container with ID starting with 4fc838d65f3d594291f10b8a95a561b0be21c7dd87bb4f1fb968c80a5bf041fa not found: ID does not exist" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.568191 4901 scope.go:117] "RemoveContainer" containerID="f2e8fa826008c2c9f01478d90a32d0b9381a5e77f7d22cee3432c76f471846e2" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.570402 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfd87218-f7dc-424a-acda-dd7b57792738-config-data" (OuterVolumeSpecName: "config-data") pod "cfd87218-f7dc-424a-acda-dd7b57792738" (UID: "cfd87218-f7dc-424a-acda-dd7b57792738"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.573673 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfd87218-f7dc-424a-acda-dd7b57792738-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cfd87218-f7dc-424a-acda-dd7b57792738" (UID: "cfd87218-f7dc-424a-acda-dd7b57792738"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.585802 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52461a44-ded9-4025-b0f1-85c22462a04f-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "52461a44-ded9-4025-b0f1-85c22462a04f" (UID: "52461a44-ded9-4025-b0f1-85c22462a04f"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.594658 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-jl2xh"] Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.604961 4901 scope.go:117] "RemoveContainer" containerID="5aadda7ec4443b77cc72cf257c26eb2f3cb4e579053dc6d7e5abdc079cb7dcfd" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.605472 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0098aa8-4248-48ec-a254-368c395308b1-combined-ca-bundle\") pod \"f0098aa8-4248-48ec-a254-368c395308b1\" (UID: \"f0098aa8-4248-48ec-a254-368c395308b1\") " Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.605630 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0098aa8-4248-48ec-a254-368c395308b1-galera-tls-certs\") pod \"f0098aa8-4248-48ec-a254-368c395308b1\" (UID: \"f0098aa8-4248-48ec-a254-368c395308b1\") " Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.605685 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f0098aa8-4248-48ec-a254-368c395308b1-kolla-config\") pod \"f0098aa8-4248-48ec-a254-368c395308b1\" (UID: \"f0098aa8-4248-48ec-a254-368c395308b1\") " Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.606809 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0098aa8-4248-48ec-a254-368c395308b1-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "f0098aa8-4248-48ec-a254-368c395308b1" (UID: "f0098aa8-4248-48ec-a254-368c395308b1"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.607059 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfd87218-f7dc-424a-acda-dd7b57792738-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "cfd87218-f7dc-424a-acda-dd7b57792738" (UID: "cfd87218-f7dc-424a-acda-dd7b57792738"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.607445 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"f0098aa8-4248-48ec-a254-368c395308b1\" (UID: \"f0098aa8-4248-48ec-a254-368c395308b1\") " Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.607476 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9fhm\" (UniqueName: \"kubernetes.io/projected/f0098aa8-4248-48ec-a254-368c395308b1-kube-api-access-q9fhm\") pod \"f0098aa8-4248-48ec-a254-368c395308b1\" (UID: \"f0098aa8-4248-48ec-a254-368c395308b1\") " Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.607521 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0098aa8-4248-48ec-a254-368c395308b1-operator-scripts\") pod \"f0098aa8-4248-48ec-a254-368c395308b1\" (UID: \"f0098aa8-4248-48ec-a254-368c395308b1\") " Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.607575 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f0098aa8-4248-48ec-a254-368c395308b1-config-data-default\") pod \"f0098aa8-4248-48ec-a254-368c395308b1\" (UID: \"f0098aa8-4248-48ec-a254-368c395308b1\") " Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.607773 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f0098aa8-4248-48ec-a254-368c395308b1-config-data-generated\") pod \"f0098aa8-4248-48ec-a254-368c395308b1\" (UID: \"f0098aa8-4248-48ec-a254-368c395308b1\") " Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.608163 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d382c9a-714e-41fc-8266-8e4ee322f5c7-operator-scripts\") pod \"keystone-0c1f-account-create-update-rllhk\" (UID: \"1d382c9a-714e-41fc-8266-8e4ee322f5c7\") " pod="openstack/keystone-0c1f-account-create-update-rllhk" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.608451 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpcm2\" (UniqueName: \"kubernetes.io/projected/1d382c9a-714e-41fc-8266-8e4ee322f5c7-kube-api-access-hpcm2\") pod \"keystone-0c1f-account-create-update-rllhk\" (UID: \"1d382c9a-714e-41fc-8266-8e4ee322f5c7\") " pod="openstack/keystone-0c1f-account-create-update-rllhk" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.608506 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52461a44-ded9-4025-b0f1-85c22462a04f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.608516 4901 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfd87218-f7dc-424a-acda-dd7b57792738-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.608525 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfd87218-f7dc-424a-acda-dd7b57792738-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.608533 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfd87218-f7dc-424a-acda-dd7b57792738-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.608541 4901 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f0098aa8-4248-48ec-a254-368c395308b1-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.608551 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52461a44-ded9-4025-b0f1-85c22462a04f-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.608559 4901 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfd87218-f7dc-424a-acda-dd7b57792738-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.608566 4901 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/52461a44-ded9-4025-b0f1-85c22462a04f-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.610377 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0098aa8-4248-48ec-a254-368c395308b1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f0098aa8-4248-48ec-a254-368c395308b1" (UID: "f0098aa8-4248-48ec-a254-368c395308b1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:05:38 crc kubenswrapper[4901]: E0309 03:05:38.611033 4901 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 09 03:05:38 crc kubenswrapper[4901]: E0309 03:05:38.611092 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1d382c9a-714e-41fc-8266-8e4ee322f5c7-operator-scripts podName:1d382c9a-714e-41fc-8266-8e4ee322f5c7 nodeName:}" failed. No retries permitted until 2026-03-09 03:05:39.111071907 +0000 UTC m=+1463.700735639 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/1d382c9a-714e-41fc-8266-8e4ee322f5c7-operator-scripts") pod "keystone-0c1f-account-create-update-rllhk" (UID: "1d382c9a-714e-41fc-8266-8e4ee322f5c7") : configmap "openstack-scripts" not found Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.611511 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0098aa8-4248-48ec-a254-368c395308b1-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "f0098aa8-4248-48ec-a254-368c395308b1" (UID: "f0098aa8-4248-48ec-a254-368c395308b1"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.612581 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0098aa8-4248-48ec-a254-368c395308b1-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "f0098aa8-4248-48ec-a254-368c395308b1" (UID: "f0098aa8-4248-48ec-a254-368c395308b1"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:05:38 crc kubenswrapper[4901]: E0309 03:05:38.612873 4901 projected.go:194] Error preparing data for projected volume kube-api-access-hpcm2 for pod openstack/keystone-0c1f-account-create-update-rllhk: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 09 03:05:38 crc kubenswrapper[4901]: E0309 03:05:38.612970 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1d382c9a-714e-41fc-8266-8e4ee322f5c7-kube-api-access-hpcm2 podName:1d382c9a-714e-41fc-8266-8e4ee322f5c7 nodeName:}" failed. No retries permitted until 2026-03-09 03:05:39.112959314 +0000 UTC m=+1463.702623046 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-hpcm2" (UniqueName: "kubernetes.io/projected/1d382c9a-714e-41fc-8266-8e4ee322f5c7-kube-api-access-hpcm2") pod "keystone-0c1f-account-create-update-rllhk" (UID: "1d382c9a-714e-41fc-8266-8e4ee322f5c7") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.614392 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52461a44-ded9-4025-b0f1-85c22462a04f-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "52461a44-ded9-4025-b0f1-85c22462a04f" (UID: "52461a44-ded9-4025-b0f1-85c22462a04f"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.620179 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0098aa8-4248-48ec-a254-368c395308b1-kube-api-access-q9fhm" (OuterVolumeSpecName: "kube-api-access-q9fhm") pod "f0098aa8-4248-48ec-a254-368c395308b1" (UID: "f0098aa8-4248-48ec-a254-368c395308b1"). InnerVolumeSpecName "kube-api-access-q9fhm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.630352 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-0c1f-account-create-update-rllhk"] Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.638034 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-qzdch"] Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.644736 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-79578f965f-zpp5p"] Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.645273 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-79578f965f-zpp5p" podUID="e45b6a38-6035-4fd4-a525-5d51ac6d0a2d" containerName="keystone-api" containerID="cri-o://2e31d5cb8dd1ce0c928d719260414673c9ea3395f2fcb765ea37517d7821c355" gracePeriod=30 Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.647090 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "mysql-db") pod "f0098aa8-4248-48ec-a254-368c395308b1" (UID: "f0098aa8-4248-48ec-a254-368c395308b1"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.655767 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-qzdch"] Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.660473 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.673608 4901 scope.go:117] "RemoveContainer" containerID="f2e8fa826008c2c9f01478d90a32d0b9381a5e77f7d22cee3432c76f471846e2" Mar 09 03:05:38 crc kubenswrapper[4901]: E0309 03:05:38.678293 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2e8fa826008c2c9f01478d90a32d0b9381a5e77f7d22cee3432c76f471846e2\": container with ID starting with f2e8fa826008c2c9f01478d90a32d0b9381a5e77f7d22cee3432c76f471846e2 not found: ID does not exist" containerID="f2e8fa826008c2c9f01478d90a32d0b9381a5e77f7d22cee3432c76f471846e2" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.678406 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2e8fa826008c2c9f01478d90a32d0b9381a5e77f7d22cee3432c76f471846e2"} err="failed to get container status \"f2e8fa826008c2c9f01478d90a32d0b9381a5e77f7d22cee3432c76f471846e2\": rpc error: code = NotFound desc = could not find container \"f2e8fa826008c2c9f01478d90a32d0b9381a5e77f7d22cee3432c76f471846e2\": container with ID starting with f2e8fa826008c2c9f01478d90a32d0b9381a5e77f7d22cee3432c76f471846e2 not found: ID does not exist" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.678955 4901 scope.go:117] "RemoveContainer" containerID="5aadda7ec4443b77cc72cf257c26eb2f3cb4e579053dc6d7e5abdc079cb7dcfd" Mar 09 03:05:38 crc kubenswrapper[4901]: E0309 03:05:38.681505 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5aadda7ec4443b77cc72cf257c26eb2f3cb4e579053dc6d7e5abdc079cb7dcfd\": container with ID starting with 5aadda7ec4443b77cc72cf257c26eb2f3cb4e579053dc6d7e5abdc079cb7dcfd not found: ID does not exist" containerID="5aadda7ec4443b77cc72cf257c26eb2f3cb4e579053dc6d7e5abdc079cb7dcfd" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.681554 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5aadda7ec4443b77cc72cf257c26eb2f3cb4e579053dc6d7e5abdc079cb7dcfd"} err="failed to get container status \"5aadda7ec4443b77cc72cf257c26eb2f3cb4e579053dc6d7e5abdc079cb7dcfd\": rpc error: code = NotFound desc = could not find container \"5aadda7ec4443b77cc72cf257c26eb2f3cb4e579053dc6d7e5abdc079cb7dcfd\": container with ID starting with 5aadda7ec4443b77cc72cf257c26eb2f3cb4e579053dc6d7e5abdc079cb7dcfd not found: ID does not exist" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.681580 4901 scope.go:117] "RemoveContainer" containerID="f2e8fa826008c2c9f01478d90a32d0b9381a5e77f7d22cee3432c76f471846e2" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.682077 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2e8fa826008c2c9f01478d90a32d0b9381a5e77f7d22cee3432c76f471846e2"} err="failed to get container status \"f2e8fa826008c2c9f01478d90a32d0b9381a5e77f7d22cee3432c76f471846e2\": rpc error: code = NotFound desc = could not find container \"f2e8fa826008c2c9f01478d90a32d0b9381a5e77f7d22cee3432c76f471846e2\": container with ID starting with f2e8fa826008c2c9f01478d90a32d0b9381a5e77f7d22cee3432c76f471846e2 not found: ID does not exist" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.682321 4901 scope.go:117] "RemoveContainer" containerID="5aadda7ec4443b77cc72cf257c26eb2f3cb4e579053dc6d7e5abdc079cb7dcfd" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.686369 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0098aa8-4248-48ec-a254-368c395308b1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f0098aa8-4248-48ec-a254-368c395308b1" (UID: "f0098aa8-4248-48ec-a254-368c395308b1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.686467 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5aadda7ec4443b77cc72cf257c26eb2f3cb4e579053dc6d7e5abdc079cb7dcfd"} err="failed to get container status \"5aadda7ec4443b77cc72cf257c26eb2f3cb4e579053dc6d7e5abdc079cb7dcfd\": rpc error: code = NotFound desc = could not find container \"5aadda7ec4443b77cc72cf257c26eb2f3cb4e579053dc6d7e5abdc079cb7dcfd\": container with ID starting with 5aadda7ec4443b77cc72cf257c26eb2f3cb4e579053dc6d7e5abdc079cb7dcfd not found: ID does not exist" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.686502 4901 scope.go:117] "RemoveContainer" containerID="6f9b2c7bb5b12105cf493783a9ea56c27bfdd210b2404ad5c2701248c69906c1" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.711246 4901 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.713056 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9fhm\" (UniqueName: \"kubernetes.io/projected/f0098aa8-4248-48ec-a254-368c395308b1-kube-api-access-q9fhm\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.713141 4901 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0098aa8-4248-48ec-a254-368c395308b1-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.713212 4901 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f0098aa8-4248-48ec-a254-368c395308b1-config-data-default\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.713315 4901 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f0098aa8-4248-48ec-a254-368c395308b1-config-data-generated\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.713386 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0098aa8-4248-48ec-a254-368c395308b1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.713467 4901 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/52461a44-ded9-4025-b0f1-85c22462a04f-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.716334 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-0c1f-account-create-update-rllhk"] Mar 09 03:05:38 crc kubenswrapper[4901]: E0309 03:05:38.717921 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-hpcm2 operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/keystone-0c1f-account-create-update-rllhk" podUID="1d382c9a-714e-41fc-8266-8e4ee322f5c7" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.720687 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0098aa8-4248-48ec-a254-368c395308b1-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "f0098aa8-4248-48ec-a254-368c395308b1" (UID: "f0098aa8-4248-48ec-a254-368c395308b1"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.725740 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-cszqf"] Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.735246 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-cszqf"] Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.742953 4901 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.743155 4901 scope.go:117] "RemoveContainer" containerID="58f28fe8133335254744ffb487e2889617fa95aef6fad8082e0e5543fe0012a2" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.753914 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c8964d89c-vw8lc"] Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.761663 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-c8964d89c-vw8lc"] Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.771784 4901 scope.go:117] "RemoveContainer" containerID="1c5af01525dee027d91335bfc70f1c4c730cdc63e3d8be7b219fbbab05c905fa" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.806024 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.840316 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 09 03:05:38 crc kubenswrapper[4901]: E0309 03:05:38.848239 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5254cd71ad47a3d4e3364e0e37c621c03d973081d495139c8fd33e59e315cc27" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.848749 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-5rg5k"] Mar 09 03:05:38 crc kubenswrapper[4901]: E0309 03:05:38.852151 4901 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Mar 09 03:05:38 crc kubenswrapper[4901]: E0309 03:05:38.852231 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a6b8a05a-d698-4770-a883-fd60b61190b7-operator-scripts podName:a6b8a05a-d698-4770-a883-fd60b61190b7 nodeName:}" failed. No retries permitted until 2026-03-09 03:05:40.852204825 +0000 UTC m=+1465.441868557 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/a6b8a05a-d698-4770-a883-fd60b61190b7-operator-scripts") pod "nova-cell1-cddb-account-create-update-z7t9t" (UID: "a6b8a05a-d698-4770-a883-fd60b61190b7") : configmap "openstack-cell1-scripts" not found Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.854645 4901 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0098aa8-4248-48ec-a254-368c395308b1-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.854887 4901 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:38 crc kubenswrapper[4901]: E0309 03:05:38.858743 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5254cd71ad47a3d4e3364e0e37c621c03d973081d495139c8fd33e59e315cc27" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.860009 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-5rg5k"] Mar 09 03:05:38 crc kubenswrapper[4901]: E0309 03:05:38.861112 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5254cd71ad47a3d4e3364e0e37c621c03d973081d495139c8fd33e59e315cc27" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 09 03:05:38 crc kubenswrapper[4901]: E0309 03:05:38.861164 4901 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="6b3d0806-00f0-46d7-a77f-f505583e49a2" containerName="nova-scheduler-scheduler" Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.867706 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-nmx84"] Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.877973 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-nmx84"] Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.893064 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.907381 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.913371 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-579bf976f9-ds45q"] Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.919758 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-579bf976f9-ds45q"] Mar 09 03:05:38 crc kubenswrapper[4901]: E0309 03:05:38.957030 4901 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 09 03:05:38 crc kubenswrapper[4901]: E0309 03:05:38.957088 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/98538e55-cb87-49e2-9fd5-fff06d7edfdd-config-data podName:98538e55-cb87-49e2-9fd5-fff06d7edfdd nodeName:}" failed. No retries permitted until 2026-03-09 03:05:42.957074657 +0000 UTC m=+1467.546738389 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/98538e55-cb87-49e2-9fd5-fff06d7edfdd-config-data") pod "rabbitmq-cell1-server-0" (UID: "98538e55-cb87-49e2-9fd5-fff06d7edfdd") : configmap "rabbitmq-cell1-config-data" not found Mar 09 03:05:38 crc kubenswrapper[4901]: E0309 03:05:38.957119 4901 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Mar 09 03:05:38 crc kubenswrapper[4901]: E0309 03:05:38.957148 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d57347fc-0546-466f-95e6-055857ca3685-operator-scripts podName:d57347fc-0546-466f-95e6-055857ca3685 nodeName:}" failed. No retries permitted until 2026-03-09 03:05:40.957131799 +0000 UTC m=+1465.546795531 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/d57347fc-0546-466f-95e6-055857ca3685-operator-scripts") pod "root-account-create-update-d247z" (UID: "d57347fc-0546-466f-95e6-055857ca3685") : configmap "openstack-cell1-scripts" not found Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.962540 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="9df0684a-2816-4af7-97cf-00e31c542eef" containerName="galera" containerID="cri-o://a30e9af390a47f07275ae732e18499bb625ab606c1ab75fcb6f396d64e6313b0" gracePeriod=30 Mar 09 03:05:38 crc kubenswrapper[4901]: I0309 03:05:38.996856 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cddb-account-create-update-z7t9t" Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.006424 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-3c33-account-create-update-8vqvj" Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.015916 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 09 03:05:39 crc kubenswrapper[4901]: E0309 03:05:39.037281 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0ebf319a3bbc109bfb53682936200a7e2d64ef909307f9919c3a44fc2eece41f is running failed: container process not found" containerID="0ebf319a3bbc109bfb53682936200a7e2d64ef909307f9919c3a44fc2eece41f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 09 03:05:39 crc kubenswrapper[4901]: E0309 03:05:39.040748 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0ebf319a3bbc109bfb53682936200a7e2d64ef909307f9919c3a44fc2eece41f is running failed: container process not found" containerID="0ebf319a3bbc109bfb53682936200a7e2d64ef909307f9919c3a44fc2eece41f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 09 03:05:39 crc kubenswrapper[4901]: E0309 03:05:39.040909 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b269dfeb8bccf0d2d404b3336072d680e723d68372ed379a3aaa3d30efaa387d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 09 03:05:39 crc kubenswrapper[4901]: E0309 03:05:39.042470 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0ebf319a3bbc109bfb53682936200a7e2d64ef909307f9919c3a44fc2eece41f is running failed: container process not found" containerID="0ebf319a3bbc109bfb53682936200a7e2d64ef909307f9919c3a44fc2eece41f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 09 03:05:39 crc kubenswrapper[4901]: E0309 03:05:39.042505 4901 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0ebf319a3bbc109bfb53682936200a7e2d64ef909307f9919c3a44fc2eece41f is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-hltph" podUID="4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89" containerName="ovsdb-server" Mar 09 03:05:39 crc kubenswrapper[4901]: E0309 03:05:39.048767 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b269dfeb8bccf0d2d404b3336072d680e723d68372ed379a3aaa3d30efaa387d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 09 03:05:39 crc kubenswrapper[4901]: E0309 03:05:39.056407 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b269dfeb8bccf0d2d404b3336072d680e723d68372ed379a3aaa3d30efaa387d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 09 03:05:39 crc kubenswrapper[4901]: E0309 03:05:39.056476 4901 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-hltph" podUID="4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89" containerName="ovs-vswitchd" Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.058323 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6b8a05a-d698-4770-a883-fd60b61190b7-operator-scripts\") pod \"a6b8a05a-d698-4770-a883-fd60b61190b7\" (UID: \"a6b8a05a-d698-4770-a883-fd60b61190b7\") " Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.058427 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwprt\" (UniqueName: \"kubernetes.io/projected/a6b8a05a-d698-4770-a883-fd60b61190b7-kube-api-access-zwprt\") pod \"a6b8a05a-d698-4770-a883-fd60b61190b7\" (UID: \"a6b8a05a-d698-4770-a883-fd60b61190b7\") " Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.059964 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6b8a05a-d698-4770-a883-fd60b61190b7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a6b8a05a-d698-4770-a883-fd60b61190b7" (UID: "a6b8a05a-d698-4770-a883-fd60b61190b7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.062652 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6b8a05a-d698-4770-a883-fd60b61190b7-kube-api-access-zwprt" (OuterVolumeSpecName: "kube-api-access-zwprt") pod "a6b8a05a-d698-4770-a883-fd60b61190b7" (UID: "a6b8a05a-d698-4770-a883-fd60b61190b7"). InnerVolumeSpecName "kube-api-access-zwprt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.159758 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7050aa4c-725b-482a-8b90-f1374b3a4a42-kube-state-metrics-tls-config\") pod \"7050aa4c-725b-482a-8b90-f1374b3a4a42\" (UID: \"7050aa4c-725b-482a-8b90-f1374b3a4a42\") " Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.159812 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5jhd\" (UniqueName: \"kubernetes.io/projected/7050aa4c-725b-482a-8b90-f1374b3a4a42-kube-api-access-z5jhd\") pod \"7050aa4c-725b-482a-8b90-f1374b3a4a42\" (UID: \"7050aa4c-725b-482a-8b90-f1374b3a4a42\") " Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.159863 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7050aa4c-725b-482a-8b90-f1374b3a4a42-combined-ca-bundle\") pod \"7050aa4c-725b-482a-8b90-f1374b3a4a42\" (UID: \"7050aa4c-725b-482a-8b90-f1374b3a4a42\") " Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.159952 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97944a12-e740-486d-ab39-6b03818f3cbd-operator-scripts\") pod \"97944a12-e740-486d-ab39-6b03818f3cbd\" (UID: \"97944a12-e740-486d-ab39-6b03818f3cbd\") " Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.160035 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q867c\" (UniqueName: \"kubernetes.io/projected/97944a12-e740-486d-ab39-6b03818f3cbd-kube-api-access-q867c\") pod \"97944a12-e740-486d-ab39-6b03818f3cbd\" (UID: \"97944a12-e740-486d-ab39-6b03818f3cbd\") " Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.160115 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7050aa4c-725b-482a-8b90-f1374b3a4a42-kube-state-metrics-tls-certs\") pod \"7050aa4c-725b-482a-8b90-f1374b3a4a42\" (UID: \"7050aa4c-725b-482a-8b90-f1374b3a4a42\") " Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.160540 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97944a12-e740-486d-ab39-6b03818f3cbd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "97944a12-e740-486d-ab39-6b03818f3cbd" (UID: "97944a12-e740-486d-ab39-6b03818f3cbd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.160808 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpcm2\" (UniqueName: \"kubernetes.io/projected/1d382c9a-714e-41fc-8266-8e4ee322f5c7-kube-api-access-hpcm2\") pod \"keystone-0c1f-account-create-update-rllhk\" (UID: \"1d382c9a-714e-41fc-8266-8e4ee322f5c7\") " pod="openstack/keystone-0c1f-account-create-update-rllhk" Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.160928 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d382c9a-714e-41fc-8266-8e4ee322f5c7-operator-scripts\") pod \"keystone-0c1f-account-create-update-rllhk\" (UID: \"1d382c9a-714e-41fc-8266-8e4ee322f5c7\") " pod="openstack/keystone-0c1f-account-create-update-rllhk" Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.161101 4901 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6b8a05a-d698-4770-a883-fd60b61190b7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.161125 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwprt\" (UniqueName: \"kubernetes.io/projected/a6b8a05a-d698-4770-a883-fd60b61190b7-kube-api-access-zwprt\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.161140 4901 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97944a12-e740-486d-ab39-6b03818f3cbd-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:39 crc kubenswrapper[4901]: E0309 03:05:39.161161 4901 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 09 03:05:39 crc kubenswrapper[4901]: E0309 03:05:39.161257 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1d382c9a-714e-41fc-8266-8e4ee322f5c7-operator-scripts podName:1d382c9a-714e-41fc-8266-8e4ee322f5c7 nodeName:}" failed. No retries permitted until 2026-03-09 03:05:40.161236371 +0000 UTC m=+1464.750900143 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/1d382c9a-714e-41fc-8266-8e4ee322f5c7-operator-scripts") pod "keystone-0c1f-account-create-update-rllhk" (UID: "1d382c9a-714e-41fc-8266-8e4ee322f5c7") : configmap "openstack-scripts" not found Mar 09 03:05:39 crc kubenswrapper[4901]: E0309 03:05:39.164957 4901 projected.go:194] Error preparing data for projected volume kube-api-access-hpcm2 for pod openstack/keystone-0c1f-account-create-update-rllhk: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 09 03:05:39 crc kubenswrapper[4901]: E0309 03:05:39.165161 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1d382c9a-714e-41fc-8266-8e4ee322f5c7-kube-api-access-hpcm2 podName:1d382c9a-714e-41fc-8266-8e4ee322f5c7 nodeName:}" failed. No retries permitted until 2026-03-09 03:05:40.16514421 +0000 UTC m=+1464.754808012 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-hpcm2" (UniqueName: "kubernetes.io/projected/1d382c9a-714e-41fc-8266-8e4ee322f5c7-kube-api-access-hpcm2") pod "keystone-0c1f-account-create-update-rllhk" (UID: "1d382c9a-714e-41fc-8266-8e4ee322f5c7") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.182905 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7050aa4c-725b-482a-8b90-f1374b3a4a42-kube-api-access-z5jhd" (OuterVolumeSpecName: "kube-api-access-z5jhd") pod "7050aa4c-725b-482a-8b90-f1374b3a4a42" (UID: "7050aa4c-725b-482a-8b90-f1374b3a4a42"). InnerVolumeSpecName "kube-api-access-z5jhd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.184634 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7050aa4c-725b-482a-8b90-f1374b3a4a42-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "7050aa4c-725b-482a-8b90-f1374b3a4a42" (UID: "7050aa4c-725b-482a-8b90-f1374b3a4a42"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.184678 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97944a12-e740-486d-ab39-6b03818f3cbd-kube-api-access-q867c" (OuterVolumeSpecName: "kube-api-access-q867c") pod "97944a12-e740-486d-ab39-6b03818f3cbd" (UID: "97944a12-e740-486d-ab39-6b03818f3cbd"). InnerVolumeSpecName "kube-api-access-q867c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.187475 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7050aa4c-725b-482a-8b90-f1374b3a4a42-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7050aa4c-725b-482a-8b90-f1374b3a4a42" (UID: "7050aa4c-725b-482a-8b90-f1374b3a4a42"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.231661 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7050aa4c-725b-482a-8b90-f1374b3a4a42-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "7050aa4c-725b-482a-8b90-f1374b3a4a42" (UID: "7050aa4c-725b-482a-8b90-f1374b3a4a42"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.262580 4901 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7050aa4c-725b-482a-8b90-f1374b3a4a42-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.262605 4901 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7050aa4c-725b-482a-8b90-f1374b3a4a42-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.262616 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5jhd\" (UniqueName: \"kubernetes.io/projected/7050aa4c-725b-482a-8b90-f1374b3a4a42-kube-api-access-z5jhd\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.262626 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7050aa4c-725b-482a-8b90-f1374b3a4a42-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.262635 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q867c\" (UniqueName: \"kubernetes.io/projected/97944a12-e740-486d-ab39-6b03818f3cbd-kube-api-access-q867c\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.475047 4901 generic.go:334] "Generic (PLEG): container finished" podID="7050aa4c-725b-482a-8b90-f1374b3a4a42" containerID="149efda8ee8e0be18d7dd9fb462950b6382146330b30292eff8d423e80ed5cc6" exitCode=2 Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.475263 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"7050aa4c-725b-482a-8b90-f1374b3a4a42","Type":"ContainerDied","Data":"149efda8ee8e0be18d7dd9fb462950b6382146330b30292eff8d423e80ed5cc6"} Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.475648 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"7050aa4c-725b-482a-8b90-f1374b3a4a42","Type":"ContainerDied","Data":"1fea3c826a151a41c7248570920e76e67f4663d409b5d8418412d59bbc46f4e5"} Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.475669 4901 scope.go:117] "RemoveContainer" containerID="149efda8ee8e0be18d7dd9fb462950b6382146330b30292eff8d423e80ed5cc6" Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.475336 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.479792 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-3c33-account-create-update-8vqvj" Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.480162 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-3c33-account-create-update-8vqvj" event={"ID":"97944a12-e740-486d-ab39-6b03818f3cbd","Type":"ContainerDied","Data":"82ac3107b4fc1a61dc1013fced1cff813f09ec93dc369e57c611c0fa5b71c0d0"} Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.494496 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f0098aa8-4248-48ec-a254-368c395308b1","Type":"ContainerDied","Data":"1de970bcceff311fa19a500f73a8d13616b2bad13039b206aea33903c1d1417a"} Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.494513 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.506190 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/memcached-0" podUID="2b3e03cd-75ae-46dc-aee4-b778929cf535" containerName="memcached" probeResult="failure" output="dial tcp 10.217.0.108:11211: connect: connection refused" Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.513274 4901 generic.go:334] "Generic (PLEG): container finished" podID="2c3d4e9a-122e-4894-98b2-91784a9f44e8" containerID="2cd524a1798da2be5336faeee44da3b0c7ebcf43441ba6883148aca3906e53c0" exitCode=0 Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.513300 4901 generic.go:334] "Generic (PLEG): container finished" podID="2c3d4e9a-122e-4894-98b2-91784a9f44e8" containerID="ef72ec9152a71c9764e40c053af5316561da42488861f73e1bfb72f8538b1bb8" exitCode=2 Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.513309 4901 generic.go:334] "Generic (PLEG): container finished" podID="2c3d4e9a-122e-4894-98b2-91784a9f44e8" containerID="fd30c7afd1e4a7b025b8b590cdbf65ff2deb03badcd9cc3419bf24e1b542c1ed" exitCode=0 Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.513351 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2c3d4e9a-122e-4894-98b2-91784a9f44e8","Type":"ContainerDied","Data":"2cd524a1798da2be5336faeee44da3b0c7ebcf43441ba6883148aca3906e53c0"} Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.513375 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2c3d4e9a-122e-4894-98b2-91784a9f44e8","Type":"ContainerDied","Data":"ef72ec9152a71c9764e40c053af5316561da42488861f73e1bfb72f8538b1bb8"} Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.513385 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2c3d4e9a-122e-4894-98b2-91784a9f44e8","Type":"ContainerDied","Data":"fd30c7afd1e4a7b025b8b590cdbf65ff2deb03badcd9cc3419bf24e1b542c1ed"} Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.519475 4901 generic.go:334] "Generic (PLEG): container finished" podID="966d96ae-fba9-4ecd-85e5-a81cecfb2ed3" containerID="ff023d589de2e36a5396ff975d9b99b7c27af8bbe416330e6c016841896c5a55" exitCode=0 Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.519557 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5ddb85f7bb-phpwc" event={"ID":"966d96ae-fba9-4ecd-85e5-a81cecfb2ed3","Type":"ContainerDied","Data":"ff023d589de2e36a5396ff975d9b99b7c27af8bbe416330e6c016841896c5a55"} Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.531289 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.532363 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cddb-account-create-update-z7t9t" event={"ID":"a6b8a05a-d698-4770-a883-fd60b61190b7","Type":"ContainerDied","Data":"1e0d2229701327b5551de4e9a3afef86818395b4442cd0ea81a014074d9c3a4b"} Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.532472 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cddb-account-create-update-z7t9t" Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.537696 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0c1f-account-create-update-rllhk" Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.543138 4901 scope.go:117] "RemoveContainer" containerID="149efda8ee8e0be18d7dd9fb462950b6382146330b30292eff8d423e80ed5cc6" Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.544521 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 03:05:39 crc kubenswrapper[4901]: E0309 03:05:39.552150 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"149efda8ee8e0be18d7dd9fb462950b6382146330b30292eff8d423e80ed5cc6\": container with ID starting with 149efda8ee8e0be18d7dd9fb462950b6382146330b30292eff8d423e80ed5cc6 not found: ID does not exist" containerID="149efda8ee8e0be18d7dd9fb462950b6382146330b30292eff8d423e80ed5cc6" Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.552355 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"149efda8ee8e0be18d7dd9fb462950b6382146330b30292eff8d423e80ed5cc6"} err="failed to get container status \"149efda8ee8e0be18d7dd9fb462950b6382146330b30292eff8d423e80ed5cc6\": rpc error: code = NotFound desc = could not find container \"149efda8ee8e0be18d7dd9fb462950b6382146330b30292eff8d423e80ed5cc6\": container with ID starting with 149efda8ee8e0be18d7dd9fb462950b6382146330b30292eff8d423e80ed5cc6 not found: ID does not exist" Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.556003 4901 scope.go:117] "RemoveContainer" containerID="b4c164d97e9bee042efac1c9e73f97d746429bbdb1ecf72f43fc47c10cd2ec24" Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.561239 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0c1f-account-create-update-rllhk" Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.579637 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-3c33-account-create-update-8vqvj"] Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.593846 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-3c33-account-create-update-8vqvj"] Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.604996 4901 scope.go:117] "RemoveContainer" containerID="3a05919e373ebb6f88b2fb0ed9c30b2b394efeac74eb31a4d3a3029fe54bc70d" Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.608788 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.620626 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.627012 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5ddb85f7bb-phpwc" Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.638813 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cddb-account-create-update-z7t9t"] Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.642734 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cddb-account-create-update-z7t9t"] Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.771849 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/966d96ae-fba9-4ecd-85e5-a81cecfb2ed3-public-tls-certs\") pod \"966d96ae-fba9-4ecd-85e5-a81cecfb2ed3\" (UID: \"966d96ae-fba9-4ecd-85e5-a81cecfb2ed3\") " Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.773046 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/966d96ae-fba9-4ecd-85e5-a81cecfb2ed3-scripts\") pod \"966d96ae-fba9-4ecd-85e5-a81cecfb2ed3\" (UID: \"966d96ae-fba9-4ecd-85e5-a81cecfb2ed3\") " Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.773121 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/966d96ae-fba9-4ecd-85e5-a81cecfb2ed3-logs\") pod \"966d96ae-fba9-4ecd-85e5-a81cecfb2ed3\" (UID: \"966d96ae-fba9-4ecd-85e5-a81cecfb2ed3\") " Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.773188 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/966d96ae-fba9-4ecd-85e5-a81cecfb2ed3-internal-tls-certs\") pod \"966d96ae-fba9-4ecd-85e5-a81cecfb2ed3\" (UID: \"966d96ae-fba9-4ecd-85e5-a81cecfb2ed3\") " Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.773280 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nm4l\" (UniqueName: \"kubernetes.io/projected/966d96ae-fba9-4ecd-85e5-a81cecfb2ed3-kube-api-access-9nm4l\") pod \"966d96ae-fba9-4ecd-85e5-a81cecfb2ed3\" (UID: \"966d96ae-fba9-4ecd-85e5-a81cecfb2ed3\") " Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.773699 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/966d96ae-fba9-4ecd-85e5-a81cecfb2ed3-combined-ca-bundle\") pod \"966d96ae-fba9-4ecd-85e5-a81cecfb2ed3\" (UID: \"966d96ae-fba9-4ecd-85e5-a81cecfb2ed3\") " Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.773771 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/966d96ae-fba9-4ecd-85e5-a81cecfb2ed3-config-data\") pod \"966d96ae-fba9-4ecd-85e5-a81cecfb2ed3\" (UID: \"966d96ae-fba9-4ecd-85e5-a81cecfb2ed3\") " Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.773785 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/966d96ae-fba9-4ecd-85e5-a81cecfb2ed3-logs" (OuterVolumeSpecName: "logs") pod "966d96ae-fba9-4ecd-85e5-a81cecfb2ed3" (UID: "966d96ae-fba9-4ecd-85e5-a81cecfb2ed3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.774749 4901 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/966d96ae-fba9-4ecd-85e5-a81cecfb2ed3-logs\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.779499 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/966d96ae-fba9-4ecd-85e5-a81cecfb2ed3-scripts" (OuterVolumeSpecName: "scripts") pod "966d96ae-fba9-4ecd-85e5-a81cecfb2ed3" (UID: "966d96ae-fba9-4ecd-85e5-a81cecfb2ed3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.799001 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/966d96ae-fba9-4ecd-85e5-a81cecfb2ed3-kube-api-access-9nm4l" (OuterVolumeSpecName: "kube-api-access-9nm4l") pod "966d96ae-fba9-4ecd-85e5-a81cecfb2ed3" (UID: "966d96ae-fba9-4ecd-85e5-a81cecfb2ed3"). InnerVolumeSpecName "kube-api-access-9nm4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.838423 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/966d96ae-fba9-4ecd-85e5-a81cecfb2ed3-config-data" (OuterVolumeSpecName: "config-data") pod "966d96ae-fba9-4ecd-85e5-a81cecfb2ed3" (UID: "966d96ae-fba9-4ecd-85e5-a81cecfb2ed3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.854300 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="719d451b-159a-4fa7-9c72-54f42fb4f216" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.169:8776/healthcheck\": read tcp 10.217.0.2:40164->10.217.0.169:8776: read: connection reset by peer" Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.858450 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/966d96ae-fba9-4ecd-85e5-a81cecfb2ed3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "966d96ae-fba9-4ecd-85e5-a81cecfb2ed3" (UID: "966d96ae-fba9-4ecd-85e5-a81cecfb2ed3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.875864 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/966d96ae-fba9-4ecd-85e5-a81cecfb2ed3-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.875978 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nm4l\" (UniqueName: \"kubernetes.io/projected/966d96ae-fba9-4ecd-85e5-a81cecfb2ed3-kube-api-access-9nm4l\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.876037 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/966d96ae-fba9-4ecd-85e5-a81cecfb2ed3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.876090 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/966d96ae-fba9-4ecd-85e5-a81cecfb2ed3-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.942073 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/966d96ae-fba9-4ecd-85e5-a81cecfb2ed3-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "966d96ae-fba9-4ecd-85e5-a81cecfb2ed3" (UID: "966d96ae-fba9-4ecd-85e5-a81cecfb2ed3"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.949765 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/966d96ae-fba9-4ecd-85e5-a81cecfb2ed3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "966d96ae-fba9-4ecd-85e5-a81cecfb2ed3" (UID: "966d96ae-fba9-4ecd-85e5-a81cecfb2ed3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.978127 4901 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/966d96ae-fba9-4ecd-85e5-a81cecfb2ed3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:39 crc kubenswrapper[4901]: I0309 03:05:39.978148 4901 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/966d96ae-fba9-4ecd-85e5-a81cecfb2ed3-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:39 crc kubenswrapper[4901]: E0309 03:05:39.978238 4901 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 09 03:05:39 crc kubenswrapper[4901]: E0309 03:05:39.978279 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/46c7df0b-fc0a-4fd9-b097-72da03442510-config-data podName:46c7df0b-fc0a-4fd9-b097-72da03442510 nodeName:}" failed. No retries permitted until 2026-03-09 03:05:43.978266965 +0000 UTC m=+1468.567930697 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/46c7df0b-fc0a-4fd9-b097-72da03442510-config-data") pod "rabbitmq-server-0" (UID: "46c7df0b-fc0a-4fd9-b097-72da03442510") : configmap "rabbitmq-config-data" not found Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.100383 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.105029 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-d247z" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.123647 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="065dfe75-7489-4b15-8a4d-4adf13393aea" path="/var/lib/kubelet/pods/065dfe75-7489-4b15-8a4d-4adf13393aea/volumes" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.124306 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b6c57bb-65c0-4563-a902-94e55b6f0713" path="/var/lib/kubelet/pods/0b6c57bb-65c0-4563-a902-94e55b6f0713/volumes" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.124788 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10c171ad-86f3-4601-8abd-89334e351bc8" path="/var/lib/kubelet/pods/10c171ad-86f3-4601-8abd-89334e351bc8/volumes" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.125870 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="167ad9cc-678d-499b-9be0-2e74112f84c9" path="/var/lib/kubelet/pods/167ad9cc-678d-499b-9be0-2e74112f84c9/volumes" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.126443 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ba69329-2c9f-4938-89b0-d1fa314d5a30" path="/var/lib/kubelet/pods/4ba69329-2c9f-4938-89b0-d1fa314d5a30/volumes" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.126968 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5140d205-de33-4e39-95fb-451471d3e7e9" path="/var/lib/kubelet/pods/5140d205-de33-4e39-95fb-451471d3e7e9/volumes" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.127847 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52461a44-ded9-4025-b0f1-85c22462a04f" path="/var/lib/kubelet/pods/52461a44-ded9-4025-b0f1-85c22462a04f/volumes" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.128467 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7050aa4c-725b-482a-8b90-f1374b3a4a42" path="/var/lib/kubelet/pods/7050aa4c-725b-482a-8b90-f1374b3a4a42/volumes" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.129048 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97944a12-e740-486d-ab39-6b03818f3cbd" path="/var/lib/kubelet/pods/97944a12-e740-486d-ab39-6b03818f3cbd/volumes" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.129391 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6b8a05a-d698-4770-a883-fd60b61190b7" path="/var/lib/kubelet/pods/a6b8a05a-d698-4770-a883-fd60b61190b7/volumes" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.130149 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfd87218-f7dc-424a-acda-dd7b57792738" path="/var/lib/kubelet/pods/cfd87218-f7dc-424a-acda-dd7b57792738/volumes" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.131368 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc54c941-19d2-42c1-b9f0-a3a58999bda5" path="/var/lib/kubelet/pods/dc54c941-19d2-42c1-b9f0-a3a58999bda5/volumes" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.131885 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e20e8f79-9634-43fd-a9a5-d2710f828a86" path="/var/lib/kubelet/pods/e20e8f79-9634-43fd-a9a5-d2710f828a86/volumes" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.132893 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0098aa8-4248-48ec-a254-368c395308b1" path="/var/lib/kubelet/pods/f0098aa8-4248-48ec-a254-368c395308b1/volumes" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.180456 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2b3e03cd-75ae-46dc-aee4-b778929cf535-kolla-config\") pod \"2b3e03cd-75ae-46dc-aee4-b778929cf535\" (UID: \"2b3e03cd-75ae-46dc-aee4-b778929cf535\") " Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.180545 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d57347fc-0546-466f-95e6-055857ca3685-operator-scripts\") pod \"d57347fc-0546-466f-95e6-055857ca3685\" (UID: \"d57347fc-0546-466f-95e6-055857ca3685\") " Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.180565 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b3e03cd-75ae-46dc-aee4-b778929cf535-combined-ca-bundle\") pod \"2b3e03cd-75ae-46dc-aee4-b778929cf535\" (UID: \"2b3e03cd-75ae-46dc-aee4-b778929cf535\") " Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.180637 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlh9p\" (UniqueName: \"kubernetes.io/projected/d57347fc-0546-466f-95e6-055857ca3685-kube-api-access-vlh9p\") pod \"d57347fc-0546-466f-95e6-055857ca3685\" (UID: \"d57347fc-0546-466f-95e6-055857ca3685\") " Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.180664 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b3e03cd-75ae-46dc-aee4-b778929cf535-memcached-tls-certs\") pod \"2b3e03cd-75ae-46dc-aee4-b778929cf535\" (UID: \"2b3e03cd-75ae-46dc-aee4-b778929cf535\") " Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.180720 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2b3e03cd-75ae-46dc-aee4-b778929cf535-config-data\") pod \"2b3e03cd-75ae-46dc-aee4-b778929cf535\" (UID: \"2b3e03cd-75ae-46dc-aee4-b778929cf535\") " Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.180784 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5zzp\" (UniqueName: \"kubernetes.io/projected/2b3e03cd-75ae-46dc-aee4-b778929cf535-kube-api-access-n5zzp\") pod \"2b3e03cd-75ae-46dc-aee4-b778929cf535\" (UID: \"2b3e03cd-75ae-46dc-aee4-b778929cf535\") " Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.182392 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpcm2\" (UniqueName: \"kubernetes.io/projected/1d382c9a-714e-41fc-8266-8e4ee322f5c7-kube-api-access-hpcm2\") pod \"keystone-0c1f-account-create-update-rllhk\" (UID: \"1d382c9a-714e-41fc-8266-8e4ee322f5c7\") " pod="openstack/keystone-0c1f-account-create-update-rllhk" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.182480 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d382c9a-714e-41fc-8266-8e4ee322f5c7-operator-scripts\") pod \"keystone-0c1f-account-create-update-rllhk\" (UID: \"1d382c9a-714e-41fc-8266-8e4ee322f5c7\") " pod="openstack/keystone-0c1f-account-create-update-rllhk" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.183154 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b3e03cd-75ae-46dc-aee4-b778929cf535-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "2b3e03cd-75ae-46dc-aee4-b778929cf535" (UID: "2b3e03cd-75ae-46dc-aee4-b778929cf535"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.183172 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b3e03cd-75ae-46dc-aee4-b778929cf535-config-data" (OuterVolumeSpecName: "config-data") pod "2b3e03cd-75ae-46dc-aee4-b778929cf535" (UID: "2b3e03cd-75ae-46dc-aee4-b778929cf535"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.183696 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d57347fc-0546-466f-95e6-055857ca3685-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d57347fc-0546-466f-95e6-055857ca3685" (UID: "d57347fc-0546-466f-95e6-055857ca3685"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:05:40 crc kubenswrapper[4901]: E0309 03:05:40.185648 4901 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 09 03:05:40 crc kubenswrapper[4901]: E0309 03:05:40.185797 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1d382c9a-714e-41fc-8266-8e4ee322f5c7-operator-scripts podName:1d382c9a-714e-41fc-8266-8e4ee322f5c7 nodeName:}" failed. No retries permitted until 2026-03-09 03:05:42.185735842 +0000 UTC m=+1466.775399574 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/1d382c9a-714e-41fc-8266-8e4ee322f5c7-operator-scripts") pod "keystone-0c1f-account-create-update-rllhk" (UID: "1d382c9a-714e-41fc-8266-8e4ee322f5c7") : configmap "openstack-scripts" not found Mar 09 03:05:40 crc kubenswrapper[4901]: E0309 03:05:40.193340 4901 projected.go:194] Error preparing data for projected volume kube-api-access-hpcm2 for pod openstack/keystone-0c1f-account-create-update-rllhk: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 09 03:05:40 crc kubenswrapper[4901]: E0309 03:05:40.193407 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1d382c9a-714e-41fc-8266-8e4ee322f5c7-kube-api-access-hpcm2 podName:1d382c9a-714e-41fc-8266-8e4ee322f5c7 nodeName:}" failed. No retries permitted until 2026-03-09 03:05:42.193388066 +0000 UTC m=+1466.783051798 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-hpcm2" (UniqueName: "kubernetes.io/projected/1d382c9a-714e-41fc-8266-8e4ee322f5c7-kube-api-access-hpcm2") pod "keystone-0c1f-account-create-update-rllhk" (UID: "1d382c9a-714e-41fc-8266-8e4ee322f5c7") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.193129 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d57347fc-0546-466f-95e6-055857ca3685-kube-api-access-vlh9p" (OuterVolumeSpecName: "kube-api-access-vlh9p") pod "d57347fc-0546-466f-95e6-055857ca3685" (UID: "d57347fc-0546-466f-95e6-055857ca3685"). InnerVolumeSpecName "kube-api-access-vlh9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.219590 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b3e03cd-75ae-46dc-aee4-b778929cf535-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2b3e03cd-75ae-46dc-aee4-b778929cf535" (UID: "2b3e03cd-75ae-46dc-aee4-b778929cf535"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.240645 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b3e03cd-75ae-46dc-aee4-b778929cf535-kube-api-access-n5zzp" (OuterVolumeSpecName: "kube-api-access-n5zzp") pod "2b3e03cd-75ae-46dc-aee4-b778929cf535" (UID: "2b3e03cd-75ae-46dc-aee4-b778929cf535"). InnerVolumeSpecName "kube-api-access-n5zzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.246069 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b3e03cd-75ae-46dc-aee4-b778929cf535-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "2b3e03cd-75ae-46dc-aee4-b778929cf535" (UID: "2b3e03cd-75ae-46dc-aee4-b778929cf535"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.284538 4901 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2b3e03cd-75ae-46dc-aee4-b778929cf535-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.284579 4901 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d57347fc-0546-466f-95e6-055857ca3685-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.284592 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b3e03cd-75ae-46dc-aee4-b778929cf535-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.284604 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlh9p\" (UniqueName: \"kubernetes.io/projected/d57347fc-0546-466f-95e6-055857ca3685-kube-api-access-vlh9p\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.284618 4901 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b3e03cd-75ae-46dc-aee4-b778929cf535-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.284630 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2b3e03cd-75ae-46dc-aee4-b778929cf535-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.284644 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5zzp\" (UniqueName: \"kubernetes.io/projected/2b3e03cd-75ae-46dc-aee4-b778929cf535-kube-api-access-n5zzp\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.340960 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.386092 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/719d451b-159a-4fa7-9c72-54f42fb4f216-config-data\") pod \"719d451b-159a-4fa7-9c72-54f42fb4f216\" (UID: \"719d451b-159a-4fa7-9c72-54f42fb4f216\") " Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.386176 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/719d451b-159a-4fa7-9c72-54f42fb4f216-internal-tls-certs\") pod \"719d451b-159a-4fa7-9c72-54f42fb4f216\" (UID: \"719d451b-159a-4fa7-9c72-54f42fb4f216\") " Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.386201 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/719d451b-159a-4fa7-9c72-54f42fb4f216-config-data-custom\") pod \"719d451b-159a-4fa7-9c72-54f42fb4f216\" (UID: \"719d451b-159a-4fa7-9c72-54f42fb4f216\") " Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.386258 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/719d451b-159a-4fa7-9c72-54f42fb4f216-public-tls-certs\") pod \"719d451b-159a-4fa7-9c72-54f42fb4f216\" (UID: \"719d451b-159a-4fa7-9c72-54f42fb4f216\") " Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.386284 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/719d451b-159a-4fa7-9c72-54f42fb4f216-etc-machine-id\") pod \"719d451b-159a-4fa7-9c72-54f42fb4f216\" (UID: \"719d451b-159a-4fa7-9c72-54f42fb4f216\") " Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.386357 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhmww\" (UniqueName: \"kubernetes.io/projected/719d451b-159a-4fa7-9c72-54f42fb4f216-kube-api-access-hhmww\") pod \"719d451b-159a-4fa7-9c72-54f42fb4f216\" (UID: \"719d451b-159a-4fa7-9c72-54f42fb4f216\") " Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.386402 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/719d451b-159a-4fa7-9c72-54f42fb4f216-combined-ca-bundle\") pod \"719d451b-159a-4fa7-9c72-54f42fb4f216\" (UID: \"719d451b-159a-4fa7-9c72-54f42fb4f216\") " Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.386445 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/719d451b-159a-4fa7-9c72-54f42fb4f216-logs\") pod \"719d451b-159a-4fa7-9c72-54f42fb4f216\" (UID: \"719d451b-159a-4fa7-9c72-54f42fb4f216\") " Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.386468 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/719d451b-159a-4fa7-9c72-54f42fb4f216-scripts\") pod \"719d451b-159a-4fa7-9c72-54f42fb4f216\" (UID: \"719d451b-159a-4fa7-9c72-54f42fb4f216\") " Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.386547 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/719d451b-159a-4fa7-9c72-54f42fb4f216-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "719d451b-159a-4fa7-9c72-54f42fb4f216" (UID: "719d451b-159a-4fa7-9c72-54f42fb4f216"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.386897 4901 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/719d451b-159a-4fa7-9c72-54f42fb4f216-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.389404 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/719d451b-159a-4fa7-9c72-54f42fb4f216-logs" (OuterVolumeSpecName: "logs") pod "719d451b-159a-4fa7-9c72-54f42fb4f216" (UID: "719d451b-159a-4fa7-9c72-54f42fb4f216"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.389830 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/719d451b-159a-4fa7-9c72-54f42fb4f216-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "719d451b-159a-4fa7-9c72-54f42fb4f216" (UID: "719d451b-159a-4fa7-9c72-54f42fb4f216"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.390686 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/719d451b-159a-4fa7-9c72-54f42fb4f216-scripts" (OuterVolumeSpecName: "scripts") pod "719d451b-159a-4fa7-9c72-54f42fb4f216" (UID: "719d451b-159a-4fa7-9c72-54f42fb4f216"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.391440 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/719d451b-159a-4fa7-9c72-54f42fb4f216-kube-api-access-hhmww" (OuterVolumeSpecName: "kube-api-access-hhmww") pod "719d451b-159a-4fa7-9c72-54f42fb4f216" (UID: "719d451b-159a-4fa7-9c72-54f42fb4f216"). InnerVolumeSpecName "kube-api-access-hhmww". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.408397 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.443715 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/719d451b-159a-4fa7-9c72-54f42fb4f216-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "719d451b-159a-4fa7-9c72-54f42fb4f216" (UID: "719d451b-159a-4fa7-9c72-54f42fb4f216"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.454864 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/719d451b-159a-4fa7-9c72-54f42fb4f216-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "719d451b-159a-4fa7-9c72-54f42fb4f216" (UID: "719d451b-159a-4fa7-9c72-54f42fb4f216"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.464536 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/719d451b-159a-4fa7-9c72-54f42fb4f216-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "719d451b-159a-4fa7-9c72-54f42fb4f216" (UID: "719d451b-159a-4fa7-9c72-54f42fb4f216"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.465071 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6cc8485b48-f86rl" podUID="e5efc6dd-6a36-4491-b090-b4c9301ec7d0" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.168:9311/healthcheck\": read tcp 10.217.0.2:46316->10.217.0.168:9311: read: connection reset by peer" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.465393 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6cc8485b48-f86rl" podUID="e5efc6dd-6a36-4491-b090-b4c9301ec7d0" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.168:9311/healthcheck\": read tcp 10.217.0.2:46330->10.217.0.168:9311: read: connection reset by peer" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.468931 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/719d451b-159a-4fa7-9c72-54f42fb4f216-config-data" (OuterVolumeSpecName: "config-data") pod "719d451b-159a-4fa7-9c72-54f42fb4f216" (UID: "719d451b-159a-4fa7-9c72-54f42fb4f216"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.487583 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cd100e4-dfd3-45a7-a97c-84a05c352883-logs\") pod \"7cd100e4-dfd3-45a7-a97c-84a05c352883\" (UID: \"7cd100e4-dfd3-45a7-a97c-84a05c352883\") " Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.487724 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chvdf\" (UniqueName: \"kubernetes.io/projected/7cd100e4-dfd3-45a7-a97c-84a05c352883-kube-api-access-chvdf\") pod \"7cd100e4-dfd3-45a7-a97c-84a05c352883\" (UID: \"7cd100e4-dfd3-45a7-a97c-84a05c352883\") " Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.487748 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cd100e4-dfd3-45a7-a97c-84a05c352883-combined-ca-bundle\") pod \"7cd100e4-dfd3-45a7-a97c-84a05c352883\" (UID: \"7cd100e4-dfd3-45a7-a97c-84a05c352883\") " Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.487789 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cd100e4-dfd3-45a7-a97c-84a05c352883-config-data\") pod \"7cd100e4-dfd3-45a7-a97c-84a05c352883\" (UID: \"7cd100e4-dfd3-45a7-a97c-84a05c352883\") " Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.487819 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cd100e4-dfd3-45a7-a97c-84a05c352883-nova-metadata-tls-certs\") pod \"7cd100e4-dfd3-45a7-a97c-84a05c352883\" (UID: \"7cd100e4-dfd3-45a7-a97c-84a05c352883\") " Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.488166 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhmww\" (UniqueName: \"kubernetes.io/projected/719d451b-159a-4fa7-9c72-54f42fb4f216-kube-api-access-hhmww\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.488182 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/719d451b-159a-4fa7-9c72-54f42fb4f216-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.488191 4901 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/719d451b-159a-4fa7-9c72-54f42fb4f216-logs\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.488200 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/719d451b-159a-4fa7-9c72-54f42fb4f216-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.488207 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/719d451b-159a-4fa7-9c72-54f42fb4f216-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.488215 4901 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/719d451b-159a-4fa7-9c72-54f42fb4f216-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.488246 4901 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/719d451b-159a-4fa7-9c72-54f42fb4f216-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.488254 4901 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/719d451b-159a-4fa7-9c72-54f42fb4f216-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.488546 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cd100e4-dfd3-45a7-a97c-84a05c352883-logs" (OuterVolumeSpecName: "logs") pod "7cd100e4-dfd3-45a7-a97c-84a05c352883" (UID: "7cd100e4-dfd3-45a7-a97c-84a05c352883"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.494168 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cd100e4-dfd3-45a7-a97c-84a05c352883-kube-api-access-chvdf" (OuterVolumeSpecName: "kube-api-access-chvdf") pod "7cd100e4-dfd3-45a7-a97c-84a05c352883" (UID: "7cd100e4-dfd3-45a7-a97c-84a05c352883"). InnerVolumeSpecName "kube-api-access-chvdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.513313 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cd100e4-dfd3-45a7-a97c-84a05c352883-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7cd100e4-dfd3-45a7-a97c-84a05c352883" (UID: "7cd100e4-dfd3-45a7-a97c-84a05c352883"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.524602 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cd100e4-dfd3-45a7-a97c-84a05c352883-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "7cd100e4-dfd3-45a7-a97c-84a05c352883" (UID: "7cd100e4-dfd3-45a7-a97c-84a05c352883"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.529030 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cd100e4-dfd3-45a7-a97c-84a05c352883-config-data" (OuterVolumeSpecName: "config-data") pod "7cd100e4-dfd3-45a7-a97c-84a05c352883" (UID: "7cd100e4-dfd3-45a7-a97c-84a05c352883"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.550974 4901 generic.go:334] "Generic (PLEG): container finished" podID="2b3e03cd-75ae-46dc-aee4-b778929cf535" containerID="854d0438c9e18acb0221644ecbe21cbe51324ea8c9135307368198911d10bbd5" exitCode=0 Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.551022 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"2b3e03cd-75ae-46dc-aee4-b778929cf535","Type":"ContainerDied","Data":"854d0438c9e18acb0221644ecbe21cbe51324ea8c9135307368198911d10bbd5"} Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.551068 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"2b3e03cd-75ae-46dc-aee4-b778929cf535","Type":"ContainerDied","Data":"c848cdd78c97d85450b65da73f826ef1b05faf1160a2df5372998fbcfd87cff9"} Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.551085 4901 scope.go:117] "RemoveContainer" containerID="854d0438c9e18acb0221644ecbe21cbe51324ea8c9135307368198911d10bbd5" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.551264 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.566295 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5ddb85f7bb-phpwc" event={"ID":"966d96ae-fba9-4ecd-85e5-a81cecfb2ed3","Type":"ContainerDied","Data":"0c6cb5a8d63f828a66ef60ea27462787d5ebbbb27ddd435f77fedbaa4692fc7f"} Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.566367 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5ddb85f7bb-phpwc" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.567366 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.568192 4901 generic.go:334] "Generic (PLEG): container finished" podID="719d451b-159a-4fa7-9c72-54f42fb4f216" containerID="bddc8674c0121619ce3f35bb3a4687012269faf10b370e3405b16aff91ea8de7" exitCode=0 Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.568251 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"719d451b-159a-4fa7-9c72-54f42fb4f216","Type":"ContainerDied","Data":"bddc8674c0121619ce3f35bb3a4687012269faf10b370e3405b16aff91ea8de7"} Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.568275 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"719d451b-159a-4fa7-9c72-54f42fb4f216","Type":"ContainerDied","Data":"ae52503a05f8a417fd0d4fe6767b11247b012f0c8a7497a4bdaf32ad11e4ccb2"} Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.568288 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.586820 4901 scope.go:117] "RemoveContainer" containerID="854d0438c9e18acb0221644ecbe21cbe51324ea8c9135307368198911d10bbd5" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.589098 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.589498 4901 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cd100e4-dfd3-45a7-a97c-84a05c352883-logs\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.589525 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chvdf\" (UniqueName: \"kubernetes.io/projected/7cd100e4-dfd3-45a7-a97c-84a05c352883-kube-api-access-chvdf\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.589536 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cd100e4-dfd3-45a7-a97c-84a05c352883-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.589548 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cd100e4-dfd3-45a7-a97c-84a05c352883-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.589557 4901 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cd100e4-dfd3-45a7-a97c-84a05c352883-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:40 crc kubenswrapper[4901]: E0309 03:05:40.592520 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"854d0438c9e18acb0221644ecbe21cbe51324ea8c9135307368198911d10bbd5\": container with ID starting with 854d0438c9e18acb0221644ecbe21cbe51324ea8c9135307368198911d10bbd5 not found: ID does not exist" containerID="854d0438c9e18acb0221644ecbe21cbe51324ea8c9135307368198911d10bbd5" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.592560 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"854d0438c9e18acb0221644ecbe21cbe51324ea8c9135307368198911d10bbd5"} err="failed to get container status \"854d0438c9e18acb0221644ecbe21cbe51324ea8c9135307368198911d10bbd5\": rpc error: code = NotFound desc = could not find container \"854d0438c9e18acb0221644ecbe21cbe51324ea8c9135307368198911d10bbd5\": container with ID starting with 854d0438c9e18acb0221644ecbe21cbe51324ea8c9135307368198911d10bbd5 not found: ID does not exist" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.592601 4901 scope.go:117] "RemoveContainer" containerID="ff023d589de2e36a5396ff975d9b99b7c27af8bbe416330e6c016841896c5a55" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.593203 4901 generic.go:334] "Generic (PLEG): container finished" podID="7cd100e4-dfd3-45a7-a97c-84a05c352883" containerID="633856b9abfea0469521c94f58caeaeb509a2acfab5f38815cbd06bf2bc15a57" exitCode=0 Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.593314 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7cd100e4-dfd3-45a7-a97c-84a05c352883","Type":"ContainerDied","Data":"633856b9abfea0469521c94f58caeaeb509a2acfab5f38815cbd06bf2bc15a57"} Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.593343 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7cd100e4-dfd3-45a7-a97c-84a05c352883","Type":"ContainerDied","Data":"55cb610bc695f2d82cda0af0f6fffc59f7cbc2cd09f4aeebbcc8033406228d71"} Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.593316 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.594328 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.604357 4901 generic.go:334] "Generic (PLEG): container finished" podID="ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d" containerID="f97ba1fa97a190b5ca6b6e5bbe19c6161fb765c75b9fffa80f6b76be75c0c237" exitCode=0 Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.604418 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d","Type":"ContainerDied","Data":"f97ba1fa97a190b5ca6b6e5bbe19c6161fb765c75b9fffa80f6b76be75c0c237"} Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.604433 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.617570 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5ddb85f7bb-phpwc"] Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.619106 4901 generic.go:334] "Generic (PLEG): container finished" podID="e5efc6dd-6a36-4491-b090-b4c9301ec7d0" containerID="cf4f1b9588039d52f25928fa5f258488b5893e580645e30a5d8a63dcf3396c0b" exitCode=0 Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.620248 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6cc8485b48-f86rl" event={"ID":"e5efc6dd-6a36-4491-b090-b4c9301ec7d0","Type":"ContainerDied","Data":"cf4f1b9588039d52f25928fa5f258488b5893e580645e30a5d8a63dcf3396c0b"} Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.626263 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-d247z" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.626287 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0c1f-account-create-update-rllhk" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.626331 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-d247z" event={"ID":"d57347fc-0546-466f-95e6-055857ca3685","Type":"ContainerDied","Data":"6be556f249be4c79573196413b63a0499871594f221f3a791d4e9a538be6e9fc"} Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.632286 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-5ddb85f7bb-phpwc"] Mar 09 03:05:40 crc kubenswrapper[4901]: E0309 03:05:40.641543 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2a8500133ddbae16882734d17dcfeac24a437220c873e5c49b9335461b23a2a0" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.650178 4901 scope.go:117] "RemoveContainer" containerID="4da5234fa05d69c236f53cbbc103505a093111b7b3b09f7a401eaded8dc333cb" Mar 09 03:05:40 crc kubenswrapper[4901]: E0309 03:05:40.651654 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2a8500133ddbae16882734d17dcfeac24a437220c873e5c49b9335461b23a2a0" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 09 03:05:40 crc kubenswrapper[4901]: E0309 03:05:40.660924 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2a8500133ddbae16882734d17dcfeac24a437220c873e5c49b9335461b23a2a0" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 09 03:05:40 crc kubenswrapper[4901]: E0309 03:05:40.660998 4901 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="412e01f6-e4bb-4bbd-ba88-5726f3e2f87f" containerName="nova-cell0-conductor-conductor" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.661904 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.680072 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.690344 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tq6z\" (UniqueName: \"kubernetes.io/projected/ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d-kube-api-access-7tq6z\") pod \"ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d\" (UID: \"ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d\") " Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.690510 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d-public-tls-certs\") pod \"ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d\" (UID: \"ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d\") " Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.690554 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d-internal-tls-certs\") pod \"ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d\" (UID: \"ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d\") " Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.690581 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d-combined-ca-bundle\") pod \"ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d\" (UID: \"ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d\") " Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.690630 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d-logs\") pod \"ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d\" (UID: \"ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d\") " Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.690678 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d-config-data\") pod \"ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d\" (UID: \"ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d\") " Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.693410 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d-kube-api-access-7tq6z" (OuterVolumeSpecName: "kube-api-access-7tq6z") pod "ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d" (UID: "ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d"). InnerVolumeSpecName "kube-api-access-7tq6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.693719 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d-logs" (OuterVolumeSpecName: "logs") pod "ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d" (UID: "ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.699867 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.725902 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.743447 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d-config-data" (OuterVolumeSpecName: "config-data") pod "ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d" (UID: "ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.743488 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d" (UID: "ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.750769 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="2c3d4e9a-122e-4894-98b2-91784a9f44e8" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.205:3000/\": dial tcp 10.217.0.205:3000: connect: connection refused" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.758570 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-0c1f-account-create-update-rllhk"] Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.771762 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d" (UID: "ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.771875 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d" (UID: "ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.773969 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-0c1f-account-create-update-rllhk"] Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.793459 4901 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.793494 4901 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.793503 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.793513 4901 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d-logs\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.793523 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.793531 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tq6z\" (UniqueName: \"kubernetes.io/projected/ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d-kube-api-access-7tq6z\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.818376 4901 scope.go:117] "RemoveContainer" containerID="bddc8674c0121619ce3f35bb3a4687012269faf10b370e3405b16aff91ea8de7" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.856582 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-d247z"] Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.864553 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-d247z"] Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.895331 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpcm2\" (UniqueName: \"kubernetes.io/projected/1d382c9a-714e-41fc-8266-8e4ee322f5c7-kube-api-access-hpcm2\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.895359 4901 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d382c9a-714e-41fc-8266-8e4ee322f5c7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:40 crc kubenswrapper[4901]: I0309 03:05:40.895458 4901 scope.go:117] "RemoveContainer" containerID="3d80911034ba2c7400726494bee1c9c208e59f1ab6ee5fe2955eaf17b33102d5" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.369197 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6cc8485b48-f86rl" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.371588 4901 scope.go:117] "RemoveContainer" containerID="bddc8674c0121619ce3f35bb3a4687012269faf10b370e3405b16aff91ea8de7" Mar 09 03:05:41 crc kubenswrapper[4901]: E0309 03:05:41.372148 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bddc8674c0121619ce3f35bb3a4687012269faf10b370e3405b16aff91ea8de7\": container with ID starting with bddc8674c0121619ce3f35bb3a4687012269faf10b370e3405b16aff91ea8de7 not found: ID does not exist" containerID="bddc8674c0121619ce3f35bb3a4687012269faf10b370e3405b16aff91ea8de7" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.372340 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bddc8674c0121619ce3f35bb3a4687012269faf10b370e3405b16aff91ea8de7"} err="failed to get container status \"bddc8674c0121619ce3f35bb3a4687012269faf10b370e3405b16aff91ea8de7\": rpc error: code = NotFound desc = could not find container \"bddc8674c0121619ce3f35bb3a4687012269faf10b370e3405b16aff91ea8de7\": container with ID starting with bddc8674c0121619ce3f35bb3a4687012269faf10b370e3405b16aff91ea8de7 not found: ID does not exist" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.372483 4901 scope.go:117] "RemoveContainer" containerID="3d80911034ba2c7400726494bee1c9c208e59f1ab6ee5fe2955eaf17b33102d5" Mar 09 03:05:41 crc kubenswrapper[4901]: E0309 03:05:41.372866 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d80911034ba2c7400726494bee1c9c208e59f1ab6ee5fe2955eaf17b33102d5\": container with ID starting with 3d80911034ba2c7400726494bee1c9c208e59f1ab6ee5fe2955eaf17b33102d5 not found: ID does not exist" containerID="3d80911034ba2c7400726494bee1c9c208e59f1ab6ee5fe2955eaf17b33102d5" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.372898 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d80911034ba2c7400726494bee1c9c208e59f1ab6ee5fe2955eaf17b33102d5"} err="failed to get container status \"3d80911034ba2c7400726494bee1c9c208e59f1ab6ee5fe2955eaf17b33102d5\": rpc error: code = NotFound desc = could not find container \"3d80911034ba2c7400726494bee1c9c208e59f1ab6ee5fe2955eaf17b33102d5\": container with ID starting with 3d80911034ba2c7400726494bee1c9c208e59f1ab6ee5fe2955eaf17b33102d5 not found: ID does not exist" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.372920 4901 scope.go:117] "RemoveContainer" containerID="633856b9abfea0469521c94f58caeaeb509a2acfab5f38815cbd06bf2bc15a57" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.382659 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.387937 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.390575 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_26f9c7a2-e2b4-4be1-8206-6c067702cc74/ovn-northd/0.log" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.390780 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.399622 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.408206 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7d497f76dc-pptvt" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.413927 4901 scope.go:117] "RemoveContainer" containerID="243e391338d48efa65297ce9611e5eccbd8854abfd2a58cc9b19ec8fa2a3478d" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.438892 4901 scope.go:117] "RemoveContainer" containerID="633856b9abfea0469521c94f58caeaeb509a2acfab5f38815cbd06bf2bc15a57" Mar 09 03:05:41 crc kubenswrapper[4901]: E0309 03:05:41.439195 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"633856b9abfea0469521c94f58caeaeb509a2acfab5f38815cbd06bf2bc15a57\": container with ID starting with 633856b9abfea0469521c94f58caeaeb509a2acfab5f38815cbd06bf2bc15a57 not found: ID does not exist" containerID="633856b9abfea0469521c94f58caeaeb509a2acfab5f38815cbd06bf2bc15a57" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.439295 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"633856b9abfea0469521c94f58caeaeb509a2acfab5f38815cbd06bf2bc15a57"} err="failed to get container status \"633856b9abfea0469521c94f58caeaeb509a2acfab5f38815cbd06bf2bc15a57\": rpc error: code = NotFound desc = could not find container \"633856b9abfea0469521c94f58caeaeb509a2acfab5f38815cbd06bf2bc15a57\": container with ID starting with 633856b9abfea0469521c94f58caeaeb509a2acfab5f38815cbd06bf2bc15a57 not found: ID does not exist" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.439326 4901 scope.go:117] "RemoveContainer" containerID="243e391338d48efa65297ce9611e5eccbd8854abfd2a58cc9b19ec8fa2a3478d" Mar 09 03:05:41 crc kubenswrapper[4901]: E0309 03:05:41.439575 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"243e391338d48efa65297ce9611e5eccbd8854abfd2a58cc9b19ec8fa2a3478d\": container with ID starting with 243e391338d48efa65297ce9611e5eccbd8854abfd2a58cc9b19ec8fa2a3478d not found: ID does not exist" containerID="243e391338d48efa65297ce9611e5eccbd8854abfd2a58cc9b19ec8fa2a3478d" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.439590 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"243e391338d48efa65297ce9611e5eccbd8854abfd2a58cc9b19ec8fa2a3478d"} err="failed to get container status \"243e391338d48efa65297ce9611e5eccbd8854abfd2a58cc9b19ec8fa2a3478d\": rpc error: code = NotFound desc = could not find container \"243e391338d48efa65297ce9611e5eccbd8854abfd2a58cc9b19ec8fa2a3478d\": container with ID starting with 243e391338d48efa65297ce9611e5eccbd8854abfd2a58cc9b19ec8fa2a3478d not found: ID does not exist" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.439602 4901 scope.go:117] "RemoveContainer" containerID="f97ba1fa97a190b5ca6b6e5bbe19c6161fb765c75b9fffa80f6b76be75c0c237" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.474970 4901 scope.go:117] "RemoveContainer" containerID="8f3e0ecc8701ef810ffac6d0b7ee7465e40d3318f8d6cae6a1382396e02c80d0" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.521811 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5efc6dd-6a36-4491-b090-b4c9301ec7d0-public-tls-certs\") pod \"e5efc6dd-6a36-4491-b090-b4c9301ec7d0\" (UID: \"e5efc6dd-6a36-4491-b090-b4c9301ec7d0\") " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.521862 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"9df0684a-2816-4af7-97cf-00e31c542eef\" (UID: \"9df0684a-2816-4af7-97cf-00e31c542eef\") " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.521895 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26f9c7a2-e2b4-4be1-8206-6c067702cc74-config\") pod \"26f9c7a2-e2b4-4be1-8206-6c067702cc74\" (UID: \"26f9c7a2-e2b4-4be1-8206-6c067702cc74\") " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.521923 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9df0684a-2816-4af7-97cf-00e31c542eef-config-data-default\") pod \"9df0684a-2816-4af7-97cf-00e31c542eef\" (UID: \"9df0684a-2816-4af7-97cf-00e31c542eef\") " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.521946 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5efc6dd-6a36-4491-b090-b4c9301ec7d0-combined-ca-bundle\") pod \"e5efc6dd-6a36-4491-b090-b4c9301ec7d0\" (UID: \"e5efc6dd-6a36-4491-b090-b4c9301ec7d0\") " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.521988 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26f9c7a2-e2b4-4be1-8206-6c067702cc74-combined-ca-bundle\") pod \"26f9c7a2-e2b4-4be1-8206-6c067702cc74\" (UID: \"26f9c7a2-e2b4-4be1-8206-6c067702cc74\") " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.522014 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6790ccc5-8f7f-4de8-bd69-652661631307-config-data\") pod \"6790ccc5-8f7f-4de8-bd69-652661631307\" (UID: \"6790ccc5-8f7f-4de8-bd69-652661631307\") " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.522045 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e5efc6dd-6a36-4491-b090-b4c9301ec7d0-config-data-custom\") pod \"e5efc6dd-6a36-4491-b090-b4c9301ec7d0\" (UID: \"e5efc6dd-6a36-4491-b090-b4c9301ec7d0\") " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.522070 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6790ccc5-8f7f-4de8-bd69-652661631307-logs\") pod \"6790ccc5-8f7f-4de8-bd69-652661631307\" (UID: \"6790ccc5-8f7f-4de8-bd69-652661631307\") " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.522099 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9df0684a-2816-4af7-97cf-00e31c542eef-combined-ca-bundle\") pod \"9df0684a-2816-4af7-97cf-00e31c542eef\" (UID: \"9df0684a-2816-4af7-97cf-00e31c542eef\") " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.522124 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6790ccc5-8f7f-4de8-bd69-652661631307-combined-ca-bundle\") pod \"6790ccc5-8f7f-4de8-bd69-652661631307\" (UID: \"6790ccc5-8f7f-4de8-bd69-652661631307\") " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.522153 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9df0684a-2816-4af7-97cf-00e31c542eef-operator-scripts\") pod \"9df0684a-2816-4af7-97cf-00e31c542eef\" (UID: \"9df0684a-2816-4af7-97cf-00e31c542eef\") " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.522185 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/26f9c7a2-e2b4-4be1-8206-6c067702cc74-metrics-certs-tls-certs\") pod \"26f9c7a2-e2b4-4be1-8206-6c067702cc74\" (UID: \"26f9c7a2-e2b4-4be1-8206-6c067702cc74\") " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.522240 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/26f9c7a2-e2b4-4be1-8206-6c067702cc74-ovn-northd-tls-certs\") pod \"26f9c7a2-e2b4-4be1-8206-6c067702cc74\" (UID: \"26f9c7a2-e2b4-4be1-8206-6c067702cc74\") " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.522267 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6790ccc5-8f7f-4de8-bd69-652661631307-config-data-custom\") pod \"6790ccc5-8f7f-4de8-bd69-652661631307\" (UID: \"6790ccc5-8f7f-4de8-bd69-652661631307\") " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.522293 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5efc6dd-6a36-4491-b090-b4c9301ec7d0-internal-tls-certs\") pod \"e5efc6dd-6a36-4491-b090-b4c9301ec7d0\" (UID: \"e5efc6dd-6a36-4491-b090-b4c9301ec7d0\") " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.522324 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/26f9c7a2-e2b4-4be1-8206-6c067702cc74-scripts\") pod \"26f9c7a2-e2b4-4be1-8206-6c067702cc74\" (UID: \"26f9c7a2-e2b4-4be1-8206-6c067702cc74\") " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.522375 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/26f9c7a2-e2b4-4be1-8206-6c067702cc74-ovn-rundir\") pod \"26f9c7a2-e2b4-4be1-8206-6c067702cc74\" (UID: \"26f9c7a2-e2b4-4be1-8206-6c067702cc74\") " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.522416 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9df0684a-2816-4af7-97cf-00e31c542eef-config-data-generated\") pod \"9df0684a-2816-4af7-97cf-00e31c542eef\" (UID: \"9df0684a-2816-4af7-97cf-00e31c542eef\") " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.522447 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbn95\" (UniqueName: \"kubernetes.io/projected/26f9c7a2-e2b4-4be1-8206-6c067702cc74-kube-api-access-vbn95\") pod \"26f9c7a2-e2b4-4be1-8206-6c067702cc74\" (UID: \"26f9c7a2-e2b4-4be1-8206-6c067702cc74\") " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.522474 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9df0684a-2816-4af7-97cf-00e31c542eef-kolla-config\") pod \"9df0684a-2816-4af7-97cf-00e31c542eef\" (UID: \"9df0684a-2816-4af7-97cf-00e31c542eef\") " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.522524 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ws4pt\" (UniqueName: \"kubernetes.io/projected/e5efc6dd-6a36-4491-b090-b4c9301ec7d0-kube-api-access-ws4pt\") pod \"e5efc6dd-6a36-4491-b090-b4c9301ec7d0\" (UID: \"e5efc6dd-6a36-4491-b090-b4c9301ec7d0\") " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.522561 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czcks\" (UniqueName: \"kubernetes.io/projected/9df0684a-2816-4af7-97cf-00e31c542eef-kube-api-access-czcks\") pod \"9df0684a-2816-4af7-97cf-00e31c542eef\" (UID: \"9df0684a-2816-4af7-97cf-00e31c542eef\") " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.522591 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5efc6dd-6a36-4491-b090-b4c9301ec7d0-logs\") pod \"e5efc6dd-6a36-4491-b090-b4c9301ec7d0\" (UID: \"e5efc6dd-6a36-4491-b090-b4c9301ec7d0\") " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.522620 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9df0684a-2816-4af7-97cf-00e31c542eef-galera-tls-certs\") pod \"9df0684a-2816-4af7-97cf-00e31c542eef\" (UID: \"9df0684a-2816-4af7-97cf-00e31c542eef\") " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.522658 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5efc6dd-6a36-4491-b090-b4c9301ec7d0-config-data\") pod \"e5efc6dd-6a36-4491-b090-b4c9301ec7d0\" (UID: \"e5efc6dd-6a36-4491-b090-b4c9301ec7d0\") " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.522683 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdcnr\" (UniqueName: \"kubernetes.io/projected/6790ccc5-8f7f-4de8-bd69-652661631307-kube-api-access-wdcnr\") pod \"6790ccc5-8f7f-4de8-bd69-652661631307\" (UID: \"6790ccc5-8f7f-4de8-bd69-652661631307\") " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.524534 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6790ccc5-8f7f-4de8-bd69-652661631307-logs" (OuterVolumeSpecName: "logs") pod "6790ccc5-8f7f-4de8-bd69-652661631307" (UID: "6790ccc5-8f7f-4de8-bd69-652661631307"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.524643 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26f9c7a2-e2b4-4be1-8206-6c067702cc74-config" (OuterVolumeSpecName: "config") pod "26f9c7a2-e2b4-4be1-8206-6c067702cc74" (UID: "26f9c7a2-e2b4-4be1-8206-6c067702cc74"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.525890 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9df0684a-2816-4af7-97cf-00e31c542eef-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "9df0684a-2816-4af7-97cf-00e31c542eef" (UID: "9df0684a-2816-4af7-97cf-00e31c542eef"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.526662 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9df0684a-2816-4af7-97cf-00e31c542eef-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9df0684a-2816-4af7-97cf-00e31c542eef" (UID: "9df0684a-2816-4af7-97cf-00e31c542eef"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.529406 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6790ccc5-8f7f-4de8-bd69-652661631307-kube-api-access-wdcnr" (OuterVolumeSpecName: "kube-api-access-wdcnr") pod "6790ccc5-8f7f-4de8-bd69-652661631307" (UID: "6790ccc5-8f7f-4de8-bd69-652661631307"). InnerVolumeSpecName "kube-api-access-wdcnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.529839 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9df0684a-2816-4af7-97cf-00e31c542eef-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "9df0684a-2816-4af7-97cf-00e31c542eef" (UID: "9df0684a-2816-4af7-97cf-00e31c542eef"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.539590 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26f9c7a2-e2b4-4be1-8206-6c067702cc74-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "26f9c7a2-e2b4-4be1-8206-6c067702cc74" (UID: "26f9c7a2-e2b4-4be1-8206-6c067702cc74"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.539938 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26f9c7a2-e2b4-4be1-8206-6c067702cc74-scripts" (OuterVolumeSpecName: "scripts") pod "26f9c7a2-e2b4-4be1-8206-6c067702cc74" (UID: "26f9c7a2-e2b4-4be1-8206-6c067702cc74"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.540080 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5efc6dd-6a36-4491-b090-b4c9301ec7d0-logs" (OuterVolumeSpecName: "logs") pod "e5efc6dd-6a36-4491-b090-b4c9301ec7d0" (UID: "e5efc6dd-6a36-4491-b090-b4c9301ec7d0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.540950 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9df0684a-2816-4af7-97cf-00e31c542eef-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "9df0684a-2816-4af7-97cf-00e31c542eef" (UID: "9df0684a-2816-4af7-97cf-00e31c542eef"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.559578 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5efc6dd-6a36-4491-b090-b4c9301ec7d0-kube-api-access-ws4pt" (OuterVolumeSpecName: "kube-api-access-ws4pt") pod "e5efc6dd-6a36-4491-b090-b4c9301ec7d0" (UID: "e5efc6dd-6a36-4491-b090-b4c9301ec7d0"). InnerVolumeSpecName "kube-api-access-ws4pt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.562695 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26f9c7a2-e2b4-4be1-8206-6c067702cc74-kube-api-access-vbn95" (OuterVolumeSpecName: "kube-api-access-vbn95") pod "26f9c7a2-e2b4-4be1-8206-6c067702cc74" (UID: "26f9c7a2-e2b4-4be1-8206-6c067702cc74"). InnerVolumeSpecName "kube-api-access-vbn95". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.581016 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5efc6dd-6a36-4491-b090-b4c9301ec7d0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e5efc6dd-6a36-4491-b090-b4c9301ec7d0" (UID: "e5efc6dd-6a36-4491-b090-b4c9301ec7d0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.581876 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6790ccc5-8f7f-4de8-bd69-652661631307-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6790ccc5-8f7f-4de8-bd69-652661631307" (UID: "6790ccc5-8f7f-4de8-bd69-652661631307"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.582980 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9df0684a-2816-4af7-97cf-00e31c542eef-kube-api-access-czcks" (OuterVolumeSpecName: "kube-api-access-czcks") pod "9df0684a-2816-4af7-97cf-00e31c542eef" (UID: "9df0684a-2816-4af7-97cf-00e31c542eef"). InnerVolumeSpecName "kube-api-access-czcks". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.585498 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6790ccc5-8f7f-4de8-bd69-652661631307-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6790ccc5-8f7f-4de8-bd69-652661631307" (UID: "6790ccc5-8f7f-4de8-bd69-652661631307"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.594692 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26f9c7a2-e2b4-4be1-8206-6c067702cc74-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "26f9c7a2-e2b4-4be1-8206-6c067702cc74" (UID: "26f9c7a2-e2b4-4be1-8206-6c067702cc74"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.601017 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "mysql-db") pod "9df0684a-2816-4af7-97cf-00e31c542eef" (UID: "9df0684a-2816-4af7-97cf-00e31c542eef"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.603939 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5efc6dd-6a36-4491-b090-b4c9301ec7d0-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e5efc6dd-6a36-4491-b090-b4c9301ec7d0" (UID: "e5efc6dd-6a36-4491-b090-b4c9301ec7d0"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.606556 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5efc6dd-6a36-4491-b090-b4c9301ec7d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5efc6dd-6a36-4491-b090-b4c9301ec7d0" (UID: "e5efc6dd-6a36-4491-b090-b4c9301ec7d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.613393 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9df0684a-2816-4af7-97cf-00e31c542eef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9df0684a-2816-4af7-97cf-00e31c542eef" (UID: "9df0684a-2816-4af7-97cf-00e31c542eef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.625707 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26f9c7a2-e2b4-4be1-8206-6c067702cc74-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.625751 4901 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e5efc6dd-6a36-4491-b090-b4c9301ec7d0-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.625760 4901 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6790ccc5-8f7f-4de8-bd69-652661631307-logs\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.625771 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9df0684a-2816-4af7-97cf-00e31c542eef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.625779 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6790ccc5-8f7f-4de8-bd69-652661631307-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.625787 4901 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9df0684a-2816-4af7-97cf-00e31c542eef-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.625795 4901 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6790ccc5-8f7f-4de8-bd69-652661631307-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.625822 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/26f9c7a2-e2b4-4be1-8206-6c067702cc74-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.625831 4901 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/26f9c7a2-e2b4-4be1-8206-6c067702cc74-ovn-rundir\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.625840 4901 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9df0684a-2816-4af7-97cf-00e31c542eef-config-data-generated\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.625851 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbn95\" (UniqueName: \"kubernetes.io/projected/26f9c7a2-e2b4-4be1-8206-6c067702cc74-kube-api-access-vbn95\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.625860 4901 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9df0684a-2816-4af7-97cf-00e31c542eef-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.625868 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ws4pt\" (UniqueName: \"kubernetes.io/projected/e5efc6dd-6a36-4491-b090-b4c9301ec7d0-kube-api-access-ws4pt\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.625894 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czcks\" (UniqueName: \"kubernetes.io/projected/9df0684a-2816-4af7-97cf-00e31c542eef-kube-api-access-czcks\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.625903 4901 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5efc6dd-6a36-4491-b090-b4c9301ec7d0-logs\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.625911 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdcnr\" (UniqueName: \"kubernetes.io/projected/6790ccc5-8f7f-4de8-bd69-652661631307-kube-api-access-wdcnr\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.625918 4901 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5efc6dd-6a36-4491-b090-b4c9301ec7d0-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.625945 4901 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.625974 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26f9c7a2-e2b4-4be1-8206-6c067702cc74-config\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.625983 4901 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9df0684a-2816-4af7-97cf-00e31c542eef-config-data-default\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.625992 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5efc6dd-6a36-4491-b090-b4c9301ec7d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.636570 4901 generic.go:334] "Generic (PLEG): container finished" podID="412e01f6-e4bb-4bbd-ba88-5726f3e2f87f" containerID="2a8500133ddbae16882734d17dcfeac24a437220c873e5c49b9335461b23a2a0" exitCode=0 Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.636639 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"412e01f6-e4bb-4bbd-ba88-5726f3e2f87f","Type":"ContainerDied","Data":"2a8500133ddbae16882734d17dcfeac24a437220c873e5c49b9335461b23a2a0"} Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.640281 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.652191 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-875b9dd78-8t9g6" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.653271 4901 generic.go:334] "Generic (PLEG): container finished" podID="12ec135f-33b3-4be3-bb27-5bb0ea25ddce" containerID="923da5f384dd45a881e4088f214dd21db8342270329da027650b26fef0f69378" exitCode=0 Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.653380 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"12ec135f-33b3-4be3-bb27-5bb0ea25ddce","Type":"ContainerDied","Data":"923da5f384dd45a881e4088f214dd21db8342270329da027650b26fef0f69378"} Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.681894 4901 generic.go:334] "Generic (PLEG): container finished" podID="ac19cc68-f23c-4622-b265-6e94db65a43f" containerID="fffb85172792a1f2d2029246715912b5942a62d42487ca9ffed83a800ad7a8d7" exitCode=0 Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.681928 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ac19cc68-f23c-4622-b265-6e94db65a43f","Type":"ContainerDied","Data":"fffb85172792a1f2d2029246715912b5942a62d42487ca9ffed83a800ad7a8d7"} Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.683004 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.683300 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.687846 4901 generic.go:334] "Generic (PLEG): container finished" podID="2c3d4e9a-122e-4894-98b2-91784a9f44e8" containerID="5728e2e081eb4563b67c226046888a882e8bdb83914ea86f35f1527e56c5d36a" exitCode=0 Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.687930 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.688325 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2c3d4e9a-122e-4894-98b2-91784a9f44e8","Type":"ContainerDied","Data":"5728e2e081eb4563b67c226046888a882e8bdb83914ea86f35f1527e56c5d36a"} Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.688365 4901 scope.go:117] "RemoveContainer" containerID="2cd524a1798da2be5336faeee44da3b0c7ebcf43441ba6883148aca3906e53c0" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.694026 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6cc8485b48-f86rl" event={"ID":"e5efc6dd-6a36-4491-b090-b4c9301ec7d0","Type":"ContainerDied","Data":"dc7315e94862f6ebce57ac75530e055aa7deb921396c97632ce6afa930c45ff9"} Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.694088 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6cc8485b48-f86rl" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.695613 4901 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.705598 4901 generic.go:334] "Generic (PLEG): container finished" podID="9df0684a-2816-4af7-97cf-00e31c542eef" containerID="a30e9af390a47f07275ae732e18499bb625ab606c1ab75fcb6f396d64e6313b0" exitCode=0 Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.705677 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"9df0684a-2816-4af7-97cf-00e31c542eef","Type":"ContainerDied","Data":"a30e9af390a47f07275ae732e18499bb625ab606c1ab75fcb6f396d64e6313b0"} Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.705703 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"9df0684a-2816-4af7-97cf-00e31c542eef","Type":"ContainerDied","Data":"45858a7d0c4eba77159193d5e3d5965a27c862de73a9ec0764595d4ab5a7ccbc"} Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.705857 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.712561 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9df0684a-2816-4af7-97cf-00e31c542eef-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "9df0684a-2816-4af7-97cf-00e31c542eef" (UID: "9df0684a-2816-4af7-97cf-00e31c542eef"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.712773 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.712858 4901 scope.go:117] "RemoveContainer" containerID="ef72ec9152a71c9764e40c053af5316561da42488861f73e1bfb72f8538b1bb8" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.715677 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26f9c7a2-e2b4-4be1-8206-6c067702cc74-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "26f9c7a2-e2b4-4be1-8206-6c067702cc74" (UID: "26f9c7a2-e2b4-4be1-8206-6c067702cc74"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.715908 4901 generic.go:334] "Generic (PLEG): container finished" podID="6790ccc5-8f7f-4de8-bd69-652661631307" containerID="c3f30f9dc1ce91e181e2e5cb21149d56e84cb129c67ad539aaf2087c64070e73" exitCode=0 Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.715984 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7d497f76dc-pptvt" event={"ID":"6790ccc5-8f7f-4de8-bd69-652661631307","Type":"ContainerDied","Data":"c3f30f9dc1ce91e181e2e5cb21149d56e84cb129c67ad539aaf2087c64070e73"} Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.716011 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7d497f76dc-pptvt" event={"ID":"6790ccc5-8f7f-4de8-bd69-652661631307","Type":"ContainerDied","Data":"bca77a73491ce63027ad0e0375b417d26e30eb5d643291def8d0accfd5d9c349"} Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.716081 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7d497f76dc-pptvt" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.721080 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_26f9c7a2-e2b4-4be1-8206-6c067702cc74/ovn-northd/0.log" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.721109 4901 generic.go:334] "Generic (PLEG): container finished" podID="26f9c7a2-e2b4-4be1-8206-6c067702cc74" containerID="380ec713a35cadba56e726245f8b17f2443d117a6aac88cbd4d4d5386efa672d" exitCode=139 Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.721167 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"26f9c7a2-e2b4-4be1-8206-6c067702cc74","Type":"ContainerDied","Data":"380ec713a35cadba56e726245f8b17f2443d117a6aac88cbd4d4d5386efa672d"} Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.721186 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"26f9c7a2-e2b4-4be1-8206-6c067702cc74","Type":"ContainerDied","Data":"032244c67de9b121df4c60605ba9ece2c844f49edd92bde9ded65add4865d57a"} Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.721351 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.722507 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5efc6dd-6a36-4491-b090-b4c9301ec7d0-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e5efc6dd-6a36-4491-b090-b4c9301ec7d0" (UID: "e5efc6dd-6a36-4491-b090-b4c9301ec7d0"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.723988 4901 generic.go:334] "Generic (PLEG): container finished" podID="baa336b3-abdd-43e2-9c54-6d8d34c71204" containerID="3ed5c94eba80813636cf9478d7655211647869ac0b7da74e90655f7e8fc79465" exitCode=0 Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.724014 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-875b9dd78-8t9g6" event={"ID":"baa336b3-abdd-43e2-9c54-6d8d34c71204","Type":"ContainerDied","Data":"3ed5c94eba80813636cf9478d7655211647869ac0b7da74e90655f7e8fc79465"} Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.724056 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-875b9dd78-8t9g6" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.725378 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26f9c7a2-e2b4-4be1-8206-6c067702cc74-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "26f9c7a2-e2b4-4be1-8206-6c067702cc74" (UID: "26f9c7a2-e2b4-4be1-8206-6c067702cc74"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.726301 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6790ccc5-8f7f-4de8-bd69-652661631307-config-data" (OuterVolumeSpecName: "config-data") pod "6790ccc5-8f7f-4de8-bd69-652661631307" (UID: "6790ccc5-8f7f-4de8-bd69-652661631307"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.726801 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/baa336b3-abdd-43e2-9c54-6d8d34c71204-config-data\") pod \"baa336b3-abdd-43e2-9c54-6d8d34c71204\" (UID: \"baa336b3-abdd-43e2-9c54-6d8d34c71204\") " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.726896 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/baa336b3-abdd-43e2-9c54-6d8d34c71204-config-data-custom\") pod \"baa336b3-abdd-43e2-9c54-6d8d34c71204\" (UID: \"baa336b3-abdd-43e2-9c54-6d8d34c71204\") " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.726930 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62mrb\" (UniqueName: \"kubernetes.io/projected/412e01f6-e4bb-4bbd-ba88-5726f3e2f87f-kube-api-access-62mrb\") pod \"412e01f6-e4bb-4bbd-ba88-5726f3e2f87f\" (UID: \"412e01f6-e4bb-4bbd-ba88-5726f3e2f87f\") " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.726968 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/baa336b3-abdd-43e2-9c54-6d8d34c71204-combined-ca-bundle\") pod \"baa336b3-abdd-43e2-9c54-6d8d34c71204\" (UID: \"baa336b3-abdd-43e2-9c54-6d8d34c71204\") " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.727005 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/412e01f6-e4bb-4bbd-ba88-5726f3e2f87f-combined-ca-bundle\") pod \"412e01f6-e4bb-4bbd-ba88-5726f3e2f87f\" (UID: \"412e01f6-e4bb-4bbd-ba88-5726f3e2f87f\") " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.727029 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/412e01f6-e4bb-4bbd-ba88-5726f3e2f87f-config-data\") pod \"412e01f6-e4bb-4bbd-ba88-5726f3e2f87f\" (UID: \"412e01f6-e4bb-4bbd-ba88-5726f3e2f87f\") " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.727063 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79g9h\" (UniqueName: \"kubernetes.io/projected/baa336b3-abdd-43e2-9c54-6d8d34c71204-kube-api-access-79g9h\") pod \"baa336b3-abdd-43e2-9c54-6d8d34c71204\" (UID: \"baa336b3-abdd-43e2-9c54-6d8d34c71204\") " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.727141 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/baa336b3-abdd-43e2-9c54-6d8d34c71204-logs\") pod \"baa336b3-abdd-43e2-9c54-6d8d34c71204\" (UID: \"baa336b3-abdd-43e2-9c54-6d8d34c71204\") " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.727463 4901 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9df0684a-2816-4af7-97cf-00e31c542eef-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.727476 4901 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.727485 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6790ccc5-8f7f-4de8-bd69-652661631307-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.727493 4901 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/26f9c7a2-e2b4-4be1-8206-6c067702cc74-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.727504 4901 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/26f9c7a2-e2b4-4be1-8206-6c067702cc74-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.727512 4901 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5efc6dd-6a36-4491-b090-b4c9301ec7d0-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.727794 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/baa336b3-abdd-43e2-9c54-6d8d34c71204-logs" (OuterVolumeSpecName: "logs") pod "baa336b3-abdd-43e2-9c54-6d8d34c71204" (UID: "baa336b3-abdd-43e2-9c54-6d8d34c71204"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.728445 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5efc6dd-6a36-4491-b090-b4c9301ec7d0-config-data" (OuterVolumeSpecName: "config-data") pod "e5efc6dd-6a36-4491-b090-b4c9301ec7d0" (UID: "e5efc6dd-6a36-4491-b090-b4c9301ec7d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.742499 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/baa336b3-abdd-43e2-9c54-6d8d34c71204-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "baa336b3-abdd-43e2-9c54-6d8d34c71204" (UID: "baa336b3-abdd-43e2-9c54-6d8d34c71204"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.745797 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/412e01f6-e4bb-4bbd-ba88-5726f3e2f87f-kube-api-access-62mrb" (OuterVolumeSpecName: "kube-api-access-62mrb") pod "412e01f6-e4bb-4bbd-ba88-5726f3e2f87f" (UID: "412e01f6-e4bb-4bbd-ba88-5726f3e2f87f"). InnerVolumeSpecName "kube-api-access-62mrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.751689 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/baa336b3-abdd-43e2-9c54-6d8d34c71204-kube-api-access-79g9h" (OuterVolumeSpecName: "kube-api-access-79g9h") pod "baa336b3-abdd-43e2-9c54-6d8d34c71204" (UID: "baa336b3-abdd-43e2-9c54-6d8d34c71204"). InnerVolumeSpecName "kube-api-access-79g9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.752608 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/baa336b3-abdd-43e2-9c54-6d8d34c71204-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "baa336b3-abdd-43e2-9c54-6d8d34c71204" (UID: "baa336b3-abdd-43e2-9c54-6d8d34c71204"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.759271 4901 scope.go:117] "RemoveContainer" containerID="5728e2e081eb4563b67c226046888a882e8bdb83914ea86f35f1527e56c5d36a" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.774637 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/baa336b3-abdd-43e2-9c54-6d8d34c71204-config-data" (OuterVolumeSpecName: "config-data") pod "baa336b3-abdd-43e2-9c54-6d8d34c71204" (UID: "baa336b3-abdd-43e2-9c54-6d8d34c71204"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.777464 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/412e01f6-e4bb-4bbd-ba88-5726f3e2f87f-config-data" (OuterVolumeSpecName: "config-data") pod "412e01f6-e4bb-4bbd-ba88-5726f3e2f87f" (UID: "412e01f6-e4bb-4bbd-ba88-5726f3e2f87f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.783353 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/412e01f6-e4bb-4bbd-ba88-5726f3e2f87f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "412e01f6-e4bb-4bbd-ba88-5726f3e2f87f" (UID: "412e01f6-e4bb-4bbd-ba88-5726f3e2f87f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.804012 4901 scope.go:117] "RemoveContainer" containerID="fd30c7afd1e4a7b025b8b590cdbf65ff2deb03badcd9cc3419bf24e1b542c1ed" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.825782 4901 scope.go:117] "RemoveContainer" containerID="cf4f1b9588039d52f25928fa5f258488b5893e580645e30a5d8a63dcf3396c0b" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.828054 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c3d4e9a-122e-4894-98b2-91784a9f44e8-config-data\") pod \"2c3d4e9a-122e-4894-98b2-91784a9f44e8\" (UID: \"2c3d4e9a-122e-4894-98b2-91784a9f44e8\") " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.828090 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12ec135f-33b3-4be3-bb27-5bb0ea25ddce-scripts\") pod \"12ec135f-33b3-4be3-bb27-5bb0ea25ddce\" (UID: \"12ec135f-33b3-4be3-bb27-5bb0ea25ddce\") " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.828109 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"12ec135f-33b3-4be3-bb27-5bb0ea25ddce\" (UID: \"12ec135f-33b3-4be3-bb27-5bb0ea25ddce\") " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.828134 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c3d4e9a-122e-4894-98b2-91784a9f44e8-combined-ca-bundle\") pod \"2c3d4e9a-122e-4894-98b2-91784a9f44e8\" (UID: \"2c3d4e9a-122e-4894-98b2-91784a9f44e8\") " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.828154 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac19cc68-f23c-4622-b265-6e94db65a43f-combined-ca-bundle\") pod \"ac19cc68-f23c-4622-b265-6e94db65a43f\" (UID: \"ac19cc68-f23c-4622-b265-6e94db65a43f\") " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.828177 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ec135f-33b3-4be3-bb27-5bb0ea25ddce-combined-ca-bundle\") pod \"12ec135f-33b3-4be3-bb27-5bb0ea25ddce\" (UID: \"12ec135f-33b3-4be3-bb27-5bb0ea25ddce\") " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.828214 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12ec135f-33b3-4be3-bb27-5bb0ea25ddce-config-data\") pod \"12ec135f-33b3-4be3-bb27-5bb0ea25ddce\" (UID: \"12ec135f-33b3-4be3-bb27-5bb0ea25ddce\") " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.828264 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ac19cc68-f23c-4622-b265-6e94db65a43f-httpd-run\") pod \"ac19cc68-f23c-4622-b265-6e94db65a43f\" (UID: \"ac19cc68-f23c-4622-b265-6e94db65a43f\") " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.828288 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac19cc68-f23c-4622-b265-6e94db65a43f-config-data\") pod \"ac19cc68-f23c-4622-b265-6e94db65a43f\" (UID: \"ac19cc68-f23c-4622-b265-6e94db65a43f\") " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.828314 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4p5pq\" (UniqueName: \"kubernetes.io/projected/2c3d4e9a-122e-4894-98b2-91784a9f44e8-kube-api-access-4p5pq\") pod \"2c3d4e9a-122e-4894-98b2-91784a9f44e8\" (UID: \"2c3d4e9a-122e-4894-98b2-91784a9f44e8\") " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.828346 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c3d4e9a-122e-4894-98b2-91784a9f44e8-scripts\") pod \"2c3d4e9a-122e-4894-98b2-91784a9f44e8\" (UID: \"2c3d4e9a-122e-4894-98b2-91784a9f44e8\") " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.828372 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac19cc68-f23c-4622-b265-6e94db65a43f-scripts\") pod \"ac19cc68-f23c-4622-b265-6e94db65a43f\" (UID: \"ac19cc68-f23c-4622-b265-6e94db65a43f\") " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.828514 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac19cc68-f23c-4622-b265-6e94db65a43f-public-tls-certs\") pod \"ac19cc68-f23c-4622-b265-6e94db65a43f\" (UID: \"ac19cc68-f23c-4622-b265-6e94db65a43f\") " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.828557 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac19cc68-f23c-4622-b265-6e94db65a43f-logs\") pod \"ac19cc68-f23c-4622-b265-6e94db65a43f\" (UID: \"ac19cc68-f23c-4622-b265-6e94db65a43f\") " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.828578 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/12ec135f-33b3-4be3-bb27-5bb0ea25ddce-internal-tls-certs\") pod \"12ec135f-33b3-4be3-bb27-5bb0ea25ddce\" (UID: \"12ec135f-33b3-4be3-bb27-5bb0ea25ddce\") " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.828601 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/12ec135f-33b3-4be3-bb27-5bb0ea25ddce-httpd-run\") pod \"12ec135f-33b3-4be3-bb27-5bb0ea25ddce\" (UID: \"12ec135f-33b3-4be3-bb27-5bb0ea25ddce\") " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.828626 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c3d4e9a-122e-4894-98b2-91784a9f44e8-ceilometer-tls-certs\") pod \"2c3d4e9a-122e-4894-98b2-91784a9f44e8\" (UID: \"2c3d4e9a-122e-4894-98b2-91784a9f44e8\") " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.828655 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2c3d4e9a-122e-4894-98b2-91784a9f44e8-sg-core-conf-yaml\") pod \"2c3d4e9a-122e-4894-98b2-91784a9f44e8\" (UID: \"2c3d4e9a-122e-4894-98b2-91784a9f44e8\") " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.828684 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12ec135f-33b3-4be3-bb27-5bb0ea25ddce-logs\") pod \"12ec135f-33b3-4be3-bb27-5bb0ea25ddce\" (UID: \"12ec135f-33b3-4be3-bb27-5bb0ea25ddce\") " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.828712 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c3d4e9a-122e-4894-98b2-91784a9f44e8-run-httpd\") pod \"2c3d4e9a-122e-4894-98b2-91784a9f44e8\" (UID: \"2c3d4e9a-122e-4894-98b2-91784a9f44e8\") " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.828745 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6rw8\" (UniqueName: \"kubernetes.io/projected/ac19cc68-f23c-4622-b265-6e94db65a43f-kube-api-access-c6rw8\") pod \"ac19cc68-f23c-4622-b265-6e94db65a43f\" (UID: \"ac19cc68-f23c-4622-b265-6e94db65a43f\") " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.828761 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwnv6\" (UniqueName: \"kubernetes.io/projected/12ec135f-33b3-4be3-bb27-5bb0ea25ddce-kube-api-access-cwnv6\") pod \"12ec135f-33b3-4be3-bb27-5bb0ea25ddce\" (UID: \"12ec135f-33b3-4be3-bb27-5bb0ea25ddce\") " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.828777 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ac19cc68-f23c-4622-b265-6e94db65a43f\" (UID: \"ac19cc68-f23c-4622-b265-6e94db65a43f\") " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.828792 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c3d4e9a-122e-4894-98b2-91784a9f44e8-log-httpd\") pod \"2c3d4e9a-122e-4894-98b2-91784a9f44e8\" (UID: \"2c3d4e9a-122e-4894-98b2-91784a9f44e8\") " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.829129 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5efc6dd-6a36-4491-b090-b4c9301ec7d0-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.829147 4901 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/baa336b3-abdd-43e2-9c54-6d8d34c71204-logs\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.829156 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/baa336b3-abdd-43e2-9c54-6d8d34c71204-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.829165 4901 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/baa336b3-abdd-43e2-9c54-6d8d34c71204-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.829178 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62mrb\" (UniqueName: \"kubernetes.io/projected/412e01f6-e4bb-4bbd-ba88-5726f3e2f87f-kube-api-access-62mrb\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.829187 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/baa336b3-abdd-43e2-9c54-6d8d34c71204-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.829196 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/412e01f6-e4bb-4bbd-ba88-5726f3e2f87f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.829206 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/412e01f6-e4bb-4bbd-ba88-5726f3e2f87f-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.829214 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79g9h\" (UniqueName: \"kubernetes.io/projected/baa336b3-abdd-43e2-9c54-6d8d34c71204-kube-api-access-79g9h\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.829540 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c3d4e9a-122e-4894-98b2-91784a9f44e8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2c3d4e9a-122e-4894-98b2-91784a9f44e8" (UID: "2c3d4e9a-122e-4894-98b2-91784a9f44e8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.829915 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac19cc68-f23c-4622-b265-6e94db65a43f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ac19cc68-f23c-4622-b265-6e94db65a43f" (UID: "ac19cc68-f23c-4622-b265-6e94db65a43f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.830313 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12ec135f-33b3-4be3-bb27-5bb0ea25ddce-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "12ec135f-33b3-4be3-bb27-5bb0ea25ddce" (UID: "12ec135f-33b3-4be3-bb27-5bb0ea25ddce"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.830423 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c3d4e9a-122e-4894-98b2-91784a9f44e8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2c3d4e9a-122e-4894-98b2-91784a9f44e8" (UID: "2c3d4e9a-122e-4894-98b2-91784a9f44e8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.830602 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12ec135f-33b3-4be3-bb27-5bb0ea25ddce-scripts" (OuterVolumeSpecName: "scripts") pod "12ec135f-33b3-4be3-bb27-5bb0ea25ddce" (UID: "12ec135f-33b3-4be3-bb27-5bb0ea25ddce"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.832685 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac19cc68-f23c-4622-b265-6e94db65a43f-scripts" (OuterVolumeSpecName: "scripts") pod "ac19cc68-f23c-4622-b265-6e94db65a43f" (UID: "ac19cc68-f23c-4622-b265-6e94db65a43f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.833145 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c3d4e9a-122e-4894-98b2-91784a9f44e8-kube-api-access-4p5pq" (OuterVolumeSpecName: "kube-api-access-4p5pq") pod "2c3d4e9a-122e-4894-98b2-91784a9f44e8" (UID: "2c3d4e9a-122e-4894-98b2-91784a9f44e8"). InnerVolumeSpecName "kube-api-access-4p5pq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.834001 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac19cc68-f23c-4622-b265-6e94db65a43f-kube-api-access-c6rw8" (OuterVolumeSpecName: "kube-api-access-c6rw8") pod "ac19cc68-f23c-4622-b265-6e94db65a43f" (UID: "ac19cc68-f23c-4622-b265-6e94db65a43f"). InnerVolumeSpecName "kube-api-access-c6rw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.846581 4901 scope.go:117] "RemoveContainer" containerID="148d0935d0af545a31bff0013cab741797e444db8fdba8b7ef3fe82da71d3a67" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.846882 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12ec135f-33b3-4be3-bb27-5bb0ea25ddce-logs" (OuterVolumeSpecName: "logs") pod "12ec135f-33b3-4be3-bb27-5bb0ea25ddce" (UID: "12ec135f-33b3-4be3-bb27-5bb0ea25ddce"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.849075 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "ac19cc68-f23c-4622-b265-6e94db65a43f" (UID: "ac19cc68-f23c-4622-b265-6e94db65a43f"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.850131 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac19cc68-f23c-4622-b265-6e94db65a43f-logs" (OuterVolumeSpecName: "logs") pod "ac19cc68-f23c-4622-b265-6e94db65a43f" (UID: "ac19cc68-f23c-4622-b265-6e94db65a43f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.851426 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12ec135f-33b3-4be3-bb27-5bb0ea25ddce-kube-api-access-cwnv6" (OuterVolumeSpecName: "kube-api-access-cwnv6") pod "12ec135f-33b3-4be3-bb27-5bb0ea25ddce" (UID: "12ec135f-33b3-4be3-bb27-5bb0ea25ddce"). InnerVolumeSpecName "kube-api-access-cwnv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.852546 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c3d4e9a-122e-4894-98b2-91784a9f44e8-scripts" (OuterVolumeSpecName: "scripts") pod "2c3d4e9a-122e-4894-98b2-91784a9f44e8" (UID: "2c3d4e9a-122e-4894-98b2-91784a9f44e8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.862274 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c3d4e9a-122e-4894-98b2-91784a9f44e8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2c3d4e9a-122e-4894-98b2-91784a9f44e8" (UID: "2c3d4e9a-122e-4894-98b2-91784a9f44e8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.863325 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "12ec135f-33b3-4be3-bb27-5bb0ea25ddce" (UID: "12ec135f-33b3-4be3-bb27-5bb0ea25ddce"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.880872 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac19cc68-f23c-4622-b265-6e94db65a43f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ac19cc68-f23c-4622-b265-6e94db65a43f" (UID: "ac19cc68-f23c-4622-b265-6e94db65a43f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.880902 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac19cc68-f23c-4622-b265-6e94db65a43f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ac19cc68-f23c-4622-b265-6e94db65a43f" (UID: "ac19cc68-f23c-4622-b265-6e94db65a43f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.884209 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12ec135f-33b3-4be3-bb27-5bb0ea25ddce-config-data" (OuterVolumeSpecName: "config-data") pod "12ec135f-33b3-4be3-bb27-5bb0ea25ddce" (UID: "12ec135f-33b3-4be3-bb27-5bb0ea25ddce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.887051 4901 scope.go:117] "RemoveContainer" containerID="a30e9af390a47f07275ae732e18499bb625ab606c1ab75fcb6f396d64e6313b0" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.887279 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac19cc68-f23c-4622-b265-6e94db65a43f-config-data" (OuterVolumeSpecName: "config-data") pod "ac19cc68-f23c-4622-b265-6e94db65a43f" (UID: "ac19cc68-f23c-4622-b265-6e94db65a43f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.891404 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c3d4e9a-122e-4894-98b2-91784a9f44e8-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "2c3d4e9a-122e-4894-98b2-91784a9f44e8" (UID: "2c3d4e9a-122e-4894-98b2-91784a9f44e8"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.895959 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12ec135f-33b3-4be3-bb27-5bb0ea25ddce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "12ec135f-33b3-4be3-bb27-5bb0ea25ddce" (UID: "12ec135f-33b3-4be3-bb27-5bb0ea25ddce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.898690 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c3d4e9a-122e-4894-98b2-91784a9f44e8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2c3d4e9a-122e-4894-98b2-91784a9f44e8" (UID: "2c3d4e9a-122e-4894-98b2-91784a9f44e8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.902534 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12ec135f-33b3-4be3-bb27-5bb0ea25ddce-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "12ec135f-33b3-4be3-bb27-5bb0ea25ddce" (UID: "12ec135f-33b3-4be3-bb27-5bb0ea25ddce"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.906709 4901 scope.go:117] "RemoveContainer" containerID="0c2c9885ccda1bae73c95e9243ec29fe06b6d01586a48419ab131559dc2b48fa" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.926136 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c3d4e9a-122e-4894-98b2-91784a9f44e8-config-data" (OuterVolumeSpecName: "config-data") pod "2c3d4e9a-122e-4894-98b2-91784a9f44e8" (UID: "2c3d4e9a-122e-4894-98b2-91784a9f44e8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.928710 4901 scope.go:117] "RemoveContainer" containerID="a30e9af390a47f07275ae732e18499bb625ab606c1ab75fcb6f396d64e6313b0" Mar 09 03:05:41 crc kubenswrapper[4901]: E0309 03:05:41.929189 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a30e9af390a47f07275ae732e18499bb625ab606c1ab75fcb6f396d64e6313b0\": container with ID starting with a30e9af390a47f07275ae732e18499bb625ab606c1ab75fcb6f396d64e6313b0 not found: ID does not exist" containerID="a30e9af390a47f07275ae732e18499bb625ab606c1ab75fcb6f396d64e6313b0" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.929246 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a30e9af390a47f07275ae732e18499bb625ab606c1ab75fcb6f396d64e6313b0"} err="failed to get container status \"a30e9af390a47f07275ae732e18499bb625ab606c1ab75fcb6f396d64e6313b0\": rpc error: code = NotFound desc = could not find container \"a30e9af390a47f07275ae732e18499bb625ab606c1ab75fcb6f396d64e6313b0\": container with ID starting with a30e9af390a47f07275ae732e18499bb625ab606c1ab75fcb6f396d64e6313b0 not found: ID does not exist" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.929267 4901 scope.go:117] "RemoveContainer" containerID="0c2c9885ccda1bae73c95e9243ec29fe06b6d01586a48419ab131559dc2b48fa" Mar 09 03:05:41 crc kubenswrapper[4901]: E0309 03:05:41.929619 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c2c9885ccda1bae73c95e9243ec29fe06b6d01586a48419ab131559dc2b48fa\": container with ID starting with 0c2c9885ccda1bae73c95e9243ec29fe06b6d01586a48419ab131559dc2b48fa not found: ID does not exist" containerID="0c2c9885ccda1bae73c95e9243ec29fe06b6d01586a48419ab131559dc2b48fa" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.929671 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c2c9885ccda1bae73c95e9243ec29fe06b6d01586a48419ab131559dc2b48fa"} err="failed to get container status \"0c2c9885ccda1bae73c95e9243ec29fe06b6d01586a48419ab131559dc2b48fa\": rpc error: code = NotFound desc = could not find container \"0c2c9885ccda1bae73c95e9243ec29fe06b6d01586a48419ab131559dc2b48fa\": container with ID starting with 0c2c9885ccda1bae73c95e9243ec29fe06b6d01586a48419ab131559dc2b48fa not found: ID does not exist" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.929705 4901 scope.go:117] "RemoveContainer" containerID="c3f30f9dc1ce91e181e2e5cb21149d56e84cb129c67ad539aaf2087c64070e73" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.930395 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwnv6\" (UniqueName: \"kubernetes.io/projected/12ec135f-33b3-4be3-bb27-5bb0ea25ddce-kube-api-access-cwnv6\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.930414 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6rw8\" (UniqueName: \"kubernetes.io/projected/ac19cc68-f23c-4622-b265-6e94db65a43f-kube-api-access-c6rw8\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.930459 4901 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.930469 4901 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c3d4e9a-122e-4894-98b2-91784a9f44e8-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.930478 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c3d4e9a-122e-4894-98b2-91784a9f44e8-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.930491 4901 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.930553 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12ec135f-33b3-4be3-bb27-5bb0ea25ddce-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.930570 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c3d4e9a-122e-4894-98b2-91784a9f44e8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.930580 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac19cc68-f23c-4622-b265-6e94db65a43f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.930589 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ec135f-33b3-4be3-bb27-5bb0ea25ddce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.930597 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12ec135f-33b3-4be3-bb27-5bb0ea25ddce-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.930605 4901 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ac19cc68-f23c-4622-b265-6e94db65a43f-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.930641 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac19cc68-f23c-4622-b265-6e94db65a43f-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.930650 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4p5pq\" (UniqueName: \"kubernetes.io/projected/2c3d4e9a-122e-4894-98b2-91784a9f44e8-kube-api-access-4p5pq\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.930659 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c3d4e9a-122e-4894-98b2-91784a9f44e8-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.930667 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac19cc68-f23c-4622-b265-6e94db65a43f-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.930674 4901 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac19cc68-f23c-4622-b265-6e94db65a43f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.930682 4901 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac19cc68-f23c-4622-b265-6e94db65a43f-logs\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.930707 4901 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/12ec135f-33b3-4be3-bb27-5bb0ea25ddce-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.930716 4901 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/12ec135f-33b3-4be3-bb27-5bb0ea25ddce-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.930726 4901 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c3d4e9a-122e-4894-98b2-91784a9f44e8-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.930734 4901 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2c3d4e9a-122e-4894-98b2-91784a9f44e8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.930742 4901 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12ec135f-33b3-4be3-bb27-5bb0ea25ddce-logs\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.930751 4901 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c3d4e9a-122e-4894-98b2-91784a9f44e8-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.951061 4901 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.951516 4901 scope.go:117] "RemoveContainer" containerID="4b17bc681ef09af1769a69b4b52d036b3ca7d2d0d4b4182be89cf1ab304d4772" Mar 09 03:05:41 crc kubenswrapper[4901]: I0309 03:05:41.962825 4901 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.032581 4901 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.032602 4901 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.123607 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d382c9a-714e-41fc-8266-8e4ee322f5c7" path="/var/lib/kubelet/pods/1d382c9a-714e-41fc-8266-8e4ee322f5c7/volumes" Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.124347 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b3e03cd-75ae-46dc-aee4-b778929cf535" path="/var/lib/kubelet/pods/2b3e03cd-75ae-46dc-aee4-b778929cf535/volumes" Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.126891 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="719d451b-159a-4fa7-9c72-54f42fb4f216" path="/var/lib/kubelet/pods/719d451b-159a-4fa7-9c72-54f42fb4f216/volumes" Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.127906 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cd100e4-dfd3-45a7-a97c-84a05c352883" path="/var/lib/kubelet/pods/7cd100e4-dfd3-45a7-a97c-84a05c352883/volumes" Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.129168 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="966d96ae-fba9-4ecd-85e5-a81cecfb2ed3" path="/var/lib/kubelet/pods/966d96ae-fba9-4ecd-85e5-a81cecfb2ed3/volumes" Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.130784 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d" path="/var/lib/kubelet/pods/ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d/volumes" Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.134931 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d57347fc-0546-466f-95e6-055857ca3685" path="/var/lib/kubelet/pods/d57347fc-0546-466f-95e6-055857ca3685/volumes" Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.153858 4901 scope.go:117] "RemoveContainer" containerID="c3f30f9dc1ce91e181e2e5cb21149d56e84cb129c67ad539aaf2087c64070e73" Mar 09 03:05:42 crc kubenswrapper[4901]: E0309 03:05:42.154864 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3f30f9dc1ce91e181e2e5cb21149d56e84cb129c67ad539aaf2087c64070e73\": container with ID starting with c3f30f9dc1ce91e181e2e5cb21149d56e84cb129c67ad539aaf2087c64070e73 not found: ID does not exist" containerID="c3f30f9dc1ce91e181e2e5cb21149d56e84cb129c67ad539aaf2087c64070e73" Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.154910 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3f30f9dc1ce91e181e2e5cb21149d56e84cb129c67ad539aaf2087c64070e73"} err="failed to get container status \"c3f30f9dc1ce91e181e2e5cb21149d56e84cb129c67ad539aaf2087c64070e73\": rpc error: code = NotFound desc = could not find container \"c3f30f9dc1ce91e181e2e5cb21149d56e84cb129c67ad539aaf2087c64070e73\": container with ID starting with c3f30f9dc1ce91e181e2e5cb21149d56e84cb129c67ad539aaf2087c64070e73 not found: ID does not exist" Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.154935 4901 scope.go:117] "RemoveContainer" containerID="4b17bc681ef09af1769a69b4b52d036b3ca7d2d0d4b4182be89cf1ab304d4772" Mar 09 03:05:42 crc kubenswrapper[4901]: E0309 03:05:42.155303 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b17bc681ef09af1769a69b4b52d036b3ca7d2d0d4b4182be89cf1ab304d4772\": container with ID starting with 4b17bc681ef09af1769a69b4b52d036b3ca7d2d0d4b4182be89cf1ab304d4772 not found: ID does not exist" containerID="4b17bc681ef09af1769a69b4b52d036b3ca7d2d0d4b4182be89cf1ab304d4772" Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.155333 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b17bc681ef09af1769a69b4b52d036b3ca7d2d0d4b4182be89cf1ab304d4772"} err="failed to get container status \"4b17bc681ef09af1769a69b4b52d036b3ca7d2d0d4b4182be89cf1ab304d4772\": rpc error: code = NotFound desc = could not find container \"4b17bc681ef09af1769a69b4b52d036b3ca7d2d0d4b4182be89cf1ab304d4772\": container with ID starting with 4b17bc681ef09af1769a69b4b52d036b3ca7d2d0d4b4182be89cf1ab304d4772 not found: ID does not exist" Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.155354 4901 scope.go:117] "RemoveContainer" containerID="40a1a57654e5d5e4c9a9cd5ea7b2003a15b2859aff2f929b6d3f5d5857124593" Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.281852 4901 scope.go:117] "RemoveContainer" containerID="380ec713a35cadba56e726245f8b17f2443d117a6aac88cbd4d4d5386efa672d" Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.292206 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-79578f965f-zpp5p" Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.292917 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.315694 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.317436 4901 scope.go:117] "RemoveContainer" containerID="40a1a57654e5d5e4c9a9cd5ea7b2003a15b2859aff2f929b6d3f5d5857124593" Mar 09 03:05:42 crc kubenswrapper[4901]: E0309 03:05:42.317905 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40a1a57654e5d5e4c9a9cd5ea7b2003a15b2859aff2f929b6d3f5d5857124593\": container with ID starting with 40a1a57654e5d5e4c9a9cd5ea7b2003a15b2859aff2f929b6d3f5d5857124593 not found: ID does not exist" containerID="40a1a57654e5d5e4c9a9cd5ea7b2003a15b2859aff2f929b6d3f5d5857124593" Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.317936 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40a1a57654e5d5e4c9a9cd5ea7b2003a15b2859aff2f929b6d3f5d5857124593"} err="failed to get container status \"40a1a57654e5d5e4c9a9cd5ea7b2003a15b2859aff2f929b6d3f5d5857124593\": rpc error: code = NotFound desc = could not find container \"40a1a57654e5d5e4c9a9cd5ea7b2003a15b2859aff2f929b6d3f5d5857124593\": container with ID starting with 40a1a57654e5d5e4c9a9cd5ea7b2003a15b2859aff2f929b6d3f5d5857124593 not found: ID does not exist" Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.317961 4901 scope.go:117] "RemoveContainer" containerID="380ec713a35cadba56e726245f8b17f2443d117a6aac88cbd4d4d5386efa672d" Mar 09 03:05:42 crc kubenswrapper[4901]: E0309 03:05:42.318252 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"380ec713a35cadba56e726245f8b17f2443d117a6aac88cbd4d4d5386efa672d\": container with ID starting with 380ec713a35cadba56e726245f8b17f2443d117a6aac88cbd4d4d5386efa672d not found: ID does not exist" containerID="380ec713a35cadba56e726245f8b17f2443d117a6aac88cbd4d4d5386efa672d" Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.318273 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"380ec713a35cadba56e726245f8b17f2443d117a6aac88cbd4d4d5386efa672d"} err="failed to get container status \"380ec713a35cadba56e726245f8b17f2443d117a6aac88cbd4d4d5386efa672d\": rpc error: code = NotFound desc = could not find container \"380ec713a35cadba56e726245f8b17f2443d117a6aac88cbd4d4d5386efa672d\": container with ID starting with 380ec713a35cadba56e726245f8b17f2443d117a6aac88cbd4d4d5386efa672d not found: ID does not exist" Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.318287 4901 scope.go:117] "RemoveContainer" containerID="3ed5c94eba80813636cf9478d7655211647869ac0b7da74e90655f7e8fc79465" Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.328435 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-7d497f76dc-pptvt"] Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.344714 4901 scope.go:117] "RemoveContainer" containerID="6e6b9a76927163c431cd48ce16cec53d765e06541e962dcf3dccf5f2b20e4b6e" Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.346236 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-7d497f76dc-pptvt"] Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.356954 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.365505 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.371329 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.380386 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.386763 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6cc8485b48-f86rl"] Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.393069 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6cc8485b48-f86rl"] Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.407378 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-875b9dd78-8t9g6"] Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.414130 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-875b9dd78-8t9g6"] Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.441204 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e45b6a38-6035-4fd4-a525-5d51ac6d0a2d-fernet-keys\") pod \"e45b6a38-6035-4fd4-a525-5d51ac6d0a2d\" (UID: \"e45b6a38-6035-4fd4-a525-5d51ac6d0a2d\") " Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.441328 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e45b6a38-6035-4fd4-a525-5d51ac6d0a2d-config-data\") pod \"e45b6a38-6035-4fd4-a525-5d51ac6d0a2d\" (UID: \"e45b6a38-6035-4fd4-a525-5d51ac6d0a2d\") " Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.441362 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e45b6a38-6035-4fd4-a525-5d51ac6d0a2d-credential-keys\") pod \"e45b6a38-6035-4fd4-a525-5d51ac6d0a2d\" (UID: \"e45b6a38-6035-4fd4-a525-5d51ac6d0a2d\") " Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.441382 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e45b6a38-6035-4fd4-a525-5d51ac6d0a2d-public-tls-certs\") pod \"e45b6a38-6035-4fd4-a525-5d51ac6d0a2d\" (UID: \"e45b6a38-6035-4fd4-a525-5d51ac6d0a2d\") " Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.441535 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e45b6a38-6035-4fd4-a525-5d51ac6d0a2d-scripts\") pod \"e45b6a38-6035-4fd4-a525-5d51ac6d0a2d\" (UID: \"e45b6a38-6035-4fd4-a525-5d51ac6d0a2d\") " Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.441553 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbnsj\" (UniqueName: \"kubernetes.io/projected/e45b6a38-6035-4fd4-a525-5d51ac6d0a2d-kube-api-access-dbnsj\") pod \"e45b6a38-6035-4fd4-a525-5d51ac6d0a2d\" (UID: \"e45b6a38-6035-4fd4-a525-5d51ac6d0a2d\") " Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.441570 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e45b6a38-6035-4fd4-a525-5d51ac6d0a2d-internal-tls-certs\") pod \"e45b6a38-6035-4fd4-a525-5d51ac6d0a2d\" (UID: \"e45b6a38-6035-4fd4-a525-5d51ac6d0a2d\") " Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.441589 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e45b6a38-6035-4fd4-a525-5d51ac6d0a2d-combined-ca-bundle\") pod \"e45b6a38-6035-4fd4-a525-5d51ac6d0a2d\" (UID: \"e45b6a38-6035-4fd4-a525-5d51ac6d0a2d\") " Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.448771 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e45b6a38-6035-4fd4-a525-5d51ac6d0a2d-kube-api-access-dbnsj" (OuterVolumeSpecName: "kube-api-access-dbnsj") pod "e45b6a38-6035-4fd4-a525-5d51ac6d0a2d" (UID: "e45b6a38-6035-4fd4-a525-5d51ac6d0a2d"). InnerVolumeSpecName "kube-api-access-dbnsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.449062 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e45b6a38-6035-4fd4-a525-5d51ac6d0a2d-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e45b6a38-6035-4fd4-a525-5d51ac6d0a2d" (UID: "e45b6a38-6035-4fd4-a525-5d51ac6d0a2d"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.457433 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e45b6a38-6035-4fd4-a525-5d51ac6d0a2d-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "e45b6a38-6035-4fd4-a525-5d51ac6d0a2d" (UID: "e45b6a38-6035-4fd4-a525-5d51ac6d0a2d"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.464853 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e45b6a38-6035-4fd4-a525-5d51ac6d0a2d-scripts" (OuterVolumeSpecName: "scripts") pod "e45b6a38-6035-4fd4-a525-5d51ac6d0a2d" (UID: "e45b6a38-6035-4fd4-a525-5d51ac6d0a2d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.470926 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e45b6a38-6035-4fd4-a525-5d51ac6d0a2d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e45b6a38-6035-4fd4-a525-5d51ac6d0a2d" (UID: "e45b6a38-6035-4fd4-a525-5d51ac6d0a2d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.487453 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e45b6a38-6035-4fd4-a525-5d51ac6d0a2d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e45b6a38-6035-4fd4-a525-5d51ac6d0a2d" (UID: "e45b6a38-6035-4fd4-a525-5d51ac6d0a2d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.495449 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e45b6a38-6035-4fd4-a525-5d51ac6d0a2d-config-data" (OuterVolumeSpecName: "config-data") pod "e45b6a38-6035-4fd4-a525-5d51ac6d0a2d" (UID: "e45b6a38-6035-4fd4-a525-5d51ac6d0a2d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.506446 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e45b6a38-6035-4fd4-a525-5d51ac6d0a2d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e45b6a38-6035-4fd4-a525-5d51ac6d0a2d" (UID: "e45b6a38-6035-4fd4-a525-5d51ac6d0a2d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.543609 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e45b6a38-6035-4fd4-a525-5d51ac6d0a2d-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.543661 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbnsj\" (UniqueName: \"kubernetes.io/projected/e45b6a38-6035-4fd4-a525-5d51ac6d0a2d-kube-api-access-dbnsj\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.543677 4901 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e45b6a38-6035-4fd4-a525-5d51ac6d0a2d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.543689 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e45b6a38-6035-4fd4-a525-5d51ac6d0a2d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.543700 4901 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e45b6a38-6035-4fd4-a525-5d51ac6d0a2d-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.543711 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e45b6a38-6035-4fd4-a525-5d51ac6d0a2d-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.543722 4901 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e45b6a38-6035-4fd4-a525-5d51ac6d0a2d-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.543733 4901 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e45b6a38-6035-4fd4-a525-5d51ac6d0a2d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.754147 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"412e01f6-e4bb-4bbd-ba88-5726f3e2f87f","Type":"ContainerDied","Data":"6b9e33445c19373191e94b111d4c2b86f6d395ccfd04820e7d1d5abe3743aa0e"} Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.754219 4901 scope.go:117] "RemoveContainer" containerID="2a8500133ddbae16882734d17dcfeac24a437220c873e5c49b9335461b23a2a0" Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.754392 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.766036 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"12ec135f-33b3-4be3-bb27-5bb0ea25ddce","Type":"ContainerDied","Data":"f93312899e026f6aa80e17f08b7fea820f795932388bbc17762e07c45cc308c2"} Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.770710 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.779046 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ac19cc68-f23c-4622-b265-6e94db65a43f","Type":"ContainerDied","Data":"9152285f054a950d74baffc0f701b5abed3213de1f526aa39fa265f7214f9ed1"} Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.779095 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.779282 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.787596 4901 generic.go:334] "Generic (PLEG): container finished" podID="e45b6a38-6035-4fd4-a525-5d51ac6d0a2d" containerID="2e31d5cb8dd1ce0c928d719260414673c9ea3395f2fcb765ea37517d7821c355" exitCode=0 Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.787664 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-79578f965f-zpp5p" event={"ID":"e45b6a38-6035-4fd4-a525-5d51ac6d0a2d","Type":"ContainerDied","Data":"2e31d5cb8dd1ce0c928d719260414673c9ea3395f2fcb765ea37517d7821c355"} Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.787703 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-79578f965f-zpp5p" event={"ID":"e45b6a38-6035-4fd4-a525-5d51ac6d0a2d","Type":"ContainerDied","Data":"e9eb5e1585081f94ba18b305e25fe998c1db60ca1d2a6be40107c6d77dfa8484"} Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.787771 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-79578f965f-zpp5p" Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.797085 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.809790 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.811326 4901 scope.go:117] "RemoveContainer" containerID="923da5f384dd45a881e4088f214dd21db8342270329da027650b26fef0f69378" Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.819294 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.835521 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.839692 4901 scope.go:117] "RemoveContainer" containerID="b6d35aeb5dab9771d9b67accacc82110a3fcd1a1a64f3b6be5cc15e368bd1336" Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.845005 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.852267 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-79578f965f-zpp5p"] Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.858671 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-79578f965f-zpp5p"] Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.859454 4901 scope.go:117] "RemoveContainer" containerID="fffb85172792a1f2d2029246715912b5942a62d42487ca9ffed83a800ad7a8d7" Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.880466 4901 scope.go:117] "RemoveContainer" containerID="d69c564e49621dd1b26ec80f330b2a7ebc14dc4c83034905dc611e74754ca966" Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.897646 4901 scope.go:117] "RemoveContainer" containerID="2e31d5cb8dd1ce0c928d719260414673c9ea3395f2fcb765ea37517d7821c355" Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.923966 4901 scope.go:117] "RemoveContainer" containerID="2e31d5cb8dd1ce0c928d719260414673c9ea3395f2fcb765ea37517d7821c355" Mar 09 03:05:42 crc kubenswrapper[4901]: E0309 03:05:42.924747 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e31d5cb8dd1ce0c928d719260414673c9ea3395f2fcb765ea37517d7821c355\": container with ID starting with 2e31d5cb8dd1ce0c928d719260414673c9ea3395f2fcb765ea37517d7821c355 not found: ID does not exist" containerID="2e31d5cb8dd1ce0c928d719260414673c9ea3395f2fcb765ea37517d7821c355" Mar 09 03:05:42 crc kubenswrapper[4901]: I0309 03:05:42.924817 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e31d5cb8dd1ce0c928d719260414673c9ea3395f2fcb765ea37517d7821c355"} err="failed to get container status \"2e31d5cb8dd1ce0c928d719260414673c9ea3395f2fcb765ea37517d7821c355\": rpc error: code = NotFound desc = could not find container \"2e31d5cb8dd1ce0c928d719260414673c9ea3395f2fcb765ea37517d7821c355\": container with ID starting with 2e31d5cb8dd1ce0c928d719260414673c9ea3395f2fcb765ea37517d7821c355 not found: ID does not exist" Mar 09 03:05:43 crc kubenswrapper[4901]: E0309 03:05:43.053876 4901 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 09 03:05:43 crc kubenswrapper[4901]: E0309 03:05:43.053973 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/98538e55-cb87-49e2-9fd5-fff06d7edfdd-config-data podName:98538e55-cb87-49e2-9fd5-fff06d7edfdd nodeName:}" failed. No retries permitted until 2026-03-09 03:05:51.053949753 +0000 UTC m=+1475.643613505 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/98538e55-cb87-49e2-9fd5-fff06d7edfdd-config-data") pod "rabbitmq-cell1-server-0" (UID: "98538e55-cb87-49e2-9fd5-fff06d7edfdd") : configmap "rabbitmq-cell1-config-data" not found Mar 09 03:05:43 crc kubenswrapper[4901]: E0309 03:05:43.085337 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="efaef8fb4ac1ae8528bd34b42efc1ece916dbf688592bd94cb85d4b9a27c1591" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 09 03:05:43 crc kubenswrapper[4901]: E0309 03:05:43.086665 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="efaef8fb4ac1ae8528bd34b42efc1ece916dbf688592bd94cb85d4b9a27c1591" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 09 03:05:43 crc kubenswrapper[4901]: E0309 03:05:43.088399 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="efaef8fb4ac1ae8528bd34b42efc1ece916dbf688592bd94cb85d4b9a27c1591" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 09 03:05:43 crc kubenswrapper[4901]: E0309 03:05:43.088438 4901 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="19b624e5-b3de-4724-b995-829d3fcd48ae" containerName="nova-cell1-conductor-conductor" Mar 09 03:05:43 crc kubenswrapper[4901]: I0309 03:05:43.805556 4901 generic.go:334] "Generic (PLEG): container finished" podID="98538e55-cb87-49e2-9fd5-fff06d7edfdd" containerID="d88fb8444efa6a21fe15aca1c8ba0da30c0a28364fd9a1356f05611a979ae19f" exitCode=0 Mar 09 03:05:43 crc kubenswrapper[4901]: I0309 03:05:43.805614 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"98538e55-cb87-49e2-9fd5-fff06d7edfdd","Type":"ContainerDied","Data":"d88fb8444efa6a21fe15aca1c8ba0da30c0a28364fd9a1356f05611a979ae19f"} Mar 09 03:05:43 crc kubenswrapper[4901]: E0309 03:05:43.836890 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5254cd71ad47a3d4e3364e0e37c621c03d973081d495139c8fd33e59e315cc27" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 09 03:05:43 crc kubenswrapper[4901]: E0309 03:05:43.839040 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5254cd71ad47a3d4e3364e0e37c621c03d973081d495139c8fd33e59e315cc27" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 09 03:05:43 crc kubenswrapper[4901]: E0309 03:05:43.840639 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5254cd71ad47a3d4e3364e0e37c621c03d973081d495139c8fd33e59e315cc27" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 09 03:05:43 crc kubenswrapper[4901]: E0309 03:05:43.840707 4901 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="6b3d0806-00f0-46d7-a77f-f505583e49a2" containerName="nova-scheduler-scheduler" Mar 09 03:05:44 crc kubenswrapper[4901]: E0309 03:05:44.038186 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0ebf319a3bbc109bfb53682936200a7e2d64ef909307f9919c3a44fc2eece41f is running failed: container process not found" containerID="0ebf319a3bbc109bfb53682936200a7e2d64ef909307f9919c3a44fc2eece41f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 09 03:05:44 crc kubenswrapper[4901]: E0309 03:05:44.038677 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0ebf319a3bbc109bfb53682936200a7e2d64ef909307f9919c3a44fc2eece41f is running failed: container process not found" containerID="0ebf319a3bbc109bfb53682936200a7e2d64ef909307f9919c3a44fc2eece41f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 09 03:05:44 crc kubenswrapper[4901]: E0309 03:05:44.039309 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0ebf319a3bbc109bfb53682936200a7e2d64ef909307f9919c3a44fc2eece41f is running failed: container process not found" containerID="0ebf319a3bbc109bfb53682936200a7e2d64ef909307f9919c3a44fc2eece41f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 09 03:05:44 crc kubenswrapper[4901]: E0309 03:05:44.039372 4901 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0ebf319a3bbc109bfb53682936200a7e2d64ef909307f9919c3a44fc2eece41f is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-hltph" podUID="4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89" containerName="ovsdb-server" Mar 09 03:05:44 crc kubenswrapper[4901]: E0309 03:05:44.040251 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b269dfeb8bccf0d2d404b3336072d680e723d68372ed379a3aaa3d30efaa387d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 09 03:05:44 crc kubenswrapper[4901]: E0309 03:05:44.041891 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b269dfeb8bccf0d2d404b3336072d680e723d68372ed379a3aaa3d30efaa387d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 09 03:05:44 crc kubenswrapper[4901]: E0309 03:05:44.043370 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b269dfeb8bccf0d2d404b3336072d680e723d68372ed379a3aaa3d30efaa387d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 09 03:05:44 crc kubenswrapper[4901]: E0309 03:05:44.043430 4901 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-hltph" podUID="4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89" containerName="ovs-vswitchd" Mar 09 03:05:44 crc kubenswrapper[4901]: E0309 03:05:44.072117 4901 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 09 03:05:44 crc kubenswrapper[4901]: E0309 03:05:44.072197 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/46c7df0b-fc0a-4fd9-b097-72da03442510-config-data podName:46c7df0b-fc0a-4fd9-b097-72da03442510 nodeName:}" failed. No retries permitted until 2026-03-09 03:05:52.072175295 +0000 UTC m=+1476.661839037 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/46c7df0b-fc0a-4fd9-b097-72da03442510-config-data") pod "rabbitmq-server-0" (UID: "46c7df0b-fc0a-4fd9-b097-72da03442510") : configmap "rabbitmq-config-data" not found Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.076494 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.124009 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12ec135f-33b3-4be3-bb27-5bb0ea25ddce" path="/var/lib/kubelet/pods/12ec135f-33b3-4be3-bb27-5bb0ea25ddce/volumes" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.125192 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26f9c7a2-e2b4-4be1-8206-6c067702cc74" path="/var/lib/kubelet/pods/26f9c7a2-e2b4-4be1-8206-6c067702cc74/volumes" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.125836 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c3d4e9a-122e-4894-98b2-91784a9f44e8" path="/var/lib/kubelet/pods/2c3d4e9a-122e-4894-98b2-91784a9f44e8/volumes" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.127300 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="412e01f6-e4bb-4bbd-ba88-5726f3e2f87f" path="/var/lib/kubelet/pods/412e01f6-e4bb-4bbd-ba88-5726f3e2f87f/volumes" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.127885 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6790ccc5-8f7f-4de8-bd69-652661631307" path="/var/lib/kubelet/pods/6790ccc5-8f7f-4de8-bd69-652661631307/volumes" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.129140 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9df0684a-2816-4af7-97cf-00e31c542eef" path="/var/lib/kubelet/pods/9df0684a-2816-4af7-97cf-00e31c542eef/volumes" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.129804 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac19cc68-f23c-4622-b265-6e94db65a43f" path="/var/lib/kubelet/pods/ac19cc68-f23c-4622-b265-6e94db65a43f/volumes" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.130473 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="baa336b3-abdd-43e2-9c54-6d8d34c71204" path="/var/lib/kubelet/pods/baa336b3-abdd-43e2-9c54-6d8d34c71204/volumes" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.131964 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e45b6a38-6035-4fd4-a525-5d51ac6d0a2d" path="/var/lib/kubelet/pods/e45b6a38-6035-4fd4-a525-5d51ac6d0a2d/volumes" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.132704 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5efc6dd-6a36-4491-b090-b4c9301ec7d0" path="/var/lib/kubelet/pods/e5efc6dd-6a36-4491-b090-b4c9301ec7d0/volumes" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.173560 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/98538e55-cb87-49e2-9fd5-fff06d7edfdd-server-conf\") pod \"98538e55-cb87-49e2-9fd5-fff06d7edfdd\" (UID: \"98538e55-cb87-49e2-9fd5-fff06d7edfdd\") " Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.173603 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"98538e55-cb87-49e2-9fd5-fff06d7edfdd\" (UID: \"98538e55-cb87-49e2-9fd5-fff06d7edfdd\") " Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.173650 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/98538e55-cb87-49e2-9fd5-fff06d7edfdd-rabbitmq-erlang-cookie\") pod \"98538e55-cb87-49e2-9fd5-fff06d7edfdd\" (UID: \"98538e55-cb87-49e2-9fd5-fff06d7edfdd\") " Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.173670 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/98538e55-cb87-49e2-9fd5-fff06d7edfdd-rabbitmq-confd\") pod \"98538e55-cb87-49e2-9fd5-fff06d7edfdd\" (UID: \"98538e55-cb87-49e2-9fd5-fff06d7edfdd\") " Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.173692 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/98538e55-cb87-49e2-9fd5-fff06d7edfdd-config-data\") pod \"98538e55-cb87-49e2-9fd5-fff06d7edfdd\" (UID: \"98538e55-cb87-49e2-9fd5-fff06d7edfdd\") " Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.173729 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrztr\" (UniqueName: \"kubernetes.io/projected/98538e55-cb87-49e2-9fd5-fff06d7edfdd-kube-api-access-wrztr\") pod \"98538e55-cb87-49e2-9fd5-fff06d7edfdd\" (UID: \"98538e55-cb87-49e2-9fd5-fff06d7edfdd\") " Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.173745 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/98538e55-cb87-49e2-9fd5-fff06d7edfdd-erlang-cookie-secret\") pod \"98538e55-cb87-49e2-9fd5-fff06d7edfdd\" (UID: \"98538e55-cb87-49e2-9fd5-fff06d7edfdd\") " Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.173776 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/98538e55-cb87-49e2-9fd5-fff06d7edfdd-rabbitmq-tls\") pod \"98538e55-cb87-49e2-9fd5-fff06d7edfdd\" (UID: \"98538e55-cb87-49e2-9fd5-fff06d7edfdd\") " Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.173809 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/98538e55-cb87-49e2-9fd5-fff06d7edfdd-rabbitmq-plugins\") pod \"98538e55-cb87-49e2-9fd5-fff06d7edfdd\" (UID: \"98538e55-cb87-49e2-9fd5-fff06d7edfdd\") " Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.173834 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/98538e55-cb87-49e2-9fd5-fff06d7edfdd-pod-info\") pod \"98538e55-cb87-49e2-9fd5-fff06d7edfdd\" (UID: \"98538e55-cb87-49e2-9fd5-fff06d7edfdd\") " Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.173901 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/98538e55-cb87-49e2-9fd5-fff06d7edfdd-plugins-conf\") pod \"98538e55-cb87-49e2-9fd5-fff06d7edfdd\" (UID: \"98538e55-cb87-49e2-9fd5-fff06d7edfdd\") " Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.174900 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98538e55-cb87-49e2-9fd5-fff06d7edfdd-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "98538e55-cb87-49e2-9fd5-fff06d7edfdd" (UID: "98538e55-cb87-49e2-9fd5-fff06d7edfdd"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.175246 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98538e55-cb87-49e2-9fd5-fff06d7edfdd-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "98538e55-cb87-49e2-9fd5-fff06d7edfdd" (UID: "98538e55-cb87-49e2-9fd5-fff06d7edfdd"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.175411 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98538e55-cb87-49e2-9fd5-fff06d7edfdd-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "98538e55-cb87-49e2-9fd5-fff06d7edfdd" (UID: "98538e55-cb87-49e2-9fd5-fff06d7edfdd"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.179730 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/98538e55-cb87-49e2-9fd5-fff06d7edfdd-pod-info" (OuterVolumeSpecName: "pod-info") pod "98538e55-cb87-49e2-9fd5-fff06d7edfdd" (UID: "98538e55-cb87-49e2-9fd5-fff06d7edfdd"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.179902 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "98538e55-cb87-49e2-9fd5-fff06d7edfdd" (UID: "98538e55-cb87-49e2-9fd5-fff06d7edfdd"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.182927 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98538e55-cb87-49e2-9fd5-fff06d7edfdd-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "98538e55-cb87-49e2-9fd5-fff06d7edfdd" (UID: "98538e55-cb87-49e2-9fd5-fff06d7edfdd"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.205050 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98538e55-cb87-49e2-9fd5-fff06d7edfdd-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "98538e55-cb87-49e2-9fd5-fff06d7edfdd" (UID: "98538e55-cb87-49e2-9fd5-fff06d7edfdd"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.205446 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98538e55-cb87-49e2-9fd5-fff06d7edfdd-kube-api-access-wrztr" (OuterVolumeSpecName: "kube-api-access-wrztr") pod "98538e55-cb87-49e2-9fd5-fff06d7edfdd" (UID: "98538e55-cb87-49e2-9fd5-fff06d7edfdd"). InnerVolumeSpecName "kube-api-access-wrztr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.205681 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98538e55-cb87-49e2-9fd5-fff06d7edfdd-config-data" (OuterVolumeSpecName: "config-data") pod "98538e55-cb87-49e2-9fd5-fff06d7edfdd" (UID: "98538e55-cb87-49e2-9fd5-fff06d7edfdd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.223033 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98538e55-cb87-49e2-9fd5-fff06d7edfdd-server-conf" (OuterVolumeSpecName: "server-conf") pod "98538e55-cb87-49e2-9fd5-fff06d7edfdd" (UID: "98538e55-cb87-49e2-9fd5-fff06d7edfdd"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.226063 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.250448 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98538e55-cb87-49e2-9fd5-fff06d7edfdd-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "98538e55-cb87-49e2-9fd5-fff06d7edfdd" (UID: "98538e55-cb87-49e2-9fd5-fff06d7edfdd"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.275935 4901 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/98538e55-cb87-49e2-9fd5-fff06d7edfdd-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.275961 4901 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/98538e55-cb87-49e2-9fd5-fff06d7edfdd-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.275970 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/98538e55-cb87-49e2-9fd5-fff06d7edfdd-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.275980 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrztr\" (UniqueName: \"kubernetes.io/projected/98538e55-cb87-49e2-9fd5-fff06d7edfdd-kube-api-access-wrztr\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.275988 4901 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/98538e55-cb87-49e2-9fd5-fff06d7edfdd-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.275996 4901 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/98538e55-cb87-49e2-9fd5-fff06d7edfdd-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.276004 4901 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/98538e55-cb87-49e2-9fd5-fff06d7edfdd-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.276011 4901 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/98538e55-cb87-49e2-9fd5-fff06d7edfdd-pod-info\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.276019 4901 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/98538e55-cb87-49e2-9fd5-fff06d7edfdd-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.276027 4901 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/98538e55-cb87-49e2-9fd5-fff06d7edfdd-server-conf\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.276054 4901 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.290238 4901 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.377377 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/46c7df0b-fc0a-4fd9-b097-72da03442510-rabbitmq-plugins\") pod \"46c7df0b-fc0a-4fd9-b097-72da03442510\" (UID: \"46c7df0b-fc0a-4fd9-b097-72da03442510\") " Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.377445 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/46c7df0b-fc0a-4fd9-b097-72da03442510-plugins-conf\") pod \"46c7df0b-fc0a-4fd9-b097-72da03442510\" (UID: \"46c7df0b-fc0a-4fd9-b097-72da03442510\") " Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.377478 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/46c7df0b-fc0a-4fd9-b097-72da03442510-erlang-cookie-secret\") pod \"46c7df0b-fc0a-4fd9-b097-72da03442510\" (UID: \"46c7df0b-fc0a-4fd9-b097-72da03442510\") " Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.377531 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/46c7df0b-fc0a-4fd9-b097-72da03442510-rabbitmq-confd\") pod \"46c7df0b-fc0a-4fd9-b097-72da03442510\" (UID: \"46c7df0b-fc0a-4fd9-b097-72da03442510\") " Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.377565 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/46c7df0b-fc0a-4fd9-b097-72da03442510-config-data\") pod \"46c7df0b-fc0a-4fd9-b097-72da03442510\" (UID: \"46c7df0b-fc0a-4fd9-b097-72da03442510\") " Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.377614 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/46c7df0b-fc0a-4fd9-b097-72da03442510-rabbitmq-erlang-cookie\") pod \"46c7df0b-fc0a-4fd9-b097-72da03442510\" (UID: \"46c7df0b-fc0a-4fd9-b097-72da03442510\") " Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.377655 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95n4v\" (UniqueName: \"kubernetes.io/projected/46c7df0b-fc0a-4fd9-b097-72da03442510-kube-api-access-95n4v\") pod \"46c7df0b-fc0a-4fd9-b097-72da03442510\" (UID: \"46c7df0b-fc0a-4fd9-b097-72da03442510\") " Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.377683 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"46c7df0b-fc0a-4fd9-b097-72da03442510\" (UID: \"46c7df0b-fc0a-4fd9-b097-72da03442510\") " Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.377720 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/46c7df0b-fc0a-4fd9-b097-72da03442510-server-conf\") pod \"46c7df0b-fc0a-4fd9-b097-72da03442510\" (UID: \"46c7df0b-fc0a-4fd9-b097-72da03442510\") " Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.377763 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/46c7df0b-fc0a-4fd9-b097-72da03442510-pod-info\") pod \"46c7df0b-fc0a-4fd9-b097-72da03442510\" (UID: \"46c7df0b-fc0a-4fd9-b097-72da03442510\") " Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.377797 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/46c7df0b-fc0a-4fd9-b097-72da03442510-rabbitmq-tls\") pod \"46c7df0b-fc0a-4fd9-b097-72da03442510\" (UID: \"46c7df0b-fc0a-4fd9-b097-72da03442510\") " Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.377965 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46c7df0b-fc0a-4fd9-b097-72da03442510-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "46c7df0b-fc0a-4fd9-b097-72da03442510" (UID: "46c7df0b-fc0a-4fd9-b097-72da03442510"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.378294 4901 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.378321 4901 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/46c7df0b-fc0a-4fd9-b097-72da03442510-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.378788 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46c7df0b-fc0a-4fd9-b097-72da03442510-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "46c7df0b-fc0a-4fd9-b097-72da03442510" (UID: "46c7df0b-fc0a-4fd9-b097-72da03442510"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.378842 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46c7df0b-fc0a-4fd9-b097-72da03442510-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "46c7df0b-fc0a-4fd9-b097-72da03442510" (UID: "46c7df0b-fc0a-4fd9-b097-72da03442510"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.381839 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46c7df0b-fc0a-4fd9-b097-72da03442510-kube-api-access-95n4v" (OuterVolumeSpecName: "kube-api-access-95n4v") pod "46c7df0b-fc0a-4fd9-b097-72da03442510" (UID: "46c7df0b-fc0a-4fd9-b097-72da03442510"). InnerVolumeSpecName "kube-api-access-95n4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.381899 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46c7df0b-fc0a-4fd9-b097-72da03442510-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "46c7df0b-fc0a-4fd9-b097-72da03442510" (UID: "46c7df0b-fc0a-4fd9-b097-72da03442510"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.382250 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46c7df0b-fc0a-4fd9-b097-72da03442510-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "46c7df0b-fc0a-4fd9-b097-72da03442510" (UID: "46c7df0b-fc0a-4fd9-b097-72da03442510"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.382840 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "46c7df0b-fc0a-4fd9-b097-72da03442510" (UID: "46c7df0b-fc0a-4fd9-b097-72da03442510"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.383001 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/46c7df0b-fc0a-4fd9-b097-72da03442510-pod-info" (OuterVolumeSpecName: "pod-info") pod "46c7df0b-fc0a-4fd9-b097-72da03442510" (UID: "46c7df0b-fc0a-4fd9-b097-72da03442510"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.399615 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46c7df0b-fc0a-4fd9-b097-72da03442510-config-data" (OuterVolumeSpecName: "config-data") pod "46c7df0b-fc0a-4fd9-b097-72da03442510" (UID: "46c7df0b-fc0a-4fd9-b097-72da03442510"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.419425 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46c7df0b-fc0a-4fd9-b097-72da03442510-server-conf" (OuterVolumeSpecName: "server-conf") pod "46c7df0b-fc0a-4fd9-b097-72da03442510" (UID: "46c7df0b-fc0a-4fd9-b097-72da03442510"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.462799 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46c7df0b-fc0a-4fd9-b097-72da03442510-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "46c7df0b-fc0a-4fd9-b097-72da03442510" (UID: "46c7df0b-fc0a-4fd9-b097-72da03442510"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.480124 4901 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/46c7df0b-fc0a-4fd9-b097-72da03442510-pod-info\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.480175 4901 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/46c7df0b-fc0a-4fd9-b097-72da03442510-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.480197 4901 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/46c7df0b-fc0a-4fd9-b097-72da03442510-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.480217 4901 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/46c7df0b-fc0a-4fd9-b097-72da03442510-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.480241 4901 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/46c7df0b-fc0a-4fd9-b097-72da03442510-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.480283 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/46c7df0b-fc0a-4fd9-b097-72da03442510-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.480305 4901 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/46c7df0b-fc0a-4fd9-b097-72da03442510-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.480324 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95n4v\" (UniqueName: \"kubernetes.io/projected/46c7df0b-fc0a-4fd9-b097-72da03442510-kube-api-access-95n4v\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.480396 4901 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.480417 4901 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/46c7df0b-fc0a-4fd9-b097-72da03442510-server-conf\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.508363 4901 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.587977 4901 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.832066 4901 generic.go:334] "Generic (PLEG): container finished" podID="46c7df0b-fc0a-4fd9-b097-72da03442510" containerID="8e53b64e302219a5893985d8539fa02c98bbe9e4e4c23ce74114a1519a51b3c0" exitCode=0 Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.832273 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.833190 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"46c7df0b-fc0a-4fd9-b097-72da03442510","Type":"ContainerDied","Data":"8e53b64e302219a5893985d8539fa02c98bbe9e4e4c23ce74114a1519a51b3c0"} Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.833305 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"46c7df0b-fc0a-4fd9-b097-72da03442510","Type":"ContainerDied","Data":"d09b0ebec0684a62c7563c9ab80e719b64e7aaa08a6829e9b18c6e07a9220e17"} Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.833352 4901 scope.go:117] "RemoveContainer" containerID="8e53b64e302219a5893985d8539fa02c98bbe9e4e4c23ce74114a1519a51b3c0" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.840941 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"98538e55-cb87-49e2-9fd5-fff06d7edfdd","Type":"ContainerDied","Data":"d7fc083e19dda162052a9d32b6cce2981ad403f0496af4440f9b9fc162c5bfd3"} Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.841079 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.871418 4901 scope.go:117] "RemoveContainer" containerID="16bb4afb1c0b882241395638795c8e5f3d5e49f87188f36ca44e3bfb83ad26f7" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.954053 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.960000 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.971196 4901 scope.go:117] "RemoveContainer" containerID="8e53b64e302219a5893985d8539fa02c98bbe9e4e4c23ce74114a1519a51b3c0" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.974840 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 03:05:44 crc kubenswrapper[4901]: E0309 03:05:44.975732 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e53b64e302219a5893985d8539fa02c98bbe9e4e4c23ce74114a1519a51b3c0\": container with ID starting with 8e53b64e302219a5893985d8539fa02c98bbe9e4e4c23ce74114a1519a51b3c0 not found: ID does not exist" containerID="8e53b64e302219a5893985d8539fa02c98bbe9e4e4c23ce74114a1519a51b3c0" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.975828 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e53b64e302219a5893985d8539fa02c98bbe9e4e4c23ce74114a1519a51b3c0"} err="failed to get container status \"8e53b64e302219a5893985d8539fa02c98bbe9e4e4c23ce74114a1519a51b3c0\": rpc error: code = NotFound desc = could not find container \"8e53b64e302219a5893985d8539fa02c98bbe9e4e4c23ce74114a1519a51b3c0\": container with ID starting with 8e53b64e302219a5893985d8539fa02c98bbe9e4e4c23ce74114a1519a51b3c0 not found: ID does not exist" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.975903 4901 scope.go:117] "RemoveContainer" containerID="16bb4afb1c0b882241395638795c8e5f3d5e49f87188f36ca44e3bfb83ad26f7" Mar 09 03:05:44 crc kubenswrapper[4901]: E0309 03:05:44.976215 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16bb4afb1c0b882241395638795c8e5f3d5e49f87188f36ca44e3bfb83ad26f7\": container with ID starting with 16bb4afb1c0b882241395638795c8e5f3d5e49f87188f36ca44e3bfb83ad26f7 not found: ID does not exist" containerID="16bb4afb1c0b882241395638795c8e5f3d5e49f87188f36ca44e3bfb83ad26f7" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.976323 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16bb4afb1c0b882241395638795c8e5f3d5e49f87188f36ca44e3bfb83ad26f7"} err="failed to get container status \"16bb4afb1c0b882241395638795c8e5f3d5e49f87188f36ca44e3bfb83ad26f7\": rpc error: code = NotFound desc = could not find container \"16bb4afb1c0b882241395638795c8e5f3d5e49f87188f36ca44e3bfb83ad26f7\": container with ID starting with 16bb4afb1c0b882241395638795c8e5f3d5e49f87188f36ca44e3bfb83ad26f7 not found: ID does not exist" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.976389 4901 scope.go:117] "RemoveContainer" containerID="d88fb8444efa6a21fe15aca1c8ba0da30c0a28364fd9a1356f05611a979ae19f" Mar 09 03:05:44 crc kubenswrapper[4901]: I0309 03:05:44.982715 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 03:05:45 crc kubenswrapper[4901]: I0309 03:05:45.029693 4901 scope.go:117] "RemoveContainer" containerID="d573b837ddf089152e6738d97df2ec1aa5c6f25f6f2ae8c229ee9079ec71fbad" Mar 09 03:05:45 crc kubenswrapper[4901]: I0309 03:05:45.213691 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 09 03:05:45 crc kubenswrapper[4901]: I0309 03:05:45.303777 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19b624e5-b3de-4724-b995-829d3fcd48ae-config-data\") pod \"19b624e5-b3de-4724-b995-829d3fcd48ae\" (UID: \"19b624e5-b3de-4724-b995-829d3fcd48ae\") " Mar 09 03:05:45 crc kubenswrapper[4901]: I0309 03:05:45.303859 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19b624e5-b3de-4724-b995-829d3fcd48ae-combined-ca-bundle\") pod \"19b624e5-b3de-4724-b995-829d3fcd48ae\" (UID: \"19b624e5-b3de-4724-b995-829d3fcd48ae\") " Mar 09 03:05:45 crc kubenswrapper[4901]: I0309 03:05:45.303914 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w69nf\" (UniqueName: \"kubernetes.io/projected/19b624e5-b3de-4724-b995-829d3fcd48ae-kube-api-access-w69nf\") pod \"19b624e5-b3de-4724-b995-829d3fcd48ae\" (UID: \"19b624e5-b3de-4724-b995-829d3fcd48ae\") " Mar 09 03:05:45 crc kubenswrapper[4901]: I0309 03:05:45.308965 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19b624e5-b3de-4724-b995-829d3fcd48ae-kube-api-access-w69nf" (OuterVolumeSpecName: "kube-api-access-w69nf") pod "19b624e5-b3de-4724-b995-829d3fcd48ae" (UID: "19b624e5-b3de-4724-b995-829d3fcd48ae"). InnerVolumeSpecName "kube-api-access-w69nf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:05:45 crc kubenswrapper[4901]: I0309 03:05:45.336535 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19b624e5-b3de-4724-b995-829d3fcd48ae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "19b624e5-b3de-4724-b995-829d3fcd48ae" (UID: "19b624e5-b3de-4724-b995-829d3fcd48ae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:45 crc kubenswrapper[4901]: I0309 03:05:45.346398 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19b624e5-b3de-4724-b995-829d3fcd48ae-config-data" (OuterVolumeSpecName: "config-data") pod "19b624e5-b3de-4724-b995-829d3fcd48ae" (UID: "19b624e5-b3de-4724-b995-829d3fcd48ae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:45 crc kubenswrapper[4901]: I0309 03:05:45.360077 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-5b7f6df545-whtgc" podUID="a29f795d-59d2-4e43-a6ee-6190dc0ad67d" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.170:9696/\": dial tcp 10.217.0.170:9696: connect: connection refused" Mar 09 03:05:45 crc kubenswrapper[4901]: I0309 03:05:45.373735 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 03:05:45 crc kubenswrapper[4901]: I0309 03:05:45.406001 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19b624e5-b3de-4724-b995-829d3fcd48ae-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:45 crc kubenswrapper[4901]: I0309 03:05:45.406026 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19b624e5-b3de-4724-b995-829d3fcd48ae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:45 crc kubenswrapper[4901]: I0309 03:05:45.406036 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w69nf\" (UniqueName: \"kubernetes.io/projected/19b624e5-b3de-4724-b995-829d3fcd48ae-kube-api-access-w69nf\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:45 crc kubenswrapper[4901]: I0309 03:05:45.507122 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b3d0806-00f0-46d7-a77f-f505583e49a2-combined-ca-bundle\") pod \"6b3d0806-00f0-46d7-a77f-f505583e49a2\" (UID: \"6b3d0806-00f0-46d7-a77f-f505583e49a2\") " Mar 09 03:05:45 crc kubenswrapper[4901]: I0309 03:05:45.507176 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjt8j\" (UniqueName: \"kubernetes.io/projected/6b3d0806-00f0-46d7-a77f-f505583e49a2-kube-api-access-mjt8j\") pod \"6b3d0806-00f0-46d7-a77f-f505583e49a2\" (UID: \"6b3d0806-00f0-46d7-a77f-f505583e49a2\") " Mar 09 03:05:45 crc kubenswrapper[4901]: I0309 03:05:45.507217 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b3d0806-00f0-46d7-a77f-f505583e49a2-config-data\") pod \"6b3d0806-00f0-46d7-a77f-f505583e49a2\" (UID: \"6b3d0806-00f0-46d7-a77f-f505583e49a2\") " Mar 09 03:05:45 crc kubenswrapper[4901]: I0309 03:05:45.511410 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b3d0806-00f0-46d7-a77f-f505583e49a2-kube-api-access-mjt8j" (OuterVolumeSpecName: "kube-api-access-mjt8j") pod "6b3d0806-00f0-46d7-a77f-f505583e49a2" (UID: "6b3d0806-00f0-46d7-a77f-f505583e49a2"). InnerVolumeSpecName "kube-api-access-mjt8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:05:45 crc kubenswrapper[4901]: I0309 03:05:45.526022 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b3d0806-00f0-46d7-a77f-f505583e49a2-config-data" (OuterVolumeSpecName: "config-data") pod "6b3d0806-00f0-46d7-a77f-f505583e49a2" (UID: "6b3d0806-00f0-46d7-a77f-f505583e49a2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:45 crc kubenswrapper[4901]: I0309 03:05:45.543077 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b3d0806-00f0-46d7-a77f-f505583e49a2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6b3d0806-00f0-46d7-a77f-f505583e49a2" (UID: "6b3d0806-00f0-46d7-a77f-f505583e49a2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:45 crc kubenswrapper[4901]: I0309 03:05:45.609105 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b3d0806-00f0-46d7-a77f-f505583e49a2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:45 crc kubenswrapper[4901]: I0309 03:05:45.609146 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjt8j\" (UniqueName: \"kubernetes.io/projected/6b3d0806-00f0-46d7-a77f-f505583e49a2-kube-api-access-mjt8j\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:45 crc kubenswrapper[4901]: I0309 03:05:45.609160 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b3d0806-00f0-46d7-a77f-f505583e49a2-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:45 crc kubenswrapper[4901]: I0309 03:05:45.858618 4901 generic.go:334] "Generic (PLEG): container finished" podID="6b3d0806-00f0-46d7-a77f-f505583e49a2" containerID="5254cd71ad47a3d4e3364e0e37c621c03d973081d495139c8fd33e59e315cc27" exitCode=0 Mar 09 03:05:45 crc kubenswrapper[4901]: I0309 03:05:45.858662 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6b3d0806-00f0-46d7-a77f-f505583e49a2","Type":"ContainerDied","Data":"5254cd71ad47a3d4e3364e0e37c621c03d973081d495139c8fd33e59e315cc27"} Mar 09 03:05:45 crc kubenswrapper[4901]: I0309 03:05:45.859146 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6b3d0806-00f0-46d7-a77f-f505583e49a2","Type":"ContainerDied","Data":"905d7cd97d4c58eb8b586f62c5b5e78d65d3ffb136122fb1c755faa875c96459"} Mar 09 03:05:45 crc kubenswrapper[4901]: I0309 03:05:45.858675 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 03:05:45 crc kubenswrapper[4901]: I0309 03:05:45.859287 4901 scope.go:117] "RemoveContainer" containerID="5254cd71ad47a3d4e3364e0e37c621c03d973081d495139c8fd33e59e315cc27" Mar 09 03:05:45 crc kubenswrapper[4901]: I0309 03:05:45.861377 4901 generic.go:334] "Generic (PLEG): container finished" podID="19b624e5-b3de-4724-b995-829d3fcd48ae" containerID="efaef8fb4ac1ae8528bd34b42efc1ece916dbf688592bd94cb85d4b9a27c1591" exitCode=0 Mar 09 03:05:45 crc kubenswrapper[4901]: I0309 03:05:45.861404 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"19b624e5-b3de-4724-b995-829d3fcd48ae","Type":"ContainerDied","Data":"efaef8fb4ac1ae8528bd34b42efc1ece916dbf688592bd94cb85d4b9a27c1591"} Mar 09 03:05:45 crc kubenswrapper[4901]: I0309 03:05:45.861423 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"19b624e5-b3de-4724-b995-829d3fcd48ae","Type":"ContainerDied","Data":"e97742193c6f9c9ef8d7ea89c5569345682293bcd2199bd28944b14f3399b6a6"} Mar 09 03:05:45 crc kubenswrapper[4901]: I0309 03:05:45.861447 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 09 03:05:45 crc kubenswrapper[4901]: I0309 03:05:45.894374 4901 scope.go:117] "RemoveContainer" containerID="5254cd71ad47a3d4e3364e0e37c621c03d973081d495139c8fd33e59e315cc27" Mar 09 03:05:45 crc kubenswrapper[4901]: E0309 03:05:45.895281 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5254cd71ad47a3d4e3364e0e37c621c03d973081d495139c8fd33e59e315cc27\": container with ID starting with 5254cd71ad47a3d4e3364e0e37c621c03d973081d495139c8fd33e59e315cc27 not found: ID does not exist" containerID="5254cd71ad47a3d4e3364e0e37c621c03d973081d495139c8fd33e59e315cc27" Mar 09 03:05:45 crc kubenswrapper[4901]: I0309 03:05:45.898754 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5254cd71ad47a3d4e3364e0e37c621c03d973081d495139c8fd33e59e315cc27"} err="failed to get container status \"5254cd71ad47a3d4e3364e0e37c621c03d973081d495139c8fd33e59e315cc27\": rpc error: code = NotFound desc = could not find container \"5254cd71ad47a3d4e3364e0e37c621c03d973081d495139c8fd33e59e315cc27\": container with ID starting with 5254cd71ad47a3d4e3364e0e37c621c03d973081d495139c8fd33e59e315cc27 not found: ID does not exist" Mar 09 03:05:45 crc kubenswrapper[4901]: I0309 03:05:45.899151 4901 scope.go:117] "RemoveContainer" containerID="efaef8fb4ac1ae8528bd34b42efc1ece916dbf688592bd94cb85d4b9a27c1591" Mar 09 03:05:45 crc kubenswrapper[4901]: I0309 03:05:45.911915 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 09 03:05:45 crc kubenswrapper[4901]: I0309 03:05:45.920637 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 09 03:05:45 crc kubenswrapper[4901]: I0309 03:05:45.942600 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 03:05:45 crc kubenswrapper[4901]: I0309 03:05:45.942646 4901 scope.go:117] "RemoveContainer" containerID="efaef8fb4ac1ae8528bd34b42efc1ece916dbf688592bd94cb85d4b9a27c1591" Mar 09 03:05:45 crc kubenswrapper[4901]: E0309 03:05:45.943296 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efaef8fb4ac1ae8528bd34b42efc1ece916dbf688592bd94cb85d4b9a27c1591\": container with ID starting with efaef8fb4ac1ae8528bd34b42efc1ece916dbf688592bd94cb85d4b9a27c1591 not found: ID does not exist" containerID="efaef8fb4ac1ae8528bd34b42efc1ece916dbf688592bd94cb85d4b9a27c1591" Mar 09 03:05:45 crc kubenswrapper[4901]: I0309 03:05:45.943335 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efaef8fb4ac1ae8528bd34b42efc1ece916dbf688592bd94cb85d4b9a27c1591"} err="failed to get container status \"efaef8fb4ac1ae8528bd34b42efc1ece916dbf688592bd94cb85d4b9a27c1591\": rpc error: code = NotFound desc = could not find container \"efaef8fb4ac1ae8528bd34b42efc1ece916dbf688592bd94cb85d4b9a27c1591\": container with ID starting with efaef8fb4ac1ae8528bd34b42efc1ece916dbf688592bd94cb85d4b9a27c1591 not found: ID does not exist" Mar 09 03:05:45 crc kubenswrapper[4901]: I0309 03:05:45.949785 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 03:05:46 crc kubenswrapper[4901]: I0309 03:05:46.132153 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19b624e5-b3de-4724-b995-829d3fcd48ae" path="/var/lib/kubelet/pods/19b624e5-b3de-4724-b995-829d3fcd48ae/volumes" Mar 09 03:05:46 crc kubenswrapper[4901]: I0309 03:05:46.133786 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46c7df0b-fc0a-4fd9-b097-72da03442510" path="/var/lib/kubelet/pods/46c7df0b-fc0a-4fd9-b097-72da03442510/volumes" Mar 09 03:05:46 crc kubenswrapper[4901]: I0309 03:05:46.135052 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b3d0806-00f0-46d7-a77f-f505583e49a2" path="/var/lib/kubelet/pods/6b3d0806-00f0-46d7-a77f-f505583e49a2/volumes" Mar 09 03:05:46 crc kubenswrapper[4901]: I0309 03:05:46.137831 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98538e55-cb87-49e2-9fd5-fff06d7edfdd" path="/var/lib/kubelet/pods/98538e55-cb87-49e2-9fd5-fff06d7edfdd/volumes" Mar 09 03:05:49 crc kubenswrapper[4901]: E0309 03:05:49.035297 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0ebf319a3bbc109bfb53682936200a7e2d64ef909307f9919c3a44fc2eece41f is running failed: container process not found" containerID="0ebf319a3bbc109bfb53682936200a7e2d64ef909307f9919c3a44fc2eece41f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 09 03:05:49 crc kubenswrapper[4901]: E0309 03:05:49.036500 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0ebf319a3bbc109bfb53682936200a7e2d64ef909307f9919c3a44fc2eece41f is running failed: container process not found" containerID="0ebf319a3bbc109bfb53682936200a7e2d64ef909307f9919c3a44fc2eece41f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 09 03:05:49 crc kubenswrapper[4901]: E0309 03:05:49.037321 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0ebf319a3bbc109bfb53682936200a7e2d64ef909307f9919c3a44fc2eece41f is running failed: container process not found" containerID="0ebf319a3bbc109bfb53682936200a7e2d64ef909307f9919c3a44fc2eece41f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 09 03:05:49 crc kubenswrapper[4901]: E0309 03:05:49.037419 4901 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0ebf319a3bbc109bfb53682936200a7e2d64ef909307f9919c3a44fc2eece41f is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-hltph" podUID="4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89" containerName="ovsdb-server" Mar 09 03:05:49 crc kubenswrapper[4901]: E0309 03:05:49.038352 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b269dfeb8bccf0d2d404b3336072d680e723d68372ed379a3aaa3d30efaa387d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 09 03:05:49 crc kubenswrapper[4901]: E0309 03:05:49.040013 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b269dfeb8bccf0d2d404b3336072d680e723d68372ed379a3aaa3d30efaa387d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 09 03:05:49 crc kubenswrapper[4901]: E0309 03:05:49.043302 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b269dfeb8bccf0d2d404b3336072d680e723d68372ed379a3aaa3d30efaa387d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 09 03:05:49 crc kubenswrapper[4901]: E0309 03:05:49.043384 4901 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-hltph" podUID="4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89" containerName="ovs-vswitchd" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.107881 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gv8tw"] Mar 09 03:05:49 crc kubenswrapper[4901]: E0309 03:05:49.108332 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b3e03cd-75ae-46dc-aee4-b778929cf535" containerName="memcached" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.108388 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b3e03cd-75ae-46dc-aee4-b778929cf535" containerName="memcached" Mar 09 03:05:49 crc kubenswrapper[4901]: E0309 03:05:49.108417 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9df0684a-2816-4af7-97cf-00e31c542eef" containerName="mysql-bootstrap" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.108434 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="9df0684a-2816-4af7-97cf-00e31c542eef" containerName="mysql-bootstrap" Mar 09 03:05:49 crc kubenswrapper[4901]: E0309 03:05:49.108451 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98538e55-cb87-49e2-9fd5-fff06d7edfdd" containerName="setup-container" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.108464 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="98538e55-cb87-49e2-9fd5-fff06d7edfdd" containerName="setup-container" Mar 09 03:05:49 crc kubenswrapper[4901]: E0309 03:05:49.108490 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d" containerName="nova-api-log" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.108502 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d" containerName="nova-api-log" Mar 09 03:05:49 crc kubenswrapper[4901]: E0309 03:05:49.108522 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="966d96ae-fba9-4ecd-85e5-a81cecfb2ed3" containerName="placement-api" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.108535 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="966d96ae-fba9-4ecd-85e5-a81cecfb2ed3" containerName="placement-api" Mar 09 03:05:49 crc kubenswrapper[4901]: E0309 03:05:49.108553 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac19cc68-f23c-4622-b265-6e94db65a43f" containerName="glance-log" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.108565 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac19cc68-f23c-4622-b265-6e94db65a43f" containerName="glance-log" Mar 09 03:05:49 crc kubenswrapper[4901]: E0309 03:05:49.108578 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46c7df0b-fc0a-4fd9-b097-72da03442510" containerName="rabbitmq" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.108590 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="46c7df0b-fc0a-4fd9-b097-72da03442510" containerName="rabbitmq" Mar 09 03:05:49 crc kubenswrapper[4901]: E0309 03:05:49.108609 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5efc6dd-6a36-4491-b090-b4c9301ec7d0" containerName="barbican-api-log" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.108620 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5efc6dd-6a36-4491-b090-b4c9301ec7d0" containerName="barbican-api-log" Mar 09 03:05:49 crc kubenswrapper[4901]: E0309 03:05:49.108644 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98538e55-cb87-49e2-9fd5-fff06d7edfdd" containerName="rabbitmq" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.108656 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="98538e55-cb87-49e2-9fd5-fff06d7edfdd" containerName="rabbitmq" Mar 09 03:05:49 crc kubenswrapper[4901]: E0309 03:05:49.108673 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19b624e5-b3de-4724-b995-829d3fcd48ae" containerName="nova-cell1-conductor-conductor" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.108685 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="19b624e5-b3de-4724-b995-829d3fcd48ae" containerName="nova-cell1-conductor-conductor" Mar 09 03:05:49 crc kubenswrapper[4901]: E0309 03:05:49.108703 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c3d4e9a-122e-4894-98b2-91784a9f44e8" containerName="sg-core" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.108715 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c3d4e9a-122e-4894-98b2-91784a9f44e8" containerName="sg-core" Mar 09 03:05:49 crc kubenswrapper[4901]: E0309 03:05:49.108732 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6790ccc5-8f7f-4de8-bd69-652661631307" containerName="barbican-worker" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.108744 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="6790ccc5-8f7f-4de8-bd69-652661631307" containerName="barbican-worker" Mar 09 03:05:49 crc kubenswrapper[4901]: E0309 03:05:49.108758 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="baa336b3-abdd-43e2-9c54-6d8d34c71204" containerName="barbican-keystone-listener-log" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.108770 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="baa336b3-abdd-43e2-9c54-6d8d34c71204" containerName="barbican-keystone-listener-log" Mar 09 03:05:49 crc kubenswrapper[4901]: E0309 03:05:49.108785 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46c7df0b-fc0a-4fd9-b097-72da03442510" containerName="setup-container" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.108797 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="46c7df0b-fc0a-4fd9-b097-72da03442510" containerName="setup-container" Mar 09 03:05:49 crc kubenswrapper[4901]: E0309 03:05:49.108817 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12ec135f-33b3-4be3-bb27-5bb0ea25ddce" containerName="glance-log" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.108829 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="12ec135f-33b3-4be3-bb27-5bb0ea25ddce" containerName="glance-log" Mar 09 03:05:49 crc kubenswrapper[4901]: E0309 03:05:49.108844 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b3d0806-00f0-46d7-a77f-f505583e49a2" containerName="nova-scheduler-scheduler" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.108857 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b3d0806-00f0-46d7-a77f-f505583e49a2" containerName="nova-scheduler-scheduler" Mar 09 03:05:49 crc kubenswrapper[4901]: E0309 03:05:49.108878 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="baa336b3-abdd-43e2-9c54-6d8d34c71204" containerName="barbican-keystone-listener" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.108890 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="baa336b3-abdd-43e2-9c54-6d8d34c71204" containerName="barbican-keystone-listener" Mar 09 03:05:49 crc kubenswrapper[4901]: E0309 03:05:49.108909 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c3d4e9a-122e-4894-98b2-91784a9f44e8" containerName="ceilometer-notification-agent" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.108921 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c3d4e9a-122e-4894-98b2-91784a9f44e8" containerName="ceilometer-notification-agent" Mar 09 03:05:49 crc kubenswrapper[4901]: E0309 03:05:49.108941 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="412e01f6-e4bb-4bbd-ba88-5726f3e2f87f" containerName="nova-cell0-conductor-conductor" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.108953 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="412e01f6-e4bb-4bbd-ba88-5726f3e2f87f" containerName="nova-cell0-conductor-conductor" Mar 09 03:05:49 crc kubenswrapper[4901]: E0309 03:05:49.108971 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6790ccc5-8f7f-4de8-bd69-652661631307" containerName="barbican-worker-log" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.108983 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="6790ccc5-8f7f-4de8-bd69-652661631307" containerName="barbican-worker-log" Mar 09 03:05:49 crc kubenswrapper[4901]: E0309 03:05:49.109002 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d" containerName="nova-api-api" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.109014 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d" containerName="nova-api-api" Mar 09 03:05:49 crc kubenswrapper[4901]: E0309 03:05:49.109040 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26f9c7a2-e2b4-4be1-8206-6c067702cc74" containerName="openstack-network-exporter" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.109053 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="26f9c7a2-e2b4-4be1-8206-6c067702cc74" containerName="openstack-network-exporter" Mar 09 03:05:49 crc kubenswrapper[4901]: E0309 03:05:49.109067 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac19cc68-f23c-4622-b265-6e94db65a43f" containerName="glance-httpd" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.109080 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac19cc68-f23c-4622-b265-6e94db65a43f" containerName="glance-httpd" Mar 09 03:05:49 crc kubenswrapper[4901]: E0309 03:05:49.109097 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7050aa4c-725b-482a-8b90-f1374b3a4a42" containerName="kube-state-metrics" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.109109 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="7050aa4c-725b-482a-8b90-f1374b3a4a42" containerName="kube-state-metrics" Mar 09 03:05:49 crc kubenswrapper[4901]: E0309 03:05:49.109133 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9df0684a-2816-4af7-97cf-00e31c542eef" containerName="galera" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.109145 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="9df0684a-2816-4af7-97cf-00e31c542eef" containerName="galera" Mar 09 03:05:49 crc kubenswrapper[4901]: E0309 03:05:49.109162 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12ec135f-33b3-4be3-bb27-5bb0ea25ddce" containerName="glance-httpd" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.109174 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="12ec135f-33b3-4be3-bb27-5bb0ea25ddce" containerName="glance-httpd" Mar 09 03:05:49 crc kubenswrapper[4901]: E0309 03:05:49.109195 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="966d96ae-fba9-4ecd-85e5-a81cecfb2ed3" containerName="placement-log" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.109229 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="966d96ae-fba9-4ecd-85e5-a81cecfb2ed3" containerName="placement-log" Mar 09 03:05:49 crc kubenswrapper[4901]: E0309 03:05:49.109300 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c3d4e9a-122e-4894-98b2-91784a9f44e8" containerName="ceilometer-central-agent" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.109321 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c3d4e9a-122e-4894-98b2-91784a9f44e8" containerName="ceilometer-central-agent" Mar 09 03:05:49 crc kubenswrapper[4901]: E0309 03:05:49.109350 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0098aa8-4248-48ec-a254-368c395308b1" containerName="mysql-bootstrap" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.109389 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0098aa8-4248-48ec-a254-368c395308b1" containerName="mysql-bootstrap" Mar 09 03:05:49 crc kubenswrapper[4901]: E0309 03:05:49.109405 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cd100e4-dfd3-45a7-a97c-84a05c352883" containerName="nova-metadata-log" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.109420 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cd100e4-dfd3-45a7-a97c-84a05c352883" containerName="nova-metadata-log" Mar 09 03:05:49 crc kubenswrapper[4901]: E0309 03:05:49.109446 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cd100e4-dfd3-45a7-a97c-84a05c352883" containerName="nova-metadata-metadata" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.109462 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cd100e4-dfd3-45a7-a97c-84a05c352883" containerName="nova-metadata-metadata" Mar 09 03:05:49 crc kubenswrapper[4901]: E0309 03:05:49.109487 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e45b6a38-6035-4fd4-a525-5d51ac6d0a2d" containerName="keystone-api" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.109500 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="e45b6a38-6035-4fd4-a525-5d51ac6d0a2d" containerName="keystone-api" Mar 09 03:05:49 crc kubenswrapper[4901]: E0309 03:05:49.109516 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5efc6dd-6a36-4491-b090-b4c9301ec7d0" containerName="barbican-api" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.109528 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5efc6dd-6a36-4491-b090-b4c9301ec7d0" containerName="barbican-api" Mar 09 03:05:49 crc kubenswrapper[4901]: E0309 03:05:49.109551 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26f9c7a2-e2b4-4be1-8206-6c067702cc74" containerName="ovn-northd" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.109565 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="26f9c7a2-e2b4-4be1-8206-6c067702cc74" containerName="ovn-northd" Mar 09 03:05:49 crc kubenswrapper[4901]: E0309 03:05:49.109586 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c3d4e9a-122e-4894-98b2-91784a9f44e8" containerName="proxy-httpd" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.109597 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c3d4e9a-122e-4894-98b2-91784a9f44e8" containerName="proxy-httpd" Mar 09 03:05:49 crc kubenswrapper[4901]: E0309 03:05:49.109615 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0098aa8-4248-48ec-a254-368c395308b1" containerName="galera" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.109626 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0098aa8-4248-48ec-a254-368c395308b1" containerName="galera" Mar 09 03:05:49 crc kubenswrapper[4901]: E0309 03:05:49.109638 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="719d451b-159a-4fa7-9c72-54f42fb4f216" containerName="cinder-api-log" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.109650 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="719d451b-159a-4fa7-9c72-54f42fb4f216" containerName="cinder-api-log" Mar 09 03:05:49 crc kubenswrapper[4901]: E0309 03:05:49.109666 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="719d451b-159a-4fa7-9c72-54f42fb4f216" containerName="cinder-api" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.109677 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="719d451b-159a-4fa7-9c72-54f42fb4f216" containerName="cinder-api" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.109934 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d" containerName="nova-api-log" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.109955 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="baa336b3-abdd-43e2-9c54-6d8d34c71204" containerName="barbican-keystone-listener-log" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.109969 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b3d0806-00f0-46d7-a77f-f505583e49a2" containerName="nova-scheduler-scheduler" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.109984 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5efc6dd-6a36-4491-b090-b4c9301ec7d0" containerName="barbican-api-log" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.109999 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c3d4e9a-122e-4894-98b2-91784a9f44e8" containerName="sg-core" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.110021 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c3d4e9a-122e-4894-98b2-91784a9f44e8" containerName="proxy-httpd" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.110036 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="98538e55-cb87-49e2-9fd5-fff06d7edfdd" containerName="rabbitmq" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.110055 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cd100e4-dfd3-45a7-a97c-84a05c352883" containerName="nova-metadata-log" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.110078 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="7050aa4c-725b-482a-8b90-f1374b3a4a42" containerName="kube-state-metrics" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.110102 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="6790ccc5-8f7f-4de8-bd69-652661631307" containerName="barbican-worker" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.110119 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cd100e4-dfd3-45a7-a97c-84a05c352883" containerName="nova-metadata-metadata" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.110135 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="966d96ae-fba9-4ecd-85e5-a81cecfb2ed3" containerName="placement-log" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.110147 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac19cc68-f23c-4622-b265-6e94db65a43f" containerName="glance-httpd" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.110168 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="6790ccc5-8f7f-4de8-bd69-652661631307" containerName="barbican-worker-log" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.110190 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5efc6dd-6a36-4491-b090-b4c9301ec7d0" containerName="barbican-api" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.110209 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="26f9c7a2-e2b4-4be1-8206-6c067702cc74" containerName="openstack-network-exporter" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.110232 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0098aa8-4248-48ec-a254-368c395308b1" containerName="galera" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.110270 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b3e03cd-75ae-46dc-aee4-b778929cf535" containerName="memcached" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.110283 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="9df0684a-2816-4af7-97cf-00e31c542eef" containerName="galera" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.110304 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c3d4e9a-122e-4894-98b2-91784a9f44e8" containerName="ceilometer-central-agent" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.110323 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="719d451b-159a-4fa7-9c72-54f42fb4f216" containerName="cinder-api" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.110359 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="26f9c7a2-e2b4-4be1-8206-6c067702cc74" containerName="ovn-northd" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.110384 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="baa336b3-abdd-43e2-9c54-6d8d34c71204" containerName="barbican-keystone-listener" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.110407 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="412e01f6-e4bb-4bbd-ba88-5726f3e2f87f" containerName="nova-cell0-conductor-conductor" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.110425 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="e45b6a38-6035-4fd4-a525-5d51ac6d0a2d" containerName="keystone-api" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.110437 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c3d4e9a-122e-4894-98b2-91784a9f44e8" containerName="ceilometer-notification-agent" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.110451 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab8ead2b-4e3f-4d04-b16c-7b2a08b5aa9d" containerName="nova-api-api" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.110470 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="19b624e5-b3de-4724-b995-829d3fcd48ae" containerName="nova-cell1-conductor-conductor" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.110496 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="719d451b-159a-4fa7-9c72-54f42fb4f216" containerName="cinder-api-log" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.110519 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="12ec135f-33b3-4be3-bb27-5bb0ea25ddce" containerName="glance-httpd" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.110541 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="46c7df0b-fc0a-4fd9-b097-72da03442510" containerName="rabbitmq" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.110564 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="966d96ae-fba9-4ecd-85e5-a81cecfb2ed3" containerName="placement-api" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.110580 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac19cc68-f23c-4622-b265-6e94db65a43f" containerName="glance-log" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.110605 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="12ec135f-33b3-4be3-bb27-5bb0ea25ddce" containerName="glance-log" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.112394 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gv8tw" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.139848 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gv8tw"] Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.170784 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e877c5b4-163c-4b68-b416-f6c0e7b63cc3-catalog-content\") pod \"redhat-operators-gv8tw\" (UID: \"e877c5b4-163c-4b68-b416-f6c0e7b63cc3\") " pod="openshift-marketplace/redhat-operators-gv8tw" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.170875 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbrzx\" (UniqueName: \"kubernetes.io/projected/e877c5b4-163c-4b68-b416-f6c0e7b63cc3-kube-api-access-kbrzx\") pod \"redhat-operators-gv8tw\" (UID: \"e877c5b4-163c-4b68-b416-f6c0e7b63cc3\") " pod="openshift-marketplace/redhat-operators-gv8tw" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.170959 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e877c5b4-163c-4b68-b416-f6c0e7b63cc3-utilities\") pod \"redhat-operators-gv8tw\" (UID: \"e877c5b4-163c-4b68-b416-f6c0e7b63cc3\") " pod="openshift-marketplace/redhat-operators-gv8tw" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.272889 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbrzx\" (UniqueName: \"kubernetes.io/projected/e877c5b4-163c-4b68-b416-f6c0e7b63cc3-kube-api-access-kbrzx\") pod \"redhat-operators-gv8tw\" (UID: \"e877c5b4-163c-4b68-b416-f6c0e7b63cc3\") " pod="openshift-marketplace/redhat-operators-gv8tw" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.272977 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e877c5b4-163c-4b68-b416-f6c0e7b63cc3-utilities\") pod \"redhat-operators-gv8tw\" (UID: \"e877c5b4-163c-4b68-b416-f6c0e7b63cc3\") " pod="openshift-marketplace/redhat-operators-gv8tw" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.273067 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e877c5b4-163c-4b68-b416-f6c0e7b63cc3-catalog-content\") pod \"redhat-operators-gv8tw\" (UID: \"e877c5b4-163c-4b68-b416-f6c0e7b63cc3\") " pod="openshift-marketplace/redhat-operators-gv8tw" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.273505 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e877c5b4-163c-4b68-b416-f6c0e7b63cc3-utilities\") pod \"redhat-operators-gv8tw\" (UID: \"e877c5b4-163c-4b68-b416-f6c0e7b63cc3\") " pod="openshift-marketplace/redhat-operators-gv8tw" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.273551 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e877c5b4-163c-4b68-b416-f6c0e7b63cc3-catalog-content\") pod \"redhat-operators-gv8tw\" (UID: \"e877c5b4-163c-4b68-b416-f6c0e7b63cc3\") " pod="openshift-marketplace/redhat-operators-gv8tw" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.291909 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbrzx\" (UniqueName: \"kubernetes.io/projected/e877c5b4-163c-4b68-b416-f6c0e7b63cc3-kube-api-access-kbrzx\") pod \"redhat-operators-gv8tw\" (UID: \"e877c5b4-163c-4b68-b416-f6c0e7b63cc3\") " pod="openshift-marketplace/redhat-operators-gv8tw" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.459522 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gv8tw" Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.894280 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gv8tw"] Mar 09 03:05:49 crc kubenswrapper[4901]: I0309 03:05:49.923353 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gv8tw" event={"ID":"e877c5b4-163c-4b68-b416-f6c0e7b63cc3","Type":"ContainerStarted","Data":"dc96d78e3f627e522569852cc0747ae3250271520ec946555d5be0081e0e28ed"} Mar 09 03:05:50 crc kubenswrapper[4901]: I0309 03:05:50.933821 4901 generic.go:334] "Generic (PLEG): container finished" podID="e877c5b4-163c-4b68-b416-f6c0e7b63cc3" containerID="caa86e2ebd16d04619ac9b9870dde63528745c087437e0a93aa523fc41851096" exitCode=0 Mar 09 03:05:50 crc kubenswrapper[4901]: I0309 03:05:50.933943 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gv8tw" event={"ID":"e877c5b4-163c-4b68-b416-f6c0e7b63cc3","Type":"ContainerDied","Data":"caa86e2ebd16d04619ac9b9870dde63528745c087437e0a93aa523fc41851096"} Mar 09 03:05:50 crc kubenswrapper[4901]: I0309 03:05:50.936106 4901 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 03:05:52 crc kubenswrapper[4901]: I0309 03:05:52.966343 4901 generic.go:334] "Generic (PLEG): container finished" podID="e877c5b4-163c-4b68-b416-f6c0e7b63cc3" containerID="14b536e8d2ce30cd9c3a17a7bafcd53ae03e8e1282bd6e75a8a2cb8406779073" exitCode=0 Mar 09 03:05:52 crc kubenswrapper[4901]: I0309 03:05:52.966408 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gv8tw" event={"ID":"e877c5b4-163c-4b68-b416-f6c0e7b63cc3","Type":"ContainerDied","Data":"14b536e8d2ce30cd9c3a17a7bafcd53ae03e8e1282bd6e75a8a2cb8406779073"} Mar 09 03:05:53 crc kubenswrapper[4901]: I0309 03:05:53.981137 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gv8tw" event={"ID":"e877c5b4-163c-4b68-b416-f6c0e7b63cc3","Type":"ContainerStarted","Data":"588db61432b4d11ddc16c180679a683d2caa34444530a30a56a1668d538c9ced"} Mar 09 03:05:54 crc kubenswrapper[4901]: I0309 03:05:54.003684 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gv8tw" podStartSLOduration=2.576828757 podStartE2EDuration="5.003663765s" podCreationTimestamp="2026-03-09 03:05:49 +0000 UTC" firstStartedPulling="2026-03-09 03:05:50.935897437 +0000 UTC m=+1475.525561169" lastFinishedPulling="2026-03-09 03:05:53.362732405 +0000 UTC m=+1477.952396177" observedRunningTime="2026-03-09 03:05:54.001986752 +0000 UTC m=+1478.591650494" watchObservedRunningTime="2026-03-09 03:05:54.003663765 +0000 UTC m=+1478.593327507" Mar 09 03:05:54 crc kubenswrapper[4901]: E0309 03:05:54.034931 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0ebf319a3bbc109bfb53682936200a7e2d64ef909307f9919c3a44fc2eece41f is running failed: container process not found" containerID="0ebf319a3bbc109bfb53682936200a7e2d64ef909307f9919c3a44fc2eece41f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 09 03:05:54 crc kubenswrapper[4901]: E0309 03:05:54.035380 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0ebf319a3bbc109bfb53682936200a7e2d64ef909307f9919c3a44fc2eece41f is running failed: container process not found" containerID="0ebf319a3bbc109bfb53682936200a7e2d64ef909307f9919c3a44fc2eece41f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 09 03:05:54 crc kubenswrapper[4901]: E0309 03:05:54.035734 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0ebf319a3bbc109bfb53682936200a7e2d64ef909307f9919c3a44fc2eece41f is running failed: container process not found" containerID="0ebf319a3bbc109bfb53682936200a7e2d64ef909307f9919c3a44fc2eece41f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 09 03:05:54 crc kubenswrapper[4901]: E0309 03:05:54.035777 4901 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0ebf319a3bbc109bfb53682936200a7e2d64ef909307f9919c3a44fc2eece41f is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-hltph" podUID="4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89" containerName="ovsdb-server" Mar 09 03:05:54 crc kubenswrapper[4901]: E0309 03:05:54.036416 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b269dfeb8bccf0d2d404b3336072d680e723d68372ed379a3aaa3d30efaa387d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 09 03:05:54 crc kubenswrapper[4901]: E0309 03:05:54.039306 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b269dfeb8bccf0d2d404b3336072d680e723d68372ed379a3aaa3d30efaa387d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 09 03:05:54 crc kubenswrapper[4901]: E0309 03:05:54.041331 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b269dfeb8bccf0d2d404b3336072d680e723d68372ed379a3aaa3d30efaa387d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 09 03:05:54 crc kubenswrapper[4901]: E0309 03:05:54.041408 4901 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-hltph" podUID="4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89" containerName="ovs-vswitchd" Mar 09 03:05:54 crc kubenswrapper[4901]: I0309 03:05:54.999217 4901 generic.go:334] "Generic (PLEG): container finished" podID="a29f795d-59d2-4e43-a6ee-6190dc0ad67d" containerID="fd7c32b5c4e4206c35907a41758ab9c087a3999127a4db227227c92656f344b5" exitCode=0 Mar 09 03:05:55 crc kubenswrapper[4901]: I0309 03:05:54.999335 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b7f6df545-whtgc" event={"ID":"a29f795d-59d2-4e43-a6ee-6190dc0ad67d","Type":"ContainerDied","Data":"fd7c32b5c4e4206c35907a41758ab9c087a3999127a4db227227c92656f344b5"} Mar 09 03:05:55 crc kubenswrapper[4901]: I0309 03:05:55.371319 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5b7f6df545-whtgc" Mar 09 03:05:55 crc kubenswrapper[4901]: I0309 03:05:55.486955 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrv9x\" (UniqueName: \"kubernetes.io/projected/a29f795d-59d2-4e43-a6ee-6190dc0ad67d-kube-api-access-hrv9x\") pod \"a29f795d-59d2-4e43-a6ee-6190dc0ad67d\" (UID: \"a29f795d-59d2-4e43-a6ee-6190dc0ad67d\") " Mar 09 03:05:55 crc kubenswrapper[4901]: I0309 03:05:55.487345 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a29f795d-59d2-4e43-a6ee-6190dc0ad67d-httpd-config\") pod \"a29f795d-59d2-4e43-a6ee-6190dc0ad67d\" (UID: \"a29f795d-59d2-4e43-a6ee-6190dc0ad67d\") " Mar 09 03:05:55 crc kubenswrapper[4901]: I0309 03:05:55.487469 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a29f795d-59d2-4e43-a6ee-6190dc0ad67d-public-tls-certs\") pod \"a29f795d-59d2-4e43-a6ee-6190dc0ad67d\" (UID: \"a29f795d-59d2-4e43-a6ee-6190dc0ad67d\") " Mar 09 03:05:55 crc kubenswrapper[4901]: I0309 03:05:55.487626 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a29f795d-59d2-4e43-a6ee-6190dc0ad67d-ovndb-tls-certs\") pod \"a29f795d-59d2-4e43-a6ee-6190dc0ad67d\" (UID: \"a29f795d-59d2-4e43-a6ee-6190dc0ad67d\") " Mar 09 03:05:55 crc kubenswrapper[4901]: I0309 03:05:55.488114 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a29f795d-59d2-4e43-a6ee-6190dc0ad67d-internal-tls-certs\") pod \"a29f795d-59d2-4e43-a6ee-6190dc0ad67d\" (UID: \"a29f795d-59d2-4e43-a6ee-6190dc0ad67d\") " Mar 09 03:05:55 crc kubenswrapper[4901]: I0309 03:05:55.488317 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a29f795d-59d2-4e43-a6ee-6190dc0ad67d-config\") pod \"a29f795d-59d2-4e43-a6ee-6190dc0ad67d\" (UID: \"a29f795d-59d2-4e43-a6ee-6190dc0ad67d\") " Mar 09 03:05:55 crc kubenswrapper[4901]: I0309 03:05:55.488445 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a29f795d-59d2-4e43-a6ee-6190dc0ad67d-combined-ca-bundle\") pod \"a29f795d-59d2-4e43-a6ee-6190dc0ad67d\" (UID: \"a29f795d-59d2-4e43-a6ee-6190dc0ad67d\") " Mar 09 03:05:55 crc kubenswrapper[4901]: I0309 03:05:55.493778 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a29f795d-59d2-4e43-a6ee-6190dc0ad67d-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "a29f795d-59d2-4e43-a6ee-6190dc0ad67d" (UID: "a29f795d-59d2-4e43-a6ee-6190dc0ad67d"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:55 crc kubenswrapper[4901]: I0309 03:05:55.493972 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a29f795d-59d2-4e43-a6ee-6190dc0ad67d-kube-api-access-hrv9x" (OuterVolumeSpecName: "kube-api-access-hrv9x") pod "a29f795d-59d2-4e43-a6ee-6190dc0ad67d" (UID: "a29f795d-59d2-4e43-a6ee-6190dc0ad67d"). InnerVolumeSpecName "kube-api-access-hrv9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:05:55 crc kubenswrapper[4901]: I0309 03:05:55.535410 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a29f795d-59d2-4e43-a6ee-6190dc0ad67d-config" (OuterVolumeSpecName: "config") pod "a29f795d-59d2-4e43-a6ee-6190dc0ad67d" (UID: "a29f795d-59d2-4e43-a6ee-6190dc0ad67d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:55 crc kubenswrapper[4901]: I0309 03:05:55.537485 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a29f795d-59d2-4e43-a6ee-6190dc0ad67d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a29f795d-59d2-4e43-a6ee-6190dc0ad67d" (UID: "a29f795d-59d2-4e43-a6ee-6190dc0ad67d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:55 crc kubenswrapper[4901]: I0309 03:05:55.549879 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a29f795d-59d2-4e43-a6ee-6190dc0ad67d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a29f795d-59d2-4e43-a6ee-6190dc0ad67d" (UID: "a29f795d-59d2-4e43-a6ee-6190dc0ad67d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:55 crc kubenswrapper[4901]: I0309 03:05:55.566091 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a29f795d-59d2-4e43-a6ee-6190dc0ad67d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a29f795d-59d2-4e43-a6ee-6190dc0ad67d" (UID: "a29f795d-59d2-4e43-a6ee-6190dc0ad67d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:55 crc kubenswrapper[4901]: I0309 03:05:55.583523 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a29f795d-59d2-4e43-a6ee-6190dc0ad67d-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "a29f795d-59d2-4e43-a6ee-6190dc0ad67d" (UID: "a29f795d-59d2-4e43-a6ee-6190dc0ad67d"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:05:55 crc kubenswrapper[4901]: I0309 03:05:55.590052 4901 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a29f795d-59d2-4e43-a6ee-6190dc0ad67d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:55 crc kubenswrapper[4901]: I0309 03:05:55.590107 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/a29f795d-59d2-4e43-a6ee-6190dc0ad67d-config\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:55 crc kubenswrapper[4901]: I0309 03:05:55.590121 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a29f795d-59d2-4e43-a6ee-6190dc0ad67d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:55 crc kubenswrapper[4901]: I0309 03:05:55.590133 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrv9x\" (UniqueName: \"kubernetes.io/projected/a29f795d-59d2-4e43-a6ee-6190dc0ad67d-kube-api-access-hrv9x\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:55 crc kubenswrapper[4901]: I0309 03:05:55.590147 4901 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a29f795d-59d2-4e43-a6ee-6190dc0ad67d-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:55 crc kubenswrapper[4901]: I0309 03:05:55.590157 4901 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a29f795d-59d2-4e43-a6ee-6190dc0ad67d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:55 crc kubenswrapper[4901]: I0309 03:05:55.590167 4901 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a29f795d-59d2-4e43-a6ee-6190dc0ad67d-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 03:05:56 crc kubenswrapper[4901]: I0309 03:05:56.015435 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5b7f6df545-whtgc" Mar 09 03:05:56 crc kubenswrapper[4901]: I0309 03:05:56.015301 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b7f6df545-whtgc" event={"ID":"a29f795d-59d2-4e43-a6ee-6190dc0ad67d","Type":"ContainerDied","Data":"0a9683b7241c768c2c7616fd8067934d41f3bad519d9e1af200d5f4cc77411d5"} Mar 09 03:05:56 crc kubenswrapper[4901]: I0309 03:05:56.024355 4901 scope.go:117] "RemoveContainer" containerID="42ab54fcd0a589e516d8eb72267b465d9fcb7af80252fced9a388a56a784446b" Mar 09 03:05:56 crc kubenswrapper[4901]: I0309 03:05:56.071779 4901 scope.go:117] "RemoveContainer" containerID="fd7c32b5c4e4206c35907a41758ab9c087a3999127a4db227227c92656f344b5" Mar 09 03:05:56 crc kubenswrapper[4901]: I0309 03:05:56.078176 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5b7f6df545-whtgc"] Mar 09 03:05:56 crc kubenswrapper[4901]: I0309 03:05:56.085432 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5b7f6df545-whtgc"] Mar 09 03:05:56 crc kubenswrapper[4901]: I0309 03:05:56.123286 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a29f795d-59d2-4e43-a6ee-6190dc0ad67d" path="/var/lib/kubelet/pods/a29f795d-59d2-4e43-a6ee-6190dc0ad67d/volumes" Mar 09 03:05:59 crc kubenswrapper[4901]: E0309 03:05:59.035246 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0ebf319a3bbc109bfb53682936200a7e2d64ef909307f9919c3a44fc2eece41f is running failed: container process not found" containerID="0ebf319a3bbc109bfb53682936200a7e2d64ef909307f9919c3a44fc2eece41f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 09 03:05:59 crc kubenswrapper[4901]: E0309 03:05:59.037941 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0ebf319a3bbc109bfb53682936200a7e2d64ef909307f9919c3a44fc2eece41f is running failed: container process not found" containerID="0ebf319a3bbc109bfb53682936200a7e2d64ef909307f9919c3a44fc2eece41f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 09 03:05:59 crc kubenswrapper[4901]: E0309 03:05:59.038314 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b269dfeb8bccf0d2d404b3336072d680e723d68372ed379a3aaa3d30efaa387d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 09 03:05:59 crc kubenswrapper[4901]: E0309 03:05:59.038704 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0ebf319a3bbc109bfb53682936200a7e2d64ef909307f9919c3a44fc2eece41f is running failed: container process not found" containerID="0ebf319a3bbc109bfb53682936200a7e2d64ef909307f9919c3a44fc2eece41f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 09 03:05:59 crc kubenswrapper[4901]: E0309 03:05:59.038815 4901 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0ebf319a3bbc109bfb53682936200a7e2d64ef909307f9919c3a44fc2eece41f is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-hltph" podUID="4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89" containerName="ovsdb-server" Mar 09 03:05:59 crc kubenswrapper[4901]: E0309 03:05:59.042924 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b269dfeb8bccf0d2d404b3336072d680e723d68372ed379a3aaa3d30efaa387d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 09 03:05:59 crc kubenswrapper[4901]: E0309 03:05:59.047353 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b269dfeb8bccf0d2d404b3336072d680e723d68372ed379a3aaa3d30efaa387d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 09 03:05:59 crc kubenswrapper[4901]: E0309 03:05:59.047439 4901 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-hltph" podUID="4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89" containerName="ovs-vswitchd" Mar 09 03:05:59 crc kubenswrapper[4901]: I0309 03:05:59.460371 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gv8tw" Mar 09 03:05:59 crc kubenswrapper[4901]: I0309 03:05:59.460635 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gv8tw" Mar 09 03:06:00 crc kubenswrapper[4901]: I0309 03:06:00.157960 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550426-wkjmq"] Mar 09 03:06:00 crc kubenswrapper[4901]: E0309 03:06:00.159534 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a29f795d-59d2-4e43-a6ee-6190dc0ad67d" containerName="neutron-httpd" Mar 09 03:06:00 crc kubenswrapper[4901]: I0309 03:06:00.159551 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="a29f795d-59d2-4e43-a6ee-6190dc0ad67d" containerName="neutron-httpd" Mar 09 03:06:00 crc kubenswrapper[4901]: E0309 03:06:00.159597 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a29f795d-59d2-4e43-a6ee-6190dc0ad67d" containerName="neutron-api" Mar 09 03:06:00 crc kubenswrapper[4901]: I0309 03:06:00.159604 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="a29f795d-59d2-4e43-a6ee-6190dc0ad67d" containerName="neutron-api" Mar 09 03:06:00 crc kubenswrapper[4901]: I0309 03:06:00.160784 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="a29f795d-59d2-4e43-a6ee-6190dc0ad67d" containerName="neutron-api" Mar 09 03:06:00 crc kubenswrapper[4901]: I0309 03:06:00.160833 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="a29f795d-59d2-4e43-a6ee-6190dc0ad67d" containerName="neutron-httpd" Mar 09 03:06:00 crc kubenswrapper[4901]: I0309 03:06:00.161883 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550426-wkjmq" Mar 09 03:06:00 crc kubenswrapper[4901]: I0309 03:06:00.168753 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 03:06:00 crc kubenswrapper[4901]: I0309 03:06:00.168924 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 03:06:00 crc kubenswrapper[4901]: I0309 03:06:00.168957 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lq6vs" Mar 09 03:06:00 crc kubenswrapper[4901]: I0309 03:06:00.177650 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550426-wkjmq"] Mar 09 03:06:00 crc kubenswrapper[4901]: I0309 03:06:00.277253 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pcft\" (UniqueName: \"kubernetes.io/projected/9e1f6875-8b63-4eff-b3bb-e51285f5e8b0-kube-api-access-5pcft\") pod \"auto-csr-approver-29550426-wkjmq\" (UID: \"9e1f6875-8b63-4eff-b3bb-e51285f5e8b0\") " pod="openshift-infra/auto-csr-approver-29550426-wkjmq" Mar 09 03:06:00 crc kubenswrapper[4901]: I0309 03:06:00.378934 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pcft\" (UniqueName: \"kubernetes.io/projected/9e1f6875-8b63-4eff-b3bb-e51285f5e8b0-kube-api-access-5pcft\") pod \"auto-csr-approver-29550426-wkjmq\" (UID: \"9e1f6875-8b63-4eff-b3bb-e51285f5e8b0\") " pod="openshift-infra/auto-csr-approver-29550426-wkjmq" Mar 09 03:06:00 crc kubenswrapper[4901]: I0309 03:06:00.402696 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pcft\" (UniqueName: \"kubernetes.io/projected/9e1f6875-8b63-4eff-b3bb-e51285f5e8b0-kube-api-access-5pcft\") pod \"auto-csr-approver-29550426-wkjmq\" (UID: \"9e1f6875-8b63-4eff-b3bb-e51285f5e8b0\") " pod="openshift-infra/auto-csr-approver-29550426-wkjmq" Mar 09 03:06:00 crc kubenswrapper[4901]: I0309 03:06:00.496456 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550426-wkjmq" Mar 09 03:06:00 crc kubenswrapper[4901]: I0309 03:06:00.507462 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gv8tw" podUID="e877c5b4-163c-4b68-b416-f6c0e7b63cc3" containerName="registry-server" probeResult="failure" output=< Mar 09 03:06:00 crc kubenswrapper[4901]: timeout: failed to connect service ":50051" within 1s Mar 09 03:06:00 crc kubenswrapper[4901]: > Mar 09 03:06:00 crc kubenswrapper[4901]: I0309 03:06:00.966253 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550426-wkjmq"] Mar 09 03:06:01 crc kubenswrapper[4901]: I0309 03:06:01.070736 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550426-wkjmq" event={"ID":"9e1f6875-8b63-4eff-b3bb-e51285f5e8b0","Type":"ContainerStarted","Data":"5d97298dd874a43e534b5e5aac35f57f99f0a5d75b1c722d365672772b553481"} Mar 09 03:06:03 crc kubenswrapper[4901]: I0309 03:06:03.089855 4901 generic.go:334] "Generic (PLEG): container finished" podID="9e1f6875-8b63-4eff-b3bb-e51285f5e8b0" containerID="716ee6debd7ff6a787ab13690b030042fb8f2f2451f2acd85f7cbb0616b3196e" exitCode=0 Mar 09 03:06:03 crc kubenswrapper[4901]: I0309 03:06:03.089916 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550426-wkjmq" event={"ID":"9e1f6875-8b63-4eff-b3bb-e51285f5e8b0","Type":"ContainerDied","Data":"716ee6debd7ff6a787ab13690b030042fb8f2f2451f2acd85f7cbb0616b3196e"} Mar 09 03:06:04 crc kubenswrapper[4901]: E0309 03:06:04.035042 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0ebf319a3bbc109bfb53682936200a7e2d64ef909307f9919c3a44fc2eece41f is running failed: container process not found" containerID="0ebf319a3bbc109bfb53682936200a7e2d64ef909307f9919c3a44fc2eece41f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 09 03:06:04 crc kubenswrapper[4901]: E0309 03:06:04.035661 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0ebf319a3bbc109bfb53682936200a7e2d64ef909307f9919c3a44fc2eece41f is running failed: container process not found" containerID="0ebf319a3bbc109bfb53682936200a7e2d64ef909307f9919c3a44fc2eece41f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 09 03:06:04 crc kubenswrapper[4901]: E0309 03:06:04.036517 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0ebf319a3bbc109bfb53682936200a7e2d64ef909307f9919c3a44fc2eece41f is running failed: container process not found" containerID="0ebf319a3bbc109bfb53682936200a7e2d64ef909307f9919c3a44fc2eece41f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 09 03:06:04 crc kubenswrapper[4901]: E0309 03:06:04.036623 4901 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0ebf319a3bbc109bfb53682936200a7e2d64ef909307f9919c3a44fc2eece41f is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-hltph" podUID="4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89" containerName="ovsdb-server" Mar 09 03:06:04 crc kubenswrapper[4901]: E0309 03:06:04.037022 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b269dfeb8bccf0d2d404b3336072d680e723d68372ed379a3aaa3d30efaa387d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 09 03:06:04 crc kubenswrapper[4901]: E0309 03:06:04.039100 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b269dfeb8bccf0d2d404b3336072d680e723d68372ed379a3aaa3d30efaa387d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 09 03:06:04 crc kubenswrapper[4901]: E0309 03:06:04.040922 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b269dfeb8bccf0d2d404b3336072d680e723d68372ed379a3aaa3d30efaa387d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 09 03:06:04 crc kubenswrapper[4901]: E0309 03:06:04.040965 4901 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-hltph" podUID="4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89" containerName="ovs-vswitchd" Mar 09 03:06:04 crc kubenswrapper[4901]: I0309 03:06:04.514773 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550426-wkjmq" Mar 09 03:06:04 crc kubenswrapper[4901]: I0309 03:06:04.642352 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pcft\" (UniqueName: \"kubernetes.io/projected/9e1f6875-8b63-4eff-b3bb-e51285f5e8b0-kube-api-access-5pcft\") pod \"9e1f6875-8b63-4eff-b3bb-e51285f5e8b0\" (UID: \"9e1f6875-8b63-4eff-b3bb-e51285f5e8b0\") " Mar 09 03:06:04 crc kubenswrapper[4901]: I0309 03:06:04.650644 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e1f6875-8b63-4eff-b3bb-e51285f5e8b0-kube-api-access-5pcft" (OuterVolumeSpecName: "kube-api-access-5pcft") pod "9e1f6875-8b63-4eff-b3bb-e51285f5e8b0" (UID: "9e1f6875-8b63-4eff-b3bb-e51285f5e8b0"). InnerVolumeSpecName "kube-api-access-5pcft". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:06:04 crc kubenswrapper[4901]: I0309 03:06:04.746100 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pcft\" (UniqueName: \"kubernetes.io/projected/9e1f6875-8b63-4eff-b3bb-e51285f5e8b0-kube-api-access-5pcft\") on node \"crc\" DevicePath \"\"" Mar 09 03:06:05 crc kubenswrapper[4901]: I0309 03:06:05.133979 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550426-wkjmq" event={"ID":"9e1f6875-8b63-4eff-b3bb-e51285f5e8b0","Type":"ContainerDied","Data":"5d97298dd874a43e534b5e5aac35f57f99f0a5d75b1c722d365672772b553481"} Mar 09 03:06:05 crc kubenswrapper[4901]: I0309 03:06:05.134049 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d97298dd874a43e534b5e5aac35f57f99f0a5d75b1c722d365672772b553481" Mar 09 03:06:05 crc kubenswrapper[4901]: I0309 03:06:05.134061 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550426-wkjmq" Mar 09 03:06:05 crc kubenswrapper[4901]: I0309 03:06:05.591972 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550420-9l54t"] Mar 09 03:06:05 crc kubenswrapper[4901]: I0309 03:06:05.601405 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550420-9l54t"] Mar 09 03:06:06 crc kubenswrapper[4901]: I0309 03:06:06.126814 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ddd9ca0-2302-4a94-9d07-fbf8db553b73" path="/var/lib/kubelet/pods/0ddd9ca0-2302-4a94-9d07-fbf8db553b73/volumes" Mar 09 03:06:06 crc kubenswrapper[4901]: I0309 03:06:06.158479 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hltph_4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89/ovs-vswitchd/0.log" Mar 09 03:06:06 crc kubenswrapper[4901]: I0309 03:06:06.160007 4901 generic.go:334] "Generic (PLEG): container finished" podID="4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89" containerID="b269dfeb8bccf0d2d404b3336072d680e723d68372ed379a3aaa3d30efaa387d" exitCode=137 Mar 09 03:06:06 crc kubenswrapper[4901]: I0309 03:06:06.160071 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hltph" event={"ID":"4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89","Type":"ContainerDied","Data":"b269dfeb8bccf0d2d404b3336072d680e723d68372ed379a3aaa3d30efaa387d"} Mar 09 03:06:06 crc kubenswrapper[4901]: I0309 03:06:06.641022 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hltph_4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89/ovs-vswitchd/0.log" Mar 09 03:06:06 crc kubenswrapper[4901]: I0309 03:06:06.641912 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-hltph" Mar 09 03:06:06 crc kubenswrapper[4901]: I0309 03:06:06.750201 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 09 03:06:06 crc kubenswrapper[4901]: I0309 03:06:06.817733 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89-var-run\") pod \"4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89\" (UID: \"4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89\") " Mar 09 03:06:06 crc kubenswrapper[4901]: I0309 03:06:06.817810 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89-var-lib\") pod \"4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89\" (UID: \"4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89\") " Mar 09 03:06:06 crc kubenswrapper[4901]: I0309 03:06:06.817863 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcl27\" (UniqueName: \"kubernetes.io/projected/4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89-kube-api-access-fcl27\") pod \"4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89\" (UID: \"4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89\") " Mar 09 03:06:06 crc kubenswrapper[4901]: I0309 03:06:06.817907 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89-etc-ovs\") pod \"4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89\" (UID: \"4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89\") " Mar 09 03:06:06 crc kubenswrapper[4901]: I0309 03:06:06.817949 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89-scripts\") pod \"4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89\" (UID: \"4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89\") " Mar 09 03:06:06 crc kubenswrapper[4901]: I0309 03:06:06.817976 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89-var-log\") pod \"4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89\" (UID: \"4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89\") " Mar 09 03:06:06 crc kubenswrapper[4901]: I0309 03:06:06.818429 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89-var-log" (OuterVolumeSpecName: "var-log") pod "4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89" (UID: "4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 03:06:06 crc kubenswrapper[4901]: I0309 03:06:06.818478 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89-var-run" (OuterVolumeSpecName: "var-run") pod "4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89" (UID: "4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 03:06:06 crc kubenswrapper[4901]: I0309 03:06:06.818501 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89-var-lib" (OuterVolumeSpecName: "var-lib") pod "4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89" (UID: "4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 03:06:06 crc kubenswrapper[4901]: I0309 03:06:06.819313 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89" (UID: "4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 03:06:06 crc kubenswrapper[4901]: I0309 03:06:06.820540 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89-scripts" (OuterVolumeSpecName: "scripts") pod "4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89" (UID: "4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:06:06 crc kubenswrapper[4901]: I0309 03:06:06.830991 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89-kube-api-access-fcl27" (OuterVolumeSpecName: "kube-api-access-fcl27") pod "4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89" (UID: "4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89"). InnerVolumeSpecName "kube-api-access-fcl27". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:06:06 crc kubenswrapper[4901]: I0309 03:06:06.852963 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 09 03:06:06 crc kubenswrapper[4901]: I0309 03:06:06.918764 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5276569a-5e4d-4bbb-ab39-f9a420e4a3e9-config-data-custom\") pod \"5276569a-5e4d-4bbb-ab39-f9a420e4a3e9\" (UID: \"5276569a-5e4d-4bbb-ab39-f9a420e4a3e9\") " Mar 09 03:06:06 crc kubenswrapper[4901]: I0309 03:06:06.918899 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5276569a-5e4d-4bbb-ab39-f9a420e4a3e9-scripts\") pod \"5276569a-5e4d-4bbb-ab39-f9a420e4a3e9\" (UID: \"5276569a-5e4d-4bbb-ab39-f9a420e4a3e9\") " Mar 09 03:06:06 crc kubenswrapper[4901]: I0309 03:06:06.918970 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9w864\" (UniqueName: \"kubernetes.io/projected/5276569a-5e4d-4bbb-ab39-f9a420e4a3e9-kube-api-access-9w864\") pod \"5276569a-5e4d-4bbb-ab39-f9a420e4a3e9\" (UID: \"5276569a-5e4d-4bbb-ab39-f9a420e4a3e9\") " Mar 09 03:06:06 crc kubenswrapper[4901]: I0309 03:06:06.918991 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5276569a-5e4d-4bbb-ab39-f9a420e4a3e9-etc-machine-id\") pod \"5276569a-5e4d-4bbb-ab39-f9a420e4a3e9\" (UID: \"5276569a-5e4d-4bbb-ab39-f9a420e4a3e9\") " Mar 09 03:06:06 crc kubenswrapper[4901]: I0309 03:06:06.919013 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5276569a-5e4d-4bbb-ab39-f9a420e4a3e9-config-data\") pod \"5276569a-5e4d-4bbb-ab39-f9a420e4a3e9\" (UID: \"5276569a-5e4d-4bbb-ab39-f9a420e4a3e9\") " Mar 09 03:06:06 crc kubenswrapper[4901]: I0309 03:06:06.919084 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5276569a-5e4d-4bbb-ab39-f9a420e4a3e9-combined-ca-bundle\") pod \"5276569a-5e4d-4bbb-ab39-f9a420e4a3e9\" (UID: \"5276569a-5e4d-4bbb-ab39-f9a420e4a3e9\") " Mar 09 03:06:06 crc kubenswrapper[4901]: I0309 03:06:06.919392 4901 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89-var-run\") on node \"crc\" DevicePath \"\"" Mar 09 03:06:06 crc kubenswrapper[4901]: I0309 03:06:06.919413 4901 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89-var-lib\") on node \"crc\" DevicePath \"\"" Mar 09 03:06:06 crc kubenswrapper[4901]: I0309 03:06:06.919426 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcl27\" (UniqueName: \"kubernetes.io/projected/4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89-kube-api-access-fcl27\") on node \"crc\" DevicePath \"\"" Mar 09 03:06:06 crc kubenswrapper[4901]: I0309 03:06:06.919437 4901 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89-etc-ovs\") on node \"crc\" DevicePath \"\"" Mar 09 03:06:06 crc kubenswrapper[4901]: I0309 03:06:06.919447 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:06:06 crc kubenswrapper[4901]: I0309 03:06:06.919457 4901 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89-var-log\") on node \"crc\" DevicePath \"\"" Mar 09 03:06:06 crc kubenswrapper[4901]: I0309 03:06:06.921538 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5276569a-5e4d-4bbb-ab39-f9a420e4a3e9-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "5276569a-5e4d-4bbb-ab39-f9a420e4a3e9" (UID: "5276569a-5e4d-4bbb-ab39-f9a420e4a3e9"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 03:06:06 crc kubenswrapper[4901]: I0309 03:06:06.927546 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5276569a-5e4d-4bbb-ab39-f9a420e4a3e9-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5276569a-5e4d-4bbb-ab39-f9a420e4a3e9" (UID: "5276569a-5e4d-4bbb-ab39-f9a420e4a3e9"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:06:06 crc kubenswrapper[4901]: I0309 03:06:06.928418 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5276569a-5e4d-4bbb-ab39-f9a420e4a3e9-kube-api-access-9w864" (OuterVolumeSpecName: "kube-api-access-9w864") pod "5276569a-5e4d-4bbb-ab39-f9a420e4a3e9" (UID: "5276569a-5e4d-4bbb-ab39-f9a420e4a3e9"). InnerVolumeSpecName "kube-api-access-9w864". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:06:06 crc kubenswrapper[4901]: I0309 03:06:06.935436 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5276569a-5e4d-4bbb-ab39-f9a420e4a3e9-scripts" (OuterVolumeSpecName: "scripts") pod "5276569a-5e4d-4bbb-ab39-f9a420e4a3e9" (UID: "5276569a-5e4d-4bbb-ab39-f9a420e4a3e9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:06:06 crc kubenswrapper[4901]: I0309 03:06:06.972399 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5276569a-5e4d-4bbb-ab39-f9a420e4a3e9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5276569a-5e4d-4bbb-ab39-f9a420e4a3e9" (UID: "5276569a-5e4d-4bbb-ab39-f9a420e4a3e9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.021215 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"56e038ec-a406-4f6b-9b8a-135c56be7514\" (UID: \"56e038ec-a406-4f6b-9b8a-135c56be7514\") " Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.021957 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/56e038ec-a406-4f6b-9b8a-135c56be7514-cache\") pod \"56e038ec-a406-4f6b-9b8a-135c56be7514\" (UID: \"56e038ec-a406-4f6b-9b8a-135c56be7514\") " Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.023632 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56e038ec-a406-4f6b-9b8a-135c56be7514-cache" (OuterVolumeSpecName: "cache") pod "56e038ec-a406-4f6b-9b8a-135c56be7514" (UID: "56e038ec-a406-4f6b-9b8a-135c56be7514"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.023672 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rtff\" (UniqueName: \"kubernetes.io/projected/56e038ec-a406-4f6b-9b8a-135c56be7514-kube-api-access-2rtff\") pod \"56e038ec-a406-4f6b-9b8a-135c56be7514\" (UID: \"56e038ec-a406-4f6b-9b8a-135c56be7514\") " Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.023751 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/56e038ec-a406-4f6b-9b8a-135c56be7514-lock\") pod \"56e038ec-a406-4f6b-9b8a-135c56be7514\" (UID: \"56e038ec-a406-4f6b-9b8a-135c56be7514\") " Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.023806 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/56e038ec-a406-4f6b-9b8a-135c56be7514-etc-swift\") pod \"56e038ec-a406-4f6b-9b8a-135c56be7514\" (UID: \"56e038ec-a406-4f6b-9b8a-135c56be7514\") " Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.023835 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56e038ec-a406-4f6b-9b8a-135c56be7514-combined-ca-bundle\") pod \"56e038ec-a406-4f6b-9b8a-135c56be7514\" (UID: \"56e038ec-a406-4f6b-9b8a-135c56be7514\") " Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.024910 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56e038ec-a406-4f6b-9b8a-135c56be7514-lock" (OuterVolumeSpecName: "lock") pod "56e038ec-a406-4f6b-9b8a-135c56be7514" (UID: "56e038ec-a406-4f6b-9b8a-135c56be7514"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.025402 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5276569a-5e4d-4bbb-ab39-f9a420e4a3e9-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.025418 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9w864\" (UniqueName: \"kubernetes.io/projected/5276569a-5e4d-4bbb-ab39-f9a420e4a3e9-kube-api-access-9w864\") on node \"crc\" DevicePath \"\"" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.025428 4901 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5276569a-5e4d-4bbb-ab39-f9a420e4a3e9-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.025436 4901 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/56e038ec-a406-4f6b-9b8a-135c56be7514-cache\") on node \"crc\" DevicePath \"\"" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.025444 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5276569a-5e4d-4bbb-ab39-f9a420e4a3e9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.025452 4901 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5276569a-5e4d-4bbb-ab39-f9a420e4a3e9-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.025459 4901 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/56e038ec-a406-4f6b-9b8a-135c56be7514-lock\") on node \"crc\" DevicePath \"\"" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.035424 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56e038ec-a406-4f6b-9b8a-135c56be7514-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "56e038ec-a406-4f6b-9b8a-135c56be7514" (UID: "56e038ec-a406-4f6b-9b8a-135c56be7514"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.038003 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "swift") pod "56e038ec-a406-4f6b-9b8a-135c56be7514" (UID: "56e038ec-a406-4f6b-9b8a-135c56be7514"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.045361 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56e038ec-a406-4f6b-9b8a-135c56be7514-kube-api-access-2rtff" (OuterVolumeSpecName: "kube-api-access-2rtff") pod "56e038ec-a406-4f6b-9b8a-135c56be7514" (UID: "56e038ec-a406-4f6b-9b8a-135c56be7514"). InnerVolumeSpecName "kube-api-access-2rtff". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.111094 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5276569a-5e4d-4bbb-ab39-f9a420e4a3e9-config-data" (OuterVolumeSpecName: "config-data") pod "5276569a-5e4d-4bbb-ab39-f9a420e4a3e9" (UID: "5276569a-5e4d-4bbb-ab39-f9a420e4a3e9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.126615 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5276569a-5e4d-4bbb-ab39-f9a420e4a3e9-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.126642 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rtff\" (UniqueName: \"kubernetes.io/projected/56e038ec-a406-4f6b-9b8a-135c56be7514-kube-api-access-2rtff\") on node \"crc\" DevicePath \"\"" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.126654 4901 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/56e038ec-a406-4f6b-9b8a-135c56be7514-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.126678 4901 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.139909 4901 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.176201 4901 generic.go:334] "Generic (PLEG): container finished" podID="5276569a-5e4d-4bbb-ab39-f9a420e4a3e9" containerID="facb483ac1a2684cbdb2157f973980b22dbea7114e3234f3ba9ca3d2bc68a22b" exitCode=137 Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.176265 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5276569a-5e4d-4bbb-ab39-f9a420e4a3e9","Type":"ContainerDied","Data":"facb483ac1a2684cbdb2157f973980b22dbea7114e3234f3ba9ca3d2bc68a22b"} Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.176315 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5276569a-5e4d-4bbb-ab39-f9a420e4a3e9","Type":"ContainerDied","Data":"f38b4b55666610f8dbc59a27371dfe35ab766a60ffe8fff7f77a0ae5cd048172"} Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.176337 4901 scope.go:117] "RemoveContainer" containerID="91cbb6deb4ec6b7d9a4c55257a984d7c3866e71cfa748163111d9fb04ddec075" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.176524 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.180321 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hltph_4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89/ovs-vswitchd/0.log" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.181204 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hltph" event={"ID":"4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89","Type":"ContainerDied","Data":"b5c023718819a580beade659ae310802783879f3abc6a9a4e91c499e588bafe3"} Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.181282 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-hltph" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.190609 4901 generic.go:334] "Generic (PLEG): container finished" podID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerID="5a61a9051d7facf2f0fe68fbc34956eedc046cea9aa7aa425a2ad7983580763d" exitCode=137 Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.190643 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"56e038ec-a406-4f6b-9b8a-135c56be7514","Type":"ContainerDied","Data":"5a61a9051d7facf2f0fe68fbc34956eedc046cea9aa7aa425a2ad7983580763d"} Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.190667 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"56e038ec-a406-4f6b-9b8a-135c56be7514","Type":"ContainerDied","Data":"c53e28cc744f24ff1fba305406ef3d31a267154c97d80480108d1da32a3bf7b0"} Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.190751 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.229864 4901 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.284618 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-hltph"] Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.291682 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-hltph"] Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.295966 4901 scope.go:117] "RemoveContainer" containerID="facb483ac1a2684cbdb2157f973980b22dbea7114e3234f3ba9ca3d2bc68a22b" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.298824 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.305821 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.315862 4901 scope.go:117] "RemoveContainer" containerID="91cbb6deb4ec6b7d9a4c55257a984d7c3866e71cfa748163111d9fb04ddec075" Mar 09 03:06:07 crc kubenswrapper[4901]: E0309 03:06:07.317756 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91cbb6deb4ec6b7d9a4c55257a984d7c3866e71cfa748163111d9fb04ddec075\": container with ID starting with 91cbb6deb4ec6b7d9a4c55257a984d7c3866e71cfa748163111d9fb04ddec075 not found: ID does not exist" containerID="91cbb6deb4ec6b7d9a4c55257a984d7c3866e71cfa748163111d9fb04ddec075" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.317789 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91cbb6deb4ec6b7d9a4c55257a984d7c3866e71cfa748163111d9fb04ddec075"} err="failed to get container status \"91cbb6deb4ec6b7d9a4c55257a984d7c3866e71cfa748163111d9fb04ddec075\": rpc error: code = NotFound desc = could not find container \"91cbb6deb4ec6b7d9a4c55257a984d7c3866e71cfa748163111d9fb04ddec075\": container with ID starting with 91cbb6deb4ec6b7d9a4c55257a984d7c3866e71cfa748163111d9fb04ddec075 not found: ID does not exist" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.317812 4901 scope.go:117] "RemoveContainer" containerID="facb483ac1a2684cbdb2157f973980b22dbea7114e3234f3ba9ca3d2bc68a22b" Mar 09 03:06:07 crc kubenswrapper[4901]: E0309 03:06:07.318109 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"facb483ac1a2684cbdb2157f973980b22dbea7114e3234f3ba9ca3d2bc68a22b\": container with ID starting with facb483ac1a2684cbdb2157f973980b22dbea7114e3234f3ba9ca3d2bc68a22b not found: ID does not exist" containerID="facb483ac1a2684cbdb2157f973980b22dbea7114e3234f3ba9ca3d2bc68a22b" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.318130 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"facb483ac1a2684cbdb2157f973980b22dbea7114e3234f3ba9ca3d2bc68a22b"} err="failed to get container status \"facb483ac1a2684cbdb2157f973980b22dbea7114e3234f3ba9ca3d2bc68a22b\": rpc error: code = NotFound desc = could not find container \"facb483ac1a2684cbdb2157f973980b22dbea7114e3234f3ba9ca3d2bc68a22b\": container with ID starting with facb483ac1a2684cbdb2157f973980b22dbea7114e3234f3ba9ca3d2bc68a22b not found: ID does not exist" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.318142 4901 scope.go:117] "RemoveContainer" containerID="b269dfeb8bccf0d2d404b3336072d680e723d68372ed379a3aaa3d30efaa387d" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.342400 4901 scope.go:117] "RemoveContainer" containerID="0ebf319a3bbc109bfb53682936200a7e2d64ef909307f9919c3a44fc2eece41f" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.366521 4901 scope.go:117] "RemoveContainer" containerID="858a5a877443194b8865cd485f5efb4064ab5bc8f500d6a40ba7b9d488f969ad" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.369176 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56e038ec-a406-4f6b-9b8a-135c56be7514-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "56e038ec-a406-4f6b-9b8a-135c56be7514" (UID: "56e038ec-a406-4f6b-9b8a-135c56be7514"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.395621 4901 scope.go:117] "RemoveContainer" containerID="5a61a9051d7facf2f0fe68fbc34956eedc046cea9aa7aa425a2ad7983580763d" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.423935 4901 scope.go:117] "RemoveContainer" containerID="98baa58acecf6a889a2bcf29944696da987b06e0a31a75f7580e5b62f1cf2db6" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.432354 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56e038ec-a406-4f6b-9b8a-135c56be7514-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.445274 4901 scope.go:117] "RemoveContainer" containerID="ec7b6ca857145cd8ffde882836905792898feadc504e21586c4cd6aba7ec5a11" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.471299 4901 scope.go:117] "RemoveContainer" containerID="1f84aa3c39b4c9622fe4347965d91df9c469a92794647651d5a87ec099686973" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.497492 4901 scope.go:117] "RemoveContainer" containerID="e1eb6f0364aa3902d58b824b7fb25b904c93c1eeb008b6cf519903b0f5d38d17" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.536442 4901 scope.go:117] "RemoveContainer" containerID="2a68b1ca4efba68812d3a303a2ceab2b4b6448914471d7a3decd7cf6b6f34bb6" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.562048 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.567475 4901 scope.go:117] "RemoveContainer" containerID="b23a37af3d719448611f4ad6a32fe5c2c308cd7ba1a776e15eadba7f364fb7bf" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.568963 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.587622 4901 scope.go:117] "RemoveContainer" containerID="deebe1091232da3c6c138fb30edea0b726dc89153aad8d9068b83577825506dd" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.615779 4901 scope.go:117] "RemoveContainer" containerID="000fe2a3e4881852b517c846ea1372dc4b8cf6aada1cff25241e58df0a0f1d14" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.642338 4901 scope.go:117] "RemoveContainer" containerID="efdf4e619a6d24b736b4544527ea94436e6c978c7ceba7ef958652cf7cb597b8" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.669695 4901 scope.go:117] "RemoveContainer" containerID="88fc84894c5e86912090b30b3eb8149fd1b794d55763d71b556f863fbc68ed0f" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.691784 4901 scope.go:117] "RemoveContainer" containerID="8439b508bcb9b7e1d34dad860ed688032784076964cafe31bf8854469d12a0c4" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.718416 4901 scope.go:117] "RemoveContainer" containerID="b505e13d0afa284626e3a000524fb455406b74e6c642956f44df576c999c444c" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.744419 4901 scope.go:117] "RemoveContainer" containerID="0508076142d286eb3dc29b982443d11cf9f76d1d98901e2dde15dd0067359954" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.773444 4901 scope.go:117] "RemoveContainer" containerID="5bf4c0e106e3a5033b95c4d3d3124a40b5aeabe081706850be3c85ef4ff88af9" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.814748 4901 scope.go:117] "RemoveContainer" containerID="5a61a9051d7facf2f0fe68fbc34956eedc046cea9aa7aa425a2ad7983580763d" Mar 09 03:06:07 crc kubenswrapper[4901]: E0309 03:06:07.816135 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a61a9051d7facf2f0fe68fbc34956eedc046cea9aa7aa425a2ad7983580763d\": container with ID starting with 5a61a9051d7facf2f0fe68fbc34956eedc046cea9aa7aa425a2ad7983580763d not found: ID does not exist" containerID="5a61a9051d7facf2f0fe68fbc34956eedc046cea9aa7aa425a2ad7983580763d" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.816177 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a61a9051d7facf2f0fe68fbc34956eedc046cea9aa7aa425a2ad7983580763d"} err="failed to get container status \"5a61a9051d7facf2f0fe68fbc34956eedc046cea9aa7aa425a2ad7983580763d\": rpc error: code = NotFound desc = could not find container \"5a61a9051d7facf2f0fe68fbc34956eedc046cea9aa7aa425a2ad7983580763d\": container with ID starting with 5a61a9051d7facf2f0fe68fbc34956eedc046cea9aa7aa425a2ad7983580763d not found: ID does not exist" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.816206 4901 scope.go:117] "RemoveContainer" containerID="98baa58acecf6a889a2bcf29944696da987b06e0a31a75f7580e5b62f1cf2db6" Mar 09 03:06:07 crc kubenswrapper[4901]: E0309 03:06:07.817076 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98baa58acecf6a889a2bcf29944696da987b06e0a31a75f7580e5b62f1cf2db6\": container with ID starting with 98baa58acecf6a889a2bcf29944696da987b06e0a31a75f7580e5b62f1cf2db6 not found: ID does not exist" containerID="98baa58acecf6a889a2bcf29944696da987b06e0a31a75f7580e5b62f1cf2db6" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.817101 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98baa58acecf6a889a2bcf29944696da987b06e0a31a75f7580e5b62f1cf2db6"} err="failed to get container status \"98baa58acecf6a889a2bcf29944696da987b06e0a31a75f7580e5b62f1cf2db6\": rpc error: code = NotFound desc = could not find container \"98baa58acecf6a889a2bcf29944696da987b06e0a31a75f7580e5b62f1cf2db6\": container with ID starting with 98baa58acecf6a889a2bcf29944696da987b06e0a31a75f7580e5b62f1cf2db6 not found: ID does not exist" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.817119 4901 scope.go:117] "RemoveContainer" containerID="ec7b6ca857145cd8ffde882836905792898feadc504e21586c4cd6aba7ec5a11" Mar 09 03:06:07 crc kubenswrapper[4901]: E0309 03:06:07.817922 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec7b6ca857145cd8ffde882836905792898feadc504e21586c4cd6aba7ec5a11\": container with ID starting with ec7b6ca857145cd8ffde882836905792898feadc504e21586c4cd6aba7ec5a11 not found: ID does not exist" containerID="ec7b6ca857145cd8ffde882836905792898feadc504e21586c4cd6aba7ec5a11" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.817950 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec7b6ca857145cd8ffde882836905792898feadc504e21586c4cd6aba7ec5a11"} err="failed to get container status \"ec7b6ca857145cd8ffde882836905792898feadc504e21586c4cd6aba7ec5a11\": rpc error: code = NotFound desc = could not find container \"ec7b6ca857145cd8ffde882836905792898feadc504e21586c4cd6aba7ec5a11\": container with ID starting with ec7b6ca857145cd8ffde882836905792898feadc504e21586c4cd6aba7ec5a11 not found: ID does not exist" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.817967 4901 scope.go:117] "RemoveContainer" containerID="1f84aa3c39b4c9622fe4347965d91df9c469a92794647651d5a87ec099686973" Mar 09 03:06:07 crc kubenswrapper[4901]: E0309 03:06:07.818704 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f84aa3c39b4c9622fe4347965d91df9c469a92794647651d5a87ec099686973\": container with ID starting with 1f84aa3c39b4c9622fe4347965d91df9c469a92794647651d5a87ec099686973 not found: ID does not exist" containerID="1f84aa3c39b4c9622fe4347965d91df9c469a92794647651d5a87ec099686973" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.818731 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f84aa3c39b4c9622fe4347965d91df9c469a92794647651d5a87ec099686973"} err="failed to get container status \"1f84aa3c39b4c9622fe4347965d91df9c469a92794647651d5a87ec099686973\": rpc error: code = NotFound desc = could not find container \"1f84aa3c39b4c9622fe4347965d91df9c469a92794647651d5a87ec099686973\": container with ID starting with 1f84aa3c39b4c9622fe4347965d91df9c469a92794647651d5a87ec099686973 not found: ID does not exist" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.818746 4901 scope.go:117] "RemoveContainer" containerID="e1eb6f0364aa3902d58b824b7fb25b904c93c1eeb008b6cf519903b0f5d38d17" Mar 09 03:06:07 crc kubenswrapper[4901]: E0309 03:06:07.819437 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1eb6f0364aa3902d58b824b7fb25b904c93c1eeb008b6cf519903b0f5d38d17\": container with ID starting with e1eb6f0364aa3902d58b824b7fb25b904c93c1eeb008b6cf519903b0f5d38d17 not found: ID does not exist" containerID="e1eb6f0364aa3902d58b824b7fb25b904c93c1eeb008b6cf519903b0f5d38d17" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.819461 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1eb6f0364aa3902d58b824b7fb25b904c93c1eeb008b6cf519903b0f5d38d17"} err="failed to get container status \"e1eb6f0364aa3902d58b824b7fb25b904c93c1eeb008b6cf519903b0f5d38d17\": rpc error: code = NotFound desc = could not find container \"e1eb6f0364aa3902d58b824b7fb25b904c93c1eeb008b6cf519903b0f5d38d17\": container with ID starting with e1eb6f0364aa3902d58b824b7fb25b904c93c1eeb008b6cf519903b0f5d38d17 not found: ID does not exist" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.819476 4901 scope.go:117] "RemoveContainer" containerID="2a68b1ca4efba68812d3a303a2ceab2b4b6448914471d7a3decd7cf6b6f34bb6" Mar 09 03:06:07 crc kubenswrapper[4901]: E0309 03:06:07.819880 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a68b1ca4efba68812d3a303a2ceab2b4b6448914471d7a3decd7cf6b6f34bb6\": container with ID starting with 2a68b1ca4efba68812d3a303a2ceab2b4b6448914471d7a3decd7cf6b6f34bb6 not found: ID does not exist" containerID="2a68b1ca4efba68812d3a303a2ceab2b4b6448914471d7a3decd7cf6b6f34bb6" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.819903 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a68b1ca4efba68812d3a303a2ceab2b4b6448914471d7a3decd7cf6b6f34bb6"} err="failed to get container status \"2a68b1ca4efba68812d3a303a2ceab2b4b6448914471d7a3decd7cf6b6f34bb6\": rpc error: code = NotFound desc = could not find container \"2a68b1ca4efba68812d3a303a2ceab2b4b6448914471d7a3decd7cf6b6f34bb6\": container with ID starting with 2a68b1ca4efba68812d3a303a2ceab2b4b6448914471d7a3decd7cf6b6f34bb6 not found: ID does not exist" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.819918 4901 scope.go:117] "RemoveContainer" containerID="b23a37af3d719448611f4ad6a32fe5c2c308cd7ba1a776e15eadba7f364fb7bf" Mar 09 03:06:07 crc kubenswrapper[4901]: E0309 03:06:07.820291 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b23a37af3d719448611f4ad6a32fe5c2c308cd7ba1a776e15eadba7f364fb7bf\": container with ID starting with b23a37af3d719448611f4ad6a32fe5c2c308cd7ba1a776e15eadba7f364fb7bf not found: ID does not exist" containerID="b23a37af3d719448611f4ad6a32fe5c2c308cd7ba1a776e15eadba7f364fb7bf" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.820320 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b23a37af3d719448611f4ad6a32fe5c2c308cd7ba1a776e15eadba7f364fb7bf"} err="failed to get container status \"b23a37af3d719448611f4ad6a32fe5c2c308cd7ba1a776e15eadba7f364fb7bf\": rpc error: code = NotFound desc = could not find container \"b23a37af3d719448611f4ad6a32fe5c2c308cd7ba1a776e15eadba7f364fb7bf\": container with ID starting with b23a37af3d719448611f4ad6a32fe5c2c308cd7ba1a776e15eadba7f364fb7bf not found: ID does not exist" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.820335 4901 scope.go:117] "RemoveContainer" containerID="deebe1091232da3c6c138fb30edea0b726dc89153aad8d9068b83577825506dd" Mar 09 03:06:07 crc kubenswrapper[4901]: E0309 03:06:07.820702 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"deebe1091232da3c6c138fb30edea0b726dc89153aad8d9068b83577825506dd\": container with ID starting with deebe1091232da3c6c138fb30edea0b726dc89153aad8d9068b83577825506dd not found: ID does not exist" containerID="deebe1091232da3c6c138fb30edea0b726dc89153aad8d9068b83577825506dd" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.820725 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"deebe1091232da3c6c138fb30edea0b726dc89153aad8d9068b83577825506dd"} err="failed to get container status \"deebe1091232da3c6c138fb30edea0b726dc89153aad8d9068b83577825506dd\": rpc error: code = NotFound desc = could not find container \"deebe1091232da3c6c138fb30edea0b726dc89153aad8d9068b83577825506dd\": container with ID starting with deebe1091232da3c6c138fb30edea0b726dc89153aad8d9068b83577825506dd not found: ID does not exist" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.820741 4901 scope.go:117] "RemoveContainer" containerID="000fe2a3e4881852b517c846ea1372dc4b8cf6aada1cff25241e58df0a0f1d14" Mar 09 03:06:07 crc kubenswrapper[4901]: E0309 03:06:07.821039 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"000fe2a3e4881852b517c846ea1372dc4b8cf6aada1cff25241e58df0a0f1d14\": container with ID starting with 000fe2a3e4881852b517c846ea1372dc4b8cf6aada1cff25241e58df0a0f1d14 not found: ID does not exist" containerID="000fe2a3e4881852b517c846ea1372dc4b8cf6aada1cff25241e58df0a0f1d14" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.821063 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"000fe2a3e4881852b517c846ea1372dc4b8cf6aada1cff25241e58df0a0f1d14"} err="failed to get container status \"000fe2a3e4881852b517c846ea1372dc4b8cf6aada1cff25241e58df0a0f1d14\": rpc error: code = NotFound desc = could not find container \"000fe2a3e4881852b517c846ea1372dc4b8cf6aada1cff25241e58df0a0f1d14\": container with ID starting with 000fe2a3e4881852b517c846ea1372dc4b8cf6aada1cff25241e58df0a0f1d14 not found: ID does not exist" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.821079 4901 scope.go:117] "RemoveContainer" containerID="efdf4e619a6d24b736b4544527ea94436e6c978c7ceba7ef958652cf7cb597b8" Mar 09 03:06:07 crc kubenswrapper[4901]: E0309 03:06:07.821695 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efdf4e619a6d24b736b4544527ea94436e6c978c7ceba7ef958652cf7cb597b8\": container with ID starting with efdf4e619a6d24b736b4544527ea94436e6c978c7ceba7ef958652cf7cb597b8 not found: ID does not exist" containerID="efdf4e619a6d24b736b4544527ea94436e6c978c7ceba7ef958652cf7cb597b8" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.821721 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efdf4e619a6d24b736b4544527ea94436e6c978c7ceba7ef958652cf7cb597b8"} err="failed to get container status \"efdf4e619a6d24b736b4544527ea94436e6c978c7ceba7ef958652cf7cb597b8\": rpc error: code = NotFound desc = could not find container \"efdf4e619a6d24b736b4544527ea94436e6c978c7ceba7ef958652cf7cb597b8\": container with ID starting with efdf4e619a6d24b736b4544527ea94436e6c978c7ceba7ef958652cf7cb597b8 not found: ID does not exist" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.821736 4901 scope.go:117] "RemoveContainer" containerID="88fc84894c5e86912090b30b3eb8149fd1b794d55763d71b556f863fbc68ed0f" Mar 09 03:06:07 crc kubenswrapper[4901]: E0309 03:06:07.822088 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88fc84894c5e86912090b30b3eb8149fd1b794d55763d71b556f863fbc68ed0f\": container with ID starting with 88fc84894c5e86912090b30b3eb8149fd1b794d55763d71b556f863fbc68ed0f not found: ID does not exist" containerID="88fc84894c5e86912090b30b3eb8149fd1b794d55763d71b556f863fbc68ed0f" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.822111 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88fc84894c5e86912090b30b3eb8149fd1b794d55763d71b556f863fbc68ed0f"} err="failed to get container status \"88fc84894c5e86912090b30b3eb8149fd1b794d55763d71b556f863fbc68ed0f\": rpc error: code = NotFound desc = could not find container \"88fc84894c5e86912090b30b3eb8149fd1b794d55763d71b556f863fbc68ed0f\": container with ID starting with 88fc84894c5e86912090b30b3eb8149fd1b794d55763d71b556f863fbc68ed0f not found: ID does not exist" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.822127 4901 scope.go:117] "RemoveContainer" containerID="8439b508bcb9b7e1d34dad860ed688032784076964cafe31bf8854469d12a0c4" Mar 09 03:06:07 crc kubenswrapper[4901]: E0309 03:06:07.822453 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8439b508bcb9b7e1d34dad860ed688032784076964cafe31bf8854469d12a0c4\": container with ID starting with 8439b508bcb9b7e1d34dad860ed688032784076964cafe31bf8854469d12a0c4 not found: ID does not exist" containerID="8439b508bcb9b7e1d34dad860ed688032784076964cafe31bf8854469d12a0c4" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.822477 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8439b508bcb9b7e1d34dad860ed688032784076964cafe31bf8854469d12a0c4"} err="failed to get container status \"8439b508bcb9b7e1d34dad860ed688032784076964cafe31bf8854469d12a0c4\": rpc error: code = NotFound desc = could not find container \"8439b508bcb9b7e1d34dad860ed688032784076964cafe31bf8854469d12a0c4\": container with ID starting with 8439b508bcb9b7e1d34dad860ed688032784076964cafe31bf8854469d12a0c4 not found: ID does not exist" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.822492 4901 scope.go:117] "RemoveContainer" containerID="b505e13d0afa284626e3a000524fb455406b74e6c642956f44df576c999c444c" Mar 09 03:06:07 crc kubenswrapper[4901]: E0309 03:06:07.822841 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b505e13d0afa284626e3a000524fb455406b74e6c642956f44df576c999c444c\": container with ID starting with b505e13d0afa284626e3a000524fb455406b74e6c642956f44df576c999c444c not found: ID does not exist" containerID="b505e13d0afa284626e3a000524fb455406b74e6c642956f44df576c999c444c" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.822865 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b505e13d0afa284626e3a000524fb455406b74e6c642956f44df576c999c444c"} err="failed to get container status \"b505e13d0afa284626e3a000524fb455406b74e6c642956f44df576c999c444c\": rpc error: code = NotFound desc = could not find container \"b505e13d0afa284626e3a000524fb455406b74e6c642956f44df576c999c444c\": container with ID starting with b505e13d0afa284626e3a000524fb455406b74e6c642956f44df576c999c444c not found: ID does not exist" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.822880 4901 scope.go:117] "RemoveContainer" containerID="0508076142d286eb3dc29b982443d11cf9f76d1d98901e2dde15dd0067359954" Mar 09 03:06:07 crc kubenswrapper[4901]: E0309 03:06:07.823182 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0508076142d286eb3dc29b982443d11cf9f76d1d98901e2dde15dd0067359954\": container with ID starting with 0508076142d286eb3dc29b982443d11cf9f76d1d98901e2dde15dd0067359954 not found: ID does not exist" containerID="0508076142d286eb3dc29b982443d11cf9f76d1d98901e2dde15dd0067359954" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.823204 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0508076142d286eb3dc29b982443d11cf9f76d1d98901e2dde15dd0067359954"} err="failed to get container status \"0508076142d286eb3dc29b982443d11cf9f76d1d98901e2dde15dd0067359954\": rpc error: code = NotFound desc = could not find container \"0508076142d286eb3dc29b982443d11cf9f76d1d98901e2dde15dd0067359954\": container with ID starting with 0508076142d286eb3dc29b982443d11cf9f76d1d98901e2dde15dd0067359954 not found: ID does not exist" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.823242 4901 scope.go:117] "RemoveContainer" containerID="5bf4c0e106e3a5033b95c4d3d3124a40b5aeabe081706850be3c85ef4ff88af9" Mar 09 03:06:07 crc kubenswrapper[4901]: E0309 03:06:07.823733 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bf4c0e106e3a5033b95c4d3d3124a40b5aeabe081706850be3c85ef4ff88af9\": container with ID starting with 5bf4c0e106e3a5033b95c4d3d3124a40b5aeabe081706850be3c85ef4ff88af9 not found: ID does not exist" containerID="5bf4c0e106e3a5033b95c4d3d3124a40b5aeabe081706850be3c85ef4ff88af9" Mar 09 03:06:07 crc kubenswrapper[4901]: I0309 03:06:07.823755 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bf4c0e106e3a5033b95c4d3d3124a40b5aeabe081706850be3c85ef4ff88af9"} err="failed to get container status \"5bf4c0e106e3a5033b95c4d3d3124a40b5aeabe081706850be3c85ef4ff88af9\": rpc error: code = NotFound desc = could not find container \"5bf4c0e106e3a5033b95c4d3d3124a40b5aeabe081706850be3c85ef4ff88af9\": container with ID starting with 5bf4c0e106e3a5033b95c4d3d3124a40b5aeabe081706850be3c85ef4ff88af9 not found: ID does not exist" Mar 09 03:06:08 crc kubenswrapper[4901]: I0309 03:06:08.121866 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89" path="/var/lib/kubelet/pods/4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89/volumes" Mar 09 03:06:08 crc kubenswrapper[4901]: I0309 03:06:08.123504 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5276569a-5e4d-4bbb-ab39-f9a420e4a3e9" path="/var/lib/kubelet/pods/5276569a-5e4d-4bbb-ab39-f9a420e4a3e9/volumes" Mar 09 03:06:08 crc kubenswrapper[4901]: I0309 03:06:08.124821 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" path="/var/lib/kubelet/pods/56e038ec-a406-4f6b-9b8a-135c56be7514/volumes" Mar 09 03:06:09 crc kubenswrapper[4901]: I0309 03:06:09.533420 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gv8tw" Mar 09 03:06:09 crc kubenswrapper[4901]: I0309 03:06:09.599524 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gv8tw" Mar 09 03:06:09 crc kubenswrapper[4901]: I0309 03:06:09.771140 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gv8tw"] Mar 09 03:06:11 crc kubenswrapper[4901]: I0309 03:06:11.234902 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gv8tw" podUID="e877c5b4-163c-4b68-b416-f6c0e7b63cc3" containerName="registry-server" containerID="cri-o://588db61432b4d11ddc16c180679a683d2caa34444530a30a56a1668d538c9ced" gracePeriod=2 Mar 09 03:06:11 crc kubenswrapper[4901]: I0309 03:06:11.772124 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gv8tw" Mar 09 03:06:11 crc kubenswrapper[4901]: I0309 03:06:11.905073 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e877c5b4-163c-4b68-b416-f6c0e7b63cc3-utilities\") pod \"e877c5b4-163c-4b68-b416-f6c0e7b63cc3\" (UID: \"e877c5b4-163c-4b68-b416-f6c0e7b63cc3\") " Mar 09 03:06:11 crc kubenswrapper[4901]: I0309 03:06:11.905149 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e877c5b4-163c-4b68-b416-f6c0e7b63cc3-catalog-content\") pod \"e877c5b4-163c-4b68-b416-f6c0e7b63cc3\" (UID: \"e877c5b4-163c-4b68-b416-f6c0e7b63cc3\") " Mar 09 03:06:11 crc kubenswrapper[4901]: I0309 03:06:11.905348 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbrzx\" (UniqueName: \"kubernetes.io/projected/e877c5b4-163c-4b68-b416-f6c0e7b63cc3-kube-api-access-kbrzx\") pod \"e877c5b4-163c-4b68-b416-f6c0e7b63cc3\" (UID: \"e877c5b4-163c-4b68-b416-f6c0e7b63cc3\") " Mar 09 03:06:11 crc kubenswrapper[4901]: I0309 03:06:11.906280 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e877c5b4-163c-4b68-b416-f6c0e7b63cc3-utilities" (OuterVolumeSpecName: "utilities") pod "e877c5b4-163c-4b68-b416-f6c0e7b63cc3" (UID: "e877c5b4-163c-4b68-b416-f6c0e7b63cc3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:06:11 crc kubenswrapper[4901]: I0309 03:06:11.919706 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e877c5b4-163c-4b68-b416-f6c0e7b63cc3-kube-api-access-kbrzx" (OuterVolumeSpecName: "kube-api-access-kbrzx") pod "e877c5b4-163c-4b68-b416-f6c0e7b63cc3" (UID: "e877c5b4-163c-4b68-b416-f6c0e7b63cc3"). InnerVolumeSpecName "kube-api-access-kbrzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:06:12 crc kubenswrapper[4901]: I0309 03:06:12.007542 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbrzx\" (UniqueName: \"kubernetes.io/projected/e877c5b4-163c-4b68-b416-f6c0e7b63cc3-kube-api-access-kbrzx\") on node \"crc\" DevicePath \"\"" Mar 09 03:06:12 crc kubenswrapper[4901]: I0309 03:06:12.007581 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e877c5b4-163c-4b68-b416-f6c0e7b63cc3-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 03:06:12 crc kubenswrapper[4901]: I0309 03:06:12.053143 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e877c5b4-163c-4b68-b416-f6c0e7b63cc3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e877c5b4-163c-4b68-b416-f6c0e7b63cc3" (UID: "e877c5b4-163c-4b68-b416-f6c0e7b63cc3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:06:12 crc kubenswrapper[4901]: I0309 03:06:12.108950 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e877c5b4-163c-4b68-b416-f6c0e7b63cc3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 03:06:12 crc kubenswrapper[4901]: I0309 03:06:12.248337 4901 generic.go:334] "Generic (PLEG): container finished" podID="e877c5b4-163c-4b68-b416-f6c0e7b63cc3" containerID="588db61432b4d11ddc16c180679a683d2caa34444530a30a56a1668d538c9ced" exitCode=0 Mar 09 03:06:12 crc kubenswrapper[4901]: I0309 03:06:12.248406 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gv8tw" event={"ID":"e877c5b4-163c-4b68-b416-f6c0e7b63cc3","Type":"ContainerDied","Data":"588db61432b4d11ddc16c180679a683d2caa34444530a30a56a1668d538c9ced"} Mar 09 03:06:12 crc kubenswrapper[4901]: I0309 03:06:12.248435 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gv8tw" Mar 09 03:06:12 crc kubenswrapper[4901]: I0309 03:06:12.248459 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gv8tw" event={"ID":"e877c5b4-163c-4b68-b416-f6c0e7b63cc3","Type":"ContainerDied","Data":"dc96d78e3f627e522569852cc0747ae3250271520ec946555d5be0081e0e28ed"} Mar 09 03:06:12 crc kubenswrapper[4901]: I0309 03:06:12.248490 4901 scope.go:117] "RemoveContainer" containerID="588db61432b4d11ddc16c180679a683d2caa34444530a30a56a1668d538c9ced" Mar 09 03:06:12 crc kubenswrapper[4901]: I0309 03:06:12.282730 4901 scope.go:117] "RemoveContainer" containerID="14b536e8d2ce30cd9c3a17a7bafcd53ae03e8e1282bd6e75a8a2cb8406779073" Mar 09 03:06:12 crc kubenswrapper[4901]: I0309 03:06:12.286767 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gv8tw"] Mar 09 03:06:12 crc kubenswrapper[4901]: I0309 03:06:12.297139 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gv8tw"] Mar 09 03:06:12 crc kubenswrapper[4901]: I0309 03:06:12.316215 4901 scope.go:117] "RemoveContainer" containerID="caa86e2ebd16d04619ac9b9870dde63528745c087437e0a93aa523fc41851096" Mar 09 03:06:12 crc kubenswrapper[4901]: I0309 03:06:12.342791 4901 scope.go:117] "RemoveContainer" containerID="588db61432b4d11ddc16c180679a683d2caa34444530a30a56a1668d538c9ced" Mar 09 03:06:12 crc kubenswrapper[4901]: E0309 03:06:12.343493 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"588db61432b4d11ddc16c180679a683d2caa34444530a30a56a1668d538c9ced\": container with ID starting with 588db61432b4d11ddc16c180679a683d2caa34444530a30a56a1668d538c9ced not found: ID does not exist" containerID="588db61432b4d11ddc16c180679a683d2caa34444530a30a56a1668d538c9ced" Mar 09 03:06:12 crc kubenswrapper[4901]: I0309 03:06:12.343568 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"588db61432b4d11ddc16c180679a683d2caa34444530a30a56a1668d538c9ced"} err="failed to get container status \"588db61432b4d11ddc16c180679a683d2caa34444530a30a56a1668d538c9ced\": rpc error: code = NotFound desc = could not find container \"588db61432b4d11ddc16c180679a683d2caa34444530a30a56a1668d538c9ced\": container with ID starting with 588db61432b4d11ddc16c180679a683d2caa34444530a30a56a1668d538c9ced not found: ID does not exist" Mar 09 03:06:12 crc kubenswrapper[4901]: I0309 03:06:12.343615 4901 scope.go:117] "RemoveContainer" containerID="14b536e8d2ce30cd9c3a17a7bafcd53ae03e8e1282bd6e75a8a2cb8406779073" Mar 09 03:06:12 crc kubenswrapper[4901]: E0309 03:06:12.344205 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14b536e8d2ce30cd9c3a17a7bafcd53ae03e8e1282bd6e75a8a2cb8406779073\": container with ID starting with 14b536e8d2ce30cd9c3a17a7bafcd53ae03e8e1282bd6e75a8a2cb8406779073 not found: ID does not exist" containerID="14b536e8d2ce30cd9c3a17a7bafcd53ae03e8e1282bd6e75a8a2cb8406779073" Mar 09 03:06:12 crc kubenswrapper[4901]: I0309 03:06:12.344341 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14b536e8d2ce30cd9c3a17a7bafcd53ae03e8e1282bd6e75a8a2cb8406779073"} err="failed to get container status \"14b536e8d2ce30cd9c3a17a7bafcd53ae03e8e1282bd6e75a8a2cb8406779073\": rpc error: code = NotFound desc = could not find container \"14b536e8d2ce30cd9c3a17a7bafcd53ae03e8e1282bd6e75a8a2cb8406779073\": container with ID starting with 14b536e8d2ce30cd9c3a17a7bafcd53ae03e8e1282bd6e75a8a2cb8406779073 not found: ID does not exist" Mar 09 03:06:12 crc kubenswrapper[4901]: I0309 03:06:12.344367 4901 scope.go:117] "RemoveContainer" containerID="caa86e2ebd16d04619ac9b9870dde63528745c087437e0a93aa523fc41851096" Mar 09 03:06:12 crc kubenswrapper[4901]: E0309 03:06:12.344756 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"caa86e2ebd16d04619ac9b9870dde63528745c087437e0a93aa523fc41851096\": container with ID starting with caa86e2ebd16d04619ac9b9870dde63528745c087437e0a93aa523fc41851096 not found: ID does not exist" containerID="caa86e2ebd16d04619ac9b9870dde63528745c087437e0a93aa523fc41851096" Mar 09 03:06:12 crc kubenswrapper[4901]: I0309 03:06:12.344796 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"caa86e2ebd16d04619ac9b9870dde63528745c087437e0a93aa523fc41851096"} err="failed to get container status \"caa86e2ebd16d04619ac9b9870dde63528745c087437e0a93aa523fc41851096\": rpc error: code = NotFound desc = could not find container \"caa86e2ebd16d04619ac9b9870dde63528745c087437e0a93aa523fc41851096\": container with ID starting with caa86e2ebd16d04619ac9b9870dde63528745c087437e0a93aa523fc41851096 not found: ID does not exist" Mar 09 03:06:14 crc kubenswrapper[4901]: I0309 03:06:14.122205 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e877c5b4-163c-4b68-b416-f6c0e7b63cc3" path="/var/lib/kubelet/pods/e877c5b4-163c-4b68-b416-f6c0e7b63cc3/volumes" Mar 09 03:06:30 crc kubenswrapper[4901]: I0309 03:06:30.863167 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 03:06:30 crc kubenswrapper[4901]: I0309 03:06:30.863883 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 03:06:34 crc kubenswrapper[4901]: I0309 03:06:34.480102 4901 scope.go:117] "RemoveContainer" containerID="2e9420df9d722ebd4bebb2b0d2034e677b415c23cb883ab6d8be4a18aba0d18c" Mar 09 03:07:00 crc kubenswrapper[4901]: I0309 03:07:00.863782 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 03:07:00 crc kubenswrapper[4901]: I0309 03:07:00.864572 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 03:07:30 crc kubenswrapper[4901]: I0309 03:07:30.862807 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 03:07:30 crc kubenswrapper[4901]: I0309 03:07:30.863511 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 03:07:30 crc kubenswrapper[4901]: I0309 03:07:30.863570 4901 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5c998" Mar 09 03:07:30 crc kubenswrapper[4901]: I0309 03:07:30.864364 4901 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e3004d260bc17a7df9a3f09f9c3fb88b56d94af0a91dbe7f057c714451b1f515"} pod="openshift-machine-config-operator/machine-config-daemon-5c998" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 03:07:30 crc kubenswrapper[4901]: I0309 03:07:30.864461 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" containerID="cri-o://e3004d260bc17a7df9a3f09f9c3fb88b56d94af0a91dbe7f057c714451b1f515" gracePeriod=600 Mar 09 03:07:31 crc kubenswrapper[4901]: I0309 03:07:31.059304 4901 generic.go:334] "Generic (PLEG): container finished" podID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerID="e3004d260bc17a7df9a3f09f9c3fb88b56d94af0a91dbe7f057c714451b1f515" exitCode=0 Mar 09 03:07:31 crc kubenswrapper[4901]: I0309 03:07:31.059363 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" event={"ID":"65e722e8-52c4-4bb6-9927-f378b2f7296a","Type":"ContainerDied","Data":"e3004d260bc17a7df9a3f09f9c3fb88b56d94af0a91dbe7f057c714451b1f515"} Mar 09 03:07:31 crc kubenswrapper[4901]: I0309 03:07:31.059457 4901 scope.go:117] "RemoveContainer" containerID="9277963b36f2cb3e2457299d51b88e2cddec56d32cd2a3c3337a07a6a046785b" Mar 09 03:07:32 crc kubenswrapper[4901]: I0309 03:07:32.070662 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" event={"ID":"65e722e8-52c4-4bb6-9927-f378b2f7296a","Type":"ContainerStarted","Data":"324d191a9a6cf1d15e57a90535c15f4180b030663e41d38d4e613a6168e3a1a2"} Mar 09 03:07:35 crc kubenswrapper[4901]: I0309 03:07:35.381363 4901 scope.go:117] "RemoveContainer" containerID="ddd5a5a2fd5db2b102441a537f515378dc011c9879e9d5aab1e360514034a5bc" Mar 09 03:07:35 crc kubenswrapper[4901]: I0309 03:07:35.423753 4901 scope.go:117] "RemoveContainer" containerID="9006efa47acc80f02568c7e41f3501e04cd4ba5afcd137f8e6891cbea2267262" Mar 09 03:07:35 crc kubenswrapper[4901]: I0309 03:07:35.473999 4901 scope.go:117] "RemoveContainer" containerID="ce0cbdc3bc8aab4b550b8ac5657a16eab3fb78b29d770ddb8c402a6b846084d9" Mar 09 03:07:35 crc kubenswrapper[4901]: I0309 03:07:35.512883 4901 scope.go:117] "RemoveContainer" containerID="d5910d7c426253115f7317e5b034b541d545d2f6be616729dbcd814afc42fd07" Mar 09 03:07:35 crc kubenswrapper[4901]: I0309 03:07:35.558328 4901 scope.go:117] "RemoveContainer" containerID="9dd7765fed399b21a402a85557bb2e3752b30c45c4c7b942075acb9f6b5c5dd7" Mar 09 03:07:35 crc kubenswrapper[4901]: I0309 03:07:35.597991 4901 scope.go:117] "RemoveContainer" containerID="9c89ebccb26bac47875a363934008992acf70441ad49375a8adc23935e857520" Mar 09 03:07:35 crc kubenswrapper[4901]: I0309 03:07:35.640601 4901 scope.go:117] "RemoveContainer" containerID="7636d66231ac9d71a66a7ce47bddddcc5f7b6d3fa536ff6823ddb5d6e4ae9419" Mar 09 03:07:35 crc kubenswrapper[4901]: I0309 03:07:35.668700 4901 scope.go:117] "RemoveContainer" containerID="2a9015c6301a41f7bb9edf9e61e1f441f7386e0db5929e6264964b6d4abbcbdb" Mar 09 03:07:35 crc kubenswrapper[4901]: I0309 03:07:35.699707 4901 scope.go:117] "RemoveContainer" containerID="6bf103e4ec5e9ca0d5bc452b5db7fc5f1271184e38999c1f0e1d1b833052187b" Mar 09 03:07:35 crc kubenswrapper[4901]: I0309 03:07:35.739423 4901 scope.go:117] "RemoveContainer" containerID="34a6b8e27fa535bcf55548c59ff362d66bb423d69d3e0871101811cba5ab368d" Mar 09 03:07:35 crc kubenswrapper[4901]: I0309 03:07:35.785934 4901 scope.go:117] "RemoveContainer" containerID="a2ad28f5f864002403825bfa3b2973d7bd3cc4be809b84ffdf5a9e79dab3db9b" Mar 09 03:07:35 crc kubenswrapper[4901]: I0309 03:07:35.809884 4901 scope.go:117] "RemoveContainer" containerID="97ab13e50a94e0d652fdbd979cbe25cc835cb2286e4d09687a06d52e3b5f01f1" Mar 09 03:07:35 crc kubenswrapper[4901]: I0309 03:07:35.836659 4901 scope.go:117] "RemoveContainer" containerID="c97e0e91fc19928d4cd2d28f3a4db51e71395cd30f0e8c6635ab7ce7d7ad2c6b" Mar 09 03:08:00 crc kubenswrapper[4901]: I0309 03:08:00.165962 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550428-b4t88"] Mar 09 03:08:00 crc kubenswrapper[4901]: E0309 03:08:00.168324 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerName="account-reaper" Mar 09 03:08:00 crc kubenswrapper[4901]: I0309 03:08:00.168485 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerName="account-reaper" Mar 09 03:08:00 crc kubenswrapper[4901]: E0309 03:08:00.168622 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerName="object-expirer" Mar 09 03:08:00 crc kubenswrapper[4901]: I0309 03:08:00.168738 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerName="object-expirer" Mar 09 03:08:00 crc kubenswrapper[4901]: E0309 03:08:00.168854 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89" containerName="ovsdb-server" Mar 09 03:08:00 crc kubenswrapper[4901]: I0309 03:08:00.168976 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89" containerName="ovsdb-server" Mar 09 03:08:00 crc kubenswrapper[4901]: E0309 03:08:00.169103 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerName="swift-recon-cron" Mar 09 03:08:00 crc kubenswrapper[4901]: I0309 03:08:00.169257 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerName="swift-recon-cron" Mar 09 03:08:00 crc kubenswrapper[4901]: E0309 03:08:00.169451 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerName="container-replicator" Mar 09 03:08:00 crc kubenswrapper[4901]: I0309 03:08:00.169582 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerName="container-replicator" Mar 09 03:08:00 crc kubenswrapper[4901]: E0309 03:08:00.169702 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerName="container-server" Mar 09 03:08:00 crc kubenswrapper[4901]: I0309 03:08:00.169813 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerName="container-server" Mar 09 03:08:00 crc kubenswrapper[4901]: E0309 03:08:00.169934 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerName="rsync" Mar 09 03:08:00 crc kubenswrapper[4901]: I0309 03:08:00.170076 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerName="rsync" Mar 09 03:08:00 crc kubenswrapper[4901]: E0309 03:08:00.170203 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerName="object-auditor" Mar 09 03:08:00 crc kubenswrapper[4901]: I0309 03:08:00.170394 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerName="object-auditor" Mar 09 03:08:00 crc kubenswrapper[4901]: E0309 03:08:00.170529 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerName="container-auditor" Mar 09 03:08:00 crc kubenswrapper[4901]: I0309 03:08:00.170645 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerName="container-auditor" Mar 09 03:08:00 crc kubenswrapper[4901]: E0309 03:08:00.170771 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerName="object-server" Mar 09 03:08:00 crc kubenswrapper[4901]: I0309 03:08:00.170893 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerName="object-server" Mar 09 03:08:00 crc kubenswrapper[4901]: E0309 03:08:00.171015 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89" containerName="ovsdb-server-init" Mar 09 03:08:00 crc kubenswrapper[4901]: I0309 03:08:00.171132 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89" containerName="ovsdb-server-init" Mar 09 03:08:00 crc kubenswrapper[4901]: E0309 03:08:00.171282 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerName="object-replicator" Mar 09 03:08:00 crc kubenswrapper[4901]: I0309 03:08:00.171410 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerName="object-replicator" Mar 09 03:08:00 crc kubenswrapper[4901]: E0309 03:08:00.171540 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerName="account-replicator" Mar 09 03:08:00 crc kubenswrapper[4901]: I0309 03:08:00.171660 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerName="account-replicator" Mar 09 03:08:00 crc kubenswrapper[4901]: E0309 03:08:00.171782 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerName="container-updater" Mar 09 03:08:00 crc kubenswrapper[4901]: I0309 03:08:00.171897 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerName="container-updater" Mar 09 03:08:00 crc kubenswrapper[4901]: E0309 03:08:00.172021 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e877c5b4-163c-4b68-b416-f6c0e7b63cc3" containerName="extract-utilities" Mar 09 03:08:00 crc kubenswrapper[4901]: I0309 03:08:00.172133 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="e877c5b4-163c-4b68-b416-f6c0e7b63cc3" containerName="extract-utilities" Mar 09 03:08:00 crc kubenswrapper[4901]: E0309 03:08:00.172306 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e877c5b4-163c-4b68-b416-f6c0e7b63cc3" containerName="registry-server" Mar 09 03:08:00 crc kubenswrapper[4901]: I0309 03:08:00.172445 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="e877c5b4-163c-4b68-b416-f6c0e7b63cc3" containerName="registry-server" Mar 09 03:08:00 crc kubenswrapper[4901]: E0309 03:08:00.172563 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e1f6875-8b63-4eff-b3bb-e51285f5e8b0" containerName="oc" Mar 09 03:08:00 crc kubenswrapper[4901]: I0309 03:08:00.172842 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e1f6875-8b63-4eff-b3bb-e51285f5e8b0" containerName="oc" Mar 09 03:08:00 crc kubenswrapper[4901]: E0309 03:08:00.172966 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerName="object-updater" Mar 09 03:08:00 crc kubenswrapper[4901]: I0309 03:08:00.173093 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerName="object-updater" Mar 09 03:08:00 crc kubenswrapper[4901]: E0309 03:08:00.173212 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerName="account-auditor" Mar 09 03:08:00 crc kubenswrapper[4901]: I0309 03:08:00.173380 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerName="account-auditor" Mar 09 03:08:00 crc kubenswrapper[4901]: E0309 03:08:00.173543 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerName="account-server" Mar 09 03:08:00 crc kubenswrapper[4901]: I0309 03:08:00.173664 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerName="account-server" Mar 09 03:08:00 crc kubenswrapper[4901]: E0309 03:08:00.173796 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e877c5b4-163c-4b68-b416-f6c0e7b63cc3" containerName="extract-content" Mar 09 03:08:00 crc kubenswrapper[4901]: I0309 03:08:00.173913 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="e877c5b4-163c-4b68-b416-f6c0e7b63cc3" containerName="extract-content" Mar 09 03:08:00 crc kubenswrapper[4901]: E0309 03:08:00.174028 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5276569a-5e4d-4bbb-ab39-f9a420e4a3e9" containerName="cinder-scheduler" Mar 09 03:08:00 crc kubenswrapper[4901]: I0309 03:08:00.174150 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="5276569a-5e4d-4bbb-ab39-f9a420e4a3e9" containerName="cinder-scheduler" Mar 09 03:08:00 crc kubenswrapper[4901]: E0309 03:08:00.174299 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5276569a-5e4d-4bbb-ab39-f9a420e4a3e9" containerName="probe" Mar 09 03:08:00 crc kubenswrapper[4901]: I0309 03:08:00.174426 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="5276569a-5e4d-4bbb-ab39-f9a420e4a3e9" containerName="probe" Mar 09 03:08:00 crc kubenswrapper[4901]: E0309 03:08:00.174545 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89" containerName="ovs-vswitchd" Mar 09 03:08:00 crc kubenswrapper[4901]: I0309 03:08:00.174662 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89" containerName="ovs-vswitchd" Mar 09 03:08:00 crc kubenswrapper[4901]: I0309 03:08:00.175027 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerName="container-server" Mar 09 03:08:00 crc kubenswrapper[4901]: I0309 03:08:00.175152 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="5276569a-5e4d-4bbb-ab39-f9a420e4a3e9" containerName="cinder-scheduler" Mar 09 03:08:00 crc kubenswrapper[4901]: I0309 03:08:00.175308 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerName="container-updater" Mar 09 03:08:00 crc kubenswrapper[4901]: I0309 03:08:00.175441 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89" containerName="ovs-vswitchd" Mar 09 03:08:00 crc kubenswrapper[4901]: I0309 03:08:00.175555 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerName="object-updater" Mar 09 03:08:00 crc kubenswrapper[4901]: I0309 03:08:00.175677 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b3c3ee5-e413-4c0f-bf6b-aba2084f4e89" containerName="ovsdb-server" Mar 09 03:08:00 crc kubenswrapper[4901]: I0309 03:08:00.175802 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerName="object-replicator" Mar 09 03:08:00 crc kubenswrapper[4901]: I0309 03:08:00.175912 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerName="account-replicator" Mar 09 03:08:00 crc kubenswrapper[4901]: I0309 03:08:00.176168 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerName="object-server" Mar 09 03:08:00 crc kubenswrapper[4901]: I0309 03:08:00.176324 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerName="swift-recon-cron" Mar 09 03:08:00 crc kubenswrapper[4901]: I0309 03:08:00.176454 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerName="account-server" Mar 09 03:08:00 crc kubenswrapper[4901]: I0309 03:08:00.176575 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerName="object-expirer" Mar 09 03:08:00 crc kubenswrapper[4901]: I0309 03:08:00.176858 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerName="rsync" Mar 09 03:08:00 crc kubenswrapper[4901]: I0309 03:08:00.177001 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerName="container-replicator" Mar 09 03:08:00 crc kubenswrapper[4901]: I0309 03:08:00.177460 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerName="account-auditor" Mar 09 03:08:00 crc kubenswrapper[4901]: I0309 03:08:00.177643 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerName="container-auditor" Mar 09 03:08:00 crc kubenswrapper[4901]: I0309 03:08:00.177765 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerName="object-auditor" Mar 09 03:08:00 crc kubenswrapper[4901]: I0309 03:08:00.177885 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e1f6875-8b63-4eff-b3bb-e51285f5e8b0" containerName="oc" Mar 09 03:08:00 crc kubenswrapper[4901]: I0309 03:08:00.178001 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="56e038ec-a406-4f6b-9b8a-135c56be7514" containerName="account-reaper" Mar 09 03:08:00 crc kubenswrapper[4901]: I0309 03:08:00.178115 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="e877c5b4-163c-4b68-b416-f6c0e7b63cc3" containerName="registry-server" Mar 09 03:08:00 crc kubenswrapper[4901]: I0309 03:08:00.178299 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="5276569a-5e4d-4bbb-ab39-f9a420e4a3e9" containerName="probe" Mar 09 03:08:00 crc kubenswrapper[4901]: I0309 03:08:00.178836 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550428-b4t88"] Mar 09 03:08:00 crc kubenswrapper[4901]: I0309 03:08:00.178994 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550428-b4t88" Mar 09 03:08:00 crc kubenswrapper[4901]: I0309 03:08:00.183031 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 03:08:00 crc kubenswrapper[4901]: I0309 03:08:00.185084 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 03:08:00 crc kubenswrapper[4901]: I0309 03:08:00.185447 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lq6vs" Mar 09 03:08:00 crc kubenswrapper[4901]: I0309 03:08:00.259637 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x4vl\" (UniqueName: \"kubernetes.io/projected/00668526-212d-45e2-afa7-dcf90734ff5d-kube-api-access-8x4vl\") pod \"auto-csr-approver-29550428-b4t88\" (UID: \"00668526-212d-45e2-afa7-dcf90734ff5d\") " pod="openshift-infra/auto-csr-approver-29550428-b4t88" Mar 09 03:08:00 crc kubenswrapper[4901]: I0309 03:08:00.363447 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8x4vl\" (UniqueName: \"kubernetes.io/projected/00668526-212d-45e2-afa7-dcf90734ff5d-kube-api-access-8x4vl\") pod \"auto-csr-approver-29550428-b4t88\" (UID: \"00668526-212d-45e2-afa7-dcf90734ff5d\") " pod="openshift-infra/auto-csr-approver-29550428-b4t88" Mar 09 03:08:00 crc kubenswrapper[4901]: I0309 03:08:00.392090 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x4vl\" (UniqueName: \"kubernetes.io/projected/00668526-212d-45e2-afa7-dcf90734ff5d-kube-api-access-8x4vl\") pod \"auto-csr-approver-29550428-b4t88\" (UID: \"00668526-212d-45e2-afa7-dcf90734ff5d\") " pod="openshift-infra/auto-csr-approver-29550428-b4t88" Mar 09 03:08:00 crc kubenswrapper[4901]: I0309 03:08:00.508989 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550428-b4t88" Mar 09 03:08:01 crc kubenswrapper[4901]: I0309 03:08:00.994702 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550428-b4t88"] Mar 09 03:08:01 crc kubenswrapper[4901]: I0309 03:08:01.368334 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550428-b4t88" event={"ID":"00668526-212d-45e2-afa7-dcf90734ff5d","Type":"ContainerStarted","Data":"acb619c4a4ce8c95a46e3ffe458f7aa9e1384cb8ffe430cd66cbc098c07b09f2"} Mar 09 03:08:01 crc kubenswrapper[4901]: I0309 03:08:01.426751 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-k45jm"] Mar 09 03:08:01 crc kubenswrapper[4901]: I0309 03:08:01.429640 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k45jm" Mar 09 03:08:01 crc kubenswrapper[4901]: I0309 03:08:01.437392 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k45jm"] Mar 09 03:08:01 crc kubenswrapper[4901]: I0309 03:08:01.484505 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcrlf\" (UniqueName: \"kubernetes.io/projected/18ae8ba8-a082-4b04-8116-69139cb76eaf-kube-api-access-tcrlf\") pod \"redhat-marketplace-k45jm\" (UID: \"18ae8ba8-a082-4b04-8116-69139cb76eaf\") " pod="openshift-marketplace/redhat-marketplace-k45jm" Mar 09 03:08:01 crc kubenswrapper[4901]: I0309 03:08:01.484569 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18ae8ba8-a082-4b04-8116-69139cb76eaf-catalog-content\") pod \"redhat-marketplace-k45jm\" (UID: \"18ae8ba8-a082-4b04-8116-69139cb76eaf\") " pod="openshift-marketplace/redhat-marketplace-k45jm" Mar 09 03:08:01 crc kubenswrapper[4901]: I0309 03:08:01.484678 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18ae8ba8-a082-4b04-8116-69139cb76eaf-utilities\") pod \"redhat-marketplace-k45jm\" (UID: \"18ae8ba8-a082-4b04-8116-69139cb76eaf\") " pod="openshift-marketplace/redhat-marketplace-k45jm" Mar 09 03:08:01 crc kubenswrapper[4901]: I0309 03:08:01.586699 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcrlf\" (UniqueName: \"kubernetes.io/projected/18ae8ba8-a082-4b04-8116-69139cb76eaf-kube-api-access-tcrlf\") pod \"redhat-marketplace-k45jm\" (UID: \"18ae8ba8-a082-4b04-8116-69139cb76eaf\") " pod="openshift-marketplace/redhat-marketplace-k45jm" Mar 09 03:08:01 crc kubenswrapper[4901]: I0309 03:08:01.586782 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18ae8ba8-a082-4b04-8116-69139cb76eaf-catalog-content\") pod \"redhat-marketplace-k45jm\" (UID: \"18ae8ba8-a082-4b04-8116-69139cb76eaf\") " pod="openshift-marketplace/redhat-marketplace-k45jm" Mar 09 03:08:01 crc kubenswrapper[4901]: I0309 03:08:01.586870 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18ae8ba8-a082-4b04-8116-69139cb76eaf-utilities\") pod \"redhat-marketplace-k45jm\" (UID: \"18ae8ba8-a082-4b04-8116-69139cb76eaf\") " pod="openshift-marketplace/redhat-marketplace-k45jm" Mar 09 03:08:01 crc kubenswrapper[4901]: I0309 03:08:01.587464 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18ae8ba8-a082-4b04-8116-69139cb76eaf-catalog-content\") pod \"redhat-marketplace-k45jm\" (UID: \"18ae8ba8-a082-4b04-8116-69139cb76eaf\") " pod="openshift-marketplace/redhat-marketplace-k45jm" Mar 09 03:08:01 crc kubenswrapper[4901]: I0309 03:08:01.587480 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18ae8ba8-a082-4b04-8116-69139cb76eaf-utilities\") pod \"redhat-marketplace-k45jm\" (UID: \"18ae8ba8-a082-4b04-8116-69139cb76eaf\") " pod="openshift-marketplace/redhat-marketplace-k45jm" Mar 09 03:08:01 crc kubenswrapper[4901]: I0309 03:08:01.615587 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcrlf\" (UniqueName: \"kubernetes.io/projected/18ae8ba8-a082-4b04-8116-69139cb76eaf-kube-api-access-tcrlf\") pod \"redhat-marketplace-k45jm\" (UID: \"18ae8ba8-a082-4b04-8116-69139cb76eaf\") " pod="openshift-marketplace/redhat-marketplace-k45jm" Mar 09 03:08:01 crc kubenswrapper[4901]: I0309 03:08:01.757465 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k45jm" Mar 09 03:08:02 crc kubenswrapper[4901]: I0309 03:08:02.243306 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k45jm"] Mar 09 03:08:02 crc kubenswrapper[4901]: W0309 03:08:02.254097 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18ae8ba8_a082_4b04_8116_69139cb76eaf.slice/crio-e0fa94a34d951b7d78a1509296080f221dfebd3a11695fc58f93cbb60bf909e1 WatchSource:0}: Error finding container e0fa94a34d951b7d78a1509296080f221dfebd3a11695fc58f93cbb60bf909e1: Status 404 returned error can't find the container with id e0fa94a34d951b7d78a1509296080f221dfebd3a11695fc58f93cbb60bf909e1 Mar 09 03:08:02 crc kubenswrapper[4901]: I0309 03:08:02.382983 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550428-b4t88" event={"ID":"00668526-212d-45e2-afa7-dcf90734ff5d","Type":"ContainerStarted","Data":"35562fcfde12d6f3085945b49680c207afc8b93d36c95a4e0a60217ec22bd8ed"} Mar 09 03:08:02 crc kubenswrapper[4901]: I0309 03:08:02.384686 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k45jm" event={"ID":"18ae8ba8-a082-4b04-8116-69139cb76eaf","Type":"ContainerStarted","Data":"36d1155bd0bff06d3842536ccf5df578962bbd77ed2194a9bb567e889069b67e"} Mar 09 03:08:02 crc kubenswrapper[4901]: I0309 03:08:02.384754 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k45jm" event={"ID":"18ae8ba8-a082-4b04-8116-69139cb76eaf","Type":"ContainerStarted","Data":"e0fa94a34d951b7d78a1509296080f221dfebd3a11695fc58f93cbb60bf909e1"} Mar 09 03:08:02 crc kubenswrapper[4901]: I0309 03:08:02.407591 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550428-b4t88" podStartSLOduration=1.369397685 podStartE2EDuration="2.407560992s" podCreationTimestamp="2026-03-09 03:08:00 +0000 UTC" firstStartedPulling="2026-03-09 03:08:01.004595119 +0000 UTC m=+1605.594258841" lastFinishedPulling="2026-03-09 03:08:02.042758416 +0000 UTC m=+1606.632422148" observedRunningTime="2026-03-09 03:08:02.398941044 +0000 UTC m=+1606.988604816" watchObservedRunningTime="2026-03-09 03:08:02.407560992 +0000 UTC m=+1606.997224764" Mar 09 03:08:03 crc kubenswrapper[4901]: I0309 03:08:03.399158 4901 generic.go:334] "Generic (PLEG): container finished" podID="18ae8ba8-a082-4b04-8116-69139cb76eaf" containerID="36d1155bd0bff06d3842536ccf5df578962bbd77ed2194a9bb567e889069b67e" exitCode=0 Mar 09 03:08:03 crc kubenswrapper[4901]: I0309 03:08:03.399773 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k45jm" event={"ID":"18ae8ba8-a082-4b04-8116-69139cb76eaf","Type":"ContainerDied","Data":"36d1155bd0bff06d3842536ccf5df578962bbd77ed2194a9bb567e889069b67e"} Mar 09 03:08:03 crc kubenswrapper[4901]: I0309 03:08:03.408814 4901 generic.go:334] "Generic (PLEG): container finished" podID="00668526-212d-45e2-afa7-dcf90734ff5d" containerID="35562fcfde12d6f3085945b49680c207afc8b93d36c95a4e0a60217ec22bd8ed" exitCode=0 Mar 09 03:08:03 crc kubenswrapper[4901]: I0309 03:08:03.408871 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550428-b4t88" event={"ID":"00668526-212d-45e2-afa7-dcf90734ff5d","Type":"ContainerDied","Data":"35562fcfde12d6f3085945b49680c207afc8b93d36c95a4e0a60217ec22bd8ed"} Mar 09 03:08:04 crc kubenswrapper[4901]: I0309 03:08:04.424734 4901 generic.go:334] "Generic (PLEG): container finished" podID="18ae8ba8-a082-4b04-8116-69139cb76eaf" containerID="8def7afdcba15b96e76735013b243b23fd0a4f05e8b2af98d0e42dea49fb0b8f" exitCode=0 Mar 09 03:08:04 crc kubenswrapper[4901]: I0309 03:08:04.424807 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k45jm" event={"ID":"18ae8ba8-a082-4b04-8116-69139cb76eaf","Type":"ContainerDied","Data":"8def7afdcba15b96e76735013b243b23fd0a4f05e8b2af98d0e42dea49fb0b8f"} Mar 09 03:08:04 crc kubenswrapper[4901]: I0309 03:08:04.855113 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550428-b4t88" Mar 09 03:08:04 crc kubenswrapper[4901]: I0309 03:08:04.943589 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8x4vl\" (UniqueName: \"kubernetes.io/projected/00668526-212d-45e2-afa7-dcf90734ff5d-kube-api-access-8x4vl\") pod \"00668526-212d-45e2-afa7-dcf90734ff5d\" (UID: \"00668526-212d-45e2-afa7-dcf90734ff5d\") " Mar 09 03:08:04 crc kubenswrapper[4901]: I0309 03:08:04.956380 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00668526-212d-45e2-afa7-dcf90734ff5d-kube-api-access-8x4vl" (OuterVolumeSpecName: "kube-api-access-8x4vl") pod "00668526-212d-45e2-afa7-dcf90734ff5d" (UID: "00668526-212d-45e2-afa7-dcf90734ff5d"). InnerVolumeSpecName "kube-api-access-8x4vl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:08:05 crc kubenswrapper[4901]: I0309 03:08:05.045485 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8x4vl\" (UniqueName: \"kubernetes.io/projected/00668526-212d-45e2-afa7-dcf90734ff5d-kube-api-access-8x4vl\") on node \"crc\" DevicePath \"\"" Mar 09 03:08:05 crc kubenswrapper[4901]: I0309 03:08:05.443216 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k45jm" event={"ID":"18ae8ba8-a082-4b04-8116-69139cb76eaf","Type":"ContainerStarted","Data":"cc31d95b83c445f7f6b5b49bfb2296bd2a13f44becea13f7b580c0fa71bb20a3"} Mar 09 03:08:05 crc kubenswrapper[4901]: I0309 03:08:05.454251 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550428-b4t88" event={"ID":"00668526-212d-45e2-afa7-dcf90734ff5d","Type":"ContainerDied","Data":"acb619c4a4ce8c95a46e3ffe458f7aa9e1384cb8ffe430cd66cbc098c07b09f2"} Mar 09 03:08:05 crc kubenswrapper[4901]: I0309 03:08:05.454304 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acb619c4a4ce8c95a46e3ffe458f7aa9e1384cb8ffe430cd66cbc098c07b09f2" Mar 09 03:08:05 crc kubenswrapper[4901]: I0309 03:08:05.454407 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550428-b4t88" Mar 09 03:08:05 crc kubenswrapper[4901]: I0309 03:08:05.480786 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-k45jm" podStartSLOduration=3.057267966 podStartE2EDuration="4.480757007s" podCreationTimestamp="2026-03-09 03:08:01 +0000 UTC" firstStartedPulling="2026-03-09 03:08:03.403427889 +0000 UTC m=+1607.993091641" lastFinishedPulling="2026-03-09 03:08:04.82691695 +0000 UTC m=+1609.416580682" observedRunningTime="2026-03-09 03:08:05.475689789 +0000 UTC m=+1610.065353571" watchObservedRunningTime="2026-03-09 03:08:05.480757007 +0000 UTC m=+1610.070420779" Mar 09 03:08:05 crc kubenswrapper[4901]: I0309 03:08:05.511685 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550422-wg2vp"] Mar 09 03:08:05 crc kubenswrapper[4901]: I0309 03:08:05.519832 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550422-wg2vp"] Mar 09 03:08:06 crc kubenswrapper[4901]: I0309 03:08:06.156525 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f285768d-05c2-41f7-a3db-7c76d4df9fb8" path="/var/lib/kubelet/pods/f285768d-05c2-41f7-a3db-7c76d4df9fb8/volumes" Mar 09 03:08:11 crc kubenswrapper[4901]: I0309 03:08:11.758532 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-k45jm" Mar 09 03:08:11 crc kubenswrapper[4901]: I0309 03:08:11.759262 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-k45jm" Mar 09 03:08:11 crc kubenswrapper[4901]: I0309 03:08:11.828498 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-k45jm" Mar 09 03:08:12 crc kubenswrapper[4901]: I0309 03:08:12.590117 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-k45jm" Mar 09 03:08:12 crc kubenswrapper[4901]: I0309 03:08:12.652784 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k45jm"] Mar 09 03:08:14 crc kubenswrapper[4901]: I0309 03:08:14.544631 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-k45jm" podUID="18ae8ba8-a082-4b04-8116-69139cb76eaf" containerName="registry-server" containerID="cri-o://cc31d95b83c445f7f6b5b49bfb2296bd2a13f44becea13f7b580c0fa71bb20a3" gracePeriod=2 Mar 09 03:08:15 crc kubenswrapper[4901]: I0309 03:08:15.520388 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k45jm" Mar 09 03:08:15 crc kubenswrapper[4901]: I0309 03:08:15.557281 4901 generic.go:334] "Generic (PLEG): container finished" podID="18ae8ba8-a082-4b04-8116-69139cb76eaf" containerID="cc31d95b83c445f7f6b5b49bfb2296bd2a13f44becea13f7b580c0fa71bb20a3" exitCode=0 Mar 09 03:08:15 crc kubenswrapper[4901]: I0309 03:08:15.557348 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k45jm" event={"ID":"18ae8ba8-a082-4b04-8116-69139cb76eaf","Type":"ContainerDied","Data":"cc31d95b83c445f7f6b5b49bfb2296bd2a13f44becea13f7b580c0fa71bb20a3"} Mar 09 03:08:15 crc kubenswrapper[4901]: I0309 03:08:15.557387 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k45jm" event={"ID":"18ae8ba8-a082-4b04-8116-69139cb76eaf","Type":"ContainerDied","Data":"e0fa94a34d951b7d78a1509296080f221dfebd3a11695fc58f93cbb60bf909e1"} Mar 09 03:08:15 crc kubenswrapper[4901]: I0309 03:08:15.557415 4901 scope.go:117] "RemoveContainer" containerID="cc31d95b83c445f7f6b5b49bfb2296bd2a13f44becea13f7b580c0fa71bb20a3" Mar 09 03:08:15 crc kubenswrapper[4901]: I0309 03:08:15.557643 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k45jm" Mar 09 03:08:15 crc kubenswrapper[4901]: I0309 03:08:15.590259 4901 scope.go:117] "RemoveContainer" containerID="8def7afdcba15b96e76735013b243b23fd0a4f05e8b2af98d0e42dea49fb0b8f" Mar 09 03:08:15 crc kubenswrapper[4901]: I0309 03:08:15.608472 4901 scope.go:117] "RemoveContainer" containerID="36d1155bd0bff06d3842536ccf5df578962bbd77ed2194a9bb567e889069b67e" Mar 09 03:08:15 crc kubenswrapper[4901]: I0309 03:08:15.618322 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18ae8ba8-a082-4b04-8116-69139cb76eaf-utilities\") pod \"18ae8ba8-a082-4b04-8116-69139cb76eaf\" (UID: \"18ae8ba8-a082-4b04-8116-69139cb76eaf\") " Mar 09 03:08:15 crc kubenswrapper[4901]: I0309 03:08:15.618555 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcrlf\" (UniqueName: \"kubernetes.io/projected/18ae8ba8-a082-4b04-8116-69139cb76eaf-kube-api-access-tcrlf\") pod \"18ae8ba8-a082-4b04-8116-69139cb76eaf\" (UID: \"18ae8ba8-a082-4b04-8116-69139cb76eaf\") " Mar 09 03:08:15 crc kubenswrapper[4901]: I0309 03:08:15.618586 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18ae8ba8-a082-4b04-8116-69139cb76eaf-catalog-content\") pod \"18ae8ba8-a082-4b04-8116-69139cb76eaf\" (UID: \"18ae8ba8-a082-4b04-8116-69139cb76eaf\") " Mar 09 03:08:15 crc kubenswrapper[4901]: I0309 03:08:15.620635 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18ae8ba8-a082-4b04-8116-69139cb76eaf-utilities" (OuterVolumeSpecName: "utilities") pod "18ae8ba8-a082-4b04-8116-69139cb76eaf" (UID: "18ae8ba8-a082-4b04-8116-69139cb76eaf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:08:15 crc kubenswrapper[4901]: I0309 03:08:15.626967 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18ae8ba8-a082-4b04-8116-69139cb76eaf-kube-api-access-tcrlf" (OuterVolumeSpecName: "kube-api-access-tcrlf") pod "18ae8ba8-a082-4b04-8116-69139cb76eaf" (UID: "18ae8ba8-a082-4b04-8116-69139cb76eaf"). InnerVolumeSpecName "kube-api-access-tcrlf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:08:15 crc kubenswrapper[4901]: I0309 03:08:15.629441 4901 scope.go:117] "RemoveContainer" containerID="cc31d95b83c445f7f6b5b49bfb2296bd2a13f44becea13f7b580c0fa71bb20a3" Mar 09 03:08:15 crc kubenswrapper[4901]: E0309 03:08:15.629966 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc31d95b83c445f7f6b5b49bfb2296bd2a13f44becea13f7b580c0fa71bb20a3\": container with ID starting with cc31d95b83c445f7f6b5b49bfb2296bd2a13f44becea13f7b580c0fa71bb20a3 not found: ID does not exist" containerID="cc31d95b83c445f7f6b5b49bfb2296bd2a13f44becea13f7b580c0fa71bb20a3" Mar 09 03:08:15 crc kubenswrapper[4901]: I0309 03:08:15.630057 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc31d95b83c445f7f6b5b49bfb2296bd2a13f44becea13f7b580c0fa71bb20a3"} err="failed to get container status \"cc31d95b83c445f7f6b5b49bfb2296bd2a13f44becea13f7b580c0fa71bb20a3\": rpc error: code = NotFound desc = could not find container \"cc31d95b83c445f7f6b5b49bfb2296bd2a13f44becea13f7b580c0fa71bb20a3\": container with ID starting with cc31d95b83c445f7f6b5b49bfb2296bd2a13f44becea13f7b580c0fa71bb20a3 not found: ID does not exist" Mar 09 03:08:15 crc kubenswrapper[4901]: I0309 03:08:15.630130 4901 scope.go:117] "RemoveContainer" containerID="8def7afdcba15b96e76735013b243b23fd0a4f05e8b2af98d0e42dea49fb0b8f" Mar 09 03:08:15 crc kubenswrapper[4901]: E0309 03:08:15.630528 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8def7afdcba15b96e76735013b243b23fd0a4f05e8b2af98d0e42dea49fb0b8f\": container with ID starting with 8def7afdcba15b96e76735013b243b23fd0a4f05e8b2af98d0e42dea49fb0b8f not found: ID does not exist" containerID="8def7afdcba15b96e76735013b243b23fd0a4f05e8b2af98d0e42dea49fb0b8f" Mar 09 03:08:15 crc kubenswrapper[4901]: I0309 03:08:15.630604 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8def7afdcba15b96e76735013b243b23fd0a4f05e8b2af98d0e42dea49fb0b8f"} err="failed to get container status \"8def7afdcba15b96e76735013b243b23fd0a4f05e8b2af98d0e42dea49fb0b8f\": rpc error: code = NotFound desc = could not find container \"8def7afdcba15b96e76735013b243b23fd0a4f05e8b2af98d0e42dea49fb0b8f\": container with ID starting with 8def7afdcba15b96e76735013b243b23fd0a4f05e8b2af98d0e42dea49fb0b8f not found: ID does not exist" Mar 09 03:08:15 crc kubenswrapper[4901]: I0309 03:08:15.630669 4901 scope.go:117] "RemoveContainer" containerID="36d1155bd0bff06d3842536ccf5df578962bbd77ed2194a9bb567e889069b67e" Mar 09 03:08:15 crc kubenswrapper[4901]: E0309 03:08:15.631049 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36d1155bd0bff06d3842536ccf5df578962bbd77ed2194a9bb567e889069b67e\": container with ID starting with 36d1155bd0bff06d3842536ccf5df578962bbd77ed2194a9bb567e889069b67e not found: ID does not exist" containerID="36d1155bd0bff06d3842536ccf5df578962bbd77ed2194a9bb567e889069b67e" Mar 09 03:08:15 crc kubenswrapper[4901]: I0309 03:08:15.631126 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36d1155bd0bff06d3842536ccf5df578962bbd77ed2194a9bb567e889069b67e"} err="failed to get container status \"36d1155bd0bff06d3842536ccf5df578962bbd77ed2194a9bb567e889069b67e\": rpc error: code = NotFound desc = could not find container \"36d1155bd0bff06d3842536ccf5df578962bbd77ed2194a9bb567e889069b67e\": container with ID starting with 36d1155bd0bff06d3842536ccf5df578962bbd77ed2194a9bb567e889069b67e not found: ID does not exist" Mar 09 03:08:15 crc kubenswrapper[4901]: I0309 03:08:15.650130 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18ae8ba8-a082-4b04-8116-69139cb76eaf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "18ae8ba8-a082-4b04-8116-69139cb76eaf" (UID: "18ae8ba8-a082-4b04-8116-69139cb76eaf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:08:15 crc kubenswrapper[4901]: I0309 03:08:15.721442 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcrlf\" (UniqueName: \"kubernetes.io/projected/18ae8ba8-a082-4b04-8116-69139cb76eaf-kube-api-access-tcrlf\") on node \"crc\" DevicePath \"\"" Mar 09 03:08:15 crc kubenswrapper[4901]: I0309 03:08:15.721539 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18ae8ba8-a082-4b04-8116-69139cb76eaf-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 03:08:15 crc kubenswrapper[4901]: I0309 03:08:15.721628 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18ae8ba8-a082-4b04-8116-69139cb76eaf-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 03:08:15 crc kubenswrapper[4901]: I0309 03:08:15.924950 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k45jm"] Mar 09 03:08:15 crc kubenswrapper[4901]: I0309 03:08:15.930115 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-k45jm"] Mar 09 03:08:16 crc kubenswrapper[4901]: I0309 03:08:16.124719 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18ae8ba8-a082-4b04-8116-69139cb76eaf" path="/var/lib/kubelet/pods/18ae8ba8-a082-4b04-8116-69139cb76eaf/volumes" Mar 09 03:08:36 crc kubenswrapper[4901]: I0309 03:08:36.090993 4901 scope.go:117] "RemoveContainer" containerID="e864148fe534dd0649b1839ddd987f99c31ebbf9373117d1b9294d3a433bc88e" Mar 09 03:08:36 crc kubenswrapper[4901]: I0309 03:08:36.159463 4901 scope.go:117] "RemoveContainer" containerID="6a0b0aaad45a66c9809f0281931099b6d19002aa141c6a0eada8f3fbaa72fd76" Mar 09 03:08:36 crc kubenswrapper[4901]: I0309 03:08:36.222525 4901 scope.go:117] "RemoveContainer" containerID="64204cefd4cbe01f276c43b9ac47b50629c25ba8c97f8eb560808222b34c0134" Mar 09 03:08:36 crc kubenswrapper[4901]: I0309 03:08:36.252458 4901 scope.go:117] "RemoveContainer" containerID="c237a3dc031b13f8d0620e7b2419fa1f1ad071510e6313e26b966db949a54e0d" Mar 09 03:08:36 crc kubenswrapper[4901]: I0309 03:08:36.279182 4901 scope.go:117] "RemoveContainer" containerID="6994cdf17ee7689665487d835e40572e3f243ef6d5aebacb4cd6c596e686a753" Mar 09 03:08:36 crc kubenswrapper[4901]: I0309 03:08:36.320455 4901 scope.go:117] "RemoveContainer" containerID="6fb0dbd3461b344fda5d6ea3fedec48538094ee2e81dcd7ca011d50e8e49ef2a" Mar 09 03:08:36 crc kubenswrapper[4901]: I0309 03:08:36.382487 4901 scope.go:117] "RemoveContainer" containerID="9cf43734be8e6dc3ec33718d683ec770eaf32c81e43d5e3570abc6248a66f417" Mar 09 03:08:36 crc kubenswrapper[4901]: I0309 03:08:36.406271 4901 scope.go:117] "RemoveContainer" containerID="1318be7d0c979674383ef3ff96fcc0e4c7a31c99f6b2aff5bad67a5d745786d1" Mar 09 03:08:36 crc kubenswrapper[4901]: I0309 03:08:36.442808 4901 scope.go:117] "RemoveContainer" containerID="56f5155451bdd26f36bf3cf5c74566c873dff320677eab6537898737b39650c4" Mar 09 03:08:36 crc kubenswrapper[4901]: I0309 03:08:36.480358 4901 scope.go:117] "RemoveContainer" containerID="1d53ed824e5bcad8a60829f5bcbcafb26fcda34318175adc46a6fe47e190e181" Mar 09 03:08:36 crc kubenswrapper[4901]: I0309 03:08:36.506030 4901 scope.go:117] "RemoveContainer" containerID="dc3a599575b60a21c71f623e29f5f5b5582a07c90c9238d7f5e671558262e0e8" Mar 09 03:08:36 crc kubenswrapper[4901]: I0309 03:08:36.536279 4901 scope.go:117] "RemoveContainer" containerID="ea76e472f757b16bd652388c5e4764ba583b8796f84a9249bfa120a81030e5e8" Mar 09 03:08:36 crc kubenswrapper[4901]: I0309 03:08:36.556970 4901 scope.go:117] "RemoveContainer" containerID="0bbad5de2a9568ae31ba845734a4840ec69ea84dc62b4ce66b1d97a4dc0ceae5" Mar 09 03:08:36 crc kubenswrapper[4901]: I0309 03:08:36.601569 4901 scope.go:117] "RemoveContainer" containerID="02af839290485dbd8e363b21c0ec495ab229698aebd9a2373b229156c393f2f6" Mar 09 03:08:36 crc kubenswrapper[4901]: I0309 03:08:36.644885 4901 scope.go:117] "RemoveContainer" containerID="b061f4aaf549eb86d1d7efe612297014d2cec92cbdf9b8e77e24149dce0551ac" Mar 09 03:08:36 crc kubenswrapper[4901]: I0309 03:08:36.675246 4901 scope.go:117] "RemoveContainer" containerID="ec8bd32114b4f887e1e946b40b5745464a56078a3296b072c0682cc38da9fb6e" Mar 09 03:08:36 crc kubenswrapper[4901]: I0309 03:08:36.726136 4901 scope.go:117] "RemoveContainer" containerID="a252bc804dc69a7818b1fc5403d3bbbeeb8a07502d697a31a00f9fffeffc0dc3" Mar 09 03:08:36 crc kubenswrapper[4901]: I0309 03:08:36.753982 4901 scope.go:117] "RemoveContainer" containerID="ca232f3af6edff5f71fbe5a88099baa97cdd016953afd10711e2cf0859f5d72e" Mar 09 03:09:37 crc kubenswrapper[4901]: I0309 03:09:37.175246 4901 scope.go:117] "RemoveContainer" containerID="d15cb14ee1f318a86895da38e06143089829e2c3813d3c07043aacb028b2aada" Mar 09 03:09:37 crc kubenswrapper[4901]: I0309 03:09:37.213376 4901 scope.go:117] "RemoveContainer" containerID="cfc47f8beb8ecefcfac0787d59cd1dc8d3416566c1269b0f491dd5ea14608964" Mar 09 03:09:37 crc kubenswrapper[4901]: I0309 03:09:37.239841 4901 scope.go:117] "RemoveContainer" containerID="e1228ba42fc90c0cde6de4a3ef427e708441ad24f194f71b8f084075c75abf93" Mar 09 03:09:37 crc kubenswrapper[4901]: I0309 03:09:37.266079 4901 scope.go:117] "RemoveContainer" containerID="f20698a7666281d2b7b17ad3948cf16cb7044c99d7b7d9bd908fc1bc3eea7a32" Mar 09 03:09:37 crc kubenswrapper[4901]: I0309 03:09:37.301097 4901 scope.go:117] "RemoveContainer" containerID="d79cab62bb0112d2d186528ed06fc210871e18ac819e1d659adeeda8aaae0c07" Mar 09 03:09:37 crc kubenswrapper[4901]: I0309 03:09:37.331132 4901 scope.go:117] "RemoveContainer" containerID="ff6b196fb187bea817fb3de6278431b1813fe38037e7088e05efb7c067276b2c" Mar 09 03:09:37 crc kubenswrapper[4901]: I0309 03:09:37.356082 4901 scope.go:117] "RemoveContainer" containerID="bc2e0443e15be5cd6de9ba14dfcdf841d6b447a932b62328a61c9eff56f7aba6" Mar 09 03:10:00 crc kubenswrapper[4901]: I0309 03:10:00.169613 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550430-t9p6m"] Mar 09 03:10:00 crc kubenswrapper[4901]: E0309 03:10:00.171106 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18ae8ba8-a082-4b04-8116-69139cb76eaf" containerName="registry-server" Mar 09 03:10:00 crc kubenswrapper[4901]: I0309 03:10:00.171137 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="18ae8ba8-a082-4b04-8116-69139cb76eaf" containerName="registry-server" Mar 09 03:10:00 crc kubenswrapper[4901]: E0309 03:10:00.171176 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18ae8ba8-a082-4b04-8116-69139cb76eaf" containerName="extract-utilities" Mar 09 03:10:00 crc kubenswrapper[4901]: I0309 03:10:00.171196 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="18ae8ba8-a082-4b04-8116-69139cb76eaf" containerName="extract-utilities" Mar 09 03:10:00 crc kubenswrapper[4901]: E0309 03:10:00.171326 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18ae8ba8-a082-4b04-8116-69139cb76eaf" containerName="extract-content" Mar 09 03:10:00 crc kubenswrapper[4901]: I0309 03:10:00.171351 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="18ae8ba8-a082-4b04-8116-69139cb76eaf" containerName="extract-content" Mar 09 03:10:00 crc kubenswrapper[4901]: E0309 03:10:00.171390 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00668526-212d-45e2-afa7-dcf90734ff5d" containerName="oc" Mar 09 03:10:00 crc kubenswrapper[4901]: I0309 03:10:00.171408 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="00668526-212d-45e2-afa7-dcf90734ff5d" containerName="oc" Mar 09 03:10:00 crc kubenswrapper[4901]: I0309 03:10:00.171793 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="00668526-212d-45e2-afa7-dcf90734ff5d" containerName="oc" Mar 09 03:10:00 crc kubenswrapper[4901]: I0309 03:10:00.171858 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="18ae8ba8-a082-4b04-8116-69139cb76eaf" containerName="registry-server" Mar 09 03:10:00 crc kubenswrapper[4901]: I0309 03:10:00.172886 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550430-t9p6m" Mar 09 03:10:00 crc kubenswrapper[4901]: I0309 03:10:00.175998 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lq6vs" Mar 09 03:10:00 crc kubenswrapper[4901]: I0309 03:10:00.178728 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 03:10:00 crc kubenswrapper[4901]: I0309 03:10:00.183466 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 03:10:00 crc kubenswrapper[4901]: I0309 03:10:00.185605 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550430-t9p6m"] Mar 09 03:10:00 crc kubenswrapper[4901]: I0309 03:10:00.216413 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crwbd\" (UniqueName: \"kubernetes.io/projected/5251665d-9f46-4d5a-8c20-0962766e0f84-kube-api-access-crwbd\") pod \"auto-csr-approver-29550430-t9p6m\" (UID: \"5251665d-9f46-4d5a-8c20-0962766e0f84\") " pod="openshift-infra/auto-csr-approver-29550430-t9p6m" Mar 09 03:10:00 crc kubenswrapper[4901]: I0309 03:10:00.318478 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crwbd\" (UniqueName: \"kubernetes.io/projected/5251665d-9f46-4d5a-8c20-0962766e0f84-kube-api-access-crwbd\") pod \"auto-csr-approver-29550430-t9p6m\" (UID: \"5251665d-9f46-4d5a-8c20-0962766e0f84\") " pod="openshift-infra/auto-csr-approver-29550430-t9p6m" Mar 09 03:10:00 crc kubenswrapper[4901]: I0309 03:10:00.342191 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crwbd\" (UniqueName: \"kubernetes.io/projected/5251665d-9f46-4d5a-8c20-0962766e0f84-kube-api-access-crwbd\") pod \"auto-csr-approver-29550430-t9p6m\" (UID: \"5251665d-9f46-4d5a-8c20-0962766e0f84\") " pod="openshift-infra/auto-csr-approver-29550430-t9p6m" Mar 09 03:10:00 crc kubenswrapper[4901]: I0309 03:10:00.507281 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550430-t9p6m" Mar 09 03:10:00 crc kubenswrapper[4901]: I0309 03:10:00.862900 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 03:10:00 crc kubenswrapper[4901]: I0309 03:10:00.862961 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 03:10:01 crc kubenswrapper[4901]: I0309 03:10:01.034187 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550430-t9p6m"] Mar 09 03:10:01 crc kubenswrapper[4901]: I0309 03:10:01.748673 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550430-t9p6m" event={"ID":"5251665d-9f46-4d5a-8c20-0962766e0f84","Type":"ContainerStarted","Data":"b74a3eb7ccc050775fa232de371dcd102f0946d8ef4f5d5d531471652ea8396b"} Mar 09 03:10:02 crc kubenswrapper[4901]: I0309 03:10:02.759856 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550430-t9p6m" event={"ID":"5251665d-9f46-4d5a-8c20-0962766e0f84","Type":"ContainerStarted","Data":"09331b7dd3de1dc5e49fc358cbe46ab76e0d17711be3c5ac553b513458f833ab"} Mar 09 03:10:02 crc kubenswrapper[4901]: I0309 03:10:02.785702 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550430-t9p6m" podStartSLOduration=1.584213456 podStartE2EDuration="2.785676902s" podCreationTimestamp="2026-03-09 03:10:00 +0000 UTC" firstStartedPulling="2026-03-09 03:10:01.04587327 +0000 UTC m=+1725.635537042" lastFinishedPulling="2026-03-09 03:10:02.247336716 +0000 UTC m=+1726.837000488" observedRunningTime="2026-03-09 03:10:02.781546937 +0000 UTC m=+1727.371210709" watchObservedRunningTime="2026-03-09 03:10:02.785676902 +0000 UTC m=+1727.375340644" Mar 09 03:10:03 crc kubenswrapper[4901]: I0309 03:10:03.773642 4901 generic.go:334] "Generic (PLEG): container finished" podID="5251665d-9f46-4d5a-8c20-0962766e0f84" containerID="09331b7dd3de1dc5e49fc358cbe46ab76e0d17711be3c5ac553b513458f833ab" exitCode=0 Mar 09 03:10:03 crc kubenswrapper[4901]: I0309 03:10:03.773703 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550430-t9p6m" event={"ID":"5251665d-9f46-4d5a-8c20-0962766e0f84","Type":"ContainerDied","Data":"09331b7dd3de1dc5e49fc358cbe46ab76e0d17711be3c5ac553b513458f833ab"} Mar 09 03:10:05 crc kubenswrapper[4901]: I0309 03:10:05.172981 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550430-t9p6m" Mar 09 03:10:05 crc kubenswrapper[4901]: I0309 03:10:05.208889 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crwbd\" (UniqueName: \"kubernetes.io/projected/5251665d-9f46-4d5a-8c20-0962766e0f84-kube-api-access-crwbd\") pod \"5251665d-9f46-4d5a-8c20-0962766e0f84\" (UID: \"5251665d-9f46-4d5a-8c20-0962766e0f84\") " Mar 09 03:10:05 crc kubenswrapper[4901]: I0309 03:10:05.219603 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5251665d-9f46-4d5a-8c20-0962766e0f84-kube-api-access-crwbd" (OuterVolumeSpecName: "kube-api-access-crwbd") pod "5251665d-9f46-4d5a-8c20-0962766e0f84" (UID: "5251665d-9f46-4d5a-8c20-0962766e0f84"). InnerVolumeSpecName "kube-api-access-crwbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:10:05 crc kubenswrapper[4901]: I0309 03:10:05.310914 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crwbd\" (UniqueName: \"kubernetes.io/projected/5251665d-9f46-4d5a-8c20-0962766e0f84-kube-api-access-crwbd\") on node \"crc\" DevicePath \"\"" Mar 09 03:10:05 crc kubenswrapper[4901]: I0309 03:10:05.801545 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550430-t9p6m" event={"ID":"5251665d-9f46-4d5a-8c20-0962766e0f84","Type":"ContainerDied","Data":"b74a3eb7ccc050775fa232de371dcd102f0946d8ef4f5d5d531471652ea8396b"} Mar 09 03:10:05 crc kubenswrapper[4901]: I0309 03:10:05.802008 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b74a3eb7ccc050775fa232de371dcd102f0946d8ef4f5d5d531471652ea8396b" Mar 09 03:10:05 crc kubenswrapper[4901]: I0309 03:10:05.801678 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550430-t9p6m" Mar 09 03:10:05 crc kubenswrapper[4901]: I0309 03:10:05.877261 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550424-rt4jl"] Mar 09 03:10:05 crc kubenswrapper[4901]: I0309 03:10:05.885379 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550424-rt4jl"] Mar 09 03:10:06 crc kubenswrapper[4901]: E0309 03:10:06.005611 4901 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5251665d_9f46_4d5a_8c20_0962766e0f84.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5251665d_9f46_4d5a_8c20_0962766e0f84.slice/crio-b74a3eb7ccc050775fa232de371dcd102f0946d8ef4f5d5d531471652ea8396b\": RecentStats: unable to find data in memory cache]" Mar 09 03:10:06 crc kubenswrapper[4901]: I0309 03:10:06.125091 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="725ee9e1-68e0-4756-bfc7-e4d209aeeae8" path="/var/lib/kubelet/pods/725ee9e1-68e0-4756-bfc7-e4d209aeeae8/volumes" Mar 09 03:10:30 crc kubenswrapper[4901]: I0309 03:10:30.862674 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 03:10:30 crc kubenswrapper[4901]: I0309 03:10:30.863314 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 03:10:37 crc kubenswrapper[4901]: I0309 03:10:37.518037 4901 scope.go:117] "RemoveContainer" containerID="ff692325b2a193aa8217a1042192281dc1b97f1fb55180f2002c7c09b3a9f731" Mar 09 03:10:37 crc kubenswrapper[4901]: I0309 03:10:37.554408 4901 scope.go:117] "RemoveContainer" containerID="7eb050b8062d25a6a16cf0cb4f42df92e684eebb712dff6d0628185b0a853346" Mar 09 03:10:37 crc kubenswrapper[4901]: I0309 03:10:37.623847 4901 scope.go:117] "RemoveContainer" containerID="2023e3fb3e8f7021afadc49905f2c75c9b97db5b0d3ac7172346d27ce1b5d2f7" Mar 09 03:10:37 crc kubenswrapper[4901]: I0309 03:10:37.651074 4901 scope.go:117] "RemoveContainer" containerID="5d9e5ccec85a35a04d0a76fa028d5ad72a893c23a40e4f14f4d92ce0a8b5962c" Mar 09 03:10:37 crc kubenswrapper[4901]: I0309 03:10:37.708974 4901 scope.go:117] "RemoveContainer" containerID="7323f3ec790c22a5c15492d0b747ce5e981ed6529f8b38178d47eb18686cb80a" Mar 09 03:11:00 crc kubenswrapper[4901]: I0309 03:11:00.862856 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 03:11:00 crc kubenswrapper[4901]: I0309 03:11:00.863661 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 03:11:00 crc kubenswrapper[4901]: I0309 03:11:00.863736 4901 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5c998" Mar 09 03:11:00 crc kubenswrapper[4901]: I0309 03:11:00.864591 4901 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"324d191a9a6cf1d15e57a90535c15f4180b030663e41d38d4e613a6168e3a1a2"} pod="openshift-machine-config-operator/machine-config-daemon-5c998" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 03:11:00 crc kubenswrapper[4901]: I0309 03:11:00.864730 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" containerID="cri-o://324d191a9a6cf1d15e57a90535c15f4180b030663e41d38d4e613a6168e3a1a2" gracePeriod=600 Mar 09 03:11:00 crc kubenswrapper[4901]: E0309 03:11:00.991594 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:11:01 crc kubenswrapper[4901]: I0309 03:11:01.323549 4901 generic.go:334] "Generic (PLEG): container finished" podID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerID="324d191a9a6cf1d15e57a90535c15f4180b030663e41d38d4e613a6168e3a1a2" exitCode=0 Mar 09 03:11:01 crc kubenswrapper[4901]: I0309 03:11:01.323611 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" event={"ID":"65e722e8-52c4-4bb6-9927-f378b2f7296a","Type":"ContainerDied","Data":"324d191a9a6cf1d15e57a90535c15f4180b030663e41d38d4e613a6168e3a1a2"} Mar 09 03:11:01 crc kubenswrapper[4901]: I0309 03:11:01.323664 4901 scope.go:117] "RemoveContainer" containerID="e3004d260bc17a7df9a3f09f9c3fb88b56d94af0a91dbe7f057c714451b1f515" Mar 09 03:11:01 crc kubenswrapper[4901]: I0309 03:11:01.324485 4901 scope.go:117] "RemoveContainer" containerID="324d191a9a6cf1d15e57a90535c15f4180b030663e41d38d4e613a6168e3a1a2" Mar 09 03:11:01 crc kubenswrapper[4901]: E0309 03:11:01.325364 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:11:15 crc kubenswrapper[4901]: I0309 03:11:15.106043 4901 scope.go:117] "RemoveContainer" containerID="324d191a9a6cf1d15e57a90535c15f4180b030663e41d38d4e613a6168e3a1a2" Mar 09 03:11:15 crc kubenswrapper[4901]: E0309 03:11:15.107141 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:11:30 crc kubenswrapper[4901]: I0309 03:11:30.106804 4901 scope.go:117] "RemoveContainer" containerID="324d191a9a6cf1d15e57a90535c15f4180b030663e41d38d4e613a6168e3a1a2" Mar 09 03:11:30 crc kubenswrapper[4901]: E0309 03:11:30.107843 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:11:37 crc kubenswrapper[4901]: I0309 03:11:37.834099 4901 scope.go:117] "RemoveContainer" containerID="fe08a232bade973666189357aa00dd8de0649590396e21fa9caeeccc1270a3ef" Mar 09 03:11:43 crc kubenswrapper[4901]: I0309 03:11:43.106835 4901 scope.go:117] "RemoveContainer" containerID="324d191a9a6cf1d15e57a90535c15f4180b030663e41d38d4e613a6168e3a1a2" Mar 09 03:11:43 crc kubenswrapper[4901]: E0309 03:11:43.107936 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:11:58 crc kubenswrapper[4901]: I0309 03:11:58.106651 4901 scope.go:117] "RemoveContainer" containerID="324d191a9a6cf1d15e57a90535c15f4180b030663e41d38d4e613a6168e3a1a2" Mar 09 03:11:58 crc kubenswrapper[4901]: E0309 03:11:58.107742 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:12:00 crc kubenswrapper[4901]: I0309 03:12:00.168115 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550432-rb2kd"] Mar 09 03:12:00 crc kubenswrapper[4901]: E0309 03:12:00.169034 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5251665d-9f46-4d5a-8c20-0962766e0f84" containerName="oc" Mar 09 03:12:00 crc kubenswrapper[4901]: I0309 03:12:00.169061 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="5251665d-9f46-4d5a-8c20-0962766e0f84" containerName="oc" Mar 09 03:12:00 crc kubenswrapper[4901]: I0309 03:12:00.169360 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="5251665d-9f46-4d5a-8c20-0962766e0f84" containerName="oc" Mar 09 03:12:00 crc kubenswrapper[4901]: I0309 03:12:00.170355 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550432-rb2kd" Mar 09 03:12:00 crc kubenswrapper[4901]: I0309 03:12:00.173510 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 03:12:00 crc kubenswrapper[4901]: I0309 03:12:00.173569 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 03:12:00 crc kubenswrapper[4901]: I0309 03:12:00.174307 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lq6vs" Mar 09 03:12:00 crc kubenswrapper[4901]: I0309 03:12:00.184844 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550432-rb2kd"] Mar 09 03:12:00 crc kubenswrapper[4901]: I0309 03:12:00.300538 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cx4m2\" (UniqueName: \"kubernetes.io/projected/594dbe7d-afcd-4710-be6a-631e834c2614-kube-api-access-cx4m2\") pod \"auto-csr-approver-29550432-rb2kd\" (UID: \"594dbe7d-afcd-4710-be6a-631e834c2614\") " pod="openshift-infra/auto-csr-approver-29550432-rb2kd" Mar 09 03:12:00 crc kubenswrapper[4901]: I0309 03:12:00.402723 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cx4m2\" (UniqueName: \"kubernetes.io/projected/594dbe7d-afcd-4710-be6a-631e834c2614-kube-api-access-cx4m2\") pod \"auto-csr-approver-29550432-rb2kd\" (UID: \"594dbe7d-afcd-4710-be6a-631e834c2614\") " pod="openshift-infra/auto-csr-approver-29550432-rb2kd" Mar 09 03:12:00 crc kubenswrapper[4901]: I0309 03:12:00.437013 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cx4m2\" (UniqueName: \"kubernetes.io/projected/594dbe7d-afcd-4710-be6a-631e834c2614-kube-api-access-cx4m2\") pod \"auto-csr-approver-29550432-rb2kd\" (UID: \"594dbe7d-afcd-4710-be6a-631e834c2614\") " pod="openshift-infra/auto-csr-approver-29550432-rb2kd" Mar 09 03:12:00 crc kubenswrapper[4901]: I0309 03:12:00.505944 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550432-rb2kd" Mar 09 03:12:00 crc kubenswrapper[4901]: I0309 03:12:00.815417 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550432-rb2kd"] Mar 09 03:12:00 crc kubenswrapper[4901]: I0309 03:12:00.823834 4901 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 03:12:00 crc kubenswrapper[4901]: I0309 03:12:00.878424 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550432-rb2kd" event={"ID":"594dbe7d-afcd-4710-be6a-631e834c2614","Type":"ContainerStarted","Data":"502a9c1da91b81c13fa6dfbd2a7710c2deafc9c9a0aca8ce6cac5b7fc119bb0b"} Mar 09 03:12:02 crc kubenswrapper[4901]: I0309 03:12:02.904066 4901 generic.go:334] "Generic (PLEG): container finished" podID="594dbe7d-afcd-4710-be6a-631e834c2614" containerID="97870cd09e2cc5eb34a06e20679ab245e08203e1d8392052533ec378e5319d9a" exitCode=0 Mar 09 03:12:02 crc kubenswrapper[4901]: I0309 03:12:02.904192 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550432-rb2kd" event={"ID":"594dbe7d-afcd-4710-be6a-631e834c2614","Type":"ContainerDied","Data":"97870cd09e2cc5eb34a06e20679ab245e08203e1d8392052533ec378e5319d9a"} Mar 09 03:12:04 crc kubenswrapper[4901]: I0309 03:12:04.322719 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550432-rb2kd" Mar 09 03:12:04 crc kubenswrapper[4901]: I0309 03:12:04.473055 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cx4m2\" (UniqueName: \"kubernetes.io/projected/594dbe7d-afcd-4710-be6a-631e834c2614-kube-api-access-cx4m2\") pod \"594dbe7d-afcd-4710-be6a-631e834c2614\" (UID: \"594dbe7d-afcd-4710-be6a-631e834c2614\") " Mar 09 03:12:04 crc kubenswrapper[4901]: I0309 03:12:04.481012 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/594dbe7d-afcd-4710-be6a-631e834c2614-kube-api-access-cx4m2" (OuterVolumeSpecName: "kube-api-access-cx4m2") pod "594dbe7d-afcd-4710-be6a-631e834c2614" (UID: "594dbe7d-afcd-4710-be6a-631e834c2614"). InnerVolumeSpecName "kube-api-access-cx4m2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:12:04 crc kubenswrapper[4901]: I0309 03:12:04.574662 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cx4m2\" (UniqueName: \"kubernetes.io/projected/594dbe7d-afcd-4710-be6a-631e834c2614-kube-api-access-cx4m2\") on node \"crc\" DevicePath \"\"" Mar 09 03:12:04 crc kubenswrapper[4901]: I0309 03:12:04.785301 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-knmtq"] Mar 09 03:12:04 crc kubenswrapper[4901]: E0309 03:12:04.785773 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="594dbe7d-afcd-4710-be6a-631e834c2614" containerName="oc" Mar 09 03:12:04 crc kubenswrapper[4901]: I0309 03:12:04.785801 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="594dbe7d-afcd-4710-be6a-631e834c2614" containerName="oc" Mar 09 03:12:04 crc kubenswrapper[4901]: I0309 03:12:04.786157 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="594dbe7d-afcd-4710-be6a-631e834c2614" containerName="oc" Mar 09 03:12:04 crc kubenswrapper[4901]: I0309 03:12:04.787940 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-knmtq" Mar 09 03:12:04 crc kubenswrapper[4901]: I0309 03:12:04.797493 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-knmtq"] Mar 09 03:12:04 crc kubenswrapper[4901]: I0309 03:12:04.878758 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/192cb251-df07-4653-b054-1e606fe20f31-utilities\") pod \"community-operators-knmtq\" (UID: \"192cb251-df07-4653-b054-1e606fe20f31\") " pod="openshift-marketplace/community-operators-knmtq" Mar 09 03:12:04 crc kubenswrapper[4901]: I0309 03:12:04.878852 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzkq2\" (UniqueName: \"kubernetes.io/projected/192cb251-df07-4653-b054-1e606fe20f31-kube-api-access-fzkq2\") pod \"community-operators-knmtq\" (UID: \"192cb251-df07-4653-b054-1e606fe20f31\") " pod="openshift-marketplace/community-operators-knmtq" Mar 09 03:12:04 crc kubenswrapper[4901]: I0309 03:12:04.878915 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/192cb251-df07-4653-b054-1e606fe20f31-catalog-content\") pod \"community-operators-knmtq\" (UID: \"192cb251-df07-4653-b054-1e606fe20f31\") " pod="openshift-marketplace/community-operators-knmtq" Mar 09 03:12:04 crc kubenswrapper[4901]: I0309 03:12:04.921207 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550432-rb2kd" event={"ID":"594dbe7d-afcd-4710-be6a-631e834c2614","Type":"ContainerDied","Data":"502a9c1da91b81c13fa6dfbd2a7710c2deafc9c9a0aca8ce6cac5b7fc119bb0b"} Mar 09 03:12:04 crc kubenswrapper[4901]: I0309 03:12:04.921277 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="502a9c1da91b81c13fa6dfbd2a7710c2deafc9c9a0aca8ce6cac5b7fc119bb0b" Mar 09 03:12:04 crc kubenswrapper[4901]: I0309 03:12:04.921303 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550432-rb2kd" Mar 09 03:12:04 crc kubenswrapper[4901]: I0309 03:12:04.977429 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-f2h5l"] Mar 09 03:12:04 crc kubenswrapper[4901]: I0309 03:12:04.979105 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f2h5l" Mar 09 03:12:04 crc kubenswrapper[4901]: I0309 03:12:04.979904 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/192cb251-df07-4653-b054-1e606fe20f31-catalog-content\") pod \"community-operators-knmtq\" (UID: \"192cb251-df07-4653-b054-1e606fe20f31\") " pod="openshift-marketplace/community-operators-knmtq" Mar 09 03:12:04 crc kubenswrapper[4901]: I0309 03:12:04.979972 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/192cb251-df07-4653-b054-1e606fe20f31-utilities\") pod \"community-operators-knmtq\" (UID: \"192cb251-df07-4653-b054-1e606fe20f31\") " pod="openshift-marketplace/community-operators-knmtq" Mar 09 03:12:04 crc kubenswrapper[4901]: I0309 03:12:04.980121 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzkq2\" (UniqueName: \"kubernetes.io/projected/192cb251-df07-4653-b054-1e606fe20f31-kube-api-access-fzkq2\") pod \"community-operators-knmtq\" (UID: \"192cb251-df07-4653-b054-1e606fe20f31\") " pod="openshift-marketplace/community-operators-knmtq" Mar 09 03:12:04 crc kubenswrapper[4901]: I0309 03:12:04.980703 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/192cb251-df07-4653-b054-1e606fe20f31-catalog-content\") pod \"community-operators-knmtq\" (UID: \"192cb251-df07-4653-b054-1e606fe20f31\") " pod="openshift-marketplace/community-operators-knmtq" Mar 09 03:12:04 crc kubenswrapper[4901]: I0309 03:12:04.980777 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/192cb251-df07-4653-b054-1e606fe20f31-utilities\") pod \"community-operators-knmtq\" (UID: \"192cb251-df07-4653-b054-1e606fe20f31\") " pod="openshift-marketplace/community-operators-knmtq" Mar 09 03:12:04 crc kubenswrapper[4901]: I0309 03:12:04.997258 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f2h5l"] Mar 09 03:12:05 crc kubenswrapper[4901]: I0309 03:12:05.009979 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzkq2\" (UniqueName: \"kubernetes.io/projected/192cb251-df07-4653-b054-1e606fe20f31-kube-api-access-fzkq2\") pod \"community-operators-knmtq\" (UID: \"192cb251-df07-4653-b054-1e606fe20f31\") " pod="openshift-marketplace/community-operators-knmtq" Mar 09 03:12:05 crc kubenswrapper[4901]: I0309 03:12:05.081996 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0c4a0f9-7822-43a9-ab69-f42a567fb9cd-utilities\") pod \"certified-operators-f2h5l\" (UID: \"e0c4a0f9-7822-43a9-ab69-f42a567fb9cd\") " pod="openshift-marketplace/certified-operators-f2h5l" Mar 09 03:12:05 crc kubenswrapper[4901]: I0309 03:12:05.082080 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0c4a0f9-7822-43a9-ab69-f42a567fb9cd-catalog-content\") pod \"certified-operators-f2h5l\" (UID: \"e0c4a0f9-7822-43a9-ab69-f42a567fb9cd\") " pod="openshift-marketplace/certified-operators-f2h5l" Mar 09 03:12:05 crc kubenswrapper[4901]: I0309 03:12:05.082116 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkt6r\" (UniqueName: \"kubernetes.io/projected/e0c4a0f9-7822-43a9-ab69-f42a567fb9cd-kube-api-access-tkt6r\") pod \"certified-operators-f2h5l\" (UID: \"e0c4a0f9-7822-43a9-ab69-f42a567fb9cd\") " pod="openshift-marketplace/certified-operators-f2h5l" Mar 09 03:12:05 crc kubenswrapper[4901]: I0309 03:12:05.128908 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-knmtq" Mar 09 03:12:05 crc kubenswrapper[4901]: I0309 03:12:05.183504 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkt6r\" (UniqueName: \"kubernetes.io/projected/e0c4a0f9-7822-43a9-ab69-f42a567fb9cd-kube-api-access-tkt6r\") pod \"certified-operators-f2h5l\" (UID: \"e0c4a0f9-7822-43a9-ab69-f42a567fb9cd\") " pod="openshift-marketplace/certified-operators-f2h5l" Mar 09 03:12:05 crc kubenswrapper[4901]: I0309 03:12:05.184538 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0c4a0f9-7822-43a9-ab69-f42a567fb9cd-utilities\") pod \"certified-operators-f2h5l\" (UID: \"e0c4a0f9-7822-43a9-ab69-f42a567fb9cd\") " pod="openshift-marketplace/certified-operators-f2h5l" Mar 09 03:12:05 crc kubenswrapper[4901]: I0309 03:12:05.184703 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0c4a0f9-7822-43a9-ab69-f42a567fb9cd-catalog-content\") pod \"certified-operators-f2h5l\" (UID: \"e0c4a0f9-7822-43a9-ab69-f42a567fb9cd\") " pod="openshift-marketplace/certified-operators-f2h5l" Mar 09 03:12:05 crc kubenswrapper[4901]: I0309 03:12:05.185461 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0c4a0f9-7822-43a9-ab69-f42a567fb9cd-catalog-content\") pod \"certified-operators-f2h5l\" (UID: \"e0c4a0f9-7822-43a9-ab69-f42a567fb9cd\") " pod="openshift-marketplace/certified-operators-f2h5l" Mar 09 03:12:05 crc kubenswrapper[4901]: I0309 03:12:05.188952 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0c4a0f9-7822-43a9-ab69-f42a567fb9cd-utilities\") pod \"certified-operators-f2h5l\" (UID: \"e0c4a0f9-7822-43a9-ab69-f42a567fb9cd\") " pod="openshift-marketplace/certified-operators-f2h5l" Mar 09 03:12:05 crc kubenswrapper[4901]: I0309 03:12:05.201461 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkt6r\" (UniqueName: \"kubernetes.io/projected/e0c4a0f9-7822-43a9-ab69-f42a567fb9cd-kube-api-access-tkt6r\") pod \"certified-operators-f2h5l\" (UID: \"e0c4a0f9-7822-43a9-ab69-f42a567fb9cd\") " pod="openshift-marketplace/certified-operators-f2h5l" Mar 09 03:12:05 crc kubenswrapper[4901]: I0309 03:12:05.299552 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f2h5l" Mar 09 03:12:05 crc kubenswrapper[4901]: I0309 03:12:05.433886 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550426-wkjmq"] Mar 09 03:12:05 crc kubenswrapper[4901]: I0309 03:12:05.439289 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550426-wkjmq"] Mar 09 03:12:05 crc kubenswrapper[4901]: I0309 03:12:05.621103 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f2h5l"] Mar 09 03:12:05 crc kubenswrapper[4901]: W0309 03:12:05.628437 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0c4a0f9_7822_43a9_ab69_f42a567fb9cd.slice/crio-bf787350ba407eb7629d92cf6482d5dc906e27f9ef98f19eaef6252cdb666397 WatchSource:0}: Error finding container bf787350ba407eb7629d92cf6482d5dc906e27f9ef98f19eaef6252cdb666397: Status 404 returned error can't find the container with id bf787350ba407eb7629d92cf6482d5dc906e27f9ef98f19eaef6252cdb666397 Mar 09 03:12:05 crc kubenswrapper[4901]: I0309 03:12:05.671037 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-knmtq"] Mar 09 03:12:05 crc kubenswrapper[4901]: I0309 03:12:05.929141 4901 generic.go:334] "Generic (PLEG): container finished" podID="e0c4a0f9-7822-43a9-ab69-f42a567fb9cd" containerID="7aefbf7d59031c9b6447aefc7366e4a8b281fc5fb507da6a96f2e9312c7f3951" exitCode=0 Mar 09 03:12:05 crc kubenswrapper[4901]: I0309 03:12:05.929253 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f2h5l" event={"ID":"e0c4a0f9-7822-43a9-ab69-f42a567fb9cd","Type":"ContainerDied","Data":"7aefbf7d59031c9b6447aefc7366e4a8b281fc5fb507da6a96f2e9312c7f3951"} Mar 09 03:12:05 crc kubenswrapper[4901]: I0309 03:12:05.929283 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f2h5l" event={"ID":"e0c4a0f9-7822-43a9-ab69-f42a567fb9cd","Type":"ContainerStarted","Data":"bf787350ba407eb7629d92cf6482d5dc906e27f9ef98f19eaef6252cdb666397"} Mar 09 03:12:05 crc kubenswrapper[4901]: I0309 03:12:05.931082 4901 generic.go:334] "Generic (PLEG): container finished" podID="192cb251-df07-4653-b054-1e606fe20f31" containerID="149cc5990f82eade8153601ab841199c3bc52688940c92a987de23a53e257309" exitCode=0 Mar 09 03:12:05 crc kubenswrapper[4901]: I0309 03:12:05.931128 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-knmtq" event={"ID":"192cb251-df07-4653-b054-1e606fe20f31","Type":"ContainerDied","Data":"149cc5990f82eade8153601ab841199c3bc52688940c92a987de23a53e257309"} Mar 09 03:12:05 crc kubenswrapper[4901]: I0309 03:12:05.931157 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-knmtq" event={"ID":"192cb251-df07-4653-b054-1e606fe20f31","Type":"ContainerStarted","Data":"2c2424f6148f0ffbb09d6ccdae3b6cd916595aeee019305d0473c2853df2f7fd"} Mar 09 03:12:06 crc kubenswrapper[4901]: I0309 03:12:06.114019 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e1f6875-8b63-4eff-b3bb-e51285f5e8b0" path="/var/lib/kubelet/pods/9e1f6875-8b63-4eff-b3bb-e51285f5e8b0/volumes" Mar 09 03:12:06 crc kubenswrapper[4901]: I0309 03:12:06.941807 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f2h5l" event={"ID":"e0c4a0f9-7822-43a9-ab69-f42a567fb9cd","Type":"ContainerStarted","Data":"9966d7b9e3abec4abb6dcbddb040811c1f73258b2364f28a141ffbf9d5667e59"} Mar 09 03:12:07 crc kubenswrapper[4901]: I0309 03:12:07.954145 4901 generic.go:334] "Generic (PLEG): container finished" podID="e0c4a0f9-7822-43a9-ab69-f42a567fb9cd" containerID="9966d7b9e3abec4abb6dcbddb040811c1f73258b2364f28a141ffbf9d5667e59" exitCode=0 Mar 09 03:12:07 crc kubenswrapper[4901]: I0309 03:12:07.954835 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f2h5l" event={"ID":"e0c4a0f9-7822-43a9-ab69-f42a567fb9cd","Type":"ContainerDied","Data":"9966d7b9e3abec4abb6dcbddb040811c1f73258b2364f28a141ffbf9d5667e59"} Mar 09 03:12:07 crc kubenswrapper[4901]: I0309 03:12:07.957665 4901 generic.go:334] "Generic (PLEG): container finished" podID="192cb251-df07-4653-b054-1e606fe20f31" containerID="9c24b17a4fd3de2bb8cfff37dfdd07eaf62919b1e2f4ae3b523aca66cc96d17e" exitCode=0 Mar 09 03:12:07 crc kubenswrapper[4901]: I0309 03:12:07.957729 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-knmtq" event={"ID":"192cb251-df07-4653-b054-1e606fe20f31","Type":"ContainerDied","Data":"9c24b17a4fd3de2bb8cfff37dfdd07eaf62919b1e2f4ae3b523aca66cc96d17e"} Mar 09 03:12:08 crc kubenswrapper[4901]: I0309 03:12:08.973309 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f2h5l" event={"ID":"e0c4a0f9-7822-43a9-ab69-f42a567fb9cd","Type":"ContainerStarted","Data":"d03a8d03084f70404ffc1278f8263c851e021c67c7d22ae37e6d29cc16da011e"} Mar 09 03:12:08 crc kubenswrapper[4901]: I0309 03:12:08.977408 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-knmtq" event={"ID":"192cb251-df07-4653-b054-1e606fe20f31","Type":"ContainerStarted","Data":"daaf5a31a069a5d06d0916b4f89526c66fdd1184f4bfe0d786f51cce75584299"} Mar 09 03:12:08 crc kubenswrapper[4901]: I0309 03:12:08.999627 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-f2h5l" podStartSLOduration=2.5178678100000003 podStartE2EDuration="4.999610016s" podCreationTimestamp="2026-03-09 03:12:04 +0000 UTC" firstStartedPulling="2026-03-09 03:12:05.931455719 +0000 UTC m=+1850.521119451" lastFinishedPulling="2026-03-09 03:12:08.413197915 +0000 UTC m=+1853.002861657" observedRunningTime="2026-03-09 03:12:08.996635651 +0000 UTC m=+1853.586299453" watchObservedRunningTime="2026-03-09 03:12:08.999610016 +0000 UTC m=+1853.589273748" Mar 09 03:12:09 crc kubenswrapper[4901]: I0309 03:12:09.023541 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-knmtq" podStartSLOduration=2.512285479 podStartE2EDuration="5.023513511s" podCreationTimestamp="2026-03-09 03:12:04 +0000 UTC" firstStartedPulling="2026-03-09 03:12:05.932384352 +0000 UTC m=+1850.522048084" lastFinishedPulling="2026-03-09 03:12:08.443612344 +0000 UTC m=+1853.033276116" observedRunningTime="2026-03-09 03:12:09.018177766 +0000 UTC m=+1853.607841498" watchObservedRunningTime="2026-03-09 03:12:09.023513511 +0000 UTC m=+1853.613177303" Mar 09 03:12:09 crc kubenswrapper[4901]: I0309 03:12:09.105903 4901 scope.go:117] "RemoveContainer" containerID="324d191a9a6cf1d15e57a90535c15f4180b030663e41d38d4e613a6168e3a1a2" Mar 09 03:12:09 crc kubenswrapper[4901]: E0309 03:12:09.106192 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:12:15 crc kubenswrapper[4901]: I0309 03:12:15.129367 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-knmtq" Mar 09 03:12:15 crc kubenswrapper[4901]: I0309 03:12:15.130070 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-knmtq" Mar 09 03:12:15 crc kubenswrapper[4901]: I0309 03:12:15.204058 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-knmtq" Mar 09 03:12:15 crc kubenswrapper[4901]: I0309 03:12:15.300985 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-f2h5l" Mar 09 03:12:15 crc kubenswrapper[4901]: I0309 03:12:15.301045 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-f2h5l" Mar 09 03:12:15 crc kubenswrapper[4901]: I0309 03:12:15.372262 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-f2h5l" Mar 09 03:12:16 crc kubenswrapper[4901]: I0309 03:12:16.125130 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-f2h5l" Mar 09 03:12:16 crc kubenswrapper[4901]: I0309 03:12:16.125196 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-knmtq" Mar 09 03:12:16 crc kubenswrapper[4901]: I0309 03:12:16.653671 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f2h5l"] Mar 09 03:12:18 crc kubenswrapper[4901]: I0309 03:12:18.062727 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-f2h5l" podUID="e0c4a0f9-7822-43a9-ab69-f42a567fb9cd" containerName="registry-server" containerID="cri-o://d03a8d03084f70404ffc1278f8263c851e021c67c7d22ae37e6d29cc16da011e" gracePeriod=2 Mar 09 03:12:18 crc kubenswrapper[4901]: I0309 03:12:18.457731 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-knmtq"] Mar 09 03:12:18 crc kubenswrapper[4901]: I0309 03:12:18.458515 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-knmtq" podUID="192cb251-df07-4653-b054-1e606fe20f31" containerName="registry-server" containerID="cri-o://daaf5a31a069a5d06d0916b4f89526c66fdd1184f4bfe0d786f51cce75584299" gracePeriod=2 Mar 09 03:12:18 crc kubenswrapper[4901]: I0309 03:12:18.635494 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f2h5l" Mar 09 03:12:18 crc kubenswrapper[4901]: I0309 03:12:18.801881 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkt6r\" (UniqueName: \"kubernetes.io/projected/e0c4a0f9-7822-43a9-ab69-f42a567fb9cd-kube-api-access-tkt6r\") pod \"e0c4a0f9-7822-43a9-ab69-f42a567fb9cd\" (UID: \"e0c4a0f9-7822-43a9-ab69-f42a567fb9cd\") " Mar 09 03:12:18 crc kubenswrapper[4901]: I0309 03:12:18.802454 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0c4a0f9-7822-43a9-ab69-f42a567fb9cd-catalog-content\") pod \"e0c4a0f9-7822-43a9-ab69-f42a567fb9cd\" (UID: \"e0c4a0f9-7822-43a9-ab69-f42a567fb9cd\") " Mar 09 03:12:18 crc kubenswrapper[4901]: I0309 03:12:18.802615 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0c4a0f9-7822-43a9-ab69-f42a567fb9cd-utilities\") pod \"e0c4a0f9-7822-43a9-ab69-f42a567fb9cd\" (UID: \"e0c4a0f9-7822-43a9-ab69-f42a567fb9cd\") " Mar 09 03:12:18 crc kubenswrapper[4901]: I0309 03:12:18.803485 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0c4a0f9-7822-43a9-ab69-f42a567fb9cd-utilities" (OuterVolumeSpecName: "utilities") pod "e0c4a0f9-7822-43a9-ab69-f42a567fb9cd" (UID: "e0c4a0f9-7822-43a9-ab69-f42a567fb9cd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:12:18 crc kubenswrapper[4901]: I0309 03:12:18.812368 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0c4a0f9-7822-43a9-ab69-f42a567fb9cd-kube-api-access-tkt6r" (OuterVolumeSpecName: "kube-api-access-tkt6r") pod "e0c4a0f9-7822-43a9-ab69-f42a567fb9cd" (UID: "e0c4a0f9-7822-43a9-ab69-f42a567fb9cd"). InnerVolumeSpecName "kube-api-access-tkt6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:12:18 crc kubenswrapper[4901]: I0309 03:12:18.858427 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-knmtq" Mar 09 03:12:18 crc kubenswrapper[4901]: I0309 03:12:18.885430 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0c4a0f9-7822-43a9-ab69-f42a567fb9cd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e0c4a0f9-7822-43a9-ab69-f42a567fb9cd" (UID: "e0c4a0f9-7822-43a9-ab69-f42a567fb9cd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:12:18 crc kubenswrapper[4901]: I0309 03:12:18.905215 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0c4a0f9-7822-43a9-ab69-f42a567fb9cd-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 03:12:18 crc kubenswrapper[4901]: I0309 03:12:18.905311 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkt6r\" (UniqueName: \"kubernetes.io/projected/e0c4a0f9-7822-43a9-ab69-f42a567fb9cd-kube-api-access-tkt6r\") on node \"crc\" DevicePath \"\"" Mar 09 03:12:18 crc kubenswrapper[4901]: I0309 03:12:18.905338 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0c4a0f9-7822-43a9-ab69-f42a567fb9cd-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 03:12:19 crc kubenswrapper[4901]: I0309 03:12:19.006732 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/192cb251-df07-4653-b054-1e606fe20f31-catalog-content\") pod \"192cb251-df07-4653-b054-1e606fe20f31\" (UID: \"192cb251-df07-4653-b054-1e606fe20f31\") " Mar 09 03:12:19 crc kubenswrapper[4901]: I0309 03:12:19.006833 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/192cb251-df07-4653-b054-1e606fe20f31-utilities\") pod \"192cb251-df07-4653-b054-1e606fe20f31\" (UID: \"192cb251-df07-4653-b054-1e606fe20f31\") " Mar 09 03:12:19 crc kubenswrapper[4901]: I0309 03:12:19.006984 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzkq2\" (UniqueName: \"kubernetes.io/projected/192cb251-df07-4653-b054-1e606fe20f31-kube-api-access-fzkq2\") pod \"192cb251-df07-4653-b054-1e606fe20f31\" (UID: \"192cb251-df07-4653-b054-1e606fe20f31\") " Mar 09 03:12:19 crc kubenswrapper[4901]: I0309 03:12:19.010209 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/192cb251-df07-4653-b054-1e606fe20f31-utilities" (OuterVolumeSpecName: "utilities") pod "192cb251-df07-4653-b054-1e606fe20f31" (UID: "192cb251-df07-4653-b054-1e606fe20f31"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:12:19 crc kubenswrapper[4901]: I0309 03:12:19.012811 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/192cb251-df07-4653-b054-1e606fe20f31-kube-api-access-fzkq2" (OuterVolumeSpecName: "kube-api-access-fzkq2") pod "192cb251-df07-4653-b054-1e606fe20f31" (UID: "192cb251-df07-4653-b054-1e606fe20f31"). InnerVolumeSpecName "kube-api-access-fzkq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:12:19 crc kubenswrapper[4901]: I0309 03:12:19.076886 4901 generic.go:334] "Generic (PLEG): container finished" podID="e0c4a0f9-7822-43a9-ab69-f42a567fb9cd" containerID="d03a8d03084f70404ffc1278f8263c851e021c67c7d22ae37e6d29cc16da011e" exitCode=0 Mar 09 03:12:19 crc kubenswrapper[4901]: I0309 03:12:19.076938 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f2h5l" event={"ID":"e0c4a0f9-7822-43a9-ab69-f42a567fb9cd","Type":"ContainerDied","Data":"d03a8d03084f70404ffc1278f8263c851e021c67c7d22ae37e6d29cc16da011e"} Mar 09 03:12:19 crc kubenswrapper[4901]: I0309 03:12:19.076960 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f2h5l" Mar 09 03:12:19 crc kubenswrapper[4901]: I0309 03:12:19.077001 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f2h5l" event={"ID":"e0c4a0f9-7822-43a9-ab69-f42a567fb9cd","Type":"ContainerDied","Data":"bf787350ba407eb7629d92cf6482d5dc906e27f9ef98f19eaef6252cdb666397"} Mar 09 03:12:19 crc kubenswrapper[4901]: I0309 03:12:19.077033 4901 scope.go:117] "RemoveContainer" containerID="d03a8d03084f70404ffc1278f8263c851e021c67c7d22ae37e6d29cc16da011e" Mar 09 03:12:19 crc kubenswrapper[4901]: I0309 03:12:19.082539 4901 generic.go:334] "Generic (PLEG): container finished" podID="192cb251-df07-4653-b054-1e606fe20f31" containerID="daaf5a31a069a5d06d0916b4f89526c66fdd1184f4bfe0d786f51cce75584299" exitCode=0 Mar 09 03:12:19 crc kubenswrapper[4901]: I0309 03:12:19.082605 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-knmtq" event={"ID":"192cb251-df07-4653-b054-1e606fe20f31","Type":"ContainerDied","Data":"daaf5a31a069a5d06d0916b4f89526c66fdd1184f4bfe0d786f51cce75584299"} Mar 09 03:12:19 crc kubenswrapper[4901]: I0309 03:12:19.082617 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-knmtq" Mar 09 03:12:19 crc kubenswrapper[4901]: I0309 03:12:19.082647 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-knmtq" event={"ID":"192cb251-df07-4653-b054-1e606fe20f31","Type":"ContainerDied","Data":"2c2424f6148f0ffbb09d6ccdae3b6cd916595aeee019305d0473c2853df2f7fd"} Mar 09 03:12:19 crc kubenswrapper[4901]: I0309 03:12:19.097917 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/192cb251-df07-4653-b054-1e606fe20f31-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "192cb251-df07-4653-b054-1e606fe20f31" (UID: "192cb251-df07-4653-b054-1e606fe20f31"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:12:19 crc kubenswrapper[4901]: I0309 03:12:19.099977 4901 scope.go:117] "RemoveContainer" containerID="9966d7b9e3abec4abb6dcbddb040811c1f73258b2364f28a141ffbf9d5667e59" Mar 09 03:12:19 crc kubenswrapper[4901]: I0309 03:12:19.109477 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/192cb251-df07-4653-b054-1e606fe20f31-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 03:12:19 crc kubenswrapper[4901]: I0309 03:12:19.109506 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzkq2\" (UniqueName: \"kubernetes.io/projected/192cb251-df07-4653-b054-1e606fe20f31-kube-api-access-fzkq2\") on node \"crc\" DevicePath \"\"" Mar 09 03:12:19 crc kubenswrapper[4901]: I0309 03:12:19.109517 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/192cb251-df07-4653-b054-1e606fe20f31-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 03:12:19 crc kubenswrapper[4901]: I0309 03:12:19.131693 4901 scope.go:117] "RemoveContainer" containerID="7aefbf7d59031c9b6447aefc7366e4a8b281fc5fb507da6a96f2e9312c7f3951" Mar 09 03:12:19 crc kubenswrapper[4901]: I0309 03:12:19.133860 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f2h5l"] Mar 09 03:12:19 crc kubenswrapper[4901]: I0309 03:12:19.146060 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-f2h5l"] Mar 09 03:12:19 crc kubenswrapper[4901]: I0309 03:12:19.152769 4901 scope.go:117] "RemoveContainer" containerID="d03a8d03084f70404ffc1278f8263c851e021c67c7d22ae37e6d29cc16da011e" Mar 09 03:12:19 crc kubenswrapper[4901]: E0309 03:12:19.153708 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d03a8d03084f70404ffc1278f8263c851e021c67c7d22ae37e6d29cc16da011e\": container with ID starting with d03a8d03084f70404ffc1278f8263c851e021c67c7d22ae37e6d29cc16da011e not found: ID does not exist" containerID="d03a8d03084f70404ffc1278f8263c851e021c67c7d22ae37e6d29cc16da011e" Mar 09 03:12:19 crc kubenswrapper[4901]: I0309 03:12:19.153775 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d03a8d03084f70404ffc1278f8263c851e021c67c7d22ae37e6d29cc16da011e"} err="failed to get container status \"d03a8d03084f70404ffc1278f8263c851e021c67c7d22ae37e6d29cc16da011e\": rpc error: code = NotFound desc = could not find container \"d03a8d03084f70404ffc1278f8263c851e021c67c7d22ae37e6d29cc16da011e\": container with ID starting with d03a8d03084f70404ffc1278f8263c851e021c67c7d22ae37e6d29cc16da011e not found: ID does not exist" Mar 09 03:12:19 crc kubenswrapper[4901]: I0309 03:12:19.153827 4901 scope.go:117] "RemoveContainer" containerID="9966d7b9e3abec4abb6dcbddb040811c1f73258b2364f28a141ffbf9d5667e59" Mar 09 03:12:19 crc kubenswrapper[4901]: E0309 03:12:19.154318 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9966d7b9e3abec4abb6dcbddb040811c1f73258b2364f28a141ffbf9d5667e59\": container with ID starting with 9966d7b9e3abec4abb6dcbddb040811c1f73258b2364f28a141ffbf9d5667e59 not found: ID does not exist" containerID="9966d7b9e3abec4abb6dcbddb040811c1f73258b2364f28a141ffbf9d5667e59" Mar 09 03:12:19 crc kubenswrapper[4901]: I0309 03:12:19.154381 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9966d7b9e3abec4abb6dcbddb040811c1f73258b2364f28a141ffbf9d5667e59"} err="failed to get container status \"9966d7b9e3abec4abb6dcbddb040811c1f73258b2364f28a141ffbf9d5667e59\": rpc error: code = NotFound desc = could not find container \"9966d7b9e3abec4abb6dcbddb040811c1f73258b2364f28a141ffbf9d5667e59\": container with ID starting with 9966d7b9e3abec4abb6dcbddb040811c1f73258b2364f28a141ffbf9d5667e59 not found: ID does not exist" Mar 09 03:12:19 crc kubenswrapper[4901]: I0309 03:12:19.154424 4901 scope.go:117] "RemoveContainer" containerID="7aefbf7d59031c9b6447aefc7366e4a8b281fc5fb507da6a96f2e9312c7f3951" Mar 09 03:12:19 crc kubenswrapper[4901]: E0309 03:12:19.154800 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7aefbf7d59031c9b6447aefc7366e4a8b281fc5fb507da6a96f2e9312c7f3951\": container with ID starting with 7aefbf7d59031c9b6447aefc7366e4a8b281fc5fb507da6a96f2e9312c7f3951 not found: ID does not exist" containerID="7aefbf7d59031c9b6447aefc7366e4a8b281fc5fb507da6a96f2e9312c7f3951" Mar 09 03:12:19 crc kubenswrapper[4901]: I0309 03:12:19.154851 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7aefbf7d59031c9b6447aefc7366e4a8b281fc5fb507da6a96f2e9312c7f3951"} err="failed to get container status \"7aefbf7d59031c9b6447aefc7366e4a8b281fc5fb507da6a96f2e9312c7f3951\": rpc error: code = NotFound desc = could not find container \"7aefbf7d59031c9b6447aefc7366e4a8b281fc5fb507da6a96f2e9312c7f3951\": container with ID starting with 7aefbf7d59031c9b6447aefc7366e4a8b281fc5fb507da6a96f2e9312c7f3951 not found: ID does not exist" Mar 09 03:12:19 crc kubenswrapper[4901]: I0309 03:12:19.154882 4901 scope.go:117] "RemoveContainer" containerID="daaf5a31a069a5d06d0916b4f89526c66fdd1184f4bfe0d786f51cce75584299" Mar 09 03:12:19 crc kubenswrapper[4901]: I0309 03:12:19.174320 4901 scope.go:117] "RemoveContainer" containerID="9c24b17a4fd3de2bb8cfff37dfdd07eaf62919b1e2f4ae3b523aca66cc96d17e" Mar 09 03:12:19 crc kubenswrapper[4901]: I0309 03:12:19.197542 4901 scope.go:117] "RemoveContainer" containerID="149cc5990f82eade8153601ab841199c3bc52688940c92a987de23a53e257309" Mar 09 03:12:19 crc kubenswrapper[4901]: I0309 03:12:19.253738 4901 scope.go:117] "RemoveContainer" containerID="daaf5a31a069a5d06d0916b4f89526c66fdd1184f4bfe0d786f51cce75584299" Mar 09 03:12:19 crc kubenswrapper[4901]: E0309 03:12:19.254375 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"daaf5a31a069a5d06d0916b4f89526c66fdd1184f4bfe0d786f51cce75584299\": container with ID starting with daaf5a31a069a5d06d0916b4f89526c66fdd1184f4bfe0d786f51cce75584299 not found: ID does not exist" containerID="daaf5a31a069a5d06d0916b4f89526c66fdd1184f4bfe0d786f51cce75584299" Mar 09 03:12:19 crc kubenswrapper[4901]: I0309 03:12:19.254441 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"daaf5a31a069a5d06d0916b4f89526c66fdd1184f4bfe0d786f51cce75584299"} err="failed to get container status \"daaf5a31a069a5d06d0916b4f89526c66fdd1184f4bfe0d786f51cce75584299\": rpc error: code = NotFound desc = could not find container \"daaf5a31a069a5d06d0916b4f89526c66fdd1184f4bfe0d786f51cce75584299\": container with ID starting with daaf5a31a069a5d06d0916b4f89526c66fdd1184f4bfe0d786f51cce75584299 not found: ID does not exist" Mar 09 03:12:19 crc kubenswrapper[4901]: I0309 03:12:19.254485 4901 scope.go:117] "RemoveContainer" containerID="9c24b17a4fd3de2bb8cfff37dfdd07eaf62919b1e2f4ae3b523aca66cc96d17e" Mar 09 03:12:19 crc kubenswrapper[4901]: E0309 03:12:19.255090 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c24b17a4fd3de2bb8cfff37dfdd07eaf62919b1e2f4ae3b523aca66cc96d17e\": container with ID starting with 9c24b17a4fd3de2bb8cfff37dfdd07eaf62919b1e2f4ae3b523aca66cc96d17e not found: ID does not exist" containerID="9c24b17a4fd3de2bb8cfff37dfdd07eaf62919b1e2f4ae3b523aca66cc96d17e" Mar 09 03:12:19 crc kubenswrapper[4901]: I0309 03:12:19.255156 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c24b17a4fd3de2bb8cfff37dfdd07eaf62919b1e2f4ae3b523aca66cc96d17e"} err="failed to get container status \"9c24b17a4fd3de2bb8cfff37dfdd07eaf62919b1e2f4ae3b523aca66cc96d17e\": rpc error: code = NotFound desc = could not find container \"9c24b17a4fd3de2bb8cfff37dfdd07eaf62919b1e2f4ae3b523aca66cc96d17e\": container with ID starting with 9c24b17a4fd3de2bb8cfff37dfdd07eaf62919b1e2f4ae3b523aca66cc96d17e not found: ID does not exist" Mar 09 03:12:19 crc kubenswrapper[4901]: I0309 03:12:19.255201 4901 scope.go:117] "RemoveContainer" containerID="149cc5990f82eade8153601ab841199c3bc52688940c92a987de23a53e257309" Mar 09 03:12:19 crc kubenswrapper[4901]: E0309 03:12:19.255655 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"149cc5990f82eade8153601ab841199c3bc52688940c92a987de23a53e257309\": container with ID starting with 149cc5990f82eade8153601ab841199c3bc52688940c92a987de23a53e257309 not found: ID does not exist" containerID="149cc5990f82eade8153601ab841199c3bc52688940c92a987de23a53e257309" Mar 09 03:12:19 crc kubenswrapper[4901]: I0309 03:12:19.255843 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"149cc5990f82eade8153601ab841199c3bc52688940c92a987de23a53e257309"} err="failed to get container status \"149cc5990f82eade8153601ab841199c3bc52688940c92a987de23a53e257309\": rpc error: code = NotFound desc = could not find container \"149cc5990f82eade8153601ab841199c3bc52688940c92a987de23a53e257309\": container with ID starting with 149cc5990f82eade8153601ab841199c3bc52688940c92a987de23a53e257309 not found: ID does not exist" Mar 09 03:12:19 crc kubenswrapper[4901]: I0309 03:12:19.439513 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-knmtq"] Mar 09 03:12:19 crc kubenswrapper[4901]: I0309 03:12:19.451355 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-knmtq"] Mar 09 03:12:20 crc kubenswrapper[4901]: I0309 03:12:20.122537 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="192cb251-df07-4653-b054-1e606fe20f31" path="/var/lib/kubelet/pods/192cb251-df07-4653-b054-1e606fe20f31/volumes" Mar 09 03:12:20 crc kubenswrapper[4901]: I0309 03:12:20.124293 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0c4a0f9-7822-43a9-ab69-f42a567fb9cd" path="/var/lib/kubelet/pods/e0c4a0f9-7822-43a9-ab69-f42a567fb9cd/volumes" Mar 09 03:12:23 crc kubenswrapper[4901]: I0309 03:12:23.106768 4901 scope.go:117] "RemoveContainer" containerID="324d191a9a6cf1d15e57a90535c15f4180b030663e41d38d4e613a6168e3a1a2" Mar 09 03:12:23 crc kubenswrapper[4901]: E0309 03:12:23.107705 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:12:37 crc kubenswrapper[4901]: I0309 03:12:37.940475 4901 scope.go:117] "RemoveContainer" containerID="716ee6debd7ff6a787ab13690b030042fb8f2f2451f2acd85f7cbb0616b3196e" Mar 09 03:12:38 crc kubenswrapper[4901]: I0309 03:12:38.106215 4901 scope.go:117] "RemoveContainer" containerID="324d191a9a6cf1d15e57a90535c15f4180b030663e41d38d4e613a6168e3a1a2" Mar 09 03:12:38 crc kubenswrapper[4901]: E0309 03:12:38.106754 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:12:53 crc kubenswrapper[4901]: I0309 03:12:53.107336 4901 scope.go:117] "RemoveContainer" containerID="324d191a9a6cf1d15e57a90535c15f4180b030663e41d38d4e613a6168e3a1a2" Mar 09 03:12:53 crc kubenswrapper[4901]: E0309 03:12:53.108048 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:13:06 crc kubenswrapper[4901]: I0309 03:13:06.114434 4901 scope.go:117] "RemoveContainer" containerID="324d191a9a6cf1d15e57a90535c15f4180b030663e41d38d4e613a6168e3a1a2" Mar 09 03:13:06 crc kubenswrapper[4901]: E0309 03:13:06.115834 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:13:19 crc kubenswrapper[4901]: I0309 03:13:19.107151 4901 scope.go:117] "RemoveContainer" containerID="324d191a9a6cf1d15e57a90535c15f4180b030663e41d38d4e613a6168e3a1a2" Mar 09 03:13:19 crc kubenswrapper[4901]: E0309 03:13:19.108817 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:13:32 crc kubenswrapper[4901]: I0309 03:13:32.106966 4901 scope.go:117] "RemoveContainer" containerID="324d191a9a6cf1d15e57a90535c15f4180b030663e41d38d4e613a6168e3a1a2" Mar 09 03:13:32 crc kubenswrapper[4901]: E0309 03:13:32.107715 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:13:45 crc kubenswrapper[4901]: I0309 03:13:45.107598 4901 scope.go:117] "RemoveContainer" containerID="324d191a9a6cf1d15e57a90535c15f4180b030663e41d38d4e613a6168e3a1a2" Mar 09 03:13:45 crc kubenswrapper[4901]: E0309 03:13:45.108885 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:13:57 crc kubenswrapper[4901]: I0309 03:13:57.106632 4901 scope.go:117] "RemoveContainer" containerID="324d191a9a6cf1d15e57a90535c15f4180b030663e41d38d4e613a6168e3a1a2" Mar 09 03:13:57 crc kubenswrapper[4901]: E0309 03:13:57.107612 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:14:00 crc kubenswrapper[4901]: I0309 03:14:00.159316 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550434-x4s5z"] Mar 09 03:14:00 crc kubenswrapper[4901]: E0309 03:14:00.159661 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0c4a0f9-7822-43a9-ab69-f42a567fb9cd" containerName="extract-utilities" Mar 09 03:14:00 crc kubenswrapper[4901]: I0309 03:14:00.159674 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0c4a0f9-7822-43a9-ab69-f42a567fb9cd" containerName="extract-utilities" Mar 09 03:14:00 crc kubenswrapper[4901]: E0309 03:14:00.159696 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0c4a0f9-7822-43a9-ab69-f42a567fb9cd" containerName="extract-content" Mar 09 03:14:00 crc kubenswrapper[4901]: I0309 03:14:00.159704 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0c4a0f9-7822-43a9-ab69-f42a567fb9cd" containerName="extract-content" Mar 09 03:14:00 crc kubenswrapper[4901]: E0309 03:14:00.159716 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="192cb251-df07-4653-b054-1e606fe20f31" containerName="extract-content" Mar 09 03:14:00 crc kubenswrapper[4901]: I0309 03:14:00.159725 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="192cb251-df07-4653-b054-1e606fe20f31" containerName="extract-content" Mar 09 03:14:00 crc kubenswrapper[4901]: E0309 03:14:00.159740 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="192cb251-df07-4653-b054-1e606fe20f31" containerName="extract-utilities" Mar 09 03:14:00 crc kubenswrapper[4901]: I0309 03:14:00.159748 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="192cb251-df07-4653-b054-1e606fe20f31" containerName="extract-utilities" Mar 09 03:14:00 crc kubenswrapper[4901]: E0309 03:14:00.159776 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0c4a0f9-7822-43a9-ab69-f42a567fb9cd" containerName="registry-server" Mar 09 03:14:00 crc kubenswrapper[4901]: I0309 03:14:00.159784 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0c4a0f9-7822-43a9-ab69-f42a567fb9cd" containerName="registry-server" Mar 09 03:14:00 crc kubenswrapper[4901]: E0309 03:14:00.159801 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="192cb251-df07-4653-b054-1e606fe20f31" containerName="registry-server" Mar 09 03:14:00 crc kubenswrapper[4901]: I0309 03:14:00.159809 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="192cb251-df07-4653-b054-1e606fe20f31" containerName="registry-server" Mar 09 03:14:00 crc kubenswrapper[4901]: I0309 03:14:00.159990 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0c4a0f9-7822-43a9-ab69-f42a567fb9cd" containerName="registry-server" Mar 09 03:14:00 crc kubenswrapper[4901]: I0309 03:14:00.160022 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="192cb251-df07-4653-b054-1e606fe20f31" containerName="registry-server" Mar 09 03:14:00 crc kubenswrapper[4901]: I0309 03:14:00.160768 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550434-x4s5z" Mar 09 03:14:00 crc kubenswrapper[4901]: I0309 03:14:00.162517 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lq6vs" Mar 09 03:14:00 crc kubenswrapper[4901]: I0309 03:14:00.162810 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 03:14:00 crc kubenswrapper[4901]: I0309 03:14:00.163792 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 03:14:00 crc kubenswrapper[4901]: I0309 03:14:00.183842 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550434-x4s5z"] Mar 09 03:14:00 crc kubenswrapper[4901]: I0309 03:14:00.191068 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7k25\" (UniqueName: \"kubernetes.io/projected/52da100b-2af2-43d0-8309-4376e54b34d3-kube-api-access-q7k25\") pod \"auto-csr-approver-29550434-x4s5z\" (UID: \"52da100b-2af2-43d0-8309-4376e54b34d3\") " pod="openshift-infra/auto-csr-approver-29550434-x4s5z" Mar 09 03:14:00 crc kubenswrapper[4901]: I0309 03:14:00.292070 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7k25\" (UniqueName: \"kubernetes.io/projected/52da100b-2af2-43d0-8309-4376e54b34d3-kube-api-access-q7k25\") pod \"auto-csr-approver-29550434-x4s5z\" (UID: \"52da100b-2af2-43d0-8309-4376e54b34d3\") " pod="openshift-infra/auto-csr-approver-29550434-x4s5z" Mar 09 03:14:00 crc kubenswrapper[4901]: I0309 03:14:00.312888 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7k25\" (UniqueName: \"kubernetes.io/projected/52da100b-2af2-43d0-8309-4376e54b34d3-kube-api-access-q7k25\") pod \"auto-csr-approver-29550434-x4s5z\" (UID: \"52da100b-2af2-43d0-8309-4376e54b34d3\") " pod="openshift-infra/auto-csr-approver-29550434-x4s5z" Mar 09 03:14:00 crc kubenswrapper[4901]: I0309 03:14:00.483538 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550434-x4s5z" Mar 09 03:14:00 crc kubenswrapper[4901]: I0309 03:14:00.988210 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550434-x4s5z"] Mar 09 03:14:01 crc kubenswrapper[4901]: I0309 03:14:01.102369 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550434-x4s5z" event={"ID":"52da100b-2af2-43d0-8309-4376e54b34d3","Type":"ContainerStarted","Data":"67c76cb97ef179954039b013067d4fe659669d56c9155fcf028514726966e68d"} Mar 09 03:14:03 crc kubenswrapper[4901]: I0309 03:14:03.125472 4901 generic.go:334] "Generic (PLEG): container finished" podID="52da100b-2af2-43d0-8309-4376e54b34d3" containerID="e42bb94cc698e6e14df7f06758942c53fafc31d9b732721ac550ecf9557f7053" exitCode=0 Mar 09 03:14:03 crc kubenswrapper[4901]: I0309 03:14:03.125556 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550434-x4s5z" event={"ID":"52da100b-2af2-43d0-8309-4376e54b34d3","Type":"ContainerDied","Data":"e42bb94cc698e6e14df7f06758942c53fafc31d9b732721ac550ecf9557f7053"} Mar 09 03:14:04 crc kubenswrapper[4901]: I0309 03:14:04.463863 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550434-x4s5z" Mar 09 03:14:04 crc kubenswrapper[4901]: I0309 03:14:04.660993 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7k25\" (UniqueName: \"kubernetes.io/projected/52da100b-2af2-43d0-8309-4376e54b34d3-kube-api-access-q7k25\") pod \"52da100b-2af2-43d0-8309-4376e54b34d3\" (UID: \"52da100b-2af2-43d0-8309-4376e54b34d3\") " Mar 09 03:14:04 crc kubenswrapper[4901]: I0309 03:14:04.668965 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52da100b-2af2-43d0-8309-4376e54b34d3-kube-api-access-q7k25" (OuterVolumeSpecName: "kube-api-access-q7k25") pod "52da100b-2af2-43d0-8309-4376e54b34d3" (UID: "52da100b-2af2-43d0-8309-4376e54b34d3"). InnerVolumeSpecName "kube-api-access-q7k25". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:14:04 crc kubenswrapper[4901]: I0309 03:14:04.763361 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7k25\" (UniqueName: \"kubernetes.io/projected/52da100b-2af2-43d0-8309-4376e54b34d3-kube-api-access-q7k25\") on node \"crc\" DevicePath \"\"" Mar 09 03:14:05 crc kubenswrapper[4901]: I0309 03:14:05.149472 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550434-x4s5z" event={"ID":"52da100b-2af2-43d0-8309-4376e54b34d3","Type":"ContainerDied","Data":"67c76cb97ef179954039b013067d4fe659669d56c9155fcf028514726966e68d"} Mar 09 03:14:05 crc kubenswrapper[4901]: I0309 03:14:05.149519 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67c76cb97ef179954039b013067d4fe659669d56c9155fcf028514726966e68d" Mar 09 03:14:05 crc kubenswrapper[4901]: I0309 03:14:05.149581 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550434-x4s5z" Mar 09 03:14:05 crc kubenswrapper[4901]: I0309 03:14:05.561642 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550428-b4t88"] Mar 09 03:14:05 crc kubenswrapper[4901]: I0309 03:14:05.572199 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550428-b4t88"] Mar 09 03:14:06 crc kubenswrapper[4901]: I0309 03:14:06.120490 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00668526-212d-45e2-afa7-dcf90734ff5d" path="/var/lib/kubelet/pods/00668526-212d-45e2-afa7-dcf90734ff5d/volumes" Mar 09 03:14:11 crc kubenswrapper[4901]: I0309 03:14:11.106627 4901 scope.go:117] "RemoveContainer" containerID="324d191a9a6cf1d15e57a90535c15f4180b030663e41d38d4e613a6168e3a1a2" Mar 09 03:14:11 crc kubenswrapper[4901]: E0309 03:14:11.107440 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:14:25 crc kubenswrapper[4901]: I0309 03:14:25.107259 4901 scope.go:117] "RemoveContainer" containerID="324d191a9a6cf1d15e57a90535c15f4180b030663e41d38d4e613a6168e3a1a2" Mar 09 03:14:25 crc kubenswrapper[4901]: E0309 03:14:25.108381 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:14:38 crc kubenswrapper[4901]: I0309 03:14:38.098145 4901 scope.go:117] "RemoveContainer" containerID="35562fcfde12d6f3085945b49680c207afc8b93d36c95a4e0a60217ec22bd8ed" Mar 09 03:14:39 crc kubenswrapper[4901]: I0309 03:14:39.107306 4901 scope.go:117] "RemoveContainer" containerID="324d191a9a6cf1d15e57a90535c15f4180b030663e41d38d4e613a6168e3a1a2" Mar 09 03:14:39 crc kubenswrapper[4901]: E0309 03:14:39.108143 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:14:50 crc kubenswrapper[4901]: I0309 03:14:50.106159 4901 scope.go:117] "RemoveContainer" containerID="324d191a9a6cf1d15e57a90535c15f4180b030663e41d38d4e613a6168e3a1a2" Mar 09 03:14:50 crc kubenswrapper[4901]: E0309 03:14:50.108570 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:15:00 crc kubenswrapper[4901]: I0309 03:15:00.180597 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550435-4qfpp"] Mar 09 03:15:00 crc kubenswrapper[4901]: E0309 03:15:00.183700 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52da100b-2af2-43d0-8309-4376e54b34d3" containerName="oc" Mar 09 03:15:00 crc kubenswrapper[4901]: I0309 03:15:00.183913 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="52da100b-2af2-43d0-8309-4376e54b34d3" containerName="oc" Mar 09 03:15:00 crc kubenswrapper[4901]: I0309 03:15:00.184361 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="52da100b-2af2-43d0-8309-4376e54b34d3" containerName="oc" Mar 09 03:15:00 crc kubenswrapper[4901]: I0309 03:15:00.185527 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550435-4qfpp" Mar 09 03:15:00 crc kubenswrapper[4901]: I0309 03:15:00.186007 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550435-4qfpp"] Mar 09 03:15:00 crc kubenswrapper[4901]: I0309 03:15:00.189021 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 09 03:15:00 crc kubenswrapper[4901]: I0309 03:15:00.208457 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 09 03:15:00 crc kubenswrapper[4901]: I0309 03:15:00.288553 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcxh7\" (UniqueName: \"kubernetes.io/projected/c5e2f2de-1f76-43b5-97f2-0ad3cf044169-kube-api-access-zcxh7\") pod \"collect-profiles-29550435-4qfpp\" (UID: \"c5e2f2de-1f76-43b5-97f2-0ad3cf044169\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550435-4qfpp" Mar 09 03:15:00 crc kubenswrapper[4901]: I0309 03:15:00.288665 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c5e2f2de-1f76-43b5-97f2-0ad3cf044169-config-volume\") pod \"collect-profiles-29550435-4qfpp\" (UID: \"c5e2f2de-1f76-43b5-97f2-0ad3cf044169\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550435-4qfpp" Mar 09 03:15:00 crc kubenswrapper[4901]: I0309 03:15:00.290450 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c5e2f2de-1f76-43b5-97f2-0ad3cf044169-secret-volume\") pod \"collect-profiles-29550435-4qfpp\" (UID: \"c5e2f2de-1f76-43b5-97f2-0ad3cf044169\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550435-4qfpp" Mar 09 03:15:00 crc kubenswrapper[4901]: I0309 03:15:00.396283 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c5e2f2de-1f76-43b5-97f2-0ad3cf044169-config-volume\") pod \"collect-profiles-29550435-4qfpp\" (UID: \"c5e2f2de-1f76-43b5-97f2-0ad3cf044169\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550435-4qfpp" Mar 09 03:15:00 crc kubenswrapper[4901]: I0309 03:15:00.394818 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c5e2f2de-1f76-43b5-97f2-0ad3cf044169-config-volume\") pod \"collect-profiles-29550435-4qfpp\" (UID: \"c5e2f2de-1f76-43b5-97f2-0ad3cf044169\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550435-4qfpp" Mar 09 03:15:00 crc kubenswrapper[4901]: I0309 03:15:00.397072 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c5e2f2de-1f76-43b5-97f2-0ad3cf044169-secret-volume\") pod \"collect-profiles-29550435-4qfpp\" (UID: \"c5e2f2de-1f76-43b5-97f2-0ad3cf044169\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550435-4qfpp" Mar 09 03:15:00 crc kubenswrapper[4901]: I0309 03:15:00.397392 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcxh7\" (UniqueName: \"kubernetes.io/projected/c5e2f2de-1f76-43b5-97f2-0ad3cf044169-kube-api-access-zcxh7\") pod \"collect-profiles-29550435-4qfpp\" (UID: \"c5e2f2de-1f76-43b5-97f2-0ad3cf044169\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550435-4qfpp" Mar 09 03:15:00 crc kubenswrapper[4901]: I0309 03:15:00.406286 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c5e2f2de-1f76-43b5-97f2-0ad3cf044169-secret-volume\") pod \"collect-profiles-29550435-4qfpp\" (UID: \"c5e2f2de-1f76-43b5-97f2-0ad3cf044169\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550435-4qfpp" Mar 09 03:15:00 crc kubenswrapper[4901]: I0309 03:15:00.426093 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcxh7\" (UniqueName: \"kubernetes.io/projected/c5e2f2de-1f76-43b5-97f2-0ad3cf044169-kube-api-access-zcxh7\") pod \"collect-profiles-29550435-4qfpp\" (UID: \"c5e2f2de-1f76-43b5-97f2-0ad3cf044169\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550435-4qfpp" Mar 09 03:15:00 crc kubenswrapper[4901]: I0309 03:15:00.531423 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550435-4qfpp" Mar 09 03:15:01 crc kubenswrapper[4901]: I0309 03:15:01.001999 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550435-4qfpp"] Mar 09 03:15:01 crc kubenswrapper[4901]: I0309 03:15:01.688385 4901 generic.go:334] "Generic (PLEG): container finished" podID="c5e2f2de-1f76-43b5-97f2-0ad3cf044169" containerID="fb942c6ec6ae344c9d8d8bfa81bfb200c00e8c88d136dbe8cce57bed0c9275ee" exitCode=0 Mar 09 03:15:01 crc kubenswrapper[4901]: I0309 03:15:01.688458 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550435-4qfpp" event={"ID":"c5e2f2de-1f76-43b5-97f2-0ad3cf044169","Type":"ContainerDied","Data":"fb942c6ec6ae344c9d8d8bfa81bfb200c00e8c88d136dbe8cce57bed0c9275ee"} Mar 09 03:15:01 crc kubenswrapper[4901]: I0309 03:15:01.688523 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550435-4qfpp" event={"ID":"c5e2f2de-1f76-43b5-97f2-0ad3cf044169","Type":"ContainerStarted","Data":"0381fdd9951c364e983b94d7acdd097bdd71d2a5c1ecfda82ffb2f06ffcf1b2b"} Mar 09 03:15:03 crc kubenswrapper[4901]: I0309 03:15:03.106679 4901 scope.go:117] "RemoveContainer" containerID="324d191a9a6cf1d15e57a90535c15f4180b030663e41d38d4e613a6168e3a1a2" Mar 09 03:15:03 crc kubenswrapper[4901]: E0309 03:15:03.107533 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:15:03 crc kubenswrapper[4901]: I0309 03:15:03.127360 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550435-4qfpp" Mar 09 03:15:03 crc kubenswrapper[4901]: I0309 03:15:03.238245 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c5e2f2de-1f76-43b5-97f2-0ad3cf044169-secret-volume\") pod \"c5e2f2de-1f76-43b5-97f2-0ad3cf044169\" (UID: \"c5e2f2de-1f76-43b5-97f2-0ad3cf044169\") " Mar 09 03:15:03 crc kubenswrapper[4901]: I0309 03:15:03.238360 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c5e2f2de-1f76-43b5-97f2-0ad3cf044169-config-volume\") pod \"c5e2f2de-1f76-43b5-97f2-0ad3cf044169\" (UID: \"c5e2f2de-1f76-43b5-97f2-0ad3cf044169\") " Mar 09 03:15:03 crc kubenswrapper[4901]: I0309 03:15:03.238458 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcxh7\" (UniqueName: \"kubernetes.io/projected/c5e2f2de-1f76-43b5-97f2-0ad3cf044169-kube-api-access-zcxh7\") pod \"c5e2f2de-1f76-43b5-97f2-0ad3cf044169\" (UID: \"c5e2f2de-1f76-43b5-97f2-0ad3cf044169\") " Mar 09 03:15:03 crc kubenswrapper[4901]: I0309 03:15:03.239129 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5e2f2de-1f76-43b5-97f2-0ad3cf044169-config-volume" (OuterVolumeSpecName: "config-volume") pod "c5e2f2de-1f76-43b5-97f2-0ad3cf044169" (UID: "c5e2f2de-1f76-43b5-97f2-0ad3cf044169"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:15:03 crc kubenswrapper[4901]: I0309 03:15:03.244895 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5e2f2de-1f76-43b5-97f2-0ad3cf044169-kube-api-access-zcxh7" (OuterVolumeSpecName: "kube-api-access-zcxh7") pod "c5e2f2de-1f76-43b5-97f2-0ad3cf044169" (UID: "c5e2f2de-1f76-43b5-97f2-0ad3cf044169"). InnerVolumeSpecName "kube-api-access-zcxh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:15:03 crc kubenswrapper[4901]: I0309 03:15:03.245834 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5e2f2de-1f76-43b5-97f2-0ad3cf044169-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c5e2f2de-1f76-43b5-97f2-0ad3cf044169" (UID: "c5e2f2de-1f76-43b5-97f2-0ad3cf044169"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:15:03 crc kubenswrapper[4901]: I0309 03:15:03.340669 4901 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c5e2f2de-1f76-43b5-97f2-0ad3cf044169-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 03:15:03 crc kubenswrapper[4901]: I0309 03:15:03.340727 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcxh7\" (UniqueName: \"kubernetes.io/projected/c5e2f2de-1f76-43b5-97f2-0ad3cf044169-kube-api-access-zcxh7\") on node \"crc\" DevicePath \"\"" Mar 09 03:15:03 crc kubenswrapper[4901]: I0309 03:15:03.340750 4901 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c5e2f2de-1f76-43b5-97f2-0ad3cf044169-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 09 03:15:03 crc kubenswrapper[4901]: I0309 03:15:03.713060 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550435-4qfpp" event={"ID":"c5e2f2de-1f76-43b5-97f2-0ad3cf044169","Type":"ContainerDied","Data":"0381fdd9951c364e983b94d7acdd097bdd71d2a5c1ecfda82ffb2f06ffcf1b2b"} Mar 09 03:15:03 crc kubenswrapper[4901]: I0309 03:15:03.713116 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0381fdd9951c364e983b94d7acdd097bdd71d2a5c1ecfda82ffb2f06ffcf1b2b" Mar 09 03:15:03 crc kubenswrapper[4901]: I0309 03:15:03.713130 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550435-4qfpp" Mar 09 03:15:04 crc kubenswrapper[4901]: I0309 03:15:04.237043 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550390-hrjpt"] Mar 09 03:15:04 crc kubenswrapper[4901]: I0309 03:15:04.247749 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550390-hrjpt"] Mar 09 03:15:06 crc kubenswrapper[4901]: I0309 03:15:06.125749 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3336f36-4389-49db-a669-fe0cbc0bfdfd" path="/var/lib/kubelet/pods/e3336f36-4389-49db-a669-fe0cbc0bfdfd/volumes" Mar 09 03:15:18 crc kubenswrapper[4901]: I0309 03:15:18.107134 4901 scope.go:117] "RemoveContainer" containerID="324d191a9a6cf1d15e57a90535c15f4180b030663e41d38d4e613a6168e3a1a2" Mar 09 03:15:18 crc kubenswrapper[4901]: E0309 03:15:18.108078 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:15:30 crc kubenswrapper[4901]: I0309 03:15:30.106058 4901 scope.go:117] "RemoveContainer" containerID="324d191a9a6cf1d15e57a90535c15f4180b030663e41d38d4e613a6168e3a1a2" Mar 09 03:15:30 crc kubenswrapper[4901]: E0309 03:15:30.107057 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:15:38 crc kubenswrapper[4901]: I0309 03:15:38.198547 4901 scope.go:117] "RemoveContainer" containerID="f09bcee651e918ddf27c2b14ec0f74a86cf3b972a8f39631c4b0972b19b67c2d" Mar 09 03:15:41 crc kubenswrapper[4901]: I0309 03:15:41.105831 4901 scope.go:117] "RemoveContainer" containerID="324d191a9a6cf1d15e57a90535c15f4180b030663e41d38d4e613a6168e3a1a2" Mar 09 03:15:41 crc kubenswrapper[4901]: E0309 03:15:41.106627 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:15:55 crc kubenswrapper[4901]: I0309 03:15:55.106499 4901 scope.go:117] "RemoveContainer" containerID="324d191a9a6cf1d15e57a90535c15f4180b030663e41d38d4e613a6168e3a1a2" Mar 09 03:15:55 crc kubenswrapper[4901]: E0309 03:15:55.107879 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:15:57 crc kubenswrapper[4901]: I0309 03:15:57.751872 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kkzp8"] Mar 09 03:15:57 crc kubenswrapper[4901]: E0309 03:15:57.753681 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5e2f2de-1f76-43b5-97f2-0ad3cf044169" containerName="collect-profiles" Mar 09 03:15:57 crc kubenswrapper[4901]: I0309 03:15:57.753714 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5e2f2de-1f76-43b5-97f2-0ad3cf044169" containerName="collect-profiles" Mar 09 03:15:57 crc kubenswrapper[4901]: I0309 03:15:57.754529 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5e2f2de-1f76-43b5-97f2-0ad3cf044169" containerName="collect-profiles" Mar 09 03:15:57 crc kubenswrapper[4901]: I0309 03:15:57.759080 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kkzp8" Mar 09 03:15:57 crc kubenswrapper[4901]: I0309 03:15:57.776699 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kkzp8"] Mar 09 03:15:57 crc kubenswrapper[4901]: I0309 03:15:57.779204 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dnsk\" (UniqueName: \"kubernetes.io/projected/1c7f66ad-fcc3-40fe-8b14-d16c3618bf07-kube-api-access-2dnsk\") pod \"redhat-operators-kkzp8\" (UID: \"1c7f66ad-fcc3-40fe-8b14-d16c3618bf07\") " pod="openshift-marketplace/redhat-operators-kkzp8" Mar 09 03:15:57 crc kubenswrapper[4901]: I0309 03:15:57.779446 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c7f66ad-fcc3-40fe-8b14-d16c3618bf07-catalog-content\") pod \"redhat-operators-kkzp8\" (UID: \"1c7f66ad-fcc3-40fe-8b14-d16c3618bf07\") " pod="openshift-marketplace/redhat-operators-kkzp8" Mar 09 03:15:57 crc kubenswrapper[4901]: I0309 03:15:57.779547 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c7f66ad-fcc3-40fe-8b14-d16c3618bf07-utilities\") pod \"redhat-operators-kkzp8\" (UID: \"1c7f66ad-fcc3-40fe-8b14-d16c3618bf07\") " pod="openshift-marketplace/redhat-operators-kkzp8" Mar 09 03:15:57 crc kubenswrapper[4901]: I0309 03:15:57.880384 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dnsk\" (UniqueName: \"kubernetes.io/projected/1c7f66ad-fcc3-40fe-8b14-d16c3618bf07-kube-api-access-2dnsk\") pod \"redhat-operators-kkzp8\" (UID: \"1c7f66ad-fcc3-40fe-8b14-d16c3618bf07\") " pod="openshift-marketplace/redhat-operators-kkzp8" Mar 09 03:15:57 crc kubenswrapper[4901]: I0309 03:15:57.880529 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c7f66ad-fcc3-40fe-8b14-d16c3618bf07-catalog-content\") pod \"redhat-operators-kkzp8\" (UID: \"1c7f66ad-fcc3-40fe-8b14-d16c3618bf07\") " pod="openshift-marketplace/redhat-operators-kkzp8" Mar 09 03:15:57 crc kubenswrapper[4901]: I0309 03:15:57.880575 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c7f66ad-fcc3-40fe-8b14-d16c3618bf07-utilities\") pod \"redhat-operators-kkzp8\" (UID: \"1c7f66ad-fcc3-40fe-8b14-d16c3618bf07\") " pod="openshift-marketplace/redhat-operators-kkzp8" Mar 09 03:15:57 crc kubenswrapper[4901]: I0309 03:15:57.881084 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c7f66ad-fcc3-40fe-8b14-d16c3618bf07-utilities\") pod \"redhat-operators-kkzp8\" (UID: \"1c7f66ad-fcc3-40fe-8b14-d16c3618bf07\") " pod="openshift-marketplace/redhat-operators-kkzp8" Mar 09 03:15:57 crc kubenswrapper[4901]: I0309 03:15:57.881268 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c7f66ad-fcc3-40fe-8b14-d16c3618bf07-catalog-content\") pod \"redhat-operators-kkzp8\" (UID: \"1c7f66ad-fcc3-40fe-8b14-d16c3618bf07\") " pod="openshift-marketplace/redhat-operators-kkzp8" Mar 09 03:15:57 crc kubenswrapper[4901]: I0309 03:15:57.902946 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dnsk\" (UniqueName: \"kubernetes.io/projected/1c7f66ad-fcc3-40fe-8b14-d16c3618bf07-kube-api-access-2dnsk\") pod \"redhat-operators-kkzp8\" (UID: \"1c7f66ad-fcc3-40fe-8b14-d16c3618bf07\") " pod="openshift-marketplace/redhat-operators-kkzp8" Mar 09 03:15:58 crc kubenswrapper[4901]: I0309 03:15:58.125651 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kkzp8" Mar 09 03:15:58 crc kubenswrapper[4901]: I0309 03:15:58.557074 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kkzp8"] Mar 09 03:15:59 crc kubenswrapper[4901]: I0309 03:15:59.248593 4901 generic.go:334] "Generic (PLEG): container finished" podID="1c7f66ad-fcc3-40fe-8b14-d16c3618bf07" containerID="564f6b45cc55ca2eea3ab9a9306cfaf56a722aa00913fb4d91b7a8153a68ef69" exitCode=0 Mar 09 03:15:59 crc kubenswrapper[4901]: I0309 03:15:59.248931 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kkzp8" event={"ID":"1c7f66ad-fcc3-40fe-8b14-d16c3618bf07","Type":"ContainerDied","Data":"564f6b45cc55ca2eea3ab9a9306cfaf56a722aa00913fb4d91b7a8153a68ef69"} Mar 09 03:15:59 crc kubenswrapper[4901]: I0309 03:15:59.248957 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kkzp8" event={"ID":"1c7f66ad-fcc3-40fe-8b14-d16c3618bf07","Type":"ContainerStarted","Data":"2ce22f999ef1d1012852361aa902a3cd3ea9ad438d5f2e9b39a5d9486aea6f23"} Mar 09 03:16:00 crc kubenswrapper[4901]: I0309 03:16:00.170423 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550436-8h7v6"] Mar 09 03:16:00 crc kubenswrapper[4901]: I0309 03:16:00.171309 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550436-8h7v6" Mar 09 03:16:00 crc kubenswrapper[4901]: I0309 03:16:00.181297 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 03:16:00 crc kubenswrapper[4901]: I0309 03:16:00.181302 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 03:16:00 crc kubenswrapper[4901]: I0309 03:16:00.181370 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lq6vs" Mar 09 03:16:00 crc kubenswrapper[4901]: I0309 03:16:00.182268 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550436-8h7v6"] Mar 09 03:16:00 crc kubenswrapper[4901]: I0309 03:16:00.218080 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwbmc\" (UniqueName: \"kubernetes.io/projected/2df87d7b-082b-4832-bd7e-b6caed7a3d8a-kube-api-access-rwbmc\") pod \"auto-csr-approver-29550436-8h7v6\" (UID: \"2df87d7b-082b-4832-bd7e-b6caed7a3d8a\") " pod="openshift-infra/auto-csr-approver-29550436-8h7v6" Mar 09 03:16:00 crc kubenswrapper[4901]: I0309 03:16:00.257110 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kkzp8" event={"ID":"1c7f66ad-fcc3-40fe-8b14-d16c3618bf07","Type":"ContainerStarted","Data":"8aab9d223883d8af8323c3966a7d33f15d15803bfea7dab1d2909d1b289b633a"} Mar 09 03:16:00 crc kubenswrapper[4901]: I0309 03:16:00.319053 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwbmc\" (UniqueName: \"kubernetes.io/projected/2df87d7b-082b-4832-bd7e-b6caed7a3d8a-kube-api-access-rwbmc\") pod \"auto-csr-approver-29550436-8h7v6\" (UID: \"2df87d7b-082b-4832-bd7e-b6caed7a3d8a\") " pod="openshift-infra/auto-csr-approver-29550436-8h7v6" Mar 09 03:16:00 crc kubenswrapper[4901]: I0309 03:16:00.354920 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwbmc\" (UniqueName: \"kubernetes.io/projected/2df87d7b-082b-4832-bd7e-b6caed7a3d8a-kube-api-access-rwbmc\") pod \"auto-csr-approver-29550436-8h7v6\" (UID: \"2df87d7b-082b-4832-bd7e-b6caed7a3d8a\") " pod="openshift-infra/auto-csr-approver-29550436-8h7v6" Mar 09 03:16:00 crc kubenswrapper[4901]: I0309 03:16:00.485191 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550436-8h7v6" Mar 09 03:16:00 crc kubenswrapper[4901]: I0309 03:16:00.784056 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550436-8h7v6"] Mar 09 03:16:00 crc kubenswrapper[4901]: W0309 03:16:00.800341 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2df87d7b_082b_4832_bd7e_b6caed7a3d8a.slice/crio-dedafe7d0fc7cdf4bedcf379762671006b9d07a3cb056dc88969792ae26124a2 WatchSource:0}: Error finding container dedafe7d0fc7cdf4bedcf379762671006b9d07a3cb056dc88969792ae26124a2: Status 404 returned error can't find the container with id dedafe7d0fc7cdf4bedcf379762671006b9d07a3cb056dc88969792ae26124a2 Mar 09 03:16:01 crc kubenswrapper[4901]: I0309 03:16:01.269883 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550436-8h7v6" event={"ID":"2df87d7b-082b-4832-bd7e-b6caed7a3d8a","Type":"ContainerStarted","Data":"dedafe7d0fc7cdf4bedcf379762671006b9d07a3cb056dc88969792ae26124a2"} Mar 09 03:16:01 crc kubenswrapper[4901]: I0309 03:16:01.273646 4901 generic.go:334] "Generic (PLEG): container finished" podID="1c7f66ad-fcc3-40fe-8b14-d16c3618bf07" containerID="8aab9d223883d8af8323c3966a7d33f15d15803bfea7dab1d2909d1b289b633a" exitCode=0 Mar 09 03:16:01 crc kubenswrapper[4901]: I0309 03:16:01.273716 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kkzp8" event={"ID":"1c7f66ad-fcc3-40fe-8b14-d16c3618bf07","Type":"ContainerDied","Data":"8aab9d223883d8af8323c3966a7d33f15d15803bfea7dab1d2909d1b289b633a"} Mar 09 03:16:02 crc kubenswrapper[4901]: I0309 03:16:02.283936 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550436-8h7v6" event={"ID":"2df87d7b-082b-4832-bd7e-b6caed7a3d8a","Type":"ContainerStarted","Data":"8a6ae6112d498700ffac947a38080cfffe8f59d1e9325f822479e005ada8206b"} Mar 09 03:16:02 crc kubenswrapper[4901]: I0309 03:16:02.288622 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kkzp8" event={"ID":"1c7f66ad-fcc3-40fe-8b14-d16c3618bf07","Type":"ContainerStarted","Data":"2a3281a779adbeea24eb3c9d8eb2be2b7de5bdc1ad036ca5e680e68d04e8b4cd"} Mar 09 03:16:02 crc kubenswrapper[4901]: I0309 03:16:02.314797 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550436-8h7v6" podStartSLOduration=1.410858306 podStartE2EDuration="2.314777419s" podCreationTimestamp="2026-03-09 03:16:00 +0000 UTC" firstStartedPulling="2026-03-09 03:16:00.803100206 +0000 UTC m=+2085.392763938" lastFinishedPulling="2026-03-09 03:16:01.707019279 +0000 UTC m=+2086.296683051" observedRunningTime="2026-03-09 03:16:02.296607497 +0000 UTC m=+2086.886271269" watchObservedRunningTime="2026-03-09 03:16:02.314777419 +0000 UTC m=+2086.904441171" Mar 09 03:16:02 crc kubenswrapper[4901]: I0309 03:16:02.316674 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kkzp8" podStartSLOduration=2.645039306 podStartE2EDuration="5.316663976s" podCreationTimestamp="2026-03-09 03:15:57 +0000 UTC" firstStartedPulling="2026-03-09 03:15:59.250759383 +0000 UTC m=+2083.840423135" lastFinishedPulling="2026-03-09 03:16:01.922384063 +0000 UTC m=+2086.512047805" observedRunningTime="2026-03-09 03:16:02.312262806 +0000 UTC m=+2086.901926548" watchObservedRunningTime="2026-03-09 03:16:02.316663976 +0000 UTC m=+2086.906327728" Mar 09 03:16:03 crc kubenswrapper[4901]: I0309 03:16:03.299837 4901 generic.go:334] "Generic (PLEG): container finished" podID="2df87d7b-082b-4832-bd7e-b6caed7a3d8a" containerID="8a6ae6112d498700ffac947a38080cfffe8f59d1e9325f822479e005ada8206b" exitCode=0 Mar 09 03:16:03 crc kubenswrapper[4901]: I0309 03:16:03.299915 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550436-8h7v6" event={"ID":"2df87d7b-082b-4832-bd7e-b6caed7a3d8a","Type":"ContainerDied","Data":"8a6ae6112d498700ffac947a38080cfffe8f59d1e9325f822479e005ada8206b"} Mar 09 03:16:04 crc kubenswrapper[4901]: I0309 03:16:04.642555 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550436-8h7v6" Mar 09 03:16:04 crc kubenswrapper[4901]: I0309 03:16:04.816815 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwbmc\" (UniqueName: \"kubernetes.io/projected/2df87d7b-082b-4832-bd7e-b6caed7a3d8a-kube-api-access-rwbmc\") pod \"2df87d7b-082b-4832-bd7e-b6caed7a3d8a\" (UID: \"2df87d7b-082b-4832-bd7e-b6caed7a3d8a\") " Mar 09 03:16:04 crc kubenswrapper[4901]: I0309 03:16:04.832553 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2df87d7b-082b-4832-bd7e-b6caed7a3d8a-kube-api-access-rwbmc" (OuterVolumeSpecName: "kube-api-access-rwbmc") pod "2df87d7b-082b-4832-bd7e-b6caed7a3d8a" (UID: "2df87d7b-082b-4832-bd7e-b6caed7a3d8a"). InnerVolumeSpecName "kube-api-access-rwbmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:16:04 crc kubenswrapper[4901]: I0309 03:16:04.918983 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwbmc\" (UniqueName: \"kubernetes.io/projected/2df87d7b-082b-4832-bd7e-b6caed7a3d8a-kube-api-access-rwbmc\") on node \"crc\" DevicePath \"\"" Mar 09 03:16:05 crc kubenswrapper[4901]: I0309 03:16:05.323044 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550436-8h7v6" event={"ID":"2df87d7b-082b-4832-bd7e-b6caed7a3d8a","Type":"ContainerDied","Data":"dedafe7d0fc7cdf4bedcf379762671006b9d07a3cb056dc88969792ae26124a2"} Mar 09 03:16:05 crc kubenswrapper[4901]: I0309 03:16:05.323096 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dedafe7d0fc7cdf4bedcf379762671006b9d07a3cb056dc88969792ae26124a2" Mar 09 03:16:05 crc kubenswrapper[4901]: I0309 03:16:05.323147 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550436-8h7v6" Mar 09 03:16:05 crc kubenswrapper[4901]: I0309 03:16:05.386988 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550430-t9p6m"] Mar 09 03:16:05 crc kubenswrapper[4901]: I0309 03:16:05.393256 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550430-t9p6m"] Mar 09 03:16:06 crc kubenswrapper[4901]: I0309 03:16:06.114557 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5251665d-9f46-4d5a-8c20-0962766e0f84" path="/var/lib/kubelet/pods/5251665d-9f46-4d5a-8c20-0962766e0f84/volumes" Mar 09 03:16:08 crc kubenswrapper[4901]: I0309 03:16:08.112113 4901 scope.go:117] "RemoveContainer" containerID="324d191a9a6cf1d15e57a90535c15f4180b030663e41d38d4e613a6168e3a1a2" Mar 09 03:16:08 crc kubenswrapper[4901]: I0309 03:16:08.125852 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kkzp8" Mar 09 03:16:08 crc kubenswrapper[4901]: I0309 03:16:08.125901 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kkzp8" Mar 09 03:16:08 crc kubenswrapper[4901]: I0309 03:16:08.360482 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" event={"ID":"65e722e8-52c4-4bb6-9927-f378b2f7296a","Type":"ContainerStarted","Data":"fe198574e3e8e3b772eb309166e1823040e85757d5648d18c8bedb1946e05ad9"} Mar 09 03:16:09 crc kubenswrapper[4901]: I0309 03:16:09.173789 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kkzp8" podUID="1c7f66ad-fcc3-40fe-8b14-d16c3618bf07" containerName="registry-server" probeResult="failure" output=< Mar 09 03:16:09 crc kubenswrapper[4901]: timeout: failed to connect service ":50051" within 1s Mar 09 03:16:09 crc kubenswrapper[4901]: > Mar 09 03:16:18 crc kubenswrapper[4901]: I0309 03:16:18.200086 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kkzp8" Mar 09 03:16:18 crc kubenswrapper[4901]: I0309 03:16:18.269760 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kkzp8" Mar 09 03:16:18 crc kubenswrapper[4901]: I0309 03:16:18.439763 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kkzp8"] Mar 09 03:16:19 crc kubenswrapper[4901]: I0309 03:16:19.553804 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kkzp8" podUID="1c7f66ad-fcc3-40fe-8b14-d16c3618bf07" containerName="registry-server" containerID="cri-o://2a3281a779adbeea24eb3c9d8eb2be2b7de5bdc1ad036ca5e680e68d04e8b4cd" gracePeriod=2 Mar 09 03:16:19 crc kubenswrapper[4901]: I0309 03:16:19.970484 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kkzp8" Mar 09 03:16:20 crc kubenswrapper[4901]: I0309 03:16:20.107556 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c7f66ad-fcc3-40fe-8b14-d16c3618bf07-utilities\") pod \"1c7f66ad-fcc3-40fe-8b14-d16c3618bf07\" (UID: \"1c7f66ad-fcc3-40fe-8b14-d16c3618bf07\") " Mar 09 03:16:20 crc kubenswrapper[4901]: I0309 03:16:20.107770 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dnsk\" (UniqueName: \"kubernetes.io/projected/1c7f66ad-fcc3-40fe-8b14-d16c3618bf07-kube-api-access-2dnsk\") pod \"1c7f66ad-fcc3-40fe-8b14-d16c3618bf07\" (UID: \"1c7f66ad-fcc3-40fe-8b14-d16c3618bf07\") " Mar 09 03:16:20 crc kubenswrapper[4901]: I0309 03:16:20.107938 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c7f66ad-fcc3-40fe-8b14-d16c3618bf07-catalog-content\") pod \"1c7f66ad-fcc3-40fe-8b14-d16c3618bf07\" (UID: \"1c7f66ad-fcc3-40fe-8b14-d16c3618bf07\") " Mar 09 03:16:20 crc kubenswrapper[4901]: I0309 03:16:20.108524 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c7f66ad-fcc3-40fe-8b14-d16c3618bf07-utilities" (OuterVolumeSpecName: "utilities") pod "1c7f66ad-fcc3-40fe-8b14-d16c3618bf07" (UID: "1c7f66ad-fcc3-40fe-8b14-d16c3618bf07"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:16:20 crc kubenswrapper[4901]: I0309 03:16:20.117439 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c7f66ad-fcc3-40fe-8b14-d16c3618bf07-kube-api-access-2dnsk" (OuterVolumeSpecName: "kube-api-access-2dnsk") pod "1c7f66ad-fcc3-40fe-8b14-d16c3618bf07" (UID: "1c7f66ad-fcc3-40fe-8b14-d16c3618bf07"). InnerVolumeSpecName "kube-api-access-2dnsk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:16:20 crc kubenswrapper[4901]: I0309 03:16:20.121345 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dnsk\" (UniqueName: \"kubernetes.io/projected/1c7f66ad-fcc3-40fe-8b14-d16c3618bf07-kube-api-access-2dnsk\") on node \"crc\" DevicePath \"\"" Mar 09 03:16:20 crc kubenswrapper[4901]: I0309 03:16:20.121408 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c7f66ad-fcc3-40fe-8b14-d16c3618bf07-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 03:16:20 crc kubenswrapper[4901]: I0309 03:16:20.331182 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c7f66ad-fcc3-40fe-8b14-d16c3618bf07-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1c7f66ad-fcc3-40fe-8b14-d16c3618bf07" (UID: "1c7f66ad-fcc3-40fe-8b14-d16c3618bf07"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:16:20 crc kubenswrapper[4901]: I0309 03:16:20.425930 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c7f66ad-fcc3-40fe-8b14-d16c3618bf07-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 03:16:20 crc kubenswrapper[4901]: I0309 03:16:20.563602 4901 generic.go:334] "Generic (PLEG): container finished" podID="1c7f66ad-fcc3-40fe-8b14-d16c3618bf07" containerID="2a3281a779adbeea24eb3c9d8eb2be2b7de5bdc1ad036ca5e680e68d04e8b4cd" exitCode=0 Mar 09 03:16:20 crc kubenswrapper[4901]: I0309 03:16:20.563658 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kkzp8" event={"ID":"1c7f66ad-fcc3-40fe-8b14-d16c3618bf07","Type":"ContainerDied","Data":"2a3281a779adbeea24eb3c9d8eb2be2b7de5bdc1ad036ca5e680e68d04e8b4cd"} Mar 09 03:16:20 crc kubenswrapper[4901]: I0309 03:16:20.563675 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kkzp8" Mar 09 03:16:20 crc kubenswrapper[4901]: I0309 03:16:20.563695 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kkzp8" event={"ID":"1c7f66ad-fcc3-40fe-8b14-d16c3618bf07","Type":"ContainerDied","Data":"2ce22f999ef1d1012852361aa902a3cd3ea9ad438d5f2e9b39a5d9486aea6f23"} Mar 09 03:16:20 crc kubenswrapper[4901]: I0309 03:16:20.563719 4901 scope.go:117] "RemoveContainer" containerID="2a3281a779adbeea24eb3c9d8eb2be2b7de5bdc1ad036ca5e680e68d04e8b4cd" Mar 09 03:16:20 crc kubenswrapper[4901]: I0309 03:16:20.589093 4901 scope.go:117] "RemoveContainer" containerID="8aab9d223883d8af8323c3966a7d33f15d15803bfea7dab1d2909d1b289b633a" Mar 09 03:16:20 crc kubenswrapper[4901]: I0309 03:16:20.626345 4901 scope.go:117] "RemoveContainer" containerID="564f6b45cc55ca2eea3ab9a9306cfaf56a722aa00913fb4d91b7a8153a68ef69" Mar 09 03:16:20 crc kubenswrapper[4901]: I0309 03:16:20.626461 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kkzp8"] Mar 09 03:16:20 crc kubenswrapper[4901]: I0309 03:16:20.643760 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kkzp8"] Mar 09 03:16:20 crc kubenswrapper[4901]: I0309 03:16:20.653049 4901 scope.go:117] "RemoveContainer" containerID="2a3281a779adbeea24eb3c9d8eb2be2b7de5bdc1ad036ca5e680e68d04e8b4cd" Mar 09 03:16:20 crc kubenswrapper[4901]: E0309 03:16:20.653797 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a3281a779adbeea24eb3c9d8eb2be2b7de5bdc1ad036ca5e680e68d04e8b4cd\": container with ID starting with 2a3281a779adbeea24eb3c9d8eb2be2b7de5bdc1ad036ca5e680e68d04e8b4cd not found: ID does not exist" containerID="2a3281a779adbeea24eb3c9d8eb2be2b7de5bdc1ad036ca5e680e68d04e8b4cd" Mar 09 03:16:20 crc kubenswrapper[4901]: I0309 03:16:20.653831 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a3281a779adbeea24eb3c9d8eb2be2b7de5bdc1ad036ca5e680e68d04e8b4cd"} err="failed to get container status \"2a3281a779adbeea24eb3c9d8eb2be2b7de5bdc1ad036ca5e680e68d04e8b4cd\": rpc error: code = NotFound desc = could not find container \"2a3281a779adbeea24eb3c9d8eb2be2b7de5bdc1ad036ca5e680e68d04e8b4cd\": container with ID starting with 2a3281a779adbeea24eb3c9d8eb2be2b7de5bdc1ad036ca5e680e68d04e8b4cd not found: ID does not exist" Mar 09 03:16:20 crc kubenswrapper[4901]: I0309 03:16:20.653850 4901 scope.go:117] "RemoveContainer" containerID="8aab9d223883d8af8323c3966a7d33f15d15803bfea7dab1d2909d1b289b633a" Mar 09 03:16:20 crc kubenswrapper[4901]: E0309 03:16:20.654240 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8aab9d223883d8af8323c3966a7d33f15d15803bfea7dab1d2909d1b289b633a\": container with ID starting with 8aab9d223883d8af8323c3966a7d33f15d15803bfea7dab1d2909d1b289b633a not found: ID does not exist" containerID="8aab9d223883d8af8323c3966a7d33f15d15803bfea7dab1d2909d1b289b633a" Mar 09 03:16:20 crc kubenswrapper[4901]: I0309 03:16:20.654267 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8aab9d223883d8af8323c3966a7d33f15d15803bfea7dab1d2909d1b289b633a"} err="failed to get container status \"8aab9d223883d8af8323c3966a7d33f15d15803bfea7dab1d2909d1b289b633a\": rpc error: code = NotFound desc = could not find container \"8aab9d223883d8af8323c3966a7d33f15d15803bfea7dab1d2909d1b289b633a\": container with ID starting with 8aab9d223883d8af8323c3966a7d33f15d15803bfea7dab1d2909d1b289b633a not found: ID does not exist" Mar 09 03:16:20 crc kubenswrapper[4901]: I0309 03:16:20.654280 4901 scope.go:117] "RemoveContainer" containerID="564f6b45cc55ca2eea3ab9a9306cfaf56a722aa00913fb4d91b7a8153a68ef69" Mar 09 03:16:20 crc kubenswrapper[4901]: E0309 03:16:20.654608 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"564f6b45cc55ca2eea3ab9a9306cfaf56a722aa00913fb4d91b7a8153a68ef69\": container with ID starting with 564f6b45cc55ca2eea3ab9a9306cfaf56a722aa00913fb4d91b7a8153a68ef69 not found: ID does not exist" containerID="564f6b45cc55ca2eea3ab9a9306cfaf56a722aa00913fb4d91b7a8153a68ef69" Mar 09 03:16:20 crc kubenswrapper[4901]: I0309 03:16:20.654628 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"564f6b45cc55ca2eea3ab9a9306cfaf56a722aa00913fb4d91b7a8153a68ef69"} err="failed to get container status \"564f6b45cc55ca2eea3ab9a9306cfaf56a722aa00913fb4d91b7a8153a68ef69\": rpc error: code = NotFound desc = could not find container \"564f6b45cc55ca2eea3ab9a9306cfaf56a722aa00913fb4d91b7a8153a68ef69\": container with ID starting with 564f6b45cc55ca2eea3ab9a9306cfaf56a722aa00913fb4d91b7a8153a68ef69 not found: ID does not exist" Mar 09 03:16:22 crc kubenswrapper[4901]: I0309 03:16:22.123848 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c7f66ad-fcc3-40fe-8b14-d16c3618bf07" path="/var/lib/kubelet/pods/1c7f66ad-fcc3-40fe-8b14-d16c3618bf07/volumes" Mar 09 03:16:38 crc kubenswrapper[4901]: I0309 03:16:38.261018 4901 scope.go:117] "RemoveContainer" containerID="09331b7dd3de1dc5e49fc358cbe46ab76e0d17711be3c5ac553b513458f833ab" Mar 09 03:18:00 crc kubenswrapper[4901]: I0309 03:18:00.156866 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550438-q2n4q"] Mar 09 03:18:00 crc kubenswrapper[4901]: E0309 03:18:00.157669 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c7f66ad-fcc3-40fe-8b14-d16c3618bf07" containerName="extract-utilities" Mar 09 03:18:00 crc kubenswrapper[4901]: I0309 03:18:00.157704 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c7f66ad-fcc3-40fe-8b14-d16c3618bf07" containerName="extract-utilities" Mar 09 03:18:00 crc kubenswrapper[4901]: E0309 03:18:00.157721 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2df87d7b-082b-4832-bd7e-b6caed7a3d8a" containerName="oc" Mar 09 03:18:00 crc kubenswrapper[4901]: I0309 03:18:00.157729 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="2df87d7b-082b-4832-bd7e-b6caed7a3d8a" containerName="oc" Mar 09 03:18:00 crc kubenswrapper[4901]: E0309 03:18:00.157739 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c7f66ad-fcc3-40fe-8b14-d16c3618bf07" containerName="extract-content" Mar 09 03:18:00 crc kubenswrapper[4901]: I0309 03:18:00.157747 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c7f66ad-fcc3-40fe-8b14-d16c3618bf07" containerName="extract-content" Mar 09 03:18:00 crc kubenswrapper[4901]: E0309 03:18:00.157756 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c7f66ad-fcc3-40fe-8b14-d16c3618bf07" containerName="registry-server" Mar 09 03:18:00 crc kubenswrapper[4901]: I0309 03:18:00.157764 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c7f66ad-fcc3-40fe-8b14-d16c3618bf07" containerName="registry-server" Mar 09 03:18:00 crc kubenswrapper[4901]: I0309 03:18:00.157924 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="2df87d7b-082b-4832-bd7e-b6caed7a3d8a" containerName="oc" Mar 09 03:18:00 crc kubenswrapper[4901]: I0309 03:18:00.157936 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c7f66ad-fcc3-40fe-8b14-d16c3618bf07" containerName="registry-server" Mar 09 03:18:00 crc kubenswrapper[4901]: I0309 03:18:00.158972 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550438-q2n4q" Mar 09 03:18:00 crc kubenswrapper[4901]: I0309 03:18:00.163047 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 03:18:00 crc kubenswrapper[4901]: I0309 03:18:00.163049 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lq6vs" Mar 09 03:18:00 crc kubenswrapper[4901]: I0309 03:18:00.163968 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 03:18:00 crc kubenswrapper[4901]: I0309 03:18:00.179240 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550438-q2n4q"] Mar 09 03:18:00 crc kubenswrapper[4901]: I0309 03:18:00.334074 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jq8g\" (UniqueName: \"kubernetes.io/projected/c668c14b-279e-4201-9464-37f15f96b512-kube-api-access-9jq8g\") pod \"auto-csr-approver-29550438-q2n4q\" (UID: \"c668c14b-279e-4201-9464-37f15f96b512\") " pod="openshift-infra/auto-csr-approver-29550438-q2n4q" Mar 09 03:18:00 crc kubenswrapper[4901]: I0309 03:18:00.435089 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jq8g\" (UniqueName: \"kubernetes.io/projected/c668c14b-279e-4201-9464-37f15f96b512-kube-api-access-9jq8g\") pod \"auto-csr-approver-29550438-q2n4q\" (UID: \"c668c14b-279e-4201-9464-37f15f96b512\") " pod="openshift-infra/auto-csr-approver-29550438-q2n4q" Mar 09 03:18:00 crc kubenswrapper[4901]: I0309 03:18:00.457176 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jq8g\" (UniqueName: \"kubernetes.io/projected/c668c14b-279e-4201-9464-37f15f96b512-kube-api-access-9jq8g\") pod \"auto-csr-approver-29550438-q2n4q\" (UID: \"c668c14b-279e-4201-9464-37f15f96b512\") " pod="openshift-infra/auto-csr-approver-29550438-q2n4q" Mar 09 03:18:00 crc kubenswrapper[4901]: I0309 03:18:00.479932 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550438-q2n4q" Mar 09 03:18:00 crc kubenswrapper[4901]: I0309 03:18:00.996292 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550438-q2n4q"] Mar 09 03:18:01 crc kubenswrapper[4901]: W0309 03:18:01.005331 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc668c14b_279e_4201_9464_37f15f96b512.slice/crio-722f9ed005677cc3d3d9ef05552f8f0ef8b092a76230f22ab9f26ddbea30df1c WatchSource:0}: Error finding container 722f9ed005677cc3d3d9ef05552f8f0ef8b092a76230f22ab9f26ddbea30df1c: Status 404 returned error can't find the container with id 722f9ed005677cc3d3d9ef05552f8f0ef8b092a76230f22ab9f26ddbea30df1c Mar 09 03:18:01 crc kubenswrapper[4901]: I0309 03:18:01.008553 4901 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 03:18:01 crc kubenswrapper[4901]: I0309 03:18:01.566893 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550438-q2n4q" event={"ID":"c668c14b-279e-4201-9464-37f15f96b512","Type":"ContainerStarted","Data":"722f9ed005677cc3d3d9ef05552f8f0ef8b092a76230f22ab9f26ddbea30df1c"} Mar 09 03:18:02 crc kubenswrapper[4901]: I0309 03:18:02.597040 4901 generic.go:334] "Generic (PLEG): container finished" podID="c668c14b-279e-4201-9464-37f15f96b512" containerID="4fc8b3ea38c5140c7d3555a226e0ec78dd84ee145d0d16a912adc8e6220510ed" exitCode=0 Mar 09 03:18:02 crc kubenswrapper[4901]: I0309 03:18:02.597191 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550438-q2n4q" event={"ID":"c668c14b-279e-4201-9464-37f15f96b512","Type":"ContainerDied","Data":"4fc8b3ea38c5140c7d3555a226e0ec78dd84ee145d0d16a912adc8e6220510ed"} Mar 09 03:18:04 crc kubenswrapper[4901]: I0309 03:18:04.058164 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550438-q2n4q" Mar 09 03:18:04 crc kubenswrapper[4901]: I0309 03:18:04.196949 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jq8g\" (UniqueName: \"kubernetes.io/projected/c668c14b-279e-4201-9464-37f15f96b512-kube-api-access-9jq8g\") pod \"c668c14b-279e-4201-9464-37f15f96b512\" (UID: \"c668c14b-279e-4201-9464-37f15f96b512\") " Mar 09 03:18:04 crc kubenswrapper[4901]: I0309 03:18:04.208993 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c668c14b-279e-4201-9464-37f15f96b512-kube-api-access-9jq8g" (OuterVolumeSpecName: "kube-api-access-9jq8g") pod "c668c14b-279e-4201-9464-37f15f96b512" (UID: "c668c14b-279e-4201-9464-37f15f96b512"). InnerVolumeSpecName "kube-api-access-9jq8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:18:04 crc kubenswrapper[4901]: I0309 03:18:04.299967 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jq8g\" (UniqueName: \"kubernetes.io/projected/c668c14b-279e-4201-9464-37f15f96b512-kube-api-access-9jq8g\") on node \"crc\" DevicePath \"\"" Mar 09 03:18:04 crc kubenswrapper[4901]: I0309 03:18:04.616516 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550438-q2n4q" event={"ID":"c668c14b-279e-4201-9464-37f15f96b512","Type":"ContainerDied","Data":"722f9ed005677cc3d3d9ef05552f8f0ef8b092a76230f22ab9f26ddbea30df1c"} Mar 09 03:18:04 crc kubenswrapper[4901]: I0309 03:18:04.616561 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="722f9ed005677cc3d3d9ef05552f8f0ef8b092a76230f22ab9f26ddbea30df1c" Mar 09 03:18:04 crc kubenswrapper[4901]: I0309 03:18:04.616590 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550438-q2n4q" Mar 09 03:18:05 crc kubenswrapper[4901]: I0309 03:18:05.163031 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550432-rb2kd"] Mar 09 03:18:05 crc kubenswrapper[4901]: I0309 03:18:05.177351 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550432-rb2kd"] Mar 09 03:18:06 crc kubenswrapper[4901]: I0309 03:18:06.114779 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="594dbe7d-afcd-4710-be6a-631e834c2614" path="/var/lib/kubelet/pods/594dbe7d-afcd-4710-be6a-631e834c2614/volumes" Mar 09 03:18:30 crc kubenswrapper[4901]: I0309 03:18:30.863075 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 03:18:30 crc kubenswrapper[4901]: I0309 03:18:30.863657 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 03:18:38 crc kubenswrapper[4901]: I0309 03:18:38.385712 4901 scope.go:117] "RemoveContainer" containerID="97870cd09e2cc5eb34a06e20679ab245e08203e1d8392052533ec378e5319d9a" Mar 09 03:19:00 crc kubenswrapper[4901]: I0309 03:19:00.862699 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 03:19:00 crc kubenswrapper[4901]: I0309 03:19:00.863587 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 03:19:30 crc kubenswrapper[4901]: I0309 03:19:30.863060 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 03:19:30 crc kubenswrapper[4901]: I0309 03:19:30.863784 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 03:19:30 crc kubenswrapper[4901]: I0309 03:19:30.863853 4901 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5c998" Mar 09 03:19:30 crc kubenswrapper[4901]: I0309 03:19:30.864891 4901 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fe198574e3e8e3b772eb309166e1823040e85757d5648d18c8bedb1946e05ad9"} pod="openshift-machine-config-operator/machine-config-daemon-5c998" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 03:19:30 crc kubenswrapper[4901]: I0309 03:19:30.865002 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" containerID="cri-o://fe198574e3e8e3b772eb309166e1823040e85757d5648d18c8bedb1946e05ad9" gracePeriod=600 Mar 09 03:19:31 crc kubenswrapper[4901]: I0309 03:19:31.449593 4901 generic.go:334] "Generic (PLEG): container finished" podID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerID="fe198574e3e8e3b772eb309166e1823040e85757d5648d18c8bedb1946e05ad9" exitCode=0 Mar 09 03:19:31 crc kubenswrapper[4901]: I0309 03:19:31.449679 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" event={"ID":"65e722e8-52c4-4bb6-9927-f378b2f7296a","Type":"ContainerDied","Data":"fe198574e3e8e3b772eb309166e1823040e85757d5648d18c8bedb1946e05ad9"} Mar 09 03:19:31 crc kubenswrapper[4901]: I0309 03:19:31.450116 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" event={"ID":"65e722e8-52c4-4bb6-9927-f378b2f7296a","Type":"ContainerStarted","Data":"b882e9e78d8beaa4679398085320aa14420eb8f755a8b3bec066597796f6c72e"} Mar 09 03:19:31 crc kubenswrapper[4901]: I0309 03:19:31.450139 4901 scope.go:117] "RemoveContainer" containerID="324d191a9a6cf1d15e57a90535c15f4180b030663e41d38d4e613a6168e3a1a2" Mar 09 03:20:00 crc kubenswrapper[4901]: I0309 03:20:00.171404 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550440-jzzt2"] Mar 09 03:20:00 crc kubenswrapper[4901]: E0309 03:20:00.172898 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c668c14b-279e-4201-9464-37f15f96b512" containerName="oc" Mar 09 03:20:00 crc kubenswrapper[4901]: I0309 03:20:00.172922 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="c668c14b-279e-4201-9464-37f15f96b512" containerName="oc" Mar 09 03:20:00 crc kubenswrapper[4901]: I0309 03:20:00.173403 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="c668c14b-279e-4201-9464-37f15f96b512" containerName="oc" Mar 09 03:20:00 crc kubenswrapper[4901]: I0309 03:20:00.174136 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550440-jzzt2" Mar 09 03:20:00 crc kubenswrapper[4901]: I0309 03:20:00.176273 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lq6vs" Mar 09 03:20:00 crc kubenswrapper[4901]: I0309 03:20:00.176828 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 03:20:00 crc kubenswrapper[4901]: I0309 03:20:00.178235 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 03:20:00 crc kubenswrapper[4901]: I0309 03:20:00.186630 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550440-jzzt2"] Mar 09 03:20:00 crc kubenswrapper[4901]: I0309 03:20:00.320536 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxcpv\" (UniqueName: \"kubernetes.io/projected/0c6aa863-04d9-44c1-95ba-22f343e6e914-kube-api-access-jxcpv\") pod \"auto-csr-approver-29550440-jzzt2\" (UID: \"0c6aa863-04d9-44c1-95ba-22f343e6e914\") " pod="openshift-infra/auto-csr-approver-29550440-jzzt2" Mar 09 03:20:00 crc kubenswrapper[4901]: I0309 03:20:00.422176 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxcpv\" (UniqueName: \"kubernetes.io/projected/0c6aa863-04d9-44c1-95ba-22f343e6e914-kube-api-access-jxcpv\") pod \"auto-csr-approver-29550440-jzzt2\" (UID: \"0c6aa863-04d9-44c1-95ba-22f343e6e914\") " pod="openshift-infra/auto-csr-approver-29550440-jzzt2" Mar 09 03:20:00 crc kubenswrapper[4901]: I0309 03:20:00.451391 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxcpv\" (UniqueName: \"kubernetes.io/projected/0c6aa863-04d9-44c1-95ba-22f343e6e914-kube-api-access-jxcpv\") pod \"auto-csr-approver-29550440-jzzt2\" (UID: \"0c6aa863-04d9-44c1-95ba-22f343e6e914\") " pod="openshift-infra/auto-csr-approver-29550440-jzzt2" Mar 09 03:20:00 crc kubenswrapper[4901]: I0309 03:20:00.502751 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550440-jzzt2" Mar 09 03:20:01 crc kubenswrapper[4901]: I0309 03:20:01.022112 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550440-jzzt2"] Mar 09 03:20:01 crc kubenswrapper[4901]: I0309 03:20:01.766988 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550440-jzzt2" event={"ID":"0c6aa863-04d9-44c1-95ba-22f343e6e914","Type":"ContainerStarted","Data":"bca1d64a5c6bc145c9363e299a7c2fbabeb018e0685815ff01d569fcb6a773ba"} Mar 09 03:20:03 crc kubenswrapper[4901]: I0309 03:20:03.789054 4901 generic.go:334] "Generic (PLEG): container finished" podID="0c6aa863-04d9-44c1-95ba-22f343e6e914" containerID="1ef45a2d2de790508e49589be6162b1bd92b465543f0df92fbf8b828ab064158" exitCode=0 Mar 09 03:20:03 crc kubenswrapper[4901]: I0309 03:20:03.789125 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550440-jzzt2" event={"ID":"0c6aa863-04d9-44c1-95ba-22f343e6e914","Type":"ContainerDied","Data":"1ef45a2d2de790508e49589be6162b1bd92b465543f0df92fbf8b828ab064158"} Mar 09 03:20:05 crc kubenswrapper[4901]: I0309 03:20:05.135572 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550440-jzzt2" Mar 09 03:20:05 crc kubenswrapper[4901]: I0309 03:20:05.302771 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxcpv\" (UniqueName: \"kubernetes.io/projected/0c6aa863-04d9-44c1-95ba-22f343e6e914-kube-api-access-jxcpv\") pod \"0c6aa863-04d9-44c1-95ba-22f343e6e914\" (UID: \"0c6aa863-04d9-44c1-95ba-22f343e6e914\") " Mar 09 03:20:05 crc kubenswrapper[4901]: I0309 03:20:05.310575 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c6aa863-04d9-44c1-95ba-22f343e6e914-kube-api-access-jxcpv" (OuterVolumeSpecName: "kube-api-access-jxcpv") pod "0c6aa863-04d9-44c1-95ba-22f343e6e914" (UID: "0c6aa863-04d9-44c1-95ba-22f343e6e914"). InnerVolumeSpecName "kube-api-access-jxcpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:20:05 crc kubenswrapper[4901]: I0309 03:20:05.404242 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxcpv\" (UniqueName: \"kubernetes.io/projected/0c6aa863-04d9-44c1-95ba-22f343e6e914-kube-api-access-jxcpv\") on node \"crc\" DevicePath \"\"" Mar 09 03:20:05 crc kubenswrapper[4901]: I0309 03:20:05.809316 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550440-jzzt2" event={"ID":"0c6aa863-04d9-44c1-95ba-22f343e6e914","Type":"ContainerDied","Data":"bca1d64a5c6bc145c9363e299a7c2fbabeb018e0685815ff01d569fcb6a773ba"} Mar 09 03:20:05 crc kubenswrapper[4901]: I0309 03:20:05.809377 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bca1d64a5c6bc145c9363e299a7c2fbabeb018e0685815ff01d569fcb6a773ba" Mar 09 03:20:05 crc kubenswrapper[4901]: I0309 03:20:05.809468 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550440-jzzt2" Mar 09 03:20:06 crc kubenswrapper[4901]: I0309 03:20:06.237338 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550434-x4s5z"] Mar 09 03:20:06 crc kubenswrapper[4901]: I0309 03:20:06.248479 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550434-x4s5z"] Mar 09 03:20:08 crc kubenswrapper[4901]: I0309 03:20:08.121694 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52da100b-2af2-43d0-8309-4376e54b34d3" path="/var/lib/kubelet/pods/52da100b-2af2-43d0-8309-4376e54b34d3/volumes" Mar 09 03:20:38 crc kubenswrapper[4901]: I0309 03:20:38.476688 4901 scope.go:117] "RemoveContainer" containerID="e42bb94cc698e6e14df7f06758942c53fafc31d9b732721ac550ecf9557f7053" Mar 09 03:22:00 crc kubenswrapper[4901]: I0309 03:22:00.159919 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550442-79j67"] Mar 09 03:22:00 crc kubenswrapper[4901]: E0309 03:22:00.161101 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c6aa863-04d9-44c1-95ba-22f343e6e914" containerName="oc" Mar 09 03:22:00 crc kubenswrapper[4901]: I0309 03:22:00.161126 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c6aa863-04d9-44c1-95ba-22f343e6e914" containerName="oc" Mar 09 03:22:00 crc kubenswrapper[4901]: I0309 03:22:00.161442 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c6aa863-04d9-44c1-95ba-22f343e6e914" containerName="oc" Mar 09 03:22:00 crc kubenswrapper[4901]: I0309 03:22:00.162358 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550442-79j67" Mar 09 03:22:00 crc kubenswrapper[4901]: I0309 03:22:00.165971 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 03:22:00 crc kubenswrapper[4901]: I0309 03:22:00.165981 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lq6vs" Mar 09 03:22:00 crc kubenswrapper[4901]: I0309 03:22:00.165992 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 03:22:00 crc kubenswrapper[4901]: I0309 03:22:00.172490 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550442-79j67"] Mar 09 03:22:00 crc kubenswrapper[4901]: I0309 03:22:00.210187 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ldqr\" (UniqueName: \"kubernetes.io/projected/ff6db7d3-b776-43ea-911a-411904a0deb4-kube-api-access-6ldqr\") pod \"auto-csr-approver-29550442-79j67\" (UID: \"ff6db7d3-b776-43ea-911a-411904a0deb4\") " pod="openshift-infra/auto-csr-approver-29550442-79j67" Mar 09 03:22:00 crc kubenswrapper[4901]: I0309 03:22:00.312928 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ldqr\" (UniqueName: \"kubernetes.io/projected/ff6db7d3-b776-43ea-911a-411904a0deb4-kube-api-access-6ldqr\") pod \"auto-csr-approver-29550442-79j67\" (UID: \"ff6db7d3-b776-43ea-911a-411904a0deb4\") " pod="openshift-infra/auto-csr-approver-29550442-79j67" Mar 09 03:22:00 crc kubenswrapper[4901]: I0309 03:22:00.335845 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ldqr\" (UniqueName: \"kubernetes.io/projected/ff6db7d3-b776-43ea-911a-411904a0deb4-kube-api-access-6ldqr\") pod \"auto-csr-approver-29550442-79j67\" (UID: \"ff6db7d3-b776-43ea-911a-411904a0deb4\") " pod="openshift-infra/auto-csr-approver-29550442-79j67" Mar 09 03:22:00 crc kubenswrapper[4901]: I0309 03:22:00.499144 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550442-79j67" Mar 09 03:22:00 crc kubenswrapper[4901]: I0309 03:22:00.780483 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550442-79j67"] Mar 09 03:22:00 crc kubenswrapper[4901]: I0309 03:22:00.859608 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550442-79j67" event={"ID":"ff6db7d3-b776-43ea-911a-411904a0deb4","Type":"ContainerStarted","Data":"4fe6b9fe9c5aa574568562ffd746b80a59fda3697e21ea6bae29e8a174363ecb"} Mar 09 03:22:00 crc kubenswrapper[4901]: I0309 03:22:00.863733 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 03:22:00 crc kubenswrapper[4901]: I0309 03:22:00.863812 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 03:22:02 crc kubenswrapper[4901]: I0309 03:22:02.884064 4901 generic.go:334] "Generic (PLEG): container finished" podID="ff6db7d3-b776-43ea-911a-411904a0deb4" containerID="fdce3477370416b31f6ddef6a9fd3137859cf002a6ac5f19d90b45842088dfdc" exitCode=0 Mar 09 03:22:02 crc kubenswrapper[4901]: I0309 03:22:02.884169 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550442-79j67" event={"ID":"ff6db7d3-b776-43ea-911a-411904a0deb4","Type":"ContainerDied","Data":"fdce3477370416b31f6ddef6a9fd3137859cf002a6ac5f19d90b45842088dfdc"} Mar 09 03:22:04 crc kubenswrapper[4901]: I0309 03:22:04.239661 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550442-79j67" Mar 09 03:22:04 crc kubenswrapper[4901]: I0309 03:22:04.280724 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ldqr\" (UniqueName: \"kubernetes.io/projected/ff6db7d3-b776-43ea-911a-411904a0deb4-kube-api-access-6ldqr\") pod \"ff6db7d3-b776-43ea-911a-411904a0deb4\" (UID: \"ff6db7d3-b776-43ea-911a-411904a0deb4\") " Mar 09 03:22:04 crc kubenswrapper[4901]: I0309 03:22:04.286122 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff6db7d3-b776-43ea-911a-411904a0deb4-kube-api-access-6ldqr" (OuterVolumeSpecName: "kube-api-access-6ldqr") pod "ff6db7d3-b776-43ea-911a-411904a0deb4" (UID: "ff6db7d3-b776-43ea-911a-411904a0deb4"). InnerVolumeSpecName "kube-api-access-6ldqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:22:04 crc kubenswrapper[4901]: I0309 03:22:04.382204 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ldqr\" (UniqueName: \"kubernetes.io/projected/ff6db7d3-b776-43ea-911a-411904a0deb4-kube-api-access-6ldqr\") on node \"crc\" DevicePath \"\"" Mar 09 03:22:04 crc kubenswrapper[4901]: I0309 03:22:04.906318 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550442-79j67" event={"ID":"ff6db7d3-b776-43ea-911a-411904a0deb4","Type":"ContainerDied","Data":"4fe6b9fe9c5aa574568562ffd746b80a59fda3697e21ea6bae29e8a174363ecb"} Mar 09 03:22:04 crc kubenswrapper[4901]: I0309 03:22:04.906440 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4fe6b9fe9c5aa574568562ffd746b80a59fda3697e21ea6bae29e8a174363ecb" Mar 09 03:22:04 crc kubenswrapper[4901]: I0309 03:22:04.906513 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550442-79j67" Mar 09 03:22:05 crc kubenswrapper[4901]: I0309 03:22:05.326758 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550436-8h7v6"] Mar 09 03:22:05 crc kubenswrapper[4901]: I0309 03:22:05.337447 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550436-8h7v6"] Mar 09 03:22:06 crc kubenswrapper[4901]: I0309 03:22:06.122613 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2df87d7b-082b-4832-bd7e-b6caed7a3d8a" path="/var/lib/kubelet/pods/2df87d7b-082b-4832-bd7e-b6caed7a3d8a/volumes" Mar 09 03:22:30 crc kubenswrapper[4901]: I0309 03:22:30.863442 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 03:22:30 crc kubenswrapper[4901]: I0309 03:22:30.864111 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 03:22:31 crc kubenswrapper[4901]: I0309 03:22:31.472714 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tqkhd"] Mar 09 03:22:31 crc kubenswrapper[4901]: E0309 03:22:31.473025 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff6db7d3-b776-43ea-911a-411904a0deb4" containerName="oc" Mar 09 03:22:31 crc kubenswrapper[4901]: I0309 03:22:31.473038 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff6db7d3-b776-43ea-911a-411904a0deb4" containerName="oc" Mar 09 03:22:31 crc kubenswrapper[4901]: I0309 03:22:31.473170 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff6db7d3-b776-43ea-911a-411904a0deb4" containerName="oc" Mar 09 03:22:31 crc kubenswrapper[4901]: I0309 03:22:31.474154 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tqkhd" Mar 09 03:22:31 crc kubenswrapper[4901]: I0309 03:22:31.516110 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tqkhd"] Mar 09 03:22:31 crc kubenswrapper[4901]: I0309 03:22:31.660821 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/718d214a-065b-4b5a-8d75-db093428a9bf-utilities\") pod \"certified-operators-tqkhd\" (UID: \"718d214a-065b-4b5a-8d75-db093428a9bf\") " pod="openshift-marketplace/certified-operators-tqkhd" Mar 09 03:22:31 crc kubenswrapper[4901]: I0309 03:22:31.661132 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b44w\" (UniqueName: \"kubernetes.io/projected/718d214a-065b-4b5a-8d75-db093428a9bf-kube-api-access-4b44w\") pod \"certified-operators-tqkhd\" (UID: \"718d214a-065b-4b5a-8d75-db093428a9bf\") " pod="openshift-marketplace/certified-operators-tqkhd" Mar 09 03:22:31 crc kubenswrapper[4901]: I0309 03:22:31.661178 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/718d214a-065b-4b5a-8d75-db093428a9bf-catalog-content\") pod \"certified-operators-tqkhd\" (UID: \"718d214a-065b-4b5a-8d75-db093428a9bf\") " pod="openshift-marketplace/certified-operators-tqkhd" Mar 09 03:22:31 crc kubenswrapper[4901]: I0309 03:22:31.762170 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/718d214a-065b-4b5a-8d75-db093428a9bf-utilities\") pod \"certified-operators-tqkhd\" (UID: \"718d214a-065b-4b5a-8d75-db093428a9bf\") " pod="openshift-marketplace/certified-operators-tqkhd" Mar 09 03:22:31 crc kubenswrapper[4901]: I0309 03:22:31.762315 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b44w\" (UniqueName: \"kubernetes.io/projected/718d214a-065b-4b5a-8d75-db093428a9bf-kube-api-access-4b44w\") pod \"certified-operators-tqkhd\" (UID: \"718d214a-065b-4b5a-8d75-db093428a9bf\") " pod="openshift-marketplace/certified-operators-tqkhd" Mar 09 03:22:31 crc kubenswrapper[4901]: I0309 03:22:31.762439 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/718d214a-065b-4b5a-8d75-db093428a9bf-catalog-content\") pod \"certified-operators-tqkhd\" (UID: \"718d214a-065b-4b5a-8d75-db093428a9bf\") " pod="openshift-marketplace/certified-operators-tqkhd" Mar 09 03:22:31 crc kubenswrapper[4901]: I0309 03:22:31.762731 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/718d214a-065b-4b5a-8d75-db093428a9bf-utilities\") pod \"certified-operators-tqkhd\" (UID: \"718d214a-065b-4b5a-8d75-db093428a9bf\") " pod="openshift-marketplace/certified-operators-tqkhd" Mar 09 03:22:31 crc kubenswrapper[4901]: I0309 03:22:31.762969 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/718d214a-065b-4b5a-8d75-db093428a9bf-catalog-content\") pod \"certified-operators-tqkhd\" (UID: \"718d214a-065b-4b5a-8d75-db093428a9bf\") " pod="openshift-marketplace/certified-operators-tqkhd" Mar 09 03:22:31 crc kubenswrapper[4901]: I0309 03:22:31.788155 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b44w\" (UniqueName: \"kubernetes.io/projected/718d214a-065b-4b5a-8d75-db093428a9bf-kube-api-access-4b44w\") pod \"certified-operators-tqkhd\" (UID: \"718d214a-065b-4b5a-8d75-db093428a9bf\") " pod="openshift-marketplace/certified-operators-tqkhd" Mar 09 03:22:31 crc kubenswrapper[4901]: I0309 03:22:31.801980 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tqkhd" Mar 09 03:22:32 crc kubenswrapper[4901]: I0309 03:22:32.041979 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tqkhd"] Mar 09 03:22:32 crc kubenswrapper[4901]: I0309 03:22:32.177643 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tqkhd" event={"ID":"718d214a-065b-4b5a-8d75-db093428a9bf","Type":"ContainerStarted","Data":"fc724dd4971210943a7411b5b5df2df250e3d44d066ae12a63d831817f85ce3d"} Mar 09 03:22:33 crc kubenswrapper[4901]: I0309 03:22:33.191268 4901 generic.go:334] "Generic (PLEG): container finished" podID="718d214a-065b-4b5a-8d75-db093428a9bf" containerID="70751f429dd0f21a6fd68fa0e4b1ced91b783d78c090cebf30672712881682ca" exitCode=0 Mar 09 03:22:33 crc kubenswrapper[4901]: I0309 03:22:33.191343 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tqkhd" event={"ID":"718d214a-065b-4b5a-8d75-db093428a9bf","Type":"ContainerDied","Data":"70751f429dd0f21a6fd68fa0e4b1ced91b783d78c090cebf30672712881682ca"} Mar 09 03:22:35 crc kubenswrapper[4901]: I0309 03:22:35.215731 4901 generic.go:334] "Generic (PLEG): container finished" podID="718d214a-065b-4b5a-8d75-db093428a9bf" containerID="06b763754d7a7e52f9fdf699534044753932836b182fd8e03bbce896c1d7dd24" exitCode=0 Mar 09 03:22:35 crc kubenswrapper[4901]: I0309 03:22:35.217213 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tqkhd" event={"ID":"718d214a-065b-4b5a-8d75-db093428a9bf","Type":"ContainerDied","Data":"06b763754d7a7e52f9fdf699534044753932836b182fd8e03bbce896c1d7dd24"} Mar 09 03:22:36 crc kubenswrapper[4901]: I0309 03:22:36.233582 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tqkhd" event={"ID":"718d214a-065b-4b5a-8d75-db093428a9bf","Type":"ContainerStarted","Data":"82676d2a5e03bef57ee36f721f07c4065262d6c798f2b95cea110277c1e4f976"} Mar 09 03:22:36 crc kubenswrapper[4901]: I0309 03:22:36.267748 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tqkhd" podStartSLOduration=2.848720893 podStartE2EDuration="5.267719373s" podCreationTimestamp="2026-03-09 03:22:31 +0000 UTC" firstStartedPulling="2026-03-09 03:22:33.193520732 +0000 UTC m=+2477.783184494" lastFinishedPulling="2026-03-09 03:22:35.612519202 +0000 UTC m=+2480.202182974" observedRunningTime="2026-03-09 03:22:36.261620349 +0000 UTC m=+2480.851284121" watchObservedRunningTime="2026-03-09 03:22:36.267719373 +0000 UTC m=+2480.857383145" Mar 09 03:22:38 crc kubenswrapper[4901]: I0309 03:22:38.562283 4901 scope.go:117] "RemoveContainer" containerID="8a6ae6112d498700ffac947a38080cfffe8f59d1e9325f822479e005ada8206b" Mar 09 03:22:41 crc kubenswrapper[4901]: I0309 03:22:41.803188 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tqkhd" Mar 09 03:22:41 crc kubenswrapper[4901]: I0309 03:22:41.804301 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tqkhd" Mar 09 03:22:41 crc kubenswrapper[4901]: I0309 03:22:41.882301 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tqkhd" Mar 09 03:22:42 crc kubenswrapper[4901]: I0309 03:22:42.368106 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tqkhd" Mar 09 03:22:42 crc kubenswrapper[4901]: I0309 03:22:42.436173 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tqkhd"] Mar 09 03:22:44 crc kubenswrapper[4901]: I0309 03:22:44.330287 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tqkhd" podUID="718d214a-065b-4b5a-8d75-db093428a9bf" containerName="registry-server" containerID="cri-o://82676d2a5e03bef57ee36f721f07c4065262d6c798f2b95cea110277c1e4f976" gracePeriod=2 Mar 09 03:22:44 crc kubenswrapper[4901]: I0309 03:22:44.851295 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tqkhd" Mar 09 03:22:44 crc kubenswrapper[4901]: I0309 03:22:44.921057 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/718d214a-065b-4b5a-8d75-db093428a9bf-utilities\") pod \"718d214a-065b-4b5a-8d75-db093428a9bf\" (UID: \"718d214a-065b-4b5a-8d75-db093428a9bf\") " Mar 09 03:22:44 crc kubenswrapper[4901]: I0309 03:22:44.921128 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4b44w\" (UniqueName: \"kubernetes.io/projected/718d214a-065b-4b5a-8d75-db093428a9bf-kube-api-access-4b44w\") pod \"718d214a-065b-4b5a-8d75-db093428a9bf\" (UID: \"718d214a-065b-4b5a-8d75-db093428a9bf\") " Mar 09 03:22:44 crc kubenswrapper[4901]: I0309 03:22:44.921192 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/718d214a-065b-4b5a-8d75-db093428a9bf-catalog-content\") pod \"718d214a-065b-4b5a-8d75-db093428a9bf\" (UID: \"718d214a-065b-4b5a-8d75-db093428a9bf\") " Mar 09 03:22:44 crc kubenswrapper[4901]: I0309 03:22:44.922790 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/718d214a-065b-4b5a-8d75-db093428a9bf-utilities" (OuterVolumeSpecName: "utilities") pod "718d214a-065b-4b5a-8d75-db093428a9bf" (UID: "718d214a-065b-4b5a-8d75-db093428a9bf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:22:44 crc kubenswrapper[4901]: I0309 03:22:44.927466 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/718d214a-065b-4b5a-8d75-db093428a9bf-kube-api-access-4b44w" (OuterVolumeSpecName: "kube-api-access-4b44w") pod "718d214a-065b-4b5a-8d75-db093428a9bf" (UID: "718d214a-065b-4b5a-8d75-db093428a9bf"). InnerVolumeSpecName "kube-api-access-4b44w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:22:45 crc kubenswrapper[4901]: I0309 03:22:45.022889 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/718d214a-065b-4b5a-8d75-db093428a9bf-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 03:22:45 crc kubenswrapper[4901]: I0309 03:22:45.022939 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4b44w\" (UniqueName: \"kubernetes.io/projected/718d214a-065b-4b5a-8d75-db093428a9bf-kube-api-access-4b44w\") on node \"crc\" DevicePath \"\"" Mar 09 03:22:45 crc kubenswrapper[4901]: I0309 03:22:45.342380 4901 generic.go:334] "Generic (PLEG): container finished" podID="718d214a-065b-4b5a-8d75-db093428a9bf" containerID="82676d2a5e03bef57ee36f721f07c4065262d6c798f2b95cea110277c1e4f976" exitCode=0 Mar 09 03:22:45 crc kubenswrapper[4901]: I0309 03:22:45.342445 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tqkhd" event={"ID":"718d214a-065b-4b5a-8d75-db093428a9bf","Type":"ContainerDied","Data":"82676d2a5e03bef57ee36f721f07c4065262d6c798f2b95cea110277c1e4f976"} Mar 09 03:22:45 crc kubenswrapper[4901]: I0309 03:22:45.342486 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tqkhd" event={"ID":"718d214a-065b-4b5a-8d75-db093428a9bf","Type":"ContainerDied","Data":"fc724dd4971210943a7411b5b5df2df250e3d44d066ae12a63d831817f85ce3d"} Mar 09 03:22:45 crc kubenswrapper[4901]: I0309 03:22:45.342697 4901 scope.go:117] "RemoveContainer" containerID="82676d2a5e03bef57ee36f721f07c4065262d6c798f2b95cea110277c1e4f976" Mar 09 03:22:45 crc kubenswrapper[4901]: I0309 03:22:45.342808 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tqkhd" Mar 09 03:22:45 crc kubenswrapper[4901]: I0309 03:22:45.375678 4901 scope.go:117] "RemoveContainer" containerID="06b763754d7a7e52f9fdf699534044753932836b182fd8e03bbce896c1d7dd24" Mar 09 03:22:45 crc kubenswrapper[4901]: I0309 03:22:45.403622 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/718d214a-065b-4b5a-8d75-db093428a9bf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "718d214a-065b-4b5a-8d75-db093428a9bf" (UID: "718d214a-065b-4b5a-8d75-db093428a9bf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:22:45 crc kubenswrapper[4901]: I0309 03:22:45.407265 4901 scope.go:117] "RemoveContainer" containerID="70751f429dd0f21a6fd68fa0e4b1ced91b783d78c090cebf30672712881682ca" Mar 09 03:22:45 crc kubenswrapper[4901]: I0309 03:22:45.430618 4901 scope.go:117] "RemoveContainer" containerID="82676d2a5e03bef57ee36f721f07c4065262d6c798f2b95cea110277c1e4f976" Mar 09 03:22:45 crc kubenswrapper[4901]: E0309 03:22:45.431051 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82676d2a5e03bef57ee36f721f07c4065262d6c798f2b95cea110277c1e4f976\": container with ID starting with 82676d2a5e03bef57ee36f721f07c4065262d6c798f2b95cea110277c1e4f976 not found: ID does not exist" containerID="82676d2a5e03bef57ee36f721f07c4065262d6c798f2b95cea110277c1e4f976" Mar 09 03:22:45 crc kubenswrapper[4901]: I0309 03:22:45.431086 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82676d2a5e03bef57ee36f721f07c4065262d6c798f2b95cea110277c1e4f976"} err="failed to get container status \"82676d2a5e03bef57ee36f721f07c4065262d6c798f2b95cea110277c1e4f976\": rpc error: code = NotFound desc = could not find container \"82676d2a5e03bef57ee36f721f07c4065262d6c798f2b95cea110277c1e4f976\": container with ID starting with 82676d2a5e03bef57ee36f721f07c4065262d6c798f2b95cea110277c1e4f976 not found: ID does not exist" Mar 09 03:22:45 crc kubenswrapper[4901]: I0309 03:22:45.431108 4901 scope.go:117] "RemoveContainer" containerID="06b763754d7a7e52f9fdf699534044753932836b182fd8e03bbce896c1d7dd24" Mar 09 03:22:45 crc kubenswrapper[4901]: I0309 03:22:45.431332 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/718d214a-065b-4b5a-8d75-db093428a9bf-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 03:22:45 crc kubenswrapper[4901]: E0309 03:22:45.431451 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06b763754d7a7e52f9fdf699534044753932836b182fd8e03bbce896c1d7dd24\": container with ID starting with 06b763754d7a7e52f9fdf699534044753932836b182fd8e03bbce896c1d7dd24 not found: ID does not exist" containerID="06b763754d7a7e52f9fdf699534044753932836b182fd8e03bbce896c1d7dd24" Mar 09 03:22:45 crc kubenswrapper[4901]: I0309 03:22:45.431501 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06b763754d7a7e52f9fdf699534044753932836b182fd8e03bbce896c1d7dd24"} err="failed to get container status \"06b763754d7a7e52f9fdf699534044753932836b182fd8e03bbce896c1d7dd24\": rpc error: code = NotFound desc = could not find container \"06b763754d7a7e52f9fdf699534044753932836b182fd8e03bbce896c1d7dd24\": container with ID starting with 06b763754d7a7e52f9fdf699534044753932836b182fd8e03bbce896c1d7dd24 not found: ID does not exist" Mar 09 03:22:45 crc kubenswrapper[4901]: I0309 03:22:45.431531 4901 scope.go:117] "RemoveContainer" containerID="70751f429dd0f21a6fd68fa0e4b1ced91b783d78c090cebf30672712881682ca" Mar 09 03:22:45 crc kubenswrapper[4901]: E0309 03:22:45.432160 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70751f429dd0f21a6fd68fa0e4b1ced91b783d78c090cebf30672712881682ca\": container with ID starting with 70751f429dd0f21a6fd68fa0e4b1ced91b783d78c090cebf30672712881682ca not found: ID does not exist" containerID="70751f429dd0f21a6fd68fa0e4b1ced91b783d78c090cebf30672712881682ca" Mar 09 03:22:45 crc kubenswrapper[4901]: I0309 03:22:45.432193 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70751f429dd0f21a6fd68fa0e4b1ced91b783d78c090cebf30672712881682ca"} err="failed to get container status \"70751f429dd0f21a6fd68fa0e4b1ced91b783d78c090cebf30672712881682ca\": rpc error: code = NotFound desc = could not find container \"70751f429dd0f21a6fd68fa0e4b1ced91b783d78c090cebf30672712881682ca\": container with ID starting with 70751f429dd0f21a6fd68fa0e4b1ced91b783d78c090cebf30672712881682ca not found: ID does not exist" Mar 09 03:22:45 crc kubenswrapper[4901]: I0309 03:22:45.694800 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tqkhd"] Mar 09 03:22:45 crc kubenswrapper[4901]: I0309 03:22:45.699936 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tqkhd"] Mar 09 03:22:46 crc kubenswrapper[4901]: I0309 03:22:46.122016 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="718d214a-065b-4b5a-8d75-db093428a9bf" path="/var/lib/kubelet/pods/718d214a-065b-4b5a-8d75-db093428a9bf/volumes" Mar 09 03:23:00 crc kubenswrapper[4901]: I0309 03:23:00.863531 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 03:23:00 crc kubenswrapper[4901]: I0309 03:23:00.864142 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 03:23:00 crc kubenswrapper[4901]: I0309 03:23:00.864262 4901 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5c998" Mar 09 03:23:00 crc kubenswrapper[4901]: I0309 03:23:00.865102 4901 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b882e9e78d8beaa4679398085320aa14420eb8f755a8b3bec066597796f6c72e"} pod="openshift-machine-config-operator/machine-config-daemon-5c998" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 03:23:00 crc kubenswrapper[4901]: I0309 03:23:00.865189 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" containerID="cri-o://b882e9e78d8beaa4679398085320aa14420eb8f755a8b3bec066597796f6c72e" gracePeriod=600 Mar 09 03:23:01 crc kubenswrapper[4901]: E0309 03:23:01.009852 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:23:01 crc kubenswrapper[4901]: I0309 03:23:01.512198 4901 generic.go:334] "Generic (PLEG): container finished" podID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerID="b882e9e78d8beaa4679398085320aa14420eb8f755a8b3bec066597796f6c72e" exitCode=0 Mar 09 03:23:01 crc kubenswrapper[4901]: I0309 03:23:01.512285 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" event={"ID":"65e722e8-52c4-4bb6-9927-f378b2f7296a","Type":"ContainerDied","Data":"b882e9e78d8beaa4679398085320aa14420eb8f755a8b3bec066597796f6c72e"} Mar 09 03:23:01 crc kubenswrapper[4901]: I0309 03:23:01.512341 4901 scope.go:117] "RemoveContainer" containerID="fe198574e3e8e3b772eb309166e1823040e85757d5648d18c8bedb1946e05ad9" Mar 09 03:23:01 crc kubenswrapper[4901]: I0309 03:23:01.512982 4901 scope.go:117] "RemoveContainer" containerID="b882e9e78d8beaa4679398085320aa14420eb8f755a8b3bec066597796f6c72e" Mar 09 03:23:01 crc kubenswrapper[4901]: E0309 03:23:01.513549 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:23:15 crc kubenswrapper[4901]: I0309 03:23:15.106588 4901 scope.go:117] "RemoveContainer" containerID="b882e9e78d8beaa4679398085320aa14420eb8f755a8b3bec066597796f6c72e" Mar 09 03:23:15 crc kubenswrapper[4901]: E0309 03:23:15.107730 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:23:27 crc kubenswrapper[4901]: I0309 03:23:27.107134 4901 scope.go:117] "RemoveContainer" containerID="b882e9e78d8beaa4679398085320aa14420eb8f755a8b3bec066597796f6c72e" Mar 09 03:23:27 crc kubenswrapper[4901]: E0309 03:23:27.108258 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:23:41 crc kubenswrapper[4901]: I0309 03:23:41.107004 4901 scope.go:117] "RemoveContainer" containerID="b882e9e78d8beaa4679398085320aa14420eb8f755a8b3bec066597796f6c72e" Mar 09 03:23:41 crc kubenswrapper[4901]: E0309 03:23:41.107946 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:23:55 crc kubenswrapper[4901]: I0309 03:23:55.106955 4901 scope.go:117] "RemoveContainer" containerID="b882e9e78d8beaa4679398085320aa14420eb8f755a8b3bec066597796f6c72e" Mar 09 03:23:55 crc kubenswrapper[4901]: E0309 03:23:55.108007 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:24:00 crc kubenswrapper[4901]: I0309 03:24:00.156870 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550444-mqxvm"] Mar 09 03:24:00 crc kubenswrapper[4901]: E0309 03:24:00.157596 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="718d214a-065b-4b5a-8d75-db093428a9bf" containerName="registry-server" Mar 09 03:24:00 crc kubenswrapper[4901]: I0309 03:24:00.157616 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="718d214a-065b-4b5a-8d75-db093428a9bf" containerName="registry-server" Mar 09 03:24:00 crc kubenswrapper[4901]: E0309 03:24:00.157684 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="718d214a-065b-4b5a-8d75-db093428a9bf" containerName="extract-utilities" Mar 09 03:24:00 crc kubenswrapper[4901]: I0309 03:24:00.157698 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="718d214a-065b-4b5a-8d75-db093428a9bf" containerName="extract-utilities" Mar 09 03:24:00 crc kubenswrapper[4901]: E0309 03:24:00.157720 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="718d214a-065b-4b5a-8d75-db093428a9bf" containerName="extract-content" Mar 09 03:24:00 crc kubenswrapper[4901]: I0309 03:24:00.157733 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="718d214a-065b-4b5a-8d75-db093428a9bf" containerName="extract-content" Mar 09 03:24:00 crc kubenswrapper[4901]: I0309 03:24:00.157963 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="718d214a-065b-4b5a-8d75-db093428a9bf" containerName="registry-server" Mar 09 03:24:00 crc kubenswrapper[4901]: I0309 03:24:00.158679 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550444-mqxvm" Mar 09 03:24:00 crc kubenswrapper[4901]: I0309 03:24:00.163535 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 03:24:00 crc kubenswrapper[4901]: I0309 03:24:00.163929 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lq6vs" Mar 09 03:24:00 crc kubenswrapper[4901]: I0309 03:24:00.164192 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 03:24:00 crc kubenswrapper[4901]: I0309 03:24:00.180168 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550444-mqxvm"] Mar 09 03:24:00 crc kubenswrapper[4901]: I0309 03:24:00.322295 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw2xj\" (UniqueName: \"kubernetes.io/projected/88f4b783-1a16-4ead-bbe7-b5b982392e97-kube-api-access-vw2xj\") pod \"auto-csr-approver-29550444-mqxvm\" (UID: \"88f4b783-1a16-4ead-bbe7-b5b982392e97\") " pod="openshift-infra/auto-csr-approver-29550444-mqxvm" Mar 09 03:24:00 crc kubenswrapper[4901]: I0309 03:24:00.424335 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vw2xj\" (UniqueName: \"kubernetes.io/projected/88f4b783-1a16-4ead-bbe7-b5b982392e97-kube-api-access-vw2xj\") pod \"auto-csr-approver-29550444-mqxvm\" (UID: \"88f4b783-1a16-4ead-bbe7-b5b982392e97\") " pod="openshift-infra/auto-csr-approver-29550444-mqxvm" Mar 09 03:24:00 crc kubenswrapper[4901]: I0309 03:24:00.463212 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw2xj\" (UniqueName: \"kubernetes.io/projected/88f4b783-1a16-4ead-bbe7-b5b982392e97-kube-api-access-vw2xj\") pod \"auto-csr-approver-29550444-mqxvm\" (UID: \"88f4b783-1a16-4ead-bbe7-b5b982392e97\") " pod="openshift-infra/auto-csr-approver-29550444-mqxvm" Mar 09 03:24:00 crc kubenswrapper[4901]: I0309 03:24:00.526566 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550444-mqxvm" Mar 09 03:24:00 crc kubenswrapper[4901]: I0309 03:24:00.867099 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550444-mqxvm"] Mar 09 03:24:00 crc kubenswrapper[4901]: W0309 03:24:00.880552 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88f4b783_1a16_4ead_bbe7_b5b982392e97.slice/crio-d7d4a09279d2ff026109fa5b341bc5ff7bebfe930072a38b99c95e4d1f078004 WatchSource:0}: Error finding container d7d4a09279d2ff026109fa5b341bc5ff7bebfe930072a38b99c95e4d1f078004: Status 404 returned error can't find the container with id d7d4a09279d2ff026109fa5b341bc5ff7bebfe930072a38b99c95e4d1f078004 Mar 09 03:24:00 crc kubenswrapper[4901]: I0309 03:24:00.883108 4901 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 03:24:01 crc kubenswrapper[4901]: I0309 03:24:01.134499 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550444-mqxvm" event={"ID":"88f4b783-1a16-4ead-bbe7-b5b982392e97","Type":"ContainerStarted","Data":"d7d4a09279d2ff026109fa5b341bc5ff7bebfe930072a38b99c95e4d1f078004"} Mar 09 03:24:03 crc kubenswrapper[4901]: I0309 03:24:03.155756 4901 generic.go:334] "Generic (PLEG): container finished" podID="88f4b783-1a16-4ead-bbe7-b5b982392e97" containerID="c857fd9cd591bf517cfbb0a60c1c4bf4d577f39c99853212a851652a4620dc21" exitCode=0 Mar 09 03:24:03 crc kubenswrapper[4901]: I0309 03:24:03.155842 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550444-mqxvm" event={"ID":"88f4b783-1a16-4ead-bbe7-b5b982392e97","Type":"ContainerDied","Data":"c857fd9cd591bf517cfbb0a60c1c4bf4d577f39c99853212a851652a4620dc21"} Mar 09 03:24:04 crc kubenswrapper[4901]: I0309 03:24:04.515017 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550444-mqxvm" Mar 09 03:24:04 crc kubenswrapper[4901]: I0309 03:24:04.692581 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vw2xj\" (UniqueName: \"kubernetes.io/projected/88f4b783-1a16-4ead-bbe7-b5b982392e97-kube-api-access-vw2xj\") pod \"88f4b783-1a16-4ead-bbe7-b5b982392e97\" (UID: \"88f4b783-1a16-4ead-bbe7-b5b982392e97\") " Mar 09 03:24:04 crc kubenswrapper[4901]: I0309 03:24:04.701575 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88f4b783-1a16-4ead-bbe7-b5b982392e97-kube-api-access-vw2xj" (OuterVolumeSpecName: "kube-api-access-vw2xj") pod "88f4b783-1a16-4ead-bbe7-b5b982392e97" (UID: "88f4b783-1a16-4ead-bbe7-b5b982392e97"). InnerVolumeSpecName "kube-api-access-vw2xj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:24:04 crc kubenswrapper[4901]: I0309 03:24:04.794983 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vw2xj\" (UniqueName: \"kubernetes.io/projected/88f4b783-1a16-4ead-bbe7-b5b982392e97-kube-api-access-vw2xj\") on node \"crc\" DevicePath \"\"" Mar 09 03:24:05 crc kubenswrapper[4901]: I0309 03:24:05.179788 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550444-mqxvm" event={"ID":"88f4b783-1a16-4ead-bbe7-b5b982392e97","Type":"ContainerDied","Data":"d7d4a09279d2ff026109fa5b341bc5ff7bebfe930072a38b99c95e4d1f078004"} Mar 09 03:24:05 crc kubenswrapper[4901]: I0309 03:24:05.180115 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7d4a09279d2ff026109fa5b341bc5ff7bebfe930072a38b99c95e4d1f078004" Mar 09 03:24:05 crc kubenswrapper[4901]: I0309 03:24:05.179914 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550444-mqxvm" Mar 09 03:24:05 crc kubenswrapper[4901]: I0309 03:24:05.638702 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550438-q2n4q"] Mar 09 03:24:05 crc kubenswrapper[4901]: I0309 03:24:05.652430 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550438-q2n4q"] Mar 09 03:24:06 crc kubenswrapper[4901]: I0309 03:24:06.121933 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c668c14b-279e-4201-9464-37f15f96b512" path="/var/lib/kubelet/pods/c668c14b-279e-4201-9464-37f15f96b512/volumes" Mar 09 03:24:07 crc kubenswrapper[4901]: I0309 03:24:07.506537 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-42knq"] Mar 09 03:24:07 crc kubenswrapper[4901]: E0309 03:24:07.507017 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88f4b783-1a16-4ead-bbe7-b5b982392e97" containerName="oc" Mar 09 03:24:07 crc kubenswrapper[4901]: I0309 03:24:07.507041 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="88f4b783-1a16-4ead-bbe7-b5b982392e97" containerName="oc" Mar 09 03:24:07 crc kubenswrapper[4901]: I0309 03:24:07.507379 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="88f4b783-1a16-4ead-bbe7-b5b982392e97" containerName="oc" Mar 09 03:24:07 crc kubenswrapper[4901]: I0309 03:24:07.509265 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-42knq" Mar 09 03:24:07 crc kubenswrapper[4901]: I0309 03:24:07.531250 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-42knq"] Mar 09 03:24:07 crc kubenswrapper[4901]: I0309 03:24:07.643734 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7db523e-196e-4c1a-9e5f-572c76f04a79-utilities\") pod \"community-operators-42knq\" (UID: \"f7db523e-196e-4c1a-9e5f-572c76f04a79\") " pod="openshift-marketplace/community-operators-42knq" Mar 09 03:24:07 crc kubenswrapper[4901]: I0309 03:24:07.644333 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9j9j9\" (UniqueName: \"kubernetes.io/projected/f7db523e-196e-4c1a-9e5f-572c76f04a79-kube-api-access-9j9j9\") pod \"community-operators-42knq\" (UID: \"f7db523e-196e-4c1a-9e5f-572c76f04a79\") " pod="openshift-marketplace/community-operators-42knq" Mar 09 03:24:07 crc kubenswrapper[4901]: I0309 03:24:07.644642 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7db523e-196e-4c1a-9e5f-572c76f04a79-catalog-content\") pod \"community-operators-42knq\" (UID: \"f7db523e-196e-4c1a-9e5f-572c76f04a79\") " pod="openshift-marketplace/community-operators-42knq" Mar 09 03:24:07 crc kubenswrapper[4901]: I0309 03:24:07.698831 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-brllk"] Mar 09 03:24:07 crc kubenswrapper[4901]: I0309 03:24:07.701941 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-brllk" Mar 09 03:24:07 crc kubenswrapper[4901]: I0309 03:24:07.708553 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-brllk"] Mar 09 03:24:07 crc kubenswrapper[4901]: I0309 03:24:07.746182 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7db523e-196e-4c1a-9e5f-572c76f04a79-utilities\") pod \"community-operators-42knq\" (UID: \"f7db523e-196e-4c1a-9e5f-572c76f04a79\") " pod="openshift-marketplace/community-operators-42knq" Mar 09 03:24:07 crc kubenswrapper[4901]: I0309 03:24:07.746278 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9j9j9\" (UniqueName: \"kubernetes.io/projected/f7db523e-196e-4c1a-9e5f-572c76f04a79-kube-api-access-9j9j9\") pod \"community-operators-42knq\" (UID: \"f7db523e-196e-4c1a-9e5f-572c76f04a79\") " pod="openshift-marketplace/community-operators-42knq" Mar 09 03:24:07 crc kubenswrapper[4901]: I0309 03:24:07.746327 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7db523e-196e-4c1a-9e5f-572c76f04a79-catalog-content\") pod \"community-operators-42knq\" (UID: \"f7db523e-196e-4c1a-9e5f-572c76f04a79\") " pod="openshift-marketplace/community-operators-42knq" Mar 09 03:24:07 crc kubenswrapper[4901]: I0309 03:24:07.747013 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7db523e-196e-4c1a-9e5f-572c76f04a79-utilities\") pod \"community-operators-42knq\" (UID: \"f7db523e-196e-4c1a-9e5f-572c76f04a79\") " pod="openshift-marketplace/community-operators-42knq" Mar 09 03:24:07 crc kubenswrapper[4901]: I0309 03:24:07.747056 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7db523e-196e-4c1a-9e5f-572c76f04a79-catalog-content\") pod \"community-operators-42knq\" (UID: \"f7db523e-196e-4c1a-9e5f-572c76f04a79\") " pod="openshift-marketplace/community-operators-42knq" Mar 09 03:24:07 crc kubenswrapper[4901]: I0309 03:24:07.770026 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9j9j9\" (UniqueName: \"kubernetes.io/projected/f7db523e-196e-4c1a-9e5f-572c76f04a79-kube-api-access-9j9j9\") pod \"community-operators-42knq\" (UID: \"f7db523e-196e-4c1a-9e5f-572c76f04a79\") " pod="openshift-marketplace/community-operators-42knq" Mar 09 03:24:07 crc kubenswrapper[4901]: I0309 03:24:07.844003 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-42knq" Mar 09 03:24:07 crc kubenswrapper[4901]: I0309 03:24:07.847578 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp45z\" (UniqueName: \"kubernetes.io/projected/658e55f8-1839-40c2-9dc9-892a2ea611fe-kube-api-access-fp45z\") pod \"redhat-marketplace-brllk\" (UID: \"658e55f8-1839-40c2-9dc9-892a2ea611fe\") " pod="openshift-marketplace/redhat-marketplace-brllk" Mar 09 03:24:07 crc kubenswrapper[4901]: I0309 03:24:07.847657 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/658e55f8-1839-40c2-9dc9-892a2ea611fe-catalog-content\") pod \"redhat-marketplace-brllk\" (UID: \"658e55f8-1839-40c2-9dc9-892a2ea611fe\") " pod="openshift-marketplace/redhat-marketplace-brllk" Mar 09 03:24:07 crc kubenswrapper[4901]: I0309 03:24:07.847686 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/658e55f8-1839-40c2-9dc9-892a2ea611fe-utilities\") pod \"redhat-marketplace-brllk\" (UID: \"658e55f8-1839-40c2-9dc9-892a2ea611fe\") " pod="openshift-marketplace/redhat-marketplace-brllk" Mar 09 03:24:07 crc kubenswrapper[4901]: I0309 03:24:07.950677 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp45z\" (UniqueName: \"kubernetes.io/projected/658e55f8-1839-40c2-9dc9-892a2ea611fe-kube-api-access-fp45z\") pod \"redhat-marketplace-brllk\" (UID: \"658e55f8-1839-40c2-9dc9-892a2ea611fe\") " pod="openshift-marketplace/redhat-marketplace-brllk" Mar 09 03:24:07 crc kubenswrapper[4901]: I0309 03:24:07.950769 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/658e55f8-1839-40c2-9dc9-892a2ea611fe-catalog-content\") pod \"redhat-marketplace-brllk\" (UID: \"658e55f8-1839-40c2-9dc9-892a2ea611fe\") " pod="openshift-marketplace/redhat-marketplace-brllk" Mar 09 03:24:07 crc kubenswrapper[4901]: I0309 03:24:07.950815 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/658e55f8-1839-40c2-9dc9-892a2ea611fe-utilities\") pod \"redhat-marketplace-brllk\" (UID: \"658e55f8-1839-40c2-9dc9-892a2ea611fe\") " pod="openshift-marketplace/redhat-marketplace-brllk" Mar 09 03:24:07 crc kubenswrapper[4901]: I0309 03:24:07.951576 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/658e55f8-1839-40c2-9dc9-892a2ea611fe-utilities\") pod \"redhat-marketplace-brllk\" (UID: \"658e55f8-1839-40c2-9dc9-892a2ea611fe\") " pod="openshift-marketplace/redhat-marketplace-brllk" Mar 09 03:24:07 crc kubenswrapper[4901]: I0309 03:24:07.952252 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/658e55f8-1839-40c2-9dc9-892a2ea611fe-catalog-content\") pod \"redhat-marketplace-brllk\" (UID: \"658e55f8-1839-40c2-9dc9-892a2ea611fe\") " pod="openshift-marketplace/redhat-marketplace-brllk" Mar 09 03:24:08 crc kubenswrapper[4901]: I0309 03:24:08.000021 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp45z\" (UniqueName: \"kubernetes.io/projected/658e55f8-1839-40c2-9dc9-892a2ea611fe-kube-api-access-fp45z\") pod \"redhat-marketplace-brllk\" (UID: \"658e55f8-1839-40c2-9dc9-892a2ea611fe\") " pod="openshift-marketplace/redhat-marketplace-brllk" Mar 09 03:24:08 crc kubenswrapper[4901]: I0309 03:24:08.025183 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-brllk" Mar 09 03:24:08 crc kubenswrapper[4901]: I0309 03:24:08.113565 4901 scope.go:117] "RemoveContainer" containerID="b882e9e78d8beaa4679398085320aa14420eb8f755a8b3bec066597796f6c72e" Mar 09 03:24:08 crc kubenswrapper[4901]: E0309 03:24:08.114030 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:24:08 crc kubenswrapper[4901]: I0309 03:24:08.377556 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-42knq"] Mar 09 03:24:08 crc kubenswrapper[4901]: I0309 03:24:08.515699 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-brllk"] Mar 09 03:24:08 crc kubenswrapper[4901]: W0309 03:24:08.524320 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod658e55f8_1839_40c2_9dc9_892a2ea611fe.slice/crio-fc6a5ac79117894e0fc984149472198778c7a853fc090f65c17182503266886d WatchSource:0}: Error finding container fc6a5ac79117894e0fc984149472198778c7a853fc090f65c17182503266886d: Status 404 returned error can't find the container with id fc6a5ac79117894e0fc984149472198778c7a853fc090f65c17182503266886d Mar 09 03:24:09 crc kubenswrapper[4901]: I0309 03:24:09.220587 4901 generic.go:334] "Generic (PLEG): container finished" podID="f7db523e-196e-4c1a-9e5f-572c76f04a79" containerID="301bcc33099b961b14effa5c9c1694b9dbac5e10b8634867a2fe8a854e439251" exitCode=0 Mar 09 03:24:09 crc kubenswrapper[4901]: I0309 03:24:09.220666 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-42knq" event={"ID":"f7db523e-196e-4c1a-9e5f-572c76f04a79","Type":"ContainerDied","Data":"301bcc33099b961b14effa5c9c1694b9dbac5e10b8634867a2fe8a854e439251"} Mar 09 03:24:09 crc kubenswrapper[4901]: I0309 03:24:09.221132 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-42knq" event={"ID":"f7db523e-196e-4c1a-9e5f-572c76f04a79","Type":"ContainerStarted","Data":"3952c492ad2f765b4640c6e33bbf824b4edf7018b5f12929d208405e2cc5bd23"} Mar 09 03:24:09 crc kubenswrapper[4901]: I0309 03:24:09.224431 4901 generic.go:334] "Generic (PLEG): container finished" podID="658e55f8-1839-40c2-9dc9-892a2ea611fe" containerID="dd60158cf6e08da2a416e62f788bddc83533e69de968fd0efb7f6cca115b3ee4" exitCode=0 Mar 09 03:24:09 crc kubenswrapper[4901]: I0309 03:24:09.224503 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-brllk" event={"ID":"658e55f8-1839-40c2-9dc9-892a2ea611fe","Type":"ContainerDied","Data":"dd60158cf6e08da2a416e62f788bddc83533e69de968fd0efb7f6cca115b3ee4"} Mar 09 03:24:09 crc kubenswrapper[4901]: I0309 03:24:09.224543 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-brllk" event={"ID":"658e55f8-1839-40c2-9dc9-892a2ea611fe","Type":"ContainerStarted","Data":"fc6a5ac79117894e0fc984149472198778c7a853fc090f65c17182503266886d"} Mar 09 03:24:10 crc kubenswrapper[4901]: I0309 03:24:10.235041 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-brllk" event={"ID":"658e55f8-1839-40c2-9dc9-892a2ea611fe","Type":"ContainerStarted","Data":"1398b92d52295a01222398d5352f65b871989c67a4991da6cd1847212c1262a5"} Mar 09 03:24:10 crc kubenswrapper[4901]: I0309 03:24:10.240422 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-42knq" event={"ID":"f7db523e-196e-4c1a-9e5f-572c76f04a79","Type":"ContainerStarted","Data":"f73d8c810a875e420e2260c9852d68a532abc2737b0b87ed777182031ad0990f"} Mar 09 03:24:11 crc kubenswrapper[4901]: I0309 03:24:11.256592 4901 generic.go:334] "Generic (PLEG): container finished" podID="f7db523e-196e-4c1a-9e5f-572c76f04a79" containerID="f73d8c810a875e420e2260c9852d68a532abc2737b0b87ed777182031ad0990f" exitCode=0 Mar 09 03:24:11 crc kubenswrapper[4901]: I0309 03:24:11.256699 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-42knq" event={"ID":"f7db523e-196e-4c1a-9e5f-572c76f04a79","Type":"ContainerDied","Data":"f73d8c810a875e420e2260c9852d68a532abc2737b0b87ed777182031ad0990f"} Mar 09 03:24:11 crc kubenswrapper[4901]: I0309 03:24:11.263530 4901 generic.go:334] "Generic (PLEG): container finished" podID="658e55f8-1839-40c2-9dc9-892a2ea611fe" containerID="1398b92d52295a01222398d5352f65b871989c67a4991da6cd1847212c1262a5" exitCode=0 Mar 09 03:24:11 crc kubenswrapper[4901]: I0309 03:24:11.263585 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-brllk" event={"ID":"658e55f8-1839-40c2-9dc9-892a2ea611fe","Type":"ContainerDied","Data":"1398b92d52295a01222398d5352f65b871989c67a4991da6cd1847212c1262a5"} Mar 09 03:24:13 crc kubenswrapper[4901]: I0309 03:24:13.349364 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-brllk" event={"ID":"658e55f8-1839-40c2-9dc9-892a2ea611fe","Type":"ContainerStarted","Data":"ed5dbe6fd8691f1cd9e072a71b352b8e79f96f1eb5ae0be2c2e4c4002ebd01c3"} Mar 09 03:24:13 crc kubenswrapper[4901]: I0309 03:24:13.353383 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-42knq" event={"ID":"f7db523e-196e-4c1a-9e5f-572c76f04a79","Type":"ContainerStarted","Data":"ce4c9f78f86e925b85dab939bcdbc4ec0671f53976dba13ba26aacc49da55796"} Mar 09 03:24:13 crc kubenswrapper[4901]: I0309 03:24:13.374639 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-brllk" podStartSLOduration=3.241086318 podStartE2EDuration="6.374618088s" podCreationTimestamp="2026-03-09 03:24:07 +0000 UTC" firstStartedPulling="2026-03-09 03:24:09.228572062 +0000 UTC m=+2573.818235824" lastFinishedPulling="2026-03-09 03:24:12.362103862 +0000 UTC m=+2576.951767594" observedRunningTime="2026-03-09 03:24:13.372001617 +0000 UTC m=+2577.961665339" watchObservedRunningTime="2026-03-09 03:24:13.374618088 +0000 UTC m=+2577.964281860" Mar 09 03:24:13 crc kubenswrapper[4901]: I0309 03:24:13.406640 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-42knq" podStartSLOduration=3.942058908 podStartE2EDuration="6.406612693s" podCreationTimestamp="2026-03-09 03:24:07 +0000 UTC" firstStartedPulling="2026-03-09 03:24:09.223636846 +0000 UTC m=+2573.813300618" lastFinishedPulling="2026-03-09 03:24:11.688190631 +0000 UTC m=+2576.277854403" observedRunningTime="2026-03-09 03:24:13.400184041 +0000 UTC m=+2577.989847773" watchObservedRunningTime="2026-03-09 03:24:13.406612693 +0000 UTC m=+2577.996276425" Mar 09 03:24:17 crc kubenswrapper[4901]: I0309 03:24:17.844343 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-42knq" Mar 09 03:24:17 crc kubenswrapper[4901]: I0309 03:24:17.845186 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-42knq" Mar 09 03:24:17 crc kubenswrapper[4901]: I0309 03:24:17.920973 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-42knq" Mar 09 03:24:18 crc kubenswrapper[4901]: I0309 03:24:18.026408 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-brllk" Mar 09 03:24:18 crc kubenswrapper[4901]: I0309 03:24:18.026608 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-brllk" Mar 09 03:24:18 crc kubenswrapper[4901]: I0309 03:24:18.103994 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-brllk" Mar 09 03:24:18 crc kubenswrapper[4901]: I0309 03:24:18.458837 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-42knq" Mar 09 03:24:18 crc kubenswrapper[4901]: I0309 03:24:18.475406 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-brllk" Mar 09 03:24:19 crc kubenswrapper[4901]: I0309 03:24:19.886523 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-42knq"] Mar 09 03:24:20 crc kubenswrapper[4901]: I0309 03:24:20.106546 4901 scope.go:117] "RemoveContainer" containerID="b882e9e78d8beaa4679398085320aa14420eb8f755a8b3bec066597796f6c72e" Mar 09 03:24:20 crc kubenswrapper[4901]: E0309 03:24:20.106912 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:24:20 crc kubenswrapper[4901]: I0309 03:24:20.415159 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-42knq" podUID="f7db523e-196e-4c1a-9e5f-572c76f04a79" containerName="registry-server" containerID="cri-o://ce4c9f78f86e925b85dab939bcdbc4ec0671f53976dba13ba26aacc49da55796" gracePeriod=2 Mar 09 03:24:20 crc kubenswrapper[4901]: I0309 03:24:20.884856 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-brllk"] Mar 09 03:24:20 crc kubenswrapper[4901]: I0309 03:24:20.889981 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-42knq" Mar 09 03:24:21 crc kubenswrapper[4901]: I0309 03:24:21.060667 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9j9j9\" (UniqueName: \"kubernetes.io/projected/f7db523e-196e-4c1a-9e5f-572c76f04a79-kube-api-access-9j9j9\") pod \"f7db523e-196e-4c1a-9e5f-572c76f04a79\" (UID: \"f7db523e-196e-4c1a-9e5f-572c76f04a79\") " Mar 09 03:24:21 crc kubenswrapper[4901]: I0309 03:24:21.060753 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7db523e-196e-4c1a-9e5f-572c76f04a79-catalog-content\") pod \"f7db523e-196e-4c1a-9e5f-572c76f04a79\" (UID: \"f7db523e-196e-4c1a-9e5f-572c76f04a79\") " Mar 09 03:24:21 crc kubenswrapper[4901]: I0309 03:24:21.060816 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7db523e-196e-4c1a-9e5f-572c76f04a79-utilities\") pod \"f7db523e-196e-4c1a-9e5f-572c76f04a79\" (UID: \"f7db523e-196e-4c1a-9e5f-572c76f04a79\") " Mar 09 03:24:21 crc kubenswrapper[4901]: I0309 03:24:21.062550 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7db523e-196e-4c1a-9e5f-572c76f04a79-utilities" (OuterVolumeSpecName: "utilities") pod "f7db523e-196e-4c1a-9e5f-572c76f04a79" (UID: "f7db523e-196e-4c1a-9e5f-572c76f04a79"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:24:21 crc kubenswrapper[4901]: I0309 03:24:21.068978 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7db523e-196e-4c1a-9e5f-572c76f04a79-kube-api-access-9j9j9" (OuterVolumeSpecName: "kube-api-access-9j9j9") pod "f7db523e-196e-4c1a-9e5f-572c76f04a79" (UID: "f7db523e-196e-4c1a-9e5f-572c76f04a79"). InnerVolumeSpecName "kube-api-access-9j9j9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:24:21 crc kubenswrapper[4901]: I0309 03:24:21.162291 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7db523e-196e-4c1a-9e5f-572c76f04a79-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 03:24:21 crc kubenswrapper[4901]: I0309 03:24:21.162338 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9j9j9\" (UniqueName: \"kubernetes.io/projected/f7db523e-196e-4c1a-9e5f-572c76f04a79-kube-api-access-9j9j9\") on node \"crc\" DevicePath \"\"" Mar 09 03:24:21 crc kubenswrapper[4901]: I0309 03:24:21.227972 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7db523e-196e-4c1a-9e5f-572c76f04a79-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f7db523e-196e-4c1a-9e5f-572c76f04a79" (UID: "f7db523e-196e-4c1a-9e5f-572c76f04a79"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:24:21 crc kubenswrapper[4901]: I0309 03:24:21.263760 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7db523e-196e-4c1a-9e5f-572c76f04a79-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 03:24:21 crc kubenswrapper[4901]: I0309 03:24:21.424887 4901 generic.go:334] "Generic (PLEG): container finished" podID="f7db523e-196e-4c1a-9e5f-572c76f04a79" containerID="ce4c9f78f86e925b85dab939bcdbc4ec0671f53976dba13ba26aacc49da55796" exitCode=0 Mar 09 03:24:21 crc kubenswrapper[4901]: I0309 03:24:21.425001 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-42knq" Mar 09 03:24:21 crc kubenswrapper[4901]: I0309 03:24:21.425018 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-42knq" event={"ID":"f7db523e-196e-4c1a-9e5f-572c76f04a79","Type":"ContainerDied","Data":"ce4c9f78f86e925b85dab939bcdbc4ec0671f53976dba13ba26aacc49da55796"} Mar 09 03:24:21 crc kubenswrapper[4901]: I0309 03:24:21.425081 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-42knq" event={"ID":"f7db523e-196e-4c1a-9e5f-572c76f04a79","Type":"ContainerDied","Data":"3952c492ad2f765b4640c6e33bbf824b4edf7018b5f12929d208405e2cc5bd23"} Mar 09 03:24:21 crc kubenswrapper[4901]: I0309 03:24:21.425117 4901 scope.go:117] "RemoveContainer" containerID="ce4c9f78f86e925b85dab939bcdbc4ec0671f53976dba13ba26aacc49da55796" Mar 09 03:24:21 crc kubenswrapper[4901]: I0309 03:24:21.425139 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-brllk" podUID="658e55f8-1839-40c2-9dc9-892a2ea611fe" containerName="registry-server" containerID="cri-o://ed5dbe6fd8691f1cd9e072a71b352b8e79f96f1eb5ae0be2c2e4c4002ebd01c3" gracePeriod=2 Mar 09 03:24:21 crc kubenswrapper[4901]: I0309 03:24:21.456788 4901 scope.go:117] "RemoveContainer" containerID="f73d8c810a875e420e2260c9852d68a532abc2737b0b87ed777182031ad0990f" Mar 09 03:24:21 crc kubenswrapper[4901]: I0309 03:24:21.486798 4901 scope.go:117] "RemoveContainer" containerID="301bcc33099b961b14effa5c9c1694b9dbac5e10b8634867a2fe8a854e439251" Mar 09 03:24:21 crc kubenswrapper[4901]: I0309 03:24:21.503944 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-42knq"] Mar 09 03:24:21 crc kubenswrapper[4901]: I0309 03:24:21.512702 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-42knq"] Mar 09 03:24:21 crc kubenswrapper[4901]: I0309 03:24:21.616906 4901 scope.go:117] "RemoveContainer" containerID="ce4c9f78f86e925b85dab939bcdbc4ec0671f53976dba13ba26aacc49da55796" Mar 09 03:24:21 crc kubenswrapper[4901]: E0309 03:24:21.617469 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce4c9f78f86e925b85dab939bcdbc4ec0671f53976dba13ba26aacc49da55796\": container with ID starting with ce4c9f78f86e925b85dab939bcdbc4ec0671f53976dba13ba26aacc49da55796 not found: ID does not exist" containerID="ce4c9f78f86e925b85dab939bcdbc4ec0671f53976dba13ba26aacc49da55796" Mar 09 03:24:21 crc kubenswrapper[4901]: I0309 03:24:21.617506 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce4c9f78f86e925b85dab939bcdbc4ec0671f53976dba13ba26aacc49da55796"} err="failed to get container status \"ce4c9f78f86e925b85dab939bcdbc4ec0671f53976dba13ba26aacc49da55796\": rpc error: code = NotFound desc = could not find container \"ce4c9f78f86e925b85dab939bcdbc4ec0671f53976dba13ba26aacc49da55796\": container with ID starting with ce4c9f78f86e925b85dab939bcdbc4ec0671f53976dba13ba26aacc49da55796 not found: ID does not exist" Mar 09 03:24:21 crc kubenswrapper[4901]: I0309 03:24:21.617531 4901 scope.go:117] "RemoveContainer" containerID="f73d8c810a875e420e2260c9852d68a532abc2737b0b87ed777182031ad0990f" Mar 09 03:24:21 crc kubenswrapper[4901]: E0309 03:24:21.617846 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f73d8c810a875e420e2260c9852d68a532abc2737b0b87ed777182031ad0990f\": container with ID starting with f73d8c810a875e420e2260c9852d68a532abc2737b0b87ed777182031ad0990f not found: ID does not exist" containerID="f73d8c810a875e420e2260c9852d68a532abc2737b0b87ed777182031ad0990f" Mar 09 03:24:21 crc kubenswrapper[4901]: I0309 03:24:21.617889 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f73d8c810a875e420e2260c9852d68a532abc2737b0b87ed777182031ad0990f"} err="failed to get container status \"f73d8c810a875e420e2260c9852d68a532abc2737b0b87ed777182031ad0990f\": rpc error: code = NotFound desc = could not find container \"f73d8c810a875e420e2260c9852d68a532abc2737b0b87ed777182031ad0990f\": container with ID starting with f73d8c810a875e420e2260c9852d68a532abc2737b0b87ed777182031ad0990f not found: ID does not exist" Mar 09 03:24:21 crc kubenswrapper[4901]: I0309 03:24:21.617933 4901 scope.go:117] "RemoveContainer" containerID="301bcc33099b961b14effa5c9c1694b9dbac5e10b8634867a2fe8a854e439251" Mar 09 03:24:21 crc kubenswrapper[4901]: E0309 03:24:21.618184 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"301bcc33099b961b14effa5c9c1694b9dbac5e10b8634867a2fe8a854e439251\": container with ID starting with 301bcc33099b961b14effa5c9c1694b9dbac5e10b8634867a2fe8a854e439251 not found: ID does not exist" containerID="301bcc33099b961b14effa5c9c1694b9dbac5e10b8634867a2fe8a854e439251" Mar 09 03:24:21 crc kubenswrapper[4901]: I0309 03:24:21.618210 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"301bcc33099b961b14effa5c9c1694b9dbac5e10b8634867a2fe8a854e439251"} err="failed to get container status \"301bcc33099b961b14effa5c9c1694b9dbac5e10b8634867a2fe8a854e439251\": rpc error: code = NotFound desc = could not find container \"301bcc33099b961b14effa5c9c1694b9dbac5e10b8634867a2fe8a854e439251\": container with ID starting with 301bcc33099b961b14effa5c9c1694b9dbac5e10b8634867a2fe8a854e439251 not found: ID does not exist" Mar 09 03:24:21 crc kubenswrapper[4901]: I0309 03:24:21.787101 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-brllk" Mar 09 03:24:21 crc kubenswrapper[4901]: I0309 03:24:21.973922 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/658e55f8-1839-40c2-9dc9-892a2ea611fe-catalog-content\") pod \"658e55f8-1839-40c2-9dc9-892a2ea611fe\" (UID: \"658e55f8-1839-40c2-9dc9-892a2ea611fe\") " Mar 09 03:24:21 crc kubenswrapper[4901]: I0309 03:24:21.974112 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fp45z\" (UniqueName: \"kubernetes.io/projected/658e55f8-1839-40c2-9dc9-892a2ea611fe-kube-api-access-fp45z\") pod \"658e55f8-1839-40c2-9dc9-892a2ea611fe\" (UID: \"658e55f8-1839-40c2-9dc9-892a2ea611fe\") " Mar 09 03:24:21 crc kubenswrapper[4901]: I0309 03:24:21.974173 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/658e55f8-1839-40c2-9dc9-892a2ea611fe-utilities\") pod \"658e55f8-1839-40c2-9dc9-892a2ea611fe\" (UID: \"658e55f8-1839-40c2-9dc9-892a2ea611fe\") " Mar 09 03:24:21 crc kubenswrapper[4901]: I0309 03:24:21.975174 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/658e55f8-1839-40c2-9dc9-892a2ea611fe-utilities" (OuterVolumeSpecName: "utilities") pod "658e55f8-1839-40c2-9dc9-892a2ea611fe" (UID: "658e55f8-1839-40c2-9dc9-892a2ea611fe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:24:21 crc kubenswrapper[4901]: I0309 03:24:21.978314 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/658e55f8-1839-40c2-9dc9-892a2ea611fe-kube-api-access-fp45z" (OuterVolumeSpecName: "kube-api-access-fp45z") pod "658e55f8-1839-40c2-9dc9-892a2ea611fe" (UID: "658e55f8-1839-40c2-9dc9-892a2ea611fe"). InnerVolumeSpecName "kube-api-access-fp45z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:24:22 crc kubenswrapper[4901]: I0309 03:24:22.009784 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/658e55f8-1839-40c2-9dc9-892a2ea611fe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "658e55f8-1839-40c2-9dc9-892a2ea611fe" (UID: "658e55f8-1839-40c2-9dc9-892a2ea611fe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:24:22 crc kubenswrapper[4901]: I0309 03:24:22.076350 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/658e55f8-1839-40c2-9dc9-892a2ea611fe-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 03:24:22 crc kubenswrapper[4901]: I0309 03:24:22.076686 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fp45z\" (UniqueName: \"kubernetes.io/projected/658e55f8-1839-40c2-9dc9-892a2ea611fe-kube-api-access-fp45z\") on node \"crc\" DevicePath \"\"" Mar 09 03:24:22 crc kubenswrapper[4901]: I0309 03:24:22.076917 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/658e55f8-1839-40c2-9dc9-892a2ea611fe-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 03:24:22 crc kubenswrapper[4901]: I0309 03:24:22.121090 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7db523e-196e-4c1a-9e5f-572c76f04a79" path="/var/lib/kubelet/pods/f7db523e-196e-4c1a-9e5f-572c76f04a79/volumes" Mar 09 03:24:22 crc kubenswrapper[4901]: I0309 03:24:22.436049 4901 generic.go:334] "Generic (PLEG): container finished" podID="658e55f8-1839-40c2-9dc9-892a2ea611fe" containerID="ed5dbe6fd8691f1cd9e072a71b352b8e79f96f1eb5ae0be2c2e4c4002ebd01c3" exitCode=0 Mar 09 03:24:22 crc kubenswrapper[4901]: I0309 03:24:22.436157 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-brllk" Mar 09 03:24:22 crc kubenswrapper[4901]: I0309 03:24:22.436378 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-brllk" event={"ID":"658e55f8-1839-40c2-9dc9-892a2ea611fe","Type":"ContainerDied","Data":"ed5dbe6fd8691f1cd9e072a71b352b8e79f96f1eb5ae0be2c2e4c4002ebd01c3"} Mar 09 03:24:22 crc kubenswrapper[4901]: I0309 03:24:22.436466 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-brllk" event={"ID":"658e55f8-1839-40c2-9dc9-892a2ea611fe","Type":"ContainerDied","Data":"fc6a5ac79117894e0fc984149472198778c7a853fc090f65c17182503266886d"} Mar 09 03:24:22 crc kubenswrapper[4901]: I0309 03:24:22.436504 4901 scope.go:117] "RemoveContainer" containerID="ed5dbe6fd8691f1cd9e072a71b352b8e79f96f1eb5ae0be2c2e4c4002ebd01c3" Mar 09 03:24:22 crc kubenswrapper[4901]: I0309 03:24:22.460847 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-brllk"] Mar 09 03:24:22 crc kubenswrapper[4901]: I0309 03:24:22.467580 4901 scope.go:117] "RemoveContainer" containerID="1398b92d52295a01222398d5352f65b871989c67a4991da6cd1847212c1262a5" Mar 09 03:24:22 crc kubenswrapper[4901]: I0309 03:24:22.472754 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-brllk"] Mar 09 03:24:22 crc kubenswrapper[4901]: I0309 03:24:22.493578 4901 scope.go:117] "RemoveContainer" containerID="dd60158cf6e08da2a416e62f788bddc83533e69de968fd0efb7f6cca115b3ee4" Mar 09 03:24:22 crc kubenswrapper[4901]: I0309 03:24:22.518912 4901 scope.go:117] "RemoveContainer" containerID="ed5dbe6fd8691f1cd9e072a71b352b8e79f96f1eb5ae0be2c2e4c4002ebd01c3" Mar 09 03:24:22 crc kubenswrapper[4901]: E0309 03:24:22.519629 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed5dbe6fd8691f1cd9e072a71b352b8e79f96f1eb5ae0be2c2e4c4002ebd01c3\": container with ID starting with ed5dbe6fd8691f1cd9e072a71b352b8e79f96f1eb5ae0be2c2e4c4002ebd01c3 not found: ID does not exist" containerID="ed5dbe6fd8691f1cd9e072a71b352b8e79f96f1eb5ae0be2c2e4c4002ebd01c3" Mar 09 03:24:22 crc kubenswrapper[4901]: I0309 03:24:22.519710 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed5dbe6fd8691f1cd9e072a71b352b8e79f96f1eb5ae0be2c2e4c4002ebd01c3"} err="failed to get container status \"ed5dbe6fd8691f1cd9e072a71b352b8e79f96f1eb5ae0be2c2e4c4002ebd01c3\": rpc error: code = NotFound desc = could not find container \"ed5dbe6fd8691f1cd9e072a71b352b8e79f96f1eb5ae0be2c2e4c4002ebd01c3\": container with ID starting with ed5dbe6fd8691f1cd9e072a71b352b8e79f96f1eb5ae0be2c2e4c4002ebd01c3 not found: ID does not exist" Mar 09 03:24:22 crc kubenswrapper[4901]: I0309 03:24:22.519767 4901 scope.go:117] "RemoveContainer" containerID="1398b92d52295a01222398d5352f65b871989c67a4991da6cd1847212c1262a5" Mar 09 03:24:22 crc kubenswrapper[4901]: E0309 03:24:22.520555 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1398b92d52295a01222398d5352f65b871989c67a4991da6cd1847212c1262a5\": container with ID starting with 1398b92d52295a01222398d5352f65b871989c67a4991da6cd1847212c1262a5 not found: ID does not exist" containerID="1398b92d52295a01222398d5352f65b871989c67a4991da6cd1847212c1262a5" Mar 09 03:24:22 crc kubenswrapper[4901]: I0309 03:24:22.520632 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1398b92d52295a01222398d5352f65b871989c67a4991da6cd1847212c1262a5"} err="failed to get container status \"1398b92d52295a01222398d5352f65b871989c67a4991da6cd1847212c1262a5\": rpc error: code = NotFound desc = could not find container \"1398b92d52295a01222398d5352f65b871989c67a4991da6cd1847212c1262a5\": container with ID starting with 1398b92d52295a01222398d5352f65b871989c67a4991da6cd1847212c1262a5 not found: ID does not exist" Mar 09 03:24:22 crc kubenswrapper[4901]: I0309 03:24:22.520678 4901 scope.go:117] "RemoveContainer" containerID="dd60158cf6e08da2a416e62f788bddc83533e69de968fd0efb7f6cca115b3ee4" Mar 09 03:24:22 crc kubenswrapper[4901]: E0309 03:24:22.521247 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd60158cf6e08da2a416e62f788bddc83533e69de968fd0efb7f6cca115b3ee4\": container with ID starting with dd60158cf6e08da2a416e62f788bddc83533e69de968fd0efb7f6cca115b3ee4 not found: ID does not exist" containerID="dd60158cf6e08da2a416e62f788bddc83533e69de968fd0efb7f6cca115b3ee4" Mar 09 03:24:22 crc kubenswrapper[4901]: I0309 03:24:22.521285 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd60158cf6e08da2a416e62f788bddc83533e69de968fd0efb7f6cca115b3ee4"} err="failed to get container status \"dd60158cf6e08da2a416e62f788bddc83533e69de968fd0efb7f6cca115b3ee4\": rpc error: code = NotFound desc = could not find container \"dd60158cf6e08da2a416e62f788bddc83533e69de968fd0efb7f6cca115b3ee4\": container with ID starting with dd60158cf6e08da2a416e62f788bddc83533e69de968fd0efb7f6cca115b3ee4 not found: ID does not exist" Mar 09 03:24:24 crc kubenswrapper[4901]: I0309 03:24:24.125666 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="658e55f8-1839-40c2-9dc9-892a2ea611fe" path="/var/lib/kubelet/pods/658e55f8-1839-40c2-9dc9-892a2ea611fe/volumes" Mar 09 03:24:35 crc kubenswrapper[4901]: I0309 03:24:35.107096 4901 scope.go:117] "RemoveContainer" containerID="b882e9e78d8beaa4679398085320aa14420eb8f755a8b3bec066597796f6c72e" Mar 09 03:24:35 crc kubenswrapper[4901]: E0309 03:24:35.108273 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:24:38 crc kubenswrapper[4901]: I0309 03:24:38.687113 4901 scope.go:117] "RemoveContainer" containerID="4fc8b3ea38c5140c7d3555a226e0ec78dd84ee145d0d16a912adc8e6220510ed" Mar 09 03:24:46 crc kubenswrapper[4901]: I0309 03:24:46.109777 4901 scope.go:117] "RemoveContainer" containerID="b882e9e78d8beaa4679398085320aa14420eb8f755a8b3bec066597796f6c72e" Mar 09 03:24:46 crc kubenswrapper[4901]: E0309 03:24:46.110396 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:24:57 crc kubenswrapper[4901]: I0309 03:24:57.106127 4901 scope.go:117] "RemoveContainer" containerID="b882e9e78d8beaa4679398085320aa14420eb8f755a8b3bec066597796f6c72e" Mar 09 03:24:57 crc kubenswrapper[4901]: E0309 03:24:57.107295 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:25:11 crc kubenswrapper[4901]: I0309 03:25:11.106899 4901 scope.go:117] "RemoveContainer" containerID="b882e9e78d8beaa4679398085320aa14420eb8f755a8b3bec066597796f6c72e" Mar 09 03:25:11 crc kubenswrapper[4901]: E0309 03:25:11.107667 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:25:22 crc kubenswrapper[4901]: I0309 03:25:22.107666 4901 scope.go:117] "RemoveContainer" containerID="b882e9e78d8beaa4679398085320aa14420eb8f755a8b3bec066597796f6c72e" Mar 09 03:25:22 crc kubenswrapper[4901]: E0309 03:25:22.108914 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:25:37 crc kubenswrapper[4901]: I0309 03:25:37.107306 4901 scope.go:117] "RemoveContainer" containerID="b882e9e78d8beaa4679398085320aa14420eb8f755a8b3bec066597796f6c72e" Mar 09 03:25:37 crc kubenswrapper[4901]: E0309 03:25:37.108041 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:25:50 crc kubenswrapper[4901]: I0309 03:25:50.106552 4901 scope.go:117] "RemoveContainer" containerID="b882e9e78d8beaa4679398085320aa14420eb8f755a8b3bec066597796f6c72e" Mar 09 03:25:50 crc kubenswrapper[4901]: E0309 03:25:50.107607 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:26:00 crc kubenswrapper[4901]: I0309 03:26:00.168313 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550446-x2hd5"] Mar 09 03:26:00 crc kubenswrapper[4901]: E0309 03:26:00.169600 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7db523e-196e-4c1a-9e5f-572c76f04a79" containerName="registry-server" Mar 09 03:26:00 crc kubenswrapper[4901]: I0309 03:26:00.169633 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7db523e-196e-4c1a-9e5f-572c76f04a79" containerName="registry-server" Mar 09 03:26:00 crc kubenswrapper[4901]: E0309 03:26:00.169652 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="658e55f8-1839-40c2-9dc9-892a2ea611fe" containerName="registry-server" Mar 09 03:26:00 crc kubenswrapper[4901]: I0309 03:26:00.169668 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="658e55f8-1839-40c2-9dc9-892a2ea611fe" containerName="registry-server" Mar 09 03:26:00 crc kubenswrapper[4901]: E0309 03:26:00.169718 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7db523e-196e-4c1a-9e5f-572c76f04a79" containerName="extract-utilities" Mar 09 03:26:00 crc kubenswrapper[4901]: I0309 03:26:00.169735 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7db523e-196e-4c1a-9e5f-572c76f04a79" containerName="extract-utilities" Mar 09 03:26:00 crc kubenswrapper[4901]: E0309 03:26:00.169767 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="658e55f8-1839-40c2-9dc9-892a2ea611fe" containerName="extract-content" Mar 09 03:26:00 crc kubenswrapper[4901]: I0309 03:26:00.169779 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="658e55f8-1839-40c2-9dc9-892a2ea611fe" containerName="extract-content" Mar 09 03:26:00 crc kubenswrapper[4901]: E0309 03:26:00.169807 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="658e55f8-1839-40c2-9dc9-892a2ea611fe" containerName="extract-utilities" Mar 09 03:26:00 crc kubenswrapper[4901]: I0309 03:26:00.169822 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="658e55f8-1839-40c2-9dc9-892a2ea611fe" containerName="extract-utilities" Mar 09 03:26:00 crc kubenswrapper[4901]: E0309 03:26:00.169873 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7db523e-196e-4c1a-9e5f-572c76f04a79" containerName="extract-content" Mar 09 03:26:00 crc kubenswrapper[4901]: I0309 03:26:00.169885 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7db523e-196e-4c1a-9e5f-572c76f04a79" containerName="extract-content" Mar 09 03:26:00 crc kubenswrapper[4901]: I0309 03:26:00.170127 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7db523e-196e-4c1a-9e5f-572c76f04a79" containerName="registry-server" Mar 09 03:26:00 crc kubenswrapper[4901]: I0309 03:26:00.170169 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="658e55f8-1839-40c2-9dc9-892a2ea611fe" containerName="registry-server" Mar 09 03:26:00 crc kubenswrapper[4901]: I0309 03:26:00.171048 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550446-x2hd5" Mar 09 03:26:00 crc kubenswrapper[4901]: I0309 03:26:00.174988 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lq6vs" Mar 09 03:26:00 crc kubenswrapper[4901]: I0309 03:26:00.175020 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 03:26:00 crc kubenswrapper[4901]: I0309 03:26:00.175337 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 03:26:00 crc kubenswrapper[4901]: I0309 03:26:00.184197 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550446-x2hd5"] Mar 09 03:26:00 crc kubenswrapper[4901]: I0309 03:26:00.232065 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z45mg\" (UniqueName: \"kubernetes.io/projected/c1acdddf-e1ac-412e-a0a4-e874ce64a2bb-kube-api-access-z45mg\") pod \"auto-csr-approver-29550446-x2hd5\" (UID: \"c1acdddf-e1ac-412e-a0a4-e874ce64a2bb\") " pod="openshift-infra/auto-csr-approver-29550446-x2hd5" Mar 09 03:26:00 crc kubenswrapper[4901]: I0309 03:26:00.333897 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z45mg\" (UniqueName: \"kubernetes.io/projected/c1acdddf-e1ac-412e-a0a4-e874ce64a2bb-kube-api-access-z45mg\") pod \"auto-csr-approver-29550446-x2hd5\" (UID: \"c1acdddf-e1ac-412e-a0a4-e874ce64a2bb\") " pod="openshift-infra/auto-csr-approver-29550446-x2hd5" Mar 09 03:26:00 crc kubenswrapper[4901]: I0309 03:26:00.370680 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z45mg\" (UniqueName: \"kubernetes.io/projected/c1acdddf-e1ac-412e-a0a4-e874ce64a2bb-kube-api-access-z45mg\") pod \"auto-csr-approver-29550446-x2hd5\" (UID: \"c1acdddf-e1ac-412e-a0a4-e874ce64a2bb\") " pod="openshift-infra/auto-csr-approver-29550446-x2hd5" Mar 09 03:26:00 crc kubenswrapper[4901]: I0309 03:26:00.507680 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550446-x2hd5" Mar 09 03:26:01 crc kubenswrapper[4901]: I0309 03:26:01.026568 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550446-x2hd5"] Mar 09 03:26:01 crc kubenswrapper[4901]: I0309 03:26:01.350785 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550446-x2hd5" event={"ID":"c1acdddf-e1ac-412e-a0a4-e874ce64a2bb","Type":"ContainerStarted","Data":"4b34d95524c72380d23ccf949522ad0099c5d950c97b6926ba9df76fff892122"} Mar 09 03:26:02 crc kubenswrapper[4901]: I0309 03:26:02.358780 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550446-x2hd5" event={"ID":"c1acdddf-e1ac-412e-a0a4-e874ce64a2bb","Type":"ContainerStarted","Data":"e28a88da3387af3a92e2cdab4ed2a00b617b45512b0375b0589f4c301d83679b"} Mar 09 03:26:02 crc kubenswrapper[4901]: I0309 03:26:02.385840 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550446-x2hd5" podStartSLOduration=1.3574215760000001 podStartE2EDuration="2.385772704s" podCreationTimestamp="2026-03-09 03:26:00 +0000 UTC" firstStartedPulling="2026-03-09 03:26:01.048702633 +0000 UTC m=+2685.638366395" lastFinishedPulling="2026-03-09 03:26:02.077053781 +0000 UTC m=+2686.666717523" observedRunningTime="2026-03-09 03:26:02.372909842 +0000 UTC m=+2686.962573574" watchObservedRunningTime="2026-03-09 03:26:02.385772704 +0000 UTC m=+2686.975436456" Mar 09 03:26:03 crc kubenswrapper[4901]: I0309 03:26:03.373700 4901 generic.go:334] "Generic (PLEG): container finished" podID="c1acdddf-e1ac-412e-a0a4-e874ce64a2bb" containerID="e28a88da3387af3a92e2cdab4ed2a00b617b45512b0375b0589f4c301d83679b" exitCode=0 Mar 09 03:26:03 crc kubenswrapper[4901]: I0309 03:26:03.373784 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550446-x2hd5" event={"ID":"c1acdddf-e1ac-412e-a0a4-e874ce64a2bb","Type":"ContainerDied","Data":"e28a88da3387af3a92e2cdab4ed2a00b617b45512b0375b0589f4c301d83679b"} Mar 09 03:26:04 crc kubenswrapper[4901]: I0309 03:26:04.111431 4901 scope.go:117] "RemoveContainer" containerID="b882e9e78d8beaa4679398085320aa14420eb8f755a8b3bec066597796f6c72e" Mar 09 03:26:04 crc kubenswrapper[4901]: E0309 03:26:04.112177 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:26:04 crc kubenswrapper[4901]: I0309 03:26:04.738183 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550446-x2hd5" Mar 09 03:26:04 crc kubenswrapper[4901]: I0309 03:26:04.910289 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z45mg\" (UniqueName: \"kubernetes.io/projected/c1acdddf-e1ac-412e-a0a4-e874ce64a2bb-kube-api-access-z45mg\") pod \"c1acdddf-e1ac-412e-a0a4-e874ce64a2bb\" (UID: \"c1acdddf-e1ac-412e-a0a4-e874ce64a2bb\") " Mar 09 03:26:04 crc kubenswrapper[4901]: I0309 03:26:04.917538 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1acdddf-e1ac-412e-a0a4-e874ce64a2bb-kube-api-access-z45mg" (OuterVolumeSpecName: "kube-api-access-z45mg") pod "c1acdddf-e1ac-412e-a0a4-e874ce64a2bb" (UID: "c1acdddf-e1ac-412e-a0a4-e874ce64a2bb"). InnerVolumeSpecName "kube-api-access-z45mg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:26:05 crc kubenswrapper[4901]: I0309 03:26:05.012447 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z45mg\" (UniqueName: \"kubernetes.io/projected/c1acdddf-e1ac-412e-a0a4-e874ce64a2bb-kube-api-access-z45mg\") on node \"crc\" DevicePath \"\"" Mar 09 03:26:05 crc kubenswrapper[4901]: I0309 03:26:05.392667 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550446-x2hd5" event={"ID":"c1acdddf-e1ac-412e-a0a4-e874ce64a2bb","Type":"ContainerDied","Data":"4b34d95524c72380d23ccf949522ad0099c5d950c97b6926ba9df76fff892122"} Mar 09 03:26:05 crc kubenswrapper[4901]: I0309 03:26:05.392722 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b34d95524c72380d23ccf949522ad0099c5d950c97b6926ba9df76fff892122" Mar 09 03:26:05 crc kubenswrapper[4901]: I0309 03:26:05.392761 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550446-x2hd5" Mar 09 03:26:05 crc kubenswrapper[4901]: I0309 03:26:05.457441 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550440-jzzt2"] Mar 09 03:26:05 crc kubenswrapper[4901]: I0309 03:26:05.464671 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550440-jzzt2"] Mar 09 03:26:06 crc kubenswrapper[4901]: I0309 03:26:06.138218 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c6aa863-04d9-44c1-95ba-22f343e6e914" path="/var/lib/kubelet/pods/0c6aa863-04d9-44c1-95ba-22f343e6e914/volumes" Mar 09 03:26:19 crc kubenswrapper[4901]: I0309 03:26:19.107066 4901 scope.go:117] "RemoveContainer" containerID="b882e9e78d8beaa4679398085320aa14420eb8f755a8b3bec066597796f6c72e" Mar 09 03:26:19 crc kubenswrapper[4901]: E0309 03:26:19.108301 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:26:31 crc kubenswrapper[4901]: I0309 03:26:31.106793 4901 scope.go:117] "RemoveContainer" containerID="b882e9e78d8beaa4679398085320aa14420eb8f755a8b3bec066597796f6c72e" Mar 09 03:26:31 crc kubenswrapper[4901]: E0309 03:26:31.107848 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:26:38 crc kubenswrapper[4901]: I0309 03:26:38.812187 4901 scope.go:117] "RemoveContainer" containerID="1ef45a2d2de790508e49589be6162b1bd92b465543f0df92fbf8b828ab064158" Mar 09 03:26:45 crc kubenswrapper[4901]: I0309 03:26:45.105930 4901 scope.go:117] "RemoveContainer" containerID="b882e9e78d8beaa4679398085320aa14420eb8f755a8b3bec066597796f6c72e" Mar 09 03:26:45 crc kubenswrapper[4901]: E0309 03:26:45.107033 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:26:58 crc kubenswrapper[4901]: I0309 03:26:58.107154 4901 scope.go:117] "RemoveContainer" containerID="b882e9e78d8beaa4679398085320aa14420eb8f755a8b3bec066597796f6c72e" Mar 09 03:26:58 crc kubenswrapper[4901]: E0309 03:26:58.108190 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:27:10 crc kubenswrapper[4901]: I0309 03:27:10.106611 4901 scope.go:117] "RemoveContainer" containerID="b882e9e78d8beaa4679398085320aa14420eb8f755a8b3bec066597796f6c72e" Mar 09 03:27:10 crc kubenswrapper[4901]: E0309 03:27:10.107577 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:27:25 crc kubenswrapper[4901]: I0309 03:27:25.107872 4901 scope.go:117] "RemoveContainer" containerID="b882e9e78d8beaa4679398085320aa14420eb8f755a8b3bec066597796f6c72e" Mar 09 03:27:25 crc kubenswrapper[4901]: E0309 03:27:25.109021 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:27:40 crc kubenswrapper[4901]: I0309 03:27:40.106372 4901 scope.go:117] "RemoveContainer" containerID="b882e9e78d8beaa4679398085320aa14420eb8f755a8b3bec066597796f6c72e" Mar 09 03:27:40 crc kubenswrapper[4901]: E0309 03:27:40.107509 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:27:52 crc kubenswrapper[4901]: I0309 03:27:52.107011 4901 scope.go:117] "RemoveContainer" containerID="b882e9e78d8beaa4679398085320aa14420eb8f755a8b3bec066597796f6c72e" Mar 09 03:27:52 crc kubenswrapper[4901]: E0309 03:27:52.108077 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:28:00 crc kubenswrapper[4901]: I0309 03:28:00.163037 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550448-kz9dr"] Mar 09 03:28:00 crc kubenswrapper[4901]: E0309 03:28:00.164080 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1acdddf-e1ac-412e-a0a4-e874ce64a2bb" containerName="oc" Mar 09 03:28:00 crc kubenswrapper[4901]: I0309 03:28:00.164101 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1acdddf-e1ac-412e-a0a4-e874ce64a2bb" containerName="oc" Mar 09 03:28:00 crc kubenswrapper[4901]: I0309 03:28:00.164320 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1acdddf-e1ac-412e-a0a4-e874ce64a2bb" containerName="oc" Mar 09 03:28:00 crc kubenswrapper[4901]: I0309 03:28:00.164996 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550448-kz9dr" Mar 09 03:28:00 crc kubenswrapper[4901]: I0309 03:28:00.167711 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 03:28:00 crc kubenswrapper[4901]: I0309 03:28:00.169618 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lq6vs" Mar 09 03:28:00 crc kubenswrapper[4901]: I0309 03:28:00.170767 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 03:28:00 crc kubenswrapper[4901]: I0309 03:28:00.188432 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550448-kz9dr"] Mar 09 03:28:00 crc kubenswrapper[4901]: I0309 03:28:00.269530 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqvhd\" (UniqueName: \"kubernetes.io/projected/d239abea-17e6-4c02-8897-a079ef1d0dc3-kube-api-access-pqvhd\") pod \"auto-csr-approver-29550448-kz9dr\" (UID: \"d239abea-17e6-4c02-8897-a079ef1d0dc3\") " pod="openshift-infra/auto-csr-approver-29550448-kz9dr" Mar 09 03:28:00 crc kubenswrapper[4901]: I0309 03:28:00.371760 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqvhd\" (UniqueName: \"kubernetes.io/projected/d239abea-17e6-4c02-8897-a079ef1d0dc3-kube-api-access-pqvhd\") pod \"auto-csr-approver-29550448-kz9dr\" (UID: \"d239abea-17e6-4c02-8897-a079ef1d0dc3\") " pod="openshift-infra/auto-csr-approver-29550448-kz9dr" Mar 09 03:28:00 crc kubenswrapper[4901]: I0309 03:28:00.389421 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqvhd\" (UniqueName: \"kubernetes.io/projected/d239abea-17e6-4c02-8897-a079ef1d0dc3-kube-api-access-pqvhd\") pod \"auto-csr-approver-29550448-kz9dr\" (UID: \"d239abea-17e6-4c02-8897-a079ef1d0dc3\") " pod="openshift-infra/auto-csr-approver-29550448-kz9dr" Mar 09 03:28:00 crc kubenswrapper[4901]: I0309 03:28:00.487078 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550448-kz9dr" Mar 09 03:28:01 crc kubenswrapper[4901]: I0309 03:28:01.010521 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550448-kz9dr"] Mar 09 03:28:01 crc kubenswrapper[4901]: I0309 03:28:01.550803 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550448-kz9dr" event={"ID":"d239abea-17e6-4c02-8897-a079ef1d0dc3","Type":"ContainerStarted","Data":"d1d50bb5e792512df6443d63c11eb36c05d841a7dd59dff370dfdedf3f03bdb1"} Mar 09 03:28:02 crc kubenswrapper[4901]: I0309 03:28:02.562174 4901 generic.go:334] "Generic (PLEG): container finished" podID="d239abea-17e6-4c02-8897-a079ef1d0dc3" containerID="aa5e32d1be8d61aeab028ce2be3ca80e3a83104e072267ce83f1464993cc68ba" exitCode=0 Mar 09 03:28:02 crc kubenswrapper[4901]: I0309 03:28:02.562303 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550448-kz9dr" event={"ID":"d239abea-17e6-4c02-8897-a079ef1d0dc3","Type":"ContainerDied","Data":"aa5e32d1be8d61aeab028ce2be3ca80e3a83104e072267ce83f1464993cc68ba"} Mar 09 03:28:04 crc kubenswrapper[4901]: I0309 03:28:04.002993 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550448-kz9dr" Mar 09 03:28:04 crc kubenswrapper[4901]: I0309 03:28:04.134980 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqvhd\" (UniqueName: \"kubernetes.io/projected/d239abea-17e6-4c02-8897-a079ef1d0dc3-kube-api-access-pqvhd\") pod \"d239abea-17e6-4c02-8897-a079ef1d0dc3\" (UID: \"d239abea-17e6-4c02-8897-a079ef1d0dc3\") " Mar 09 03:28:04 crc kubenswrapper[4901]: I0309 03:28:04.145438 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d239abea-17e6-4c02-8897-a079ef1d0dc3-kube-api-access-pqvhd" (OuterVolumeSpecName: "kube-api-access-pqvhd") pod "d239abea-17e6-4c02-8897-a079ef1d0dc3" (UID: "d239abea-17e6-4c02-8897-a079ef1d0dc3"). InnerVolumeSpecName "kube-api-access-pqvhd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:28:04 crc kubenswrapper[4901]: I0309 03:28:04.237671 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqvhd\" (UniqueName: \"kubernetes.io/projected/d239abea-17e6-4c02-8897-a079ef1d0dc3-kube-api-access-pqvhd\") on node \"crc\" DevicePath \"\"" Mar 09 03:28:04 crc kubenswrapper[4901]: I0309 03:28:04.585822 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550448-kz9dr" event={"ID":"d239abea-17e6-4c02-8897-a079ef1d0dc3","Type":"ContainerDied","Data":"d1d50bb5e792512df6443d63c11eb36c05d841a7dd59dff370dfdedf3f03bdb1"} Mar 09 03:28:04 crc kubenswrapper[4901]: I0309 03:28:04.586394 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1d50bb5e792512df6443d63c11eb36c05d841a7dd59dff370dfdedf3f03bdb1" Mar 09 03:28:04 crc kubenswrapper[4901]: I0309 03:28:04.586485 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550448-kz9dr" Mar 09 03:28:05 crc kubenswrapper[4901]: I0309 03:28:05.099129 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550442-79j67"] Mar 09 03:28:05 crc kubenswrapper[4901]: I0309 03:28:05.109630 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550442-79j67"] Mar 09 03:28:06 crc kubenswrapper[4901]: I0309 03:28:06.121668 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff6db7d3-b776-43ea-911a-411904a0deb4" path="/var/lib/kubelet/pods/ff6db7d3-b776-43ea-911a-411904a0deb4/volumes" Mar 09 03:28:07 crc kubenswrapper[4901]: I0309 03:28:07.106527 4901 scope.go:117] "RemoveContainer" containerID="b882e9e78d8beaa4679398085320aa14420eb8f755a8b3bec066597796f6c72e" Mar 09 03:28:07 crc kubenswrapper[4901]: I0309 03:28:07.614290 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" event={"ID":"65e722e8-52c4-4bb6-9927-f378b2f7296a","Type":"ContainerStarted","Data":"96795b44fdfb1d02477775a37c4cf63b6931ee38b13fb7a897de264e1203c8a7"} Mar 09 03:28:38 crc kubenswrapper[4901]: I0309 03:28:38.922745 4901 scope.go:117] "RemoveContainer" containerID="fdce3477370416b31f6ddef6a9fd3137859cf002a6ac5f19d90b45842088dfdc" Mar 09 03:30:00 crc kubenswrapper[4901]: I0309 03:30:00.164093 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550450-tj9sv"] Mar 09 03:30:00 crc kubenswrapper[4901]: E0309 03:30:00.166941 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d239abea-17e6-4c02-8897-a079ef1d0dc3" containerName="oc" Mar 09 03:30:00 crc kubenswrapper[4901]: I0309 03:30:00.167086 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="d239abea-17e6-4c02-8897-a079ef1d0dc3" containerName="oc" Mar 09 03:30:00 crc kubenswrapper[4901]: I0309 03:30:00.167445 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="d239abea-17e6-4c02-8897-a079ef1d0dc3" containerName="oc" Mar 09 03:30:00 crc kubenswrapper[4901]: I0309 03:30:00.168410 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550450-tj9sv" Mar 09 03:30:00 crc kubenswrapper[4901]: I0309 03:30:00.170840 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550450-h5mcz"] Mar 09 03:30:00 crc kubenswrapper[4901]: I0309 03:30:00.171798 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550450-h5mcz" Mar 09 03:30:00 crc kubenswrapper[4901]: I0309 03:30:00.171815 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 03:30:00 crc kubenswrapper[4901]: I0309 03:30:00.171922 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lq6vs" Mar 09 03:30:00 crc kubenswrapper[4901]: I0309 03:30:00.172848 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 03:30:00 crc kubenswrapper[4901]: I0309 03:30:00.173706 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 09 03:30:00 crc kubenswrapper[4901]: I0309 03:30:00.174721 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 09 03:30:00 crc kubenswrapper[4901]: I0309 03:30:00.186746 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550450-tj9sv"] Mar 09 03:30:00 crc kubenswrapper[4901]: I0309 03:30:00.230779 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550450-h5mcz"] Mar 09 03:30:00 crc kubenswrapper[4901]: I0309 03:30:00.343389 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/38e0d506-1a40-4a28-8819-ebae5d085f89-config-volume\") pod \"collect-profiles-29550450-h5mcz\" (UID: \"38e0d506-1a40-4a28-8819-ebae5d085f89\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550450-h5mcz" Mar 09 03:30:00 crc kubenswrapper[4901]: I0309 03:30:00.343520 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ld5xl\" (UniqueName: \"kubernetes.io/projected/3fba0d69-8ec5-4a56-8486-db533ab566ec-kube-api-access-ld5xl\") pod \"auto-csr-approver-29550450-tj9sv\" (UID: \"3fba0d69-8ec5-4a56-8486-db533ab566ec\") " pod="openshift-infra/auto-csr-approver-29550450-tj9sv" Mar 09 03:30:00 crc kubenswrapper[4901]: I0309 03:30:00.343578 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69m47\" (UniqueName: \"kubernetes.io/projected/38e0d506-1a40-4a28-8819-ebae5d085f89-kube-api-access-69m47\") pod \"collect-profiles-29550450-h5mcz\" (UID: \"38e0d506-1a40-4a28-8819-ebae5d085f89\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550450-h5mcz" Mar 09 03:30:00 crc kubenswrapper[4901]: I0309 03:30:00.343662 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/38e0d506-1a40-4a28-8819-ebae5d085f89-secret-volume\") pod \"collect-profiles-29550450-h5mcz\" (UID: \"38e0d506-1a40-4a28-8819-ebae5d085f89\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550450-h5mcz" Mar 09 03:30:00 crc kubenswrapper[4901]: I0309 03:30:00.445442 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/38e0d506-1a40-4a28-8819-ebae5d085f89-config-volume\") pod \"collect-profiles-29550450-h5mcz\" (UID: \"38e0d506-1a40-4a28-8819-ebae5d085f89\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550450-h5mcz" Mar 09 03:30:00 crc kubenswrapper[4901]: I0309 03:30:00.445611 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ld5xl\" (UniqueName: \"kubernetes.io/projected/3fba0d69-8ec5-4a56-8486-db533ab566ec-kube-api-access-ld5xl\") pod \"auto-csr-approver-29550450-tj9sv\" (UID: \"3fba0d69-8ec5-4a56-8486-db533ab566ec\") " pod="openshift-infra/auto-csr-approver-29550450-tj9sv" Mar 09 03:30:00 crc kubenswrapper[4901]: I0309 03:30:00.445687 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69m47\" (UniqueName: \"kubernetes.io/projected/38e0d506-1a40-4a28-8819-ebae5d085f89-kube-api-access-69m47\") pod \"collect-profiles-29550450-h5mcz\" (UID: \"38e0d506-1a40-4a28-8819-ebae5d085f89\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550450-h5mcz" Mar 09 03:30:00 crc kubenswrapper[4901]: I0309 03:30:00.445800 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/38e0d506-1a40-4a28-8819-ebae5d085f89-secret-volume\") pod \"collect-profiles-29550450-h5mcz\" (UID: \"38e0d506-1a40-4a28-8819-ebae5d085f89\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550450-h5mcz" Mar 09 03:30:00 crc kubenswrapper[4901]: I0309 03:30:00.446347 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/38e0d506-1a40-4a28-8819-ebae5d085f89-config-volume\") pod \"collect-profiles-29550450-h5mcz\" (UID: \"38e0d506-1a40-4a28-8819-ebae5d085f89\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550450-h5mcz" Mar 09 03:30:00 crc kubenswrapper[4901]: I0309 03:30:00.454033 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/38e0d506-1a40-4a28-8819-ebae5d085f89-secret-volume\") pod \"collect-profiles-29550450-h5mcz\" (UID: \"38e0d506-1a40-4a28-8819-ebae5d085f89\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550450-h5mcz" Mar 09 03:30:00 crc kubenswrapper[4901]: I0309 03:30:00.465954 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ld5xl\" (UniqueName: \"kubernetes.io/projected/3fba0d69-8ec5-4a56-8486-db533ab566ec-kube-api-access-ld5xl\") pod \"auto-csr-approver-29550450-tj9sv\" (UID: \"3fba0d69-8ec5-4a56-8486-db533ab566ec\") " pod="openshift-infra/auto-csr-approver-29550450-tj9sv" Mar 09 03:30:00 crc kubenswrapper[4901]: I0309 03:30:00.466484 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69m47\" (UniqueName: \"kubernetes.io/projected/38e0d506-1a40-4a28-8819-ebae5d085f89-kube-api-access-69m47\") pod \"collect-profiles-29550450-h5mcz\" (UID: \"38e0d506-1a40-4a28-8819-ebae5d085f89\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550450-h5mcz" Mar 09 03:30:00 crc kubenswrapper[4901]: I0309 03:30:00.519278 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550450-tj9sv" Mar 09 03:30:00 crc kubenswrapper[4901]: I0309 03:30:00.537932 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550450-h5mcz" Mar 09 03:30:00 crc kubenswrapper[4901]: I0309 03:30:00.838006 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550450-h5mcz"] Mar 09 03:30:00 crc kubenswrapper[4901]: I0309 03:30:00.944477 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550450-h5mcz" event={"ID":"38e0d506-1a40-4a28-8819-ebae5d085f89","Type":"ContainerStarted","Data":"e53820c256adc69a612c188116acecb54bae5a83e64c0a8d47b258679bc604ed"} Mar 09 03:30:01 crc kubenswrapper[4901]: W0309 03:30:01.006591 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3fba0d69_8ec5_4a56_8486_db533ab566ec.slice/crio-c918adaafb7ae7704b13925e2a1326d4a0d139158032cb9582fc6a1d66177b95 WatchSource:0}: Error finding container c918adaafb7ae7704b13925e2a1326d4a0d139158032cb9582fc6a1d66177b95: Status 404 returned error can't find the container with id c918adaafb7ae7704b13925e2a1326d4a0d139158032cb9582fc6a1d66177b95 Mar 09 03:30:01 crc kubenswrapper[4901]: I0309 03:30:01.008349 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550450-tj9sv"] Mar 09 03:30:01 crc kubenswrapper[4901]: I0309 03:30:01.010480 4901 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 03:30:01 crc kubenswrapper[4901]: I0309 03:30:01.955054 4901 generic.go:334] "Generic (PLEG): container finished" podID="38e0d506-1a40-4a28-8819-ebae5d085f89" containerID="887521f9d676b728687ca618dbe31f6f346e44011bf697088239931c2d36482a" exitCode=0 Mar 09 03:30:01 crc kubenswrapper[4901]: I0309 03:30:01.955159 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550450-h5mcz" event={"ID":"38e0d506-1a40-4a28-8819-ebae5d085f89","Type":"ContainerDied","Data":"887521f9d676b728687ca618dbe31f6f346e44011bf697088239931c2d36482a"} Mar 09 03:30:01 crc kubenswrapper[4901]: I0309 03:30:01.956807 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550450-tj9sv" event={"ID":"3fba0d69-8ec5-4a56-8486-db533ab566ec","Type":"ContainerStarted","Data":"c918adaafb7ae7704b13925e2a1326d4a0d139158032cb9582fc6a1d66177b95"} Mar 09 03:30:02 crc kubenswrapper[4901]: I0309 03:30:02.971421 4901 generic.go:334] "Generic (PLEG): container finished" podID="3fba0d69-8ec5-4a56-8486-db533ab566ec" containerID="860adf32ce83cf817b6966b87e414ece86915ccb41cbd993e7eb57e154dad9c6" exitCode=0 Mar 09 03:30:02 crc kubenswrapper[4901]: I0309 03:30:02.971469 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550450-tj9sv" event={"ID":"3fba0d69-8ec5-4a56-8486-db533ab566ec","Type":"ContainerDied","Data":"860adf32ce83cf817b6966b87e414ece86915ccb41cbd993e7eb57e154dad9c6"} Mar 09 03:30:03 crc kubenswrapper[4901]: I0309 03:30:03.353416 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550450-h5mcz" Mar 09 03:30:03 crc kubenswrapper[4901]: I0309 03:30:03.497809 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/38e0d506-1a40-4a28-8819-ebae5d085f89-secret-volume\") pod \"38e0d506-1a40-4a28-8819-ebae5d085f89\" (UID: \"38e0d506-1a40-4a28-8819-ebae5d085f89\") " Mar 09 03:30:03 crc kubenswrapper[4901]: I0309 03:30:03.497871 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69m47\" (UniqueName: \"kubernetes.io/projected/38e0d506-1a40-4a28-8819-ebae5d085f89-kube-api-access-69m47\") pod \"38e0d506-1a40-4a28-8819-ebae5d085f89\" (UID: \"38e0d506-1a40-4a28-8819-ebae5d085f89\") " Mar 09 03:30:03 crc kubenswrapper[4901]: I0309 03:30:03.497902 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/38e0d506-1a40-4a28-8819-ebae5d085f89-config-volume\") pod \"38e0d506-1a40-4a28-8819-ebae5d085f89\" (UID: \"38e0d506-1a40-4a28-8819-ebae5d085f89\") " Mar 09 03:30:03 crc kubenswrapper[4901]: I0309 03:30:03.499110 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38e0d506-1a40-4a28-8819-ebae5d085f89-config-volume" (OuterVolumeSpecName: "config-volume") pod "38e0d506-1a40-4a28-8819-ebae5d085f89" (UID: "38e0d506-1a40-4a28-8819-ebae5d085f89"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:30:03 crc kubenswrapper[4901]: I0309 03:30:03.506089 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38e0d506-1a40-4a28-8819-ebae5d085f89-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "38e0d506-1a40-4a28-8819-ebae5d085f89" (UID: "38e0d506-1a40-4a28-8819-ebae5d085f89"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:30:03 crc kubenswrapper[4901]: I0309 03:30:03.509069 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38e0d506-1a40-4a28-8819-ebae5d085f89-kube-api-access-69m47" (OuterVolumeSpecName: "kube-api-access-69m47") pod "38e0d506-1a40-4a28-8819-ebae5d085f89" (UID: "38e0d506-1a40-4a28-8819-ebae5d085f89"). InnerVolumeSpecName "kube-api-access-69m47". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:30:03 crc kubenswrapper[4901]: I0309 03:30:03.600102 4901 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/38e0d506-1a40-4a28-8819-ebae5d085f89-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 09 03:30:03 crc kubenswrapper[4901]: I0309 03:30:03.600158 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69m47\" (UniqueName: \"kubernetes.io/projected/38e0d506-1a40-4a28-8819-ebae5d085f89-kube-api-access-69m47\") on node \"crc\" DevicePath \"\"" Mar 09 03:30:03 crc kubenswrapper[4901]: I0309 03:30:03.600185 4901 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/38e0d506-1a40-4a28-8819-ebae5d085f89-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 03:30:03 crc kubenswrapper[4901]: I0309 03:30:03.983848 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550450-h5mcz" event={"ID":"38e0d506-1a40-4a28-8819-ebae5d085f89","Type":"ContainerDied","Data":"e53820c256adc69a612c188116acecb54bae5a83e64c0a8d47b258679bc604ed"} Mar 09 03:30:03 crc kubenswrapper[4901]: I0309 03:30:03.983918 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e53820c256adc69a612c188116acecb54bae5a83e64c0a8d47b258679bc604ed" Mar 09 03:30:03 crc kubenswrapper[4901]: I0309 03:30:03.983871 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550450-h5mcz" Mar 09 03:30:04 crc kubenswrapper[4901]: I0309 03:30:04.290393 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550450-tj9sv" Mar 09 03:30:04 crc kubenswrapper[4901]: I0309 03:30:04.416257 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ld5xl\" (UniqueName: \"kubernetes.io/projected/3fba0d69-8ec5-4a56-8486-db533ab566ec-kube-api-access-ld5xl\") pod \"3fba0d69-8ec5-4a56-8486-db533ab566ec\" (UID: \"3fba0d69-8ec5-4a56-8486-db533ab566ec\") " Mar 09 03:30:04 crc kubenswrapper[4901]: I0309 03:30:04.433609 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fba0d69-8ec5-4a56-8486-db533ab566ec-kube-api-access-ld5xl" (OuterVolumeSpecName: "kube-api-access-ld5xl") pod "3fba0d69-8ec5-4a56-8486-db533ab566ec" (UID: "3fba0d69-8ec5-4a56-8486-db533ab566ec"). InnerVolumeSpecName "kube-api-access-ld5xl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:30:04 crc kubenswrapper[4901]: I0309 03:30:04.435761 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550405-nlxv5"] Mar 09 03:30:04 crc kubenswrapper[4901]: I0309 03:30:04.442455 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550405-nlxv5"] Mar 09 03:30:04 crc kubenswrapper[4901]: I0309 03:30:04.519074 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ld5xl\" (UniqueName: \"kubernetes.io/projected/3fba0d69-8ec5-4a56-8486-db533ab566ec-kube-api-access-ld5xl\") on node \"crc\" DevicePath \"\"" Mar 09 03:30:04 crc kubenswrapper[4901]: I0309 03:30:04.994319 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550450-tj9sv" event={"ID":"3fba0d69-8ec5-4a56-8486-db533ab566ec","Type":"ContainerDied","Data":"c918adaafb7ae7704b13925e2a1326d4a0d139158032cb9582fc6a1d66177b95"} Mar 09 03:30:04 crc kubenswrapper[4901]: I0309 03:30:04.994375 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c918adaafb7ae7704b13925e2a1326d4a0d139158032cb9582fc6a1d66177b95" Mar 09 03:30:04 crc kubenswrapper[4901]: I0309 03:30:04.995451 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550450-tj9sv" Mar 09 03:30:05 crc kubenswrapper[4901]: I0309 03:30:05.356955 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550444-mqxvm"] Mar 09 03:30:05 crc kubenswrapper[4901]: I0309 03:30:05.363562 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550444-mqxvm"] Mar 09 03:30:06 crc kubenswrapper[4901]: I0309 03:30:06.122791 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88f4b783-1a16-4ead-bbe7-b5b982392e97" path="/var/lib/kubelet/pods/88f4b783-1a16-4ead-bbe7-b5b982392e97/volumes" Mar 09 03:30:06 crc kubenswrapper[4901]: I0309 03:30:06.124630 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e0edae-f21a-473e-93a7-c9b797f9a112" path="/var/lib/kubelet/pods/e7e0edae-f21a-473e-93a7-c9b797f9a112/volumes" Mar 09 03:30:30 crc kubenswrapper[4901]: I0309 03:30:30.863442 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 03:30:30 crc kubenswrapper[4901]: I0309 03:30:30.864058 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 03:30:39 crc kubenswrapper[4901]: I0309 03:30:39.055920 4901 scope.go:117] "RemoveContainer" containerID="c857fd9cd591bf517cfbb0a60c1c4bf4d577f39c99853212a851652a4620dc21" Mar 09 03:30:39 crc kubenswrapper[4901]: I0309 03:30:39.123855 4901 scope.go:117] "RemoveContainer" containerID="3ee70dec9345649446b76daa64fb85b6eb014cc1effe10f165caa1e54849e223" Mar 09 03:31:00 crc kubenswrapper[4901]: I0309 03:31:00.862849 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 03:31:00 crc kubenswrapper[4901]: I0309 03:31:00.863545 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 03:31:30 crc kubenswrapper[4901]: I0309 03:31:30.863609 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 03:31:30 crc kubenswrapper[4901]: I0309 03:31:30.864653 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 03:31:30 crc kubenswrapper[4901]: I0309 03:31:30.864748 4901 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5c998" Mar 09 03:31:30 crc kubenswrapper[4901]: I0309 03:31:30.866329 4901 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"96795b44fdfb1d02477775a37c4cf63b6931ee38b13fb7a897de264e1203c8a7"} pod="openshift-machine-config-operator/machine-config-daemon-5c998" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 03:31:30 crc kubenswrapper[4901]: I0309 03:31:30.866454 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" containerID="cri-o://96795b44fdfb1d02477775a37c4cf63b6931ee38b13fb7a897de264e1203c8a7" gracePeriod=600 Mar 09 03:31:31 crc kubenswrapper[4901]: I0309 03:31:31.807339 4901 generic.go:334] "Generic (PLEG): container finished" podID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerID="96795b44fdfb1d02477775a37c4cf63b6931ee38b13fb7a897de264e1203c8a7" exitCode=0 Mar 09 03:31:31 crc kubenswrapper[4901]: I0309 03:31:31.807423 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" event={"ID":"65e722e8-52c4-4bb6-9927-f378b2f7296a","Type":"ContainerDied","Data":"96795b44fdfb1d02477775a37c4cf63b6931ee38b13fb7a897de264e1203c8a7"} Mar 09 03:31:31 crc kubenswrapper[4901]: I0309 03:31:31.807740 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" event={"ID":"65e722e8-52c4-4bb6-9927-f378b2f7296a","Type":"ContainerStarted","Data":"4e595b8a3bc8456f295eac6997fdba3d4dfa8618a5fe46f9ea1dce83a5d6f9be"} Mar 09 03:31:31 crc kubenswrapper[4901]: I0309 03:31:31.807764 4901 scope.go:117] "RemoveContainer" containerID="b882e9e78d8beaa4679398085320aa14420eb8f755a8b3bec066597796f6c72e" Mar 09 03:32:00 crc kubenswrapper[4901]: I0309 03:32:00.165426 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550452-4sxfr"] Mar 09 03:32:00 crc kubenswrapper[4901]: E0309 03:32:00.167824 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38e0d506-1a40-4a28-8819-ebae5d085f89" containerName="collect-profiles" Mar 09 03:32:00 crc kubenswrapper[4901]: I0309 03:32:00.167851 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="38e0d506-1a40-4a28-8819-ebae5d085f89" containerName="collect-profiles" Mar 09 03:32:00 crc kubenswrapper[4901]: E0309 03:32:00.167954 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fba0d69-8ec5-4a56-8486-db533ab566ec" containerName="oc" Mar 09 03:32:00 crc kubenswrapper[4901]: I0309 03:32:00.168018 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fba0d69-8ec5-4a56-8486-db533ab566ec" containerName="oc" Mar 09 03:32:00 crc kubenswrapper[4901]: I0309 03:32:00.168575 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="38e0d506-1a40-4a28-8819-ebae5d085f89" containerName="collect-profiles" Mar 09 03:32:00 crc kubenswrapper[4901]: I0309 03:32:00.168666 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fba0d69-8ec5-4a56-8486-db533ab566ec" containerName="oc" Mar 09 03:32:00 crc kubenswrapper[4901]: I0309 03:32:00.170808 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550452-4sxfr" Mar 09 03:32:00 crc kubenswrapper[4901]: I0309 03:32:00.174633 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lq6vs" Mar 09 03:32:00 crc kubenswrapper[4901]: I0309 03:32:00.174943 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 03:32:00 crc kubenswrapper[4901]: I0309 03:32:00.175164 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 03:32:00 crc kubenswrapper[4901]: I0309 03:32:00.182046 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550452-4sxfr"] Mar 09 03:32:00 crc kubenswrapper[4901]: I0309 03:32:00.228930 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ht2rc\" (UniqueName: \"kubernetes.io/projected/9aa50188-8fc6-4944-8be3-9d8056584741-kube-api-access-ht2rc\") pod \"auto-csr-approver-29550452-4sxfr\" (UID: \"9aa50188-8fc6-4944-8be3-9d8056584741\") " pod="openshift-infra/auto-csr-approver-29550452-4sxfr" Mar 09 03:32:00 crc kubenswrapper[4901]: I0309 03:32:00.330839 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ht2rc\" (UniqueName: \"kubernetes.io/projected/9aa50188-8fc6-4944-8be3-9d8056584741-kube-api-access-ht2rc\") pod \"auto-csr-approver-29550452-4sxfr\" (UID: \"9aa50188-8fc6-4944-8be3-9d8056584741\") " pod="openshift-infra/auto-csr-approver-29550452-4sxfr" Mar 09 03:32:00 crc kubenswrapper[4901]: I0309 03:32:00.369586 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht2rc\" (UniqueName: \"kubernetes.io/projected/9aa50188-8fc6-4944-8be3-9d8056584741-kube-api-access-ht2rc\") pod \"auto-csr-approver-29550452-4sxfr\" (UID: \"9aa50188-8fc6-4944-8be3-9d8056584741\") " pod="openshift-infra/auto-csr-approver-29550452-4sxfr" Mar 09 03:32:00 crc kubenswrapper[4901]: I0309 03:32:00.514077 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550452-4sxfr" Mar 09 03:32:01 crc kubenswrapper[4901]: I0309 03:32:01.135524 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550452-4sxfr"] Mar 09 03:32:02 crc kubenswrapper[4901]: I0309 03:32:02.159869 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550452-4sxfr" event={"ID":"9aa50188-8fc6-4944-8be3-9d8056584741","Type":"ContainerStarted","Data":"295237c5dd4533d97f0f43000cef4dc673436ddf97a992145603b0a82bcb488a"} Mar 09 03:32:03 crc kubenswrapper[4901]: I0309 03:32:03.171436 4901 generic.go:334] "Generic (PLEG): container finished" podID="9aa50188-8fc6-4944-8be3-9d8056584741" containerID="1d27618494908526afd1286a15d708f6ad8a48e3f32445bca208dbf02c3d938b" exitCode=0 Mar 09 03:32:03 crc kubenswrapper[4901]: I0309 03:32:03.171536 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550452-4sxfr" event={"ID":"9aa50188-8fc6-4944-8be3-9d8056584741","Type":"ContainerDied","Data":"1d27618494908526afd1286a15d708f6ad8a48e3f32445bca208dbf02c3d938b"} Mar 09 03:32:04 crc kubenswrapper[4901]: I0309 03:32:04.477391 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550452-4sxfr" Mar 09 03:32:04 crc kubenswrapper[4901]: I0309 03:32:04.623073 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ht2rc\" (UniqueName: \"kubernetes.io/projected/9aa50188-8fc6-4944-8be3-9d8056584741-kube-api-access-ht2rc\") pod \"9aa50188-8fc6-4944-8be3-9d8056584741\" (UID: \"9aa50188-8fc6-4944-8be3-9d8056584741\") " Mar 09 03:32:04 crc kubenswrapper[4901]: I0309 03:32:04.632309 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9aa50188-8fc6-4944-8be3-9d8056584741-kube-api-access-ht2rc" (OuterVolumeSpecName: "kube-api-access-ht2rc") pod "9aa50188-8fc6-4944-8be3-9d8056584741" (UID: "9aa50188-8fc6-4944-8be3-9d8056584741"). InnerVolumeSpecName "kube-api-access-ht2rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:32:04 crc kubenswrapper[4901]: I0309 03:32:04.724531 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ht2rc\" (UniqueName: \"kubernetes.io/projected/9aa50188-8fc6-4944-8be3-9d8056584741-kube-api-access-ht2rc\") on node \"crc\" DevicePath \"\"" Mar 09 03:32:05 crc kubenswrapper[4901]: I0309 03:32:05.190923 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550452-4sxfr" event={"ID":"9aa50188-8fc6-4944-8be3-9d8056584741","Type":"ContainerDied","Data":"295237c5dd4533d97f0f43000cef4dc673436ddf97a992145603b0a82bcb488a"} Mar 09 03:32:05 crc kubenswrapper[4901]: I0309 03:32:05.190975 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="295237c5dd4533d97f0f43000cef4dc673436ddf97a992145603b0a82bcb488a" Mar 09 03:32:05 crc kubenswrapper[4901]: I0309 03:32:05.191017 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550452-4sxfr" Mar 09 03:32:05 crc kubenswrapper[4901]: I0309 03:32:05.586099 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550446-x2hd5"] Mar 09 03:32:05 crc kubenswrapper[4901]: I0309 03:32:05.592894 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550446-x2hd5"] Mar 09 03:32:06 crc kubenswrapper[4901]: I0309 03:32:06.122156 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1acdddf-e1ac-412e-a0a4-e874ce64a2bb" path="/var/lib/kubelet/pods/c1acdddf-e1ac-412e-a0a4-e874ce64a2bb/volumes" Mar 09 03:32:22 crc kubenswrapper[4901]: I0309 03:32:22.412087 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qx77w"] Mar 09 03:32:22 crc kubenswrapper[4901]: E0309 03:32:22.412839 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aa50188-8fc6-4944-8be3-9d8056584741" containerName="oc" Mar 09 03:32:22 crc kubenswrapper[4901]: I0309 03:32:22.412860 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aa50188-8fc6-4944-8be3-9d8056584741" containerName="oc" Mar 09 03:32:22 crc kubenswrapper[4901]: I0309 03:32:22.413013 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="9aa50188-8fc6-4944-8be3-9d8056584741" containerName="oc" Mar 09 03:32:22 crc kubenswrapper[4901]: I0309 03:32:22.413918 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qx77w" Mar 09 03:32:22 crc kubenswrapper[4901]: I0309 03:32:22.427730 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qx77w"] Mar 09 03:32:22 crc kubenswrapper[4901]: I0309 03:32:22.541700 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/352e87e5-5aa4-4800-bd7b-36391e72de86-utilities\") pod \"redhat-operators-qx77w\" (UID: \"352e87e5-5aa4-4800-bd7b-36391e72de86\") " pod="openshift-marketplace/redhat-operators-qx77w" Mar 09 03:32:22 crc kubenswrapper[4901]: I0309 03:32:22.542081 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/352e87e5-5aa4-4800-bd7b-36391e72de86-catalog-content\") pod \"redhat-operators-qx77w\" (UID: \"352e87e5-5aa4-4800-bd7b-36391e72de86\") " pod="openshift-marketplace/redhat-operators-qx77w" Mar 09 03:32:22 crc kubenswrapper[4901]: I0309 03:32:22.542237 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vj9gn\" (UniqueName: \"kubernetes.io/projected/352e87e5-5aa4-4800-bd7b-36391e72de86-kube-api-access-vj9gn\") pod \"redhat-operators-qx77w\" (UID: \"352e87e5-5aa4-4800-bd7b-36391e72de86\") " pod="openshift-marketplace/redhat-operators-qx77w" Mar 09 03:32:22 crc kubenswrapper[4901]: I0309 03:32:22.643840 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vj9gn\" (UniqueName: \"kubernetes.io/projected/352e87e5-5aa4-4800-bd7b-36391e72de86-kube-api-access-vj9gn\") pod \"redhat-operators-qx77w\" (UID: \"352e87e5-5aa4-4800-bd7b-36391e72de86\") " pod="openshift-marketplace/redhat-operators-qx77w" Mar 09 03:32:22 crc kubenswrapper[4901]: I0309 03:32:22.643936 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/352e87e5-5aa4-4800-bd7b-36391e72de86-utilities\") pod \"redhat-operators-qx77w\" (UID: \"352e87e5-5aa4-4800-bd7b-36391e72de86\") " pod="openshift-marketplace/redhat-operators-qx77w" Mar 09 03:32:22 crc kubenswrapper[4901]: I0309 03:32:22.643995 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/352e87e5-5aa4-4800-bd7b-36391e72de86-catalog-content\") pod \"redhat-operators-qx77w\" (UID: \"352e87e5-5aa4-4800-bd7b-36391e72de86\") " pod="openshift-marketplace/redhat-operators-qx77w" Mar 09 03:32:22 crc kubenswrapper[4901]: I0309 03:32:22.644510 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/352e87e5-5aa4-4800-bd7b-36391e72de86-utilities\") pod \"redhat-operators-qx77w\" (UID: \"352e87e5-5aa4-4800-bd7b-36391e72de86\") " pod="openshift-marketplace/redhat-operators-qx77w" Mar 09 03:32:22 crc kubenswrapper[4901]: I0309 03:32:22.644510 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/352e87e5-5aa4-4800-bd7b-36391e72de86-catalog-content\") pod \"redhat-operators-qx77w\" (UID: \"352e87e5-5aa4-4800-bd7b-36391e72de86\") " pod="openshift-marketplace/redhat-operators-qx77w" Mar 09 03:32:22 crc kubenswrapper[4901]: I0309 03:32:22.665196 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vj9gn\" (UniqueName: \"kubernetes.io/projected/352e87e5-5aa4-4800-bd7b-36391e72de86-kube-api-access-vj9gn\") pod \"redhat-operators-qx77w\" (UID: \"352e87e5-5aa4-4800-bd7b-36391e72de86\") " pod="openshift-marketplace/redhat-operators-qx77w" Mar 09 03:32:22 crc kubenswrapper[4901]: I0309 03:32:22.757806 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qx77w" Mar 09 03:32:23 crc kubenswrapper[4901]: I0309 03:32:23.209817 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qx77w"] Mar 09 03:32:23 crc kubenswrapper[4901]: I0309 03:32:23.355435 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qx77w" event={"ID":"352e87e5-5aa4-4800-bd7b-36391e72de86","Type":"ContainerStarted","Data":"7be54e5799988cc2f2ca62481a1906dc51ec37b2f59b2bb4385ad2d6fc926e0e"} Mar 09 03:32:24 crc kubenswrapper[4901]: I0309 03:32:24.362459 4901 generic.go:334] "Generic (PLEG): container finished" podID="352e87e5-5aa4-4800-bd7b-36391e72de86" containerID="6e3f9c3f7d92bc26a927baa8359b38929ce944522a11f195a21f4afc4203d10e" exitCode=0 Mar 09 03:32:24 crc kubenswrapper[4901]: I0309 03:32:24.362549 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qx77w" event={"ID":"352e87e5-5aa4-4800-bd7b-36391e72de86","Type":"ContainerDied","Data":"6e3f9c3f7d92bc26a927baa8359b38929ce944522a11f195a21f4afc4203d10e"} Mar 09 03:32:26 crc kubenswrapper[4901]: I0309 03:32:26.382292 4901 generic.go:334] "Generic (PLEG): container finished" podID="352e87e5-5aa4-4800-bd7b-36391e72de86" containerID="d71c32ff7bad019c110b46d73ab3abe9c8ebb7bd4b1c63365c5268ea86cb0fc4" exitCode=0 Mar 09 03:32:26 crc kubenswrapper[4901]: I0309 03:32:26.382416 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qx77w" event={"ID":"352e87e5-5aa4-4800-bd7b-36391e72de86","Type":"ContainerDied","Data":"d71c32ff7bad019c110b46d73ab3abe9c8ebb7bd4b1c63365c5268ea86cb0fc4"} Mar 09 03:32:27 crc kubenswrapper[4901]: I0309 03:32:27.398630 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qx77w" event={"ID":"352e87e5-5aa4-4800-bd7b-36391e72de86","Type":"ContainerStarted","Data":"e4a3ad67617261afbbe76b89113ba49f2b21f493c50aa75fe57b36ff849489f6"} Mar 09 03:32:27 crc kubenswrapper[4901]: I0309 03:32:27.427295 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qx77w" podStartSLOduration=2.9896284829999997 podStartE2EDuration="5.427276354s" podCreationTimestamp="2026-03-09 03:32:22 +0000 UTC" firstStartedPulling="2026-03-09 03:32:24.363755341 +0000 UTC m=+3068.953419073" lastFinishedPulling="2026-03-09 03:32:26.801403172 +0000 UTC m=+3071.391066944" observedRunningTime="2026-03-09 03:32:27.421775898 +0000 UTC m=+3072.011439680" watchObservedRunningTime="2026-03-09 03:32:27.427276354 +0000 UTC m=+3072.016940096" Mar 09 03:32:32 crc kubenswrapper[4901]: I0309 03:32:32.758712 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qx77w" Mar 09 03:32:32 crc kubenswrapper[4901]: I0309 03:32:32.759173 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qx77w" Mar 09 03:32:33 crc kubenswrapper[4901]: I0309 03:32:33.833764 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qx77w" podUID="352e87e5-5aa4-4800-bd7b-36391e72de86" containerName="registry-server" probeResult="failure" output=< Mar 09 03:32:33 crc kubenswrapper[4901]: timeout: failed to connect service ":50051" within 1s Mar 09 03:32:33 crc kubenswrapper[4901]: > Mar 09 03:32:36 crc kubenswrapper[4901]: I0309 03:32:36.629325 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kgpxn"] Mar 09 03:32:36 crc kubenswrapper[4901]: I0309 03:32:36.632811 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kgpxn" Mar 09 03:32:36 crc kubenswrapper[4901]: I0309 03:32:36.645951 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kgpxn"] Mar 09 03:32:36 crc kubenswrapper[4901]: I0309 03:32:36.690689 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt2xl\" (UniqueName: \"kubernetes.io/projected/91a84bf1-df1d-43fd-9b04-b95b9e43f9f9-kube-api-access-xt2xl\") pod \"certified-operators-kgpxn\" (UID: \"91a84bf1-df1d-43fd-9b04-b95b9e43f9f9\") " pod="openshift-marketplace/certified-operators-kgpxn" Mar 09 03:32:36 crc kubenswrapper[4901]: I0309 03:32:36.690871 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91a84bf1-df1d-43fd-9b04-b95b9e43f9f9-utilities\") pod \"certified-operators-kgpxn\" (UID: \"91a84bf1-df1d-43fd-9b04-b95b9e43f9f9\") " pod="openshift-marketplace/certified-operators-kgpxn" Mar 09 03:32:36 crc kubenswrapper[4901]: I0309 03:32:36.690911 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91a84bf1-df1d-43fd-9b04-b95b9e43f9f9-catalog-content\") pod \"certified-operators-kgpxn\" (UID: \"91a84bf1-df1d-43fd-9b04-b95b9e43f9f9\") " pod="openshift-marketplace/certified-operators-kgpxn" Mar 09 03:32:36 crc kubenswrapper[4901]: I0309 03:32:36.792357 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xt2xl\" (UniqueName: \"kubernetes.io/projected/91a84bf1-df1d-43fd-9b04-b95b9e43f9f9-kube-api-access-xt2xl\") pod \"certified-operators-kgpxn\" (UID: \"91a84bf1-df1d-43fd-9b04-b95b9e43f9f9\") " pod="openshift-marketplace/certified-operators-kgpxn" Mar 09 03:32:36 crc kubenswrapper[4901]: I0309 03:32:36.792471 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91a84bf1-df1d-43fd-9b04-b95b9e43f9f9-utilities\") pod \"certified-operators-kgpxn\" (UID: \"91a84bf1-df1d-43fd-9b04-b95b9e43f9f9\") " pod="openshift-marketplace/certified-operators-kgpxn" Mar 09 03:32:36 crc kubenswrapper[4901]: I0309 03:32:36.792534 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91a84bf1-df1d-43fd-9b04-b95b9e43f9f9-catalog-content\") pod \"certified-operators-kgpxn\" (UID: \"91a84bf1-df1d-43fd-9b04-b95b9e43f9f9\") " pod="openshift-marketplace/certified-operators-kgpxn" Mar 09 03:32:36 crc kubenswrapper[4901]: I0309 03:32:36.793084 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91a84bf1-df1d-43fd-9b04-b95b9e43f9f9-catalog-content\") pod \"certified-operators-kgpxn\" (UID: \"91a84bf1-df1d-43fd-9b04-b95b9e43f9f9\") " pod="openshift-marketplace/certified-operators-kgpxn" Mar 09 03:32:36 crc kubenswrapper[4901]: I0309 03:32:36.793331 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91a84bf1-df1d-43fd-9b04-b95b9e43f9f9-utilities\") pod \"certified-operators-kgpxn\" (UID: \"91a84bf1-df1d-43fd-9b04-b95b9e43f9f9\") " pod="openshift-marketplace/certified-operators-kgpxn" Mar 09 03:32:36 crc kubenswrapper[4901]: I0309 03:32:36.818674 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt2xl\" (UniqueName: \"kubernetes.io/projected/91a84bf1-df1d-43fd-9b04-b95b9e43f9f9-kube-api-access-xt2xl\") pod \"certified-operators-kgpxn\" (UID: \"91a84bf1-df1d-43fd-9b04-b95b9e43f9f9\") " pod="openshift-marketplace/certified-operators-kgpxn" Mar 09 03:32:36 crc kubenswrapper[4901]: I0309 03:32:36.963861 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kgpxn" Mar 09 03:32:37 crc kubenswrapper[4901]: I0309 03:32:37.490790 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kgpxn"] Mar 09 03:32:38 crc kubenswrapper[4901]: I0309 03:32:38.492819 4901 generic.go:334] "Generic (PLEG): container finished" podID="91a84bf1-df1d-43fd-9b04-b95b9e43f9f9" containerID="1cd982c307ddf27e80ec7ad0419709bba6ef26639d9d06f6016bafef7a7ccf7d" exitCode=0 Mar 09 03:32:38 crc kubenswrapper[4901]: I0309 03:32:38.493070 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kgpxn" event={"ID":"91a84bf1-df1d-43fd-9b04-b95b9e43f9f9","Type":"ContainerDied","Data":"1cd982c307ddf27e80ec7ad0419709bba6ef26639d9d06f6016bafef7a7ccf7d"} Mar 09 03:32:38 crc kubenswrapper[4901]: I0309 03:32:38.493198 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kgpxn" event={"ID":"91a84bf1-df1d-43fd-9b04-b95b9e43f9f9","Type":"ContainerStarted","Data":"cf1cec8c4948a621bf4aa9981463a41c053456d59cd0de6edc22a1b3f472385f"} Mar 09 03:32:39 crc kubenswrapper[4901]: I0309 03:32:39.271151 4901 scope.go:117] "RemoveContainer" containerID="e28a88da3387af3a92e2cdab4ed2a00b617b45512b0375b0589f4c301d83679b" Mar 09 03:32:39 crc kubenswrapper[4901]: I0309 03:32:39.516597 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kgpxn" event={"ID":"91a84bf1-df1d-43fd-9b04-b95b9e43f9f9","Type":"ContainerStarted","Data":"e53bc1d1701b88938feb598932b09f11aaa941b671a21382ddbd451b7f173ff4"} Mar 09 03:32:40 crc kubenswrapper[4901]: I0309 03:32:40.530126 4901 generic.go:334] "Generic (PLEG): container finished" podID="91a84bf1-df1d-43fd-9b04-b95b9e43f9f9" containerID="e53bc1d1701b88938feb598932b09f11aaa941b671a21382ddbd451b7f173ff4" exitCode=0 Mar 09 03:32:40 crc kubenswrapper[4901]: I0309 03:32:40.530204 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kgpxn" event={"ID":"91a84bf1-df1d-43fd-9b04-b95b9e43f9f9","Type":"ContainerDied","Data":"e53bc1d1701b88938feb598932b09f11aaa941b671a21382ddbd451b7f173ff4"} Mar 09 03:32:41 crc kubenswrapper[4901]: I0309 03:32:41.543927 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kgpxn" event={"ID":"91a84bf1-df1d-43fd-9b04-b95b9e43f9f9","Type":"ContainerStarted","Data":"4472442f1739e325711bd3d903e8eb99a8c061fa1745efb0d274717cda849fa3"} Mar 09 03:32:41 crc kubenswrapper[4901]: I0309 03:32:41.587664 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kgpxn" podStartSLOduration=3.055275665 podStartE2EDuration="5.587636487s" podCreationTimestamp="2026-03-09 03:32:36 +0000 UTC" firstStartedPulling="2026-03-09 03:32:38.496689777 +0000 UTC m=+3083.086353539" lastFinishedPulling="2026-03-09 03:32:41.029050609 +0000 UTC m=+3085.618714361" observedRunningTime="2026-03-09 03:32:41.583678269 +0000 UTC m=+3086.173342031" watchObservedRunningTime="2026-03-09 03:32:41.587636487 +0000 UTC m=+3086.177300259" Mar 09 03:32:42 crc kubenswrapper[4901]: I0309 03:32:42.832342 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qx77w" Mar 09 03:32:42 crc kubenswrapper[4901]: I0309 03:32:42.909487 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qx77w" Mar 09 03:32:44 crc kubenswrapper[4901]: I0309 03:32:44.020068 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qx77w"] Mar 09 03:32:44 crc kubenswrapper[4901]: I0309 03:32:44.568578 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qx77w" podUID="352e87e5-5aa4-4800-bd7b-36391e72de86" containerName="registry-server" containerID="cri-o://e4a3ad67617261afbbe76b89113ba49f2b21f493c50aa75fe57b36ff849489f6" gracePeriod=2 Mar 09 03:32:45 crc kubenswrapper[4901]: I0309 03:32:45.562831 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qx77w" Mar 09 03:32:45 crc kubenswrapper[4901]: I0309 03:32:45.578538 4901 generic.go:334] "Generic (PLEG): container finished" podID="352e87e5-5aa4-4800-bd7b-36391e72de86" containerID="e4a3ad67617261afbbe76b89113ba49f2b21f493c50aa75fe57b36ff849489f6" exitCode=0 Mar 09 03:32:45 crc kubenswrapper[4901]: I0309 03:32:45.578572 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qx77w" event={"ID":"352e87e5-5aa4-4800-bd7b-36391e72de86","Type":"ContainerDied","Data":"e4a3ad67617261afbbe76b89113ba49f2b21f493c50aa75fe57b36ff849489f6"} Mar 09 03:32:45 crc kubenswrapper[4901]: I0309 03:32:45.578605 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qx77w" Mar 09 03:32:45 crc kubenswrapper[4901]: I0309 03:32:45.578624 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qx77w" event={"ID":"352e87e5-5aa4-4800-bd7b-36391e72de86","Type":"ContainerDied","Data":"7be54e5799988cc2f2ca62481a1906dc51ec37b2f59b2bb4385ad2d6fc926e0e"} Mar 09 03:32:45 crc kubenswrapper[4901]: I0309 03:32:45.578646 4901 scope.go:117] "RemoveContainer" containerID="e4a3ad67617261afbbe76b89113ba49f2b21f493c50aa75fe57b36ff849489f6" Mar 09 03:32:45 crc kubenswrapper[4901]: I0309 03:32:45.613328 4901 scope.go:117] "RemoveContainer" containerID="d71c32ff7bad019c110b46d73ab3abe9c8ebb7bd4b1c63365c5268ea86cb0fc4" Mar 09 03:32:45 crc kubenswrapper[4901]: I0309 03:32:45.638126 4901 scope.go:117] "RemoveContainer" containerID="6e3f9c3f7d92bc26a927baa8359b38929ce944522a11f195a21f4afc4203d10e" Mar 09 03:32:45 crc kubenswrapper[4901]: I0309 03:32:45.659027 4901 scope.go:117] "RemoveContainer" containerID="e4a3ad67617261afbbe76b89113ba49f2b21f493c50aa75fe57b36ff849489f6" Mar 09 03:32:45 crc kubenswrapper[4901]: E0309 03:32:45.659811 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4a3ad67617261afbbe76b89113ba49f2b21f493c50aa75fe57b36ff849489f6\": container with ID starting with e4a3ad67617261afbbe76b89113ba49f2b21f493c50aa75fe57b36ff849489f6 not found: ID does not exist" containerID="e4a3ad67617261afbbe76b89113ba49f2b21f493c50aa75fe57b36ff849489f6" Mar 09 03:32:45 crc kubenswrapper[4901]: I0309 03:32:45.659882 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4a3ad67617261afbbe76b89113ba49f2b21f493c50aa75fe57b36ff849489f6"} err="failed to get container status \"e4a3ad67617261afbbe76b89113ba49f2b21f493c50aa75fe57b36ff849489f6\": rpc error: code = NotFound desc = could not find container \"e4a3ad67617261afbbe76b89113ba49f2b21f493c50aa75fe57b36ff849489f6\": container with ID starting with e4a3ad67617261afbbe76b89113ba49f2b21f493c50aa75fe57b36ff849489f6 not found: ID does not exist" Mar 09 03:32:45 crc kubenswrapper[4901]: I0309 03:32:45.659925 4901 scope.go:117] "RemoveContainer" containerID="d71c32ff7bad019c110b46d73ab3abe9c8ebb7bd4b1c63365c5268ea86cb0fc4" Mar 09 03:32:45 crc kubenswrapper[4901]: E0309 03:32:45.660530 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d71c32ff7bad019c110b46d73ab3abe9c8ebb7bd4b1c63365c5268ea86cb0fc4\": container with ID starting with d71c32ff7bad019c110b46d73ab3abe9c8ebb7bd4b1c63365c5268ea86cb0fc4 not found: ID does not exist" containerID="d71c32ff7bad019c110b46d73ab3abe9c8ebb7bd4b1c63365c5268ea86cb0fc4" Mar 09 03:32:45 crc kubenswrapper[4901]: I0309 03:32:45.660574 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d71c32ff7bad019c110b46d73ab3abe9c8ebb7bd4b1c63365c5268ea86cb0fc4"} err="failed to get container status \"d71c32ff7bad019c110b46d73ab3abe9c8ebb7bd4b1c63365c5268ea86cb0fc4\": rpc error: code = NotFound desc = could not find container \"d71c32ff7bad019c110b46d73ab3abe9c8ebb7bd4b1c63365c5268ea86cb0fc4\": container with ID starting with d71c32ff7bad019c110b46d73ab3abe9c8ebb7bd4b1c63365c5268ea86cb0fc4 not found: ID does not exist" Mar 09 03:32:45 crc kubenswrapper[4901]: I0309 03:32:45.660602 4901 scope.go:117] "RemoveContainer" containerID="6e3f9c3f7d92bc26a927baa8359b38929ce944522a11f195a21f4afc4203d10e" Mar 09 03:32:45 crc kubenswrapper[4901]: E0309 03:32:45.660977 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e3f9c3f7d92bc26a927baa8359b38929ce944522a11f195a21f4afc4203d10e\": container with ID starting with 6e3f9c3f7d92bc26a927baa8359b38929ce944522a11f195a21f4afc4203d10e not found: ID does not exist" containerID="6e3f9c3f7d92bc26a927baa8359b38929ce944522a11f195a21f4afc4203d10e" Mar 09 03:32:45 crc kubenswrapper[4901]: I0309 03:32:45.661001 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e3f9c3f7d92bc26a927baa8359b38929ce944522a11f195a21f4afc4203d10e"} err="failed to get container status \"6e3f9c3f7d92bc26a927baa8359b38929ce944522a11f195a21f4afc4203d10e\": rpc error: code = NotFound desc = could not find container \"6e3f9c3f7d92bc26a927baa8359b38929ce944522a11f195a21f4afc4203d10e\": container with ID starting with 6e3f9c3f7d92bc26a927baa8359b38929ce944522a11f195a21f4afc4203d10e not found: ID does not exist" Mar 09 03:32:45 crc kubenswrapper[4901]: I0309 03:32:45.748931 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/352e87e5-5aa4-4800-bd7b-36391e72de86-utilities\") pod \"352e87e5-5aa4-4800-bd7b-36391e72de86\" (UID: \"352e87e5-5aa4-4800-bd7b-36391e72de86\") " Mar 09 03:32:45 crc kubenswrapper[4901]: I0309 03:32:45.749126 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/352e87e5-5aa4-4800-bd7b-36391e72de86-catalog-content\") pod \"352e87e5-5aa4-4800-bd7b-36391e72de86\" (UID: \"352e87e5-5aa4-4800-bd7b-36391e72de86\") " Mar 09 03:32:45 crc kubenswrapper[4901]: I0309 03:32:45.749367 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vj9gn\" (UniqueName: \"kubernetes.io/projected/352e87e5-5aa4-4800-bd7b-36391e72de86-kube-api-access-vj9gn\") pod \"352e87e5-5aa4-4800-bd7b-36391e72de86\" (UID: \"352e87e5-5aa4-4800-bd7b-36391e72de86\") " Mar 09 03:32:45 crc kubenswrapper[4901]: I0309 03:32:45.752810 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/352e87e5-5aa4-4800-bd7b-36391e72de86-utilities" (OuterVolumeSpecName: "utilities") pod "352e87e5-5aa4-4800-bd7b-36391e72de86" (UID: "352e87e5-5aa4-4800-bd7b-36391e72de86"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:32:45 crc kubenswrapper[4901]: I0309 03:32:45.757844 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/352e87e5-5aa4-4800-bd7b-36391e72de86-kube-api-access-vj9gn" (OuterVolumeSpecName: "kube-api-access-vj9gn") pod "352e87e5-5aa4-4800-bd7b-36391e72de86" (UID: "352e87e5-5aa4-4800-bd7b-36391e72de86"). InnerVolumeSpecName "kube-api-access-vj9gn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:32:45 crc kubenswrapper[4901]: I0309 03:32:45.853119 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vj9gn\" (UniqueName: \"kubernetes.io/projected/352e87e5-5aa4-4800-bd7b-36391e72de86-kube-api-access-vj9gn\") on node \"crc\" DevicePath \"\"" Mar 09 03:32:45 crc kubenswrapper[4901]: I0309 03:32:45.853163 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/352e87e5-5aa4-4800-bd7b-36391e72de86-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 03:32:45 crc kubenswrapper[4901]: I0309 03:32:45.954517 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/352e87e5-5aa4-4800-bd7b-36391e72de86-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "352e87e5-5aa4-4800-bd7b-36391e72de86" (UID: "352e87e5-5aa4-4800-bd7b-36391e72de86"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:32:45 crc kubenswrapper[4901]: I0309 03:32:45.956149 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/352e87e5-5aa4-4800-bd7b-36391e72de86-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 03:32:46 crc kubenswrapper[4901]: I0309 03:32:46.212334 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qx77w"] Mar 09 03:32:46 crc kubenswrapper[4901]: I0309 03:32:46.218576 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qx77w"] Mar 09 03:32:46 crc kubenswrapper[4901]: I0309 03:32:46.964581 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kgpxn" Mar 09 03:32:46 crc kubenswrapper[4901]: I0309 03:32:46.964700 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kgpxn" Mar 09 03:32:47 crc kubenswrapper[4901]: I0309 03:32:47.045106 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kgpxn" Mar 09 03:32:47 crc kubenswrapper[4901]: I0309 03:32:47.667608 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kgpxn" Mar 09 03:32:48 crc kubenswrapper[4901]: I0309 03:32:48.123571 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="352e87e5-5aa4-4800-bd7b-36391e72de86" path="/var/lib/kubelet/pods/352e87e5-5aa4-4800-bd7b-36391e72de86/volumes" Mar 09 03:32:49 crc kubenswrapper[4901]: I0309 03:32:49.405166 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kgpxn"] Mar 09 03:32:49 crc kubenswrapper[4901]: I0309 03:32:49.622094 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kgpxn" podUID="91a84bf1-df1d-43fd-9b04-b95b9e43f9f9" containerName="registry-server" containerID="cri-o://4472442f1739e325711bd3d903e8eb99a8c061fa1745efb0d274717cda849fa3" gracePeriod=2 Mar 09 03:32:50 crc kubenswrapper[4901]: I0309 03:32:50.583520 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kgpxn" Mar 09 03:32:50 crc kubenswrapper[4901]: I0309 03:32:50.632331 4901 generic.go:334] "Generic (PLEG): container finished" podID="91a84bf1-df1d-43fd-9b04-b95b9e43f9f9" containerID="4472442f1739e325711bd3d903e8eb99a8c061fa1745efb0d274717cda849fa3" exitCode=0 Mar 09 03:32:50 crc kubenswrapper[4901]: I0309 03:32:50.632404 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kgpxn" event={"ID":"91a84bf1-df1d-43fd-9b04-b95b9e43f9f9","Type":"ContainerDied","Data":"4472442f1739e325711bd3d903e8eb99a8c061fa1745efb0d274717cda849fa3"} Mar 09 03:32:50 crc kubenswrapper[4901]: I0309 03:32:50.632450 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kgpxn" event={"ID":"91a84bf1-df1d-43fd-9b04-b95b9e43f9f9","Type":"ContainerDied","Data":"cf1cec8c4948a621bf4aa9981463a41c053456d59cd0de6edc22a1b3f472385f"} Mar 09 03:32:50 crc kubenswrapper[4901]: I0309 03:32:50.632459 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kgpxn" Mar 09 03:32:50 crc kubenswrapper[4901]: I0309 03:32:50.632482 4901 scope.go:117] "RemoveContainer" containerID="4472442f1739e325711bd3d903e8eb99a8c061fa1745efb0d274717cda849fa3" Mar 09 03:32:50 crc kubenswrapper[4901]: I0309 03:32:50.660517 4901 scope.go:117] "RemoveContainer" containerID="e53bc1d1701b88938feb598932b09f11aaa941b671a21382ddbd451b7f173ff4" Mar 09 03:32:50 crc kubenswrapper[4901]: I0309 03:32:50.682528 4901 scope.go:117] "RemoveContainer" containerID="1cd982c307ddf27e80ec7ad0419709bba6ef26639d9d06f6016bafef7a7ccf7d" Mar 09 03:32:50 crc kubenswrapper[4901]: I0309 03:32:50.700526 4901 scope.go:117] "RemoveContainer" containerID="4472442f1739e325711bd3d903e8eb99a8c061fa1745efb0d274717cda849fa3" Mar 09 03:32:50 crc kubenswrapper[4901]: E0309 03:32:50.701000 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4472442f1739e325711bd3d903e8eb99a8c061fa1745efb0d274717cda849fa3\": container with ID starting with 4472442f1739e325711bd3d903e8eb99a8c061fa1745efb0d274717cda849fa3 not found: ID does not exist" containerID="4472442f1739e325711bd3d903e8eb99a8c061fa1745efb0d274717cda849fa3" Mar 09 03:32:50 crc kubenswrapper[4901]: I0309 03:32:50.701039 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4472442f1739e325711bd3d903e8eb99a8c061fa1745efb0d274717cda849fa3"} err="failed to get container status \"4472442f1739e325711bd3d903e8eb99a8c061fa1745efb0d274717cda849fa3\": rpc error: code = NotFound desc = could not find container \"4472442f1739e325711bd3d903e8eb99a8c061fa1745efb0d274717cda849fa3\": container with ID starting with 4472442f1739e325711bd3d903e8eb99a8c061fa1745efb0d274717cda849fa3 not found: ID does not exist" Mar 09 03:32:50 crc kubenswrapper[4901]: I0309 03:32:50.701077 4901 scope.go:117] "RemoveContainer" containerID="e53bc1d1701b88938feb598932b09f11aaa941b671a21382ddbd451b7f173ff4" Mar 09 03:32:50 crc kubenswrapper[4901]: E0309 03:32:50.701616 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e53bc1d1701b88938feb598932b09f11aaa941b671a21382ddbd451b7f173ff4\": container with ID starting with e53bc1d1701b88938feb598932b09f11aaa941b671a21382ddbd451b7f173ff4 not found: ID does not exist" containerID="e53bc1d1701b88938feb598932b09f11aaa941b671a21382ddbd451b7f173ff4" Mar 09 03:32:50 crc kubenswrapper[4901]: I0309 03:32:50.701716 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e53bc1d1701b88938feb598932b09f11aaa941b671a21382ddbd451b7f173ff4"} err="failed to get container status \"e53bc1d1701b88938feb598932b09f11aaa941b671a21382ddbd451b7f173ff4\": rpc error: code = NotFound desc = could not find container \"e53bc1d1701b88938feb598932b09f11aaa941b671a21382ddbd451b7f173ff4\": container with ID starting with e53bc1d1701b88938feb598932b09f11aaa941b671a21382ddbd451b7f173ff4 not found: ID does not exist" Mar 09 03:32:50 crc kubenswrapper[4901]: I0309 03:32:50.701765 4901 scope.go:117] "RemoveContainer" containerID="1cd982c307ddf27e80ec7ad0419709bba6ef26639d9d06f6016bafef7a7ccf7d" Mar 09 03:32:50 crc kubenswrapper[4901]: E0309 03:32:50.702368 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cd982c307ddf27e80ec7ad0419709bba6ef26639d9d06f6016bafef7a7ccf7d\": container with ID starting with 1cd982c307ddf27e80ec7ad0419709bba6ef26639d9d06f6016bafef7a7ccf7d not found: ID does not exist" containerID="1cd982c307ddf27e80ec7ad0419709bba6ef26639d9d06f6016bafef7a7ccf7d" Mar 09 03:32:50 crc kubenswrapper[4901]: I0309 03:32:50.702415 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cd982c307ddf27e80ec7ad0419709bba6ef26639d9d06f6016bafef7a7ccf7d"} err="failed to get container status \"1cd982c307ddf27e80ec7ad0419709bba6ef26639d9d06f6016bafef7a7ccf7d\": rpc error: code = NotFound desc = could not find container \"1cd982c307ddf27e80ec7ad0419709bba6ef26639d9d06f6016bafef7a7ccf7d\": container with ID starting with 1cd982c307ddf27e80ec7ad0419709bba6ef26639d9d06f6016bafef7a7ccf7d not found: ID does not exist" Mar 09 03:32:50 crc kubenswrapper[4901]: I0309 03:32:50.745739 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91a84bf1-df1d-43fd-9b04-b95b9e43f9f9-utilities\") pod \"91a84bf1-df1d-43fd-9b04-b95b9e43f9f9\" (UID: \"91a84bf1-df1d-43fd-9b04-b95b9e43f9f9\") " Mar 09 03:32:50 crc kubenswrapper[4901]: I0309 03:32:50.745956 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91a84bf1-df1d-43fd-9b04-b95b9e43f9f9-catalog-content\") pod \"91a84bf1-df1d-43fd-9b04-b95b9e43f9f9\" (UID: \"91a84bf1-df1d-43fd-9b04-b95b9e43f9f9\") " Mar 09 03:32:50 crc kubenswrapper[4901]: I0309 03:32:50.746034 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xt2xl\" (UniqueName: \"kubernetes.io/projected/91a84bf1-df1d-43fd-9b04-b95b9e43f9f9-kube-api-access-xt2xl\") pod \"91a84bf1-df1d-43fd-9b04-b95b9e43f9f9\" (UID: \"91a84bf1-df1d-43fd-9b04-b95b9e43f9f9\") " Mar 09 03:32:50 crc kubenswrapper[4901]: I0309 03:32:50.746887 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91a84bf1-df1d-43fd-9b04-b95b9e43f9f9-utilities" (OuterVolumeSpecName: "utilities") pod "91a84bf1-df1d-43fd-9b04-b95b9e43f9f9" (UID: "91a84bf1-df1d-43fd-9b04-b95b9e43f9f9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:32:50 crc kubenswrapper[4901]: I0309 03:32:50.753862 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91a84bf1-df1d-43fd-9b04-b95b9e43f9f9-kube-api-access-xt2xl" (OuterVolumeSpecName: "kube-api-access-xt2xl") pod "91a84bf1-df1d-43fd-9b04-b95b9e43f9f9" (UID: "91a84bf1-df1d-43fd-9b04-b95b9e43f9f9"). InnerVolumeSpecName "kube-api-access-xt2xl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:32:50 crc kubenswrapper[4901]: I0309 03:32:50.815355 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91a84bf1-df1d-43fd-9b04-b95b9e43f9f9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "91a84bf1-df1d-43fd-9b04-b95b9e43f9f9" (UID: "91a84bf1-df1d-43fd-9b04-b95b9e43f9f9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:32:50 crc kubenswrapper[4901]: I0309 03:32:50.848268 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xt2xl\" (UniqueName: \"kubernetes.io/projected/91a84bf1-df1d-43fd-9b04-b95b9e43f9f9-kube-api-access-xt2xl\") on node \"crc\" DevicePath \"\"" Mar 09 03:32:50 crc kubenswrapper[4901]: I0309 03:32:50.848637 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91a84bf1-df1d-43fd-9b04-b95b9e43f9f9-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 03:32:50 crc kubenswrapper[4901]: I0309 03:32:50.848657 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91a84bf1-df1d-43fd-9b04-b95b9e43f9f9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 03:32:50 crc kubenswrapper[4901]: I0309 03:32:50.980258 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kgpxn"] Mar 09 03:32:50 crc kubenswrapper[4901]: I0309 03:32:50.984961 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kgpxn"] Mar 09 03:32:52 crc kubenswrapper[4901]: I0309 03:32:52.136742 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91a84bf1-df1d-43fd-9b04-b95b9e43f9f9" path="/var/lib/kubelet/pods/91a84bf1-df1d-43fd-9b04-b95b9e43f9f9/volumes" Mar 09 03:34:00 crc kubenswrapper[4901]: I0309 03:34:00.166005 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550454-l6d8p"] Mar 09 03:34:00 crc kubenswrapper[4901]: E0309 03:34:00.167047 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91a84bf1-df1d-43fd-9b04-b95b9e43f9f9" containerName="extract-content" Mar 09 03:34:00 crc kubenswrapper[4901]: I0309 03:34:00.167075 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="91a84bf1-df1d-43fd-9b04-b95b9e43f9f9" containerName="extract-content" Mar 09 03:34:00 crc kubenswrapper[4901]: E0309 03:34:00.167099 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91a84bf1-df1d-43fd-9b04-b95b9e43f9f9" containerName="extract-utilities" Mar 09 03:34:00 crc kubenswrapper[4901]: I0309 03:34:00.167110 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="91a84bf1-df1d-43fd-9b04-b95b9e43f9f9" containerName="extract-utilities" Mar 09 03:34:00 crc kubenswrapper[4901]: E0309 03:34:00.167138 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="352e87e5-5aa4-4800-bd7b-36391e72de86" containerName="registry-server" Mar 09 03:34:00 crc kubenswrapper[4901]: I0309 03:34:00.167147 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="352e87e5-5aa4-4800-bd7b-36391e72de86" containerName="registry-server" Mar 09 03:34:00 crc kubenswrapper[4901]: E0309 03:34:00.167161 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91a84bf1-df1d-43fd-9b04-b95b9e43f9f9" containerName="registry-server" Mar 09 03:34:00 crc kubenswrapper[4901]: I0309 03:34:00.167170 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="91a84bf1-df1d-43fd-9b04-b95b9e43f9f9" containerName="registry-server" Mar 09 03:34:00 crc kubenswrapper[4901]: E0309 03:34:00.167189 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="352e87e5-5aa4-4800-bd7b-36391e72de86" containerName="extract-content" Mar 09 03:34:00 crc kubenswrapper[4901]: I0309 03:34:00.167198 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="352e87e5-5aa4-4800-bd7b-36391e72de86" containerName="extract-content" Mar 09 03:34:00 crc kubenswrapper[4901]: E0309 03:34:00.167244 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="352e87e5-5aa4-4800-bd7b-36391e72de86" containerName="extract-utilities" Mar 09 03:34:00 crc kubenswrapper[4901]: I0309 03:34:00.167256 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="352e87e5-5aa4-4800-bd7b-36391e72de86" containerName="extract-utilities" Mar 09 03:34:00 crc kubenswrapper[4901]: I0309 03:34:00.167512 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="352e87e5-5aa4-4800-bd7b-36391e72de86" containerName="registry-server" Mar 09 03:34:00 crc kubenswrapper[4901]: I0309 03:34:00.167556 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="91a84bf1-df1d-43fd-9b04-b95b9e43f9f9" containerName="registry-server" Mar 09 03:34:00 crc kubenswrapper[4901]: I0309 03:34:00.168286 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550454-l6d8p" Mar 09 03:34:00 crc kubenswrapper[4901]: I0309 03:34:00.171952 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lq6vs" Mar 09 03:34:00 crc kubenswrapper[4901]: I0309 03:34:00.173565 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 03:34:00 crc kubenswrapper[4901]: I0309 03:34:00.174159 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 03:34:00 crc kubenswrapper[4901]: I0309 03:34:00.183642 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550454-l6d8p"] Mar 09 03:34:00 crc kubenswrapper[4901]: I0309 03:34:00.310885 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2q7l\" (UniqueName: \"kubernetes.io/projected/644165d2-91ab-413a-b9e0-45ea35318943-kube-api-access-w2q7l\") pod \"auto-csr-approver-29550454-l6d8p\" (UID: \"644165d2-91ab-413a-b9e0-45ea35318943\") " pod="openshift-infra/auto-csr-approver-29550454-l6d8p" Mar 09 03:34:00 crc kubenswrapper[4901]: I0309 03:34:00.413139 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2q7l\" (UniqueName: \"kubernetes.io/projected/644165d2-91ab-413a-b9e0-45ea35318943-kube-api-access-w2q7l\") pod \"auto-csr-approver-29550454-l6d8p\" (UID: \"644165d2-91ab-413a-b9e0-45ea35318943\") " pod="openshift-infra/auto-csr-approver-29550454-l6d8p" Mar 09 03:34:00 crc kubenswrapper[4901]: I0309 03:34:00.446745 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2q7l\" (UniqueName: \"kubernetes.io/projected/644165d2-91ab-413a-b9e0-45ea35318943-kube-api-access-w2q7l\") pod \"auto-csr-approver-29550454-l6d8p\" (UID: \"644165d2-91ab-413a-b9e0-45ea35318943\") " pod="openshift-infra/auto-csr-approver-29550454-l6d8p" Mar 09 03:34:00 crc kubenswrapper[4901]: I0309 03:34:00.504292 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550454-l6d8p" Mar 09 03:34:00 crc kubenswrapper[4901]: I0309 03:34:00.822862 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550454-l6d8p"] Mar 09 03:34:00 crc kubenswrapper[4901]: I0309 03:34:00.863392 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 03:34:00 crc kubenswrapper[4901]: I0309 03:34:00.863480 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 03:34:01 crc kubenswrapper[4901]: I0309 03:34:01.379263 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550454-l6d8p" event={"ID":"644165d2-91ab-413a-b9e0-45ea35318943","Type":"ContainerStarted","Data":"437ea96f5c0fb6d5381b82431ee2683a04eff1ceaf2b0401f0b7e69d87d34659"} Mar 09 03:34:03 crc kubenswrapper[4901]: I0309 03:34:03.402194 4901 generic.go:334] "Generic (PLEG): container finished" podID="644165d2-91ab-413a-b9e0-45ea35318943" containerID="ab99b2238489620ae0271f3149ca6725e2f23b6049904ff499b7864bc2d5e6f0" exitCode=0 Mar 09 03:34:03 crc kubenswrapper[4901]: I0309 03:34:03.402282 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550454-l6d8p" event={"ID":"644165d2-91ab-413a-b9e0-45ea35318943","Type":"ContainerDied","Data":"ab99b2238489620ae0271f3149ca6725e2f23b6049904ff499b7864bc2d5e6f0"} Mar 09 03:34:04 crc kubenswrapper[4901]: I0309 03:34:04.803856 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550454-l6d8p" Mar 09 03:34:04 crc kubenswrapper[4901]: I0309 03:34:04.881748 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2q7l\" (UniqueName: \"kubernetes.io/projected/644165d2-91ab-413a-b9e0-45ea35318943-kube-api-access-w2q7l\") pod \"644165d2-91ab-413a-b9e0-45ea35318943\" (UID: \"644165d2-91ab-413a-b9e0-45ea35318943\") " Mar 09 03:34:04 crc kubenswrapper[4901]: I0309 03:34:04.892077 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/644165d2-91ab-413a-b9e0-45ea35318943-kube-api-access-w2q7l" (OuterVolumeSpecName: "kube-api-access-w2q7l") pod "644165d2-91ab-413a-b9e0-45ea35318943" (UID: "644165d2-91ab-413a-b9e0-45ea35318943"). InnerVolumeSpecName "kube-api-access-w2q7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:34:04 crc kubenswrapper[4901]: I0309 03:34:04.983562 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2q7l\" (UniqueName: \"kubernetes.io/projected/644165d2-91ab-413a-b9e0-45ea35318943-kube-api-access-w2q7l\") on node \"crc\" DevicePath \"\"" Mar 09 03:34:05 crc kubenswrapper[4901]: I0309 03:34:05.421384 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550454-l6d8p" event={"ID":"644165d2-91ab-413a-b9e0-45ea35318943","Type":"ContainerDied","Data":"437ea96f5c0fb6d5381b82431ee2683a04eff1ceaf2b0401f0b7e69d87d34659"} Mar 09 03:34:05 crc kubenswrapper[4901]: I0309 03:34:05.421448 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="437ea96f5c0fb6d5381b82431ee2683a04eff1ceaf2b0401f0b7e69d87d34659" Mar 09 03:34:05 crc kubenswrapper[4901]: I0309 03:34:05.421467 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550454-l6d8p" Mar 09 03:34:05 crc kubenswrapper[4901]: I0309 03:34:05.888712 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550448-kz9dr"] Mar 09 03:34:05 crc kubenswrapper[4901]: I0309 03:34:05.895391 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550448-kz9dr"] Mar 09 03:34:06 crc kubenswrapper[4901]: I0309 03:34:06.124007 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d239abea-17e6-4c02-8897-a079ef1d0dc3" path="/var/lib/kubelet/pods/d239abea-17e6-4c02-8897-a079ef1d0dc3/volumes" Mar 09 03:34:20 crc kubenswrapper[4901]: I0309 03:34:20.594125 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wmwc6"] Mar 09 03:34:20 crc kubenswrapper[4901]: E0309 03:34:20.595150 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="644165d2-91ab-413a-b9e0-45ea35318943" containerName="oc" Mar 09 03:34:20 crc kubenswrapper[4901]: I0309 03:34:20.595170 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="644165d2-91ab-413a-b9e0-45ea35318943" containerName="oc" Mar 09 03:34:20 crc kubenswrapper[4901]: I0309 03:34:20.595453 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="644165d2-91ab-413a-b9e0-45ea35318943" containerName="oc" Mar 09 03:34:20 crc kubenswrapper[4901]: I0309 03:34:20.600402 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wmwc6" Mar 09 03:34:20 crc kubenswrapper[4901]: I0309 03:34:20.626576 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wmwc6"] Mar 09 03:34:20 crc kubenswrapper[4901]: I0309 03:34:20.749243 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0eb1deb1-107d-4022-9922-1591cf4e406a-utilities\") pod \"community-operators-wmwc6\" (UID: \"0eb1deb1-107d-4022-9922-1591cf4e406a\") " pod="openshift-marketplace/community-operators-wmwc6" Mar 09 03:34:20 crc kubenswrapper[4901]: I0309 03:34:20.749307 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0eb1deb1-107d-4022-9922-1591cf4e406a-catalog-content\") pod \"community-operators-wmwc6\" (UID: \"0eb1deb1-107d-4022-9922-1591cf4e406a\") " pod="openshift-marketplace/community-operators-wmwc6" Mar 09 03:34:20 crc kubenswrapper[4901]: I0309 03:34:20.749336 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbcpt\" (UniqueName: \"kubernetes.io/projected/0eb1deb1-107d-4022-9922-1591cf4e406a-kube-api-access-qbcpt\") pod \"community-operators-wmwc6\" (UID: \"0eb1deb1-107d-4022-9922-1591cf4e406a\") " pod="openshift-marketplace/community-operators-wmwc6" Mar 09 03:34:20 crc kubenswrapper[4901]: I0309 03:34:20.850439 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0eb1deb1-107d-4022-9922-1591cf4e406a-utilities\") pod \"community-operators-wmwc6\" (UID: \"0eb1deb1-107d-4022-9922-1591cf4e406a\") " pod="openshift-marketplace/community-operators-wmwc6" Mar 09 03:34:20 crc kubenswrapper[4901]: I0309 03:34:20.850497 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0eb1deb1-107d-4022-9922-1591cf4e406a-catalog-content\") pod \"community-operators-wmwc6\" (UID: \"0eb1deb1-107d-4022-9922-1591cf4e406a\") " pod="openshift-marketplace/community-operators-wmwc6" Mar 09 03:34:20 crc kubenswrapper[4901]: I0309 03:34:20.850523 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbcpt\" (UniqueName: \"kubernetes.io/projected/0eb1deb1-107d-4022-9922-1591cf4e406a-kube-api-access-qbcpt\") pod \"community-operators-wmwc6\" (UID: \"0eb1deb1-107d-4022-9922-1591cf4e406a\") " pod="openshift-marketplace/community-operators-wmwc6" Mar 09 03:34:20 crc kubenswrapper[4901]: I0309 03:34:20.851074 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0eb1deb1-107d-4022-9922-1591cf4e406a-catalog-content\") pod \"community-operators-wmwc6\" (UID: \"0eb1deb1-107d-4022-9922-1591cf4e406a\") " pod="openshift-marketplace/community-operators-wmwc6" Mar 09 03:34:20 crc kubenswrapper[4901]: I0309 03:34:20.851497 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0eb1deb1-107d-4022-9922-1591cf4e406a-utilities\") pod \"community-operators-wmwc6\" (UID: \"0eb1deb1-107d-4022-9922-1591cf4e406a\") " pod="openshift-marketplace/community-operators-wmwc6" Mar 09 03:34:20 crc kubenswrapper[4901]: I0309 03:34:20.879811 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbcpt\" (UniqueName: \"kubernetes.io/projected/0eb1deb1-107d-4022-9922-1591cf4e406a-kube-api-access-qbcpt\") pod \"community-operators-wmwc6\" (UID: \"0eb1deb1-107d-4022-9922-1591cf4e406a\") " pod="openshift-marketplace/community-operators-wmwc6" Mar 09 03:34:20 crc kubenswrapper[4901]: I0309 03:34:20.928511 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wmwc6" Mar 09 03:34:21 crc kubenswrapper[4901]: I0309 03:34:21.238289 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wmwc6"] Mar 09 03:34:22 crc kubenswrapper[4901]: I0309 03:34:22.017683 4901 generic.go:334] "Generic (PLEG): container finished" podID="0eb1deb1-107d-4022-9922-1591cf4e406a" containerID="17dbb8effe90c3734b496f634dc4b64f1a3fc769455e161a96d5d741c8ae6db4" exitCode=0 Mar 09 03:34:22 crc kubenswrapper[4901]: I0309 03:34:22.017771 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmwc6" event={"ID":"0eb1deb1-107d-4022-9922-1591cf4e406a","Type":"ContainerDied","Data":"17dbb8effe90c3734b496f634dc4b64f1a3fc769455e161a96d5d741c8ae6db4"} Mar 09 03:34:22 crc kubenswrapper[4901]: I0309 03:34:22.018050 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmwc6" event={"ID":"0eb1deb1-107d-4022-9922-1591cf4e406a","Type":"ContainerStarted","Data":"e7eb7e0c65d819196877cb41d52ca7f8b0ecf9b391a90a193b82a5aa98b59c1a"} Mar 09 03:34:23 crc kubenswrapper[4901]: I0309 03:34:23.031931 4901 generic.go:334] "Generic (PLEG): container finished" podID="0eb1deb1-107d-4022-9922-1591cf4e406a" containerID="aa4ca4975ad2a82ce4e030ce9eee482bc776f6f1ecc050567d0b131ba695f7c2" exitCode=0 Mar 09 03:34:23 crc kubenswrapper[4901]: I0309 03:34:23.032040 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmwc6" event={"ID":"0eb1deb1-107d-4022-9922-1591cf4e406a","Type":"ContainerDied","Data":"aa4ca4975ad2a82ce4e030ce9eee482bc776f6f1ecc050567d0b131ba695f7c2"} Mar 09 03:34:24 crc kubenswrapper[4901]: I0309 03:34:24.041807 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmwc6" event={"ID":"0eb1deb1-107d-4022-9922-1591cf4e406a","Type":"ContainerStarted","Data":"980b1f49403301680d1da03891a7f7ba1cfed7758366b28cedd85a7a1d3d4c84"} Mar 09 03:34:24 crc kubenswrapper[4901]: I0309 03:34:24.063157 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wmwc6" podStartSLOduration=2.620812063 podStartE2EDuration="4.063138432s" podCreationTimestamp="2026-03-09 03:34:20 +0000 UTC" firstStartedPulling="2026-03-09 03:34:22.022089532 +0000 UTC m=+3186.611753304" lastFinishedPulling="2026-03-09 03:34:23.464415941 +0000 UTC m=+3188.054079673" observedRunningTime="2026-03-09 03:34:24.061109961 +0000 UTC m=+3188.650773693" watchObservedRunningTime="2026-03-09 03:34:24.063138432 +0000 UTC m=+3188.652802174" Mar 09 03:34:30 crc kubenswrapper[4901]: I0309 03:34:30.398165 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tq2dm"] Mar 09 03:34:30 crc kubenswrapper[4901]: I0309 03:34:30.403640 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tq2dm" Mar 09 03:34:30 crc kubenswrapper[4901]: I0309 03:34:30.424799 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tq2dm"] Mar 09 03:34:30 crc kubenswrapper[4901]: I0309 03:34:30.506700 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2gvh\" (UniqueName: \"kubernetes.io/projected/a1475a47-1760-4892-98ee-3efb3bc6741a-kube-api-access-b2gvh\") pod \"redhat-marketplace-tq2dm\" (UID: \"a1475a47-1760-4892-98ee-3efb3bc6741a\") " pod="openshift-marketplace/redhat-marketplace-tq2dm" Mar 09 03:34:30 crc kubenswrapper[4901]: I0309 03:34:30.506828 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1475a47-1760-4892-98ee-3efb3bc6741a-catalog-content\") pod \"redhat-marketplace-tq2dm\" (UID: \"a1475a47-1760-4892-98ee-3efb3bc6741a\") " pod="openshift-marketplace/redhat-marketplace-tq2dm" Mar 09 03:34:30 crc kubenswrapper[4901]: I0309 03:34:30.506902 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1475a47-1760-4892-98ee-3efb3bc6741a-utilities\") pod \"redhat-marketplace-tq2dm\" (UID: \"a1475a47-1760-4892-98ee-3efb3bc6741a\") " pod="openshift-marketplace/redhat-marketplace-tq2dm" Mar 09 03:34:30 crc kubenswrapper[4901]: I0309 03:34:30.609372 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2gvh\" (UniqueName: \"kubernetes.io/projected/a1475a47-1760-4892-98ee-3efb3bc6741a-kube-api-access-b2gvh\") pod \"redhat-marketplace-tq2dm\" (UID: \"a1475a47-1760-4892-98ee-3efb3bc6741a\") " pod="openshift-marketplace/redhat-marketplace-tq2dm" Mar 09 03:34:30 crc kubenswrapper[4901]: I0309 03:34:30.609780 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1475a47-1760-4892-98ee-3efb3bc6741a-catalog-content\") pod \"redhat-marketplace-tq2dm\" (UID: \"a1475a47-1760-4892-98ee-3efb3bc6741a\") " pod="openshift-marketplace/redhat-marketplace-tq2dm" Mar 09 03:34:30 crc kubenswrapper[4901]: I0309 03:34:30.610321 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1475a47-1760-4892-98ee-3efb3bc6741a-catalog-content\") pod \"redhat-marketplace-tq2dm\" (UID: \"a1475a47-1760-4892-98ee-3efb3bc6741a\") " pod="openshift-marketplace/redhat-marketplace-tq2dm" Mar 09 03:34:30 crc kubenswrapper[4901]: I0309 03:34:30.610815 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1475a47-1760-4892-98ee-3efb3bc6741a-utilities\") pod \"redhat-marketplace-tq2dm\" (UID: \"a1475a47-1760-4892-98ee-3efb3bc6741a\") " pod="openshift-marketplace/redhat-marketplace-tq2dm" Mar 09 03:34:30 crc kubenswrapper[4901]: I0309 03:34:30.611103 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1475a47-1760-4892-98ee-3efb3bc6741a-utilities\") pod \"redhat-marketplace-tq2dm\" (UID: \"a1475a47-1760-4892-98ee-3efb3bc6741a\") " pod="openshift-marketplace/redhat-marketplace-tq2dm" Mar 09 03:34:30 crc kubenswrapper[4901]: I0309 03:34:30.633290 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2gvh\" (UniqueName: \"kubernetes.io/projected/a1475a47-1760-4892-98ee-3efb3bc6741a-kube-api-access-b2gvh\") pod \"redhat-marketplace-tq2dm\" (UID: \"a1475a47-1760-4892-98ee-3efb3bc6741a\") " pod="openshift-marketplace/redhat-marketplace-tq2dm" Mar 09 03:34:30 crc kubenswrapper[4901]: I0309 03:34:30.735457 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tq2dm" Mar 09 03:34:30 crc kubenswrapper[4901]: I0309 03:34:30.864642 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 03:34:30 crc kubenswrapper[4901]: I0309 03:34:30.864957 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 03:34:30 crc kubenswrapper[4901]: I0309 03:34:30.929060 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wmwc6" Mar 09 03:34:30 crc kubenswrapper[4901]: I0309 03:34:30.929136 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wmwc6" Mar 09 03:34:30 crc kubenswrapper[4901]: I0309 03:34:30.995920 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wmwc6" Mar 09 03:34:31 crc kubenswrapper[4901]: I0309 03:34:31.006805 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tq2dm"] Mar 09 03:34:31 crc kubenswrapper[4901]: I0309 03:34:31.112395 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tq2dm" event={"ID":"a1475a47-1760-4892-98ee-3efb3bc6741a","Type":"ContainerStarted","Data":"2afdec5f25e409737319d298f03b1fb8a5747209ef1abad7591d770f6897f2a1"} Mar 09 03:34:31 crc kubenswrapper[4901]: I0309 03:34:31.158825 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wmwc6" Mar 09 03:34:32 crc kubenswrapper[4901]: I0309 03:34:32.128449 4901 generic.go:334] "Generic (PLEG): container finished" podID="a1475a47-1760-4892-98ee-3efb3bc6741a" containerID="b199107bb56d813def31ce9e3887286576fc438ab878e4b0db5936cf194b2f26" exitCode=0 Mar 09 03:34:32 crc kubenswrapper[4901]: I0309 03:34:32.128961 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tq2dm" event={"ID":"a1475a47-1760-4892-98ee-3efb3bc6741a","Type":"ContainerDied","Data":"b199107bb56d813def31ce9e3887286576fc438ab878e4b0db5936cf194b2f26"} Mar 09 03:34:33 crc kubenswrapper[4901]: I0309 03:34:33.140624 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tq2dm" event={"ID":"a1475a47-1760-4892-98ee-3efb3bc6741a","Type":"ContainerStarted","Data":"80beb3796ed51bb6b926bad88019011df255d797d353c8f2d5e6d3c1fb1e01f9"} Mar 09 03:34:33 crc kubenswrapper[4901]: I0309 03:34:33.367824 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wmwc6"] Mar 09 03:34:33 crc kubenswrapper[4901]: I0309 03:34:33.368385 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wmwc6" podUID="0eb1deb1-107d-4022-9922-1591cf4e406a" containerName="registry-server" containerID="cri-o://980b1f49403301680d1da03891a7f7ba1cfed7758366b28cedd85a7a1d3d4c84" gracePeriod=2 Mar 09 03:34:33 crc kubenswrapper[4901]: I0309 03:34:33.834658 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wmwc6" Mar 09 03:34:33 crc kubenswrapper[4901]: I0309 03:34:33.971337 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbcpt\" (UniqueName: \"kubernetes.io/projected/0eb1deb1-107d-4022-9922-1591cf4e406a-kube-api-access-qbcpt\") pod \"0eb1deb1-107d-4022-9922-1591cf4e406a\" (UID: \"0eb1deb1-107d-4022-9922-1591cf4e406a\") " Mar 09 03:34:33 crc kubenswrapper[4901]: I0309 03:34:33.971424 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0eb1deb1-107d-4022-9922-1591cf4e406a-catalog-content\") pod \"0eb1deb1-107d-4022-9922-1591cf4e406a\" (UID: \"0eb1deb1-107d-4022-9922-1591cf4e406a\") " Mar 09 03:34:33 crc kubenswrapper[4901]: I0309 03:34:33.971475 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0eb1deb1-107d-4022-9922-1591cf4e406a-utilities\") pod \"0eb1deb1-107d-4022-9922-1591cf4e406a\" (UID: \"0eb1deb1-107d-4022-9922-1591cf4e406a\") " Mar 09 03:34:33 crc kubenswrapper[4901]: I0309 03:34:33.973002 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0eb1deb1-107d-4022-9922-1591cf4e406a-utilities" (OuterVolumeSpecName: "utilities") pod "0eb1deb1-107d-4022-9922-1591cf4e406a" (UID: "0eb1deb1-107d-4022-9922-1591cf4e406a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:34:33 crc kubenswrapper[4901]: I0309 03:34:33.980859 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0eb1deb1-107d-4022-9922-1591cf4e406a-kube-api-access-qbcpt" (OuterVolumeSpecName: "kube-api-access-qbcpt") pod "0eb1deb1-107d-4022-9922-1591cf4e406a" (UID: "0eb1deb1-107d-4022-9922-1591cf4e406a"). InnerVolumeSpecName "kube-api-access-qbcpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:34:34 crc kubenswrapper[4901]: I0309 03:34:34.056302 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0eb1deb1-107d-4022-9922-1591cf4e406a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0eb1deb1-107d-4022-9922-1591cf4e406a" (UID: "0eb1deb1-107d-4022-9922-1591cf4e406a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:34:34 crc kubenswrapper[4901]: I0309 03:34:34.073342 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbcpt\" (UniqueName: \"kubernetes.io/projected/0eb1deb1-107d-4022-9922-1591cf4e406a-kube-api-access-qbcpt\") on node \"crc\" DevicePath \"\"" Mar 09 03:34:34 crc kubenswrapper[4901]: I0309 03:34:34.073378 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0eb1deb1-107d-4022-9922-1591cf4e406a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 03:34:34 crc kubenswrapper[4901]: I0309 03:34:34.073393 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0eb1deb1-107d-4022-9922-1591cf4e406a-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 03:34:34 crc kubenswrapper[4901]: I0309 03:34:34.153429 4901 generic.go:334] "Generic (PLEG): container finished" podID="a1475a47-1760-4892-98ee-3efb3bc6741a" containerID="80beb3796ed51bb6b926bad88019011df255d797d353c8f2d5e6d3c1fb1e01f9" exitCode=0 Mar 09 03:34:34 crc kubenswrapper[4901]: I0309 03:34:34.153496 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tq2dm" event={"ID":"a1475a47-1760-4892-98ee-3efb3bc6741a","Type":"ContainerDied","Data":"80beb3796ed51bb6b926bad88019011df255d797d353c8f2d5e6d3c1fb1e01f9"} Mar 09 03:34:34 crc kubenswrapper[4901]: I0309 03:34:34.160303 4901 generic.go:334] "Generic (PLEG): container finished" podID="0eb1deb1-107d-4022-9922-1591cf4e406a" containerID="980b1f49403301680d1da03891a7f7ba1cfed7758366b28cedd85a7a1d3d4c84" exitCode=0 Mar 09 03:34:34 crc kubenswrapper[4901]: I0309 03:34:34.160368 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmwc6" event={"ID":"0eb1deb1-107d-4022-9922-1591cf4e406a","Type":"ContainerDied","Data":"980b1f49403301680d1da03891a7f7ba1cfed7758366b28cedd85a7a1d3d4c84"} Mar 09 03:34:34 crc kubenswrapper[4901]: I0309 03:34:34.160380 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wmwc6" Mar 09 03:34:34 crc kubenswrapper[4901]: I0309 03:34:34.160422 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmwc6" event={"ID":"0eb1deb1-107d-4022-9922-1591cf4e406a","Type":"ContainerDied","Data":"e7eb7e0c65d819196877cb41d52ca7f8b0ecf9b391a90a193b82a5aa98b59c1a"} Mar 09 03:34:34 crc kubenswrapper[4901]: I0309 03:34:34.160455 4901 scope.go:117] "RemoveContainer" containerID="980b1f49403301680d1da03891a7f7ba1cfed7758366b28cedd85a7a1d3d4c84" Mar 09 03:34:34 crc kubenswrapper[4901]: I0309 03:34:34.194979 4901 scope.go:117] "RemoveContainer" containerID="aa4ca4975ad2a82ce4e030ce9eee482bc776f6f1ecc050567d0b131ba695f7c2" Mar 09 03:34:34 crc kubenswrapper[4901]: I0309 03:34:34.210968 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wmwc6"] Mar 09 03:34:34 crc kubenswrapper[4901]: I0309 03:34:34.216952 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wmwc6"] Mar 09 03:34:34 crc kubenswrapper[4901]: I0309 03:34:34.233144 4901 scope.go:117] "RemoveContainer" containerID="17dbb8effe90c3734b496f634dc4b64f1a3fc769455e161a96d5d741c8ae6db4" Mar 09 03:34:34 crc kubenswrapper[4901]: I0309 03:34:34.266276 4901 scope.go:117] "RemoveContainer" containerID="980b1f49403301680d1da03891a7f7ba1cfed7758366b28cedd85a7a1d3d4c84" Mar 09 03:34:34 crc kubenswrapper[4901]: E0309 03:34:34.266841 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"980b1f49403301680d1da03891a7f7ba1cfed7758366b28cedd85a7a1d3d4c84\": container with ID starting with 980b1f49403301680d1da03891a7f7ba1cfed7758366b28cedd85a7a1d3d4c84 not found: ID does not exist" containerID="980b1f49403301680d1da03891a7f7ba1cfed7758366b28cedd85a7a1d3d4c84" Mar 09 03:34:34 crc kubenswrapper[4901]: I0309 03:34:34.266908 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"980b1f49403301680d1da03891a7f7ba1cfed7758366b28cedd85a7a1d3d4c84"} err="failed to get container status \"980b1f49403301680d1da03891a7f7ba1cfed7758366b28cedd85a7a1d3d4c84\": rpc error: code = NotFound desc = could not find container \"980b1f49403301680d1da03891a7f7ba1cfed7758366b28cedd85a7a1d3d4c84\": container with ID starting with 980b1f49403301680d1da03891a7f7ba1cfed7758366b28cedd85a7a1d3d4c84 not found: ID does not exist" Mar 09 03:34:34 crc kubenswrapper[4901]: I0309 03:34:34.266948 4901 scope.go:117] "RemoveContainer" containerID="aa4ca4975ad2a82ce4e030ce9eee482bc776f6f1ecc050567d0b131ba695f7c2" Mar 09 03:34:34 crc kubenswrapper[4901]: E0309 03:34:34.267359 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa4ca4975ad2a82ce4e030ce9eee482bc776f6f1ecc050567d0b131ba695f7c2\": container with ID starting with aa4ca4975ad2a82ce4e030ce9eee482bc776f6f1ecc050567d0b131ba695f7c2 not found: ID does not exist" containerID="aa4ca4975ad2a82ce4e030ce9eee482bc776f6f1ecc050567d0b131ba695f7c2" Mar 09 03:34:34 crc kubenswrapper[4901]: I0309 03:34:34.267423 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa4ca4975ad2a82ce4e030ce9eee482bc776f6f1ecc050567d0b131ba695f7c2"} err="failed to get container status \"aa4ca4975ad2a82ce4e030ce9eee482bc776f6f1ecc050567d0b131ba695f7c2\": rpc error: code = NotFound desc = could not find container \"aa4ca4975ad2a82ce4e030ce9eee482bc776f6f1ecc050567d0b131ba695f7c2\": container with ID starting with aa4ca4975ad2a82ce4e030ce9eee482bc776f6f1ecc050567d0b131ba695f7c2 not found: ID does not exist" Mar 09 03:34:34 crc kubenswrapper[4901]: I0309 03:34:34.267453 4901 scope.go:117] "RemoveContainer" containerID="17dbb8effe90c3734b496f634dc4b64f1a3fc769455e161a96d5d741c8ae6db4" Mar 09 03:34:34 crc kubenswrapper[4901]: E0309 03:34:34.268089 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17dbb8effe90c3734b496f634dc4b64f1a3fc769455e161a96d5d741c8ae6db4\": container with ID starting with 17dbb8effe90c3734b496f634dc4b64f1a3fc769455e161a96d5d741c8ae6db4 not found: ID does not exist" containerID="17dbb8effe90c3734b496f634dc4b64f1a3fc769455e161a96d5d741c8ae6db4" Mar 09 03:34:34 crc kubenswrapper[4901]: I0309 03:34:34.268144 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17dbb8effe90c3734b496f634dc4b64f1a3fc769455e161a96d5d741c8ae6db4"} err="failed to get container status \"17dbb8effe90c3734b496f634dc4b64f1a3fc769455e161a96d5d741c8ae6db4\": rpc error: code = NotFound desc = could not find container \"17dbb8effe90c3734b496f634dc4b64f1a3fc769455e161a96d5d741c8ae6db4\": container with ID starting with 17dbb8effe90c3734b496f634dc4b64f1a3fc769455e161a96d5d741c8ae6db4 not found: ID does not exist" Mar 09 03:34:35 crc kubenswrapper[4901]: I0309 03:34:35.175276 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tq2dm" event={"ID":"a1475a47-1760-4892-98ee-3efb3bc6741a","Type":"ContainerStarted","Data":"43b635d2359c58fcad6b554e642e8754b219ea20dace391f376820348381388c"} Mar 09 03:34:36 crc kubenswrapper[4901]: I0309 03:34:36.117899 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0eb1deb1-107d-4022-9922-1591cf4e406a" path="/var/lib/kubelet/pods/0eb1deb1-107d-4022-9922-1591cf4e406a/volumes" Mar 09 03:34:39 crc kubenswrapper[4901]: I0309 03:34:39.470861 4901 scope.go:117] "RemoveContainer" containerID="aa5e32d1be8d61aeab028ce2be3ca80e3a83104e072267ce83f1464993cc68ba" Mar 09 03:34:40 crc kubenswrapper[4901]: I0309 03:34:40.737084 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tq2dm" Mar 09 03:34:40 crc kubenswrapper[4901]: I0309 03:34:40.737444 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tq2dm" Mar 09 03:34:40 crc kubenswrapper[4901]: I0309 03:34:40.812215 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tq2dm" Mar 09 03:34:40 crc kubenswrapper[4901]: I0309 03:34:40.843758 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tq2dm" podStartSLOduration=8.316372744 podStartE2EDuration="10.843732503s" podCreationTimestamp="2026-03-09 03:34:30 +0000 UTC" firstStartedPulling="2026-03-09 03:34:32.132953565 +0000 UTC m=+3196.722617337" lastFinishedPulling="2026-03-09 03:34:34.660313324 +0000 UTC m=+3199.249977096" observedRunningTime="2026-03-09 03:34:35.210808812 +0000 UTC m=+3199.800472594" watchObservedRunningTime="2026-03-09 03:34:40.843732503 +0000 UTC m=+3205.433396265" Mar 09 03:34:41 crc kubenswrapper[4901]: I0309 03:34:41.307608 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tq2dm" Mar 09 03:34:41 crc kubenswrapper[4901]: I0309 03:34:41.380256 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tq2dm"] Mar 09 03:34:43 crc kubenswrapper[4901]: I0309 03:34:43.254249 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tq2dm" podUID="a1475a47-1760-4892-98ee-3efb3bc6741a" containerName="registry-server" containerID="cri-o://43b635d2359c58fcad6b554e642e8754b219ea20dace391f376820348381388c" gracePeriod=2 Mar 09 03:34:44 crc kubenswrapper[4901]: I0309 03:34:44.205944 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tq2dm" Mar 09 03:34:44 crc kubenswrapper[4901]: I0309 03:34:44.275306 4901 generic.go:334] "Generic (PLEG): container finished" podID="a1475a47-1760-4892-98ee-3efb3bc6741a" containerID="43b635d2359c58fcad6b554e642e8754b219ea20dace391f376820348381388c" exitCode=0 Mar 09 03:34:44 crc kubenswrapper[4901]: I0309 03:34:44.275375 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tq2dm" event={"ID":"a1475a47-1760-4892-98ee-3efb3bc6741a","Type":"ContainerDied","Data":"43b635d2359c58fcad6b554e642e8754b219ea20dace391f376820348381388c"} Mar 09 03:34:44 crc kubenswrapper[4901]: I0309 03:34:44.275406 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tq2dm" Mar 09 03:34:44 crc kubenswrapper[4901]: I0309 03:34:44.275450 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tq2dm" event={"ID":"a1475a47-1760-4892-98ee-3efb3bc6741a","Type":"ContainerDied","Data":"2afdec5f25e409737319d298f03b1fb8a5747209ef1abad7591d770f6897f2a1"} Mar 09 03:34:44 crc kubenswrapper[4901]: I0309 03:34:44.275478 4901 scope.go:117] "RemoveContainer" containerID="43b635d2359c58fcad6b554e642e8754b219ea20dace391f376820348381388c" Mar 09 03:34:44 crc kubenswrapper[4901]: I0309 03:34:44.300494 4901 scope.go:117] "RemoveContainer" containerID="80beb3796ed51bb6b926bad88019011df255d797d353c8f2d5e6d3c1fb1e01f9" Mar 09 03:34:44 crc kubenswrapper[4901]: I0309 03:34:44.326373 4901 scope.go:117] "RemoveContainer" containerID="b199107bb56d813def31ce9e3887286576fc438ab878e4b0db5936cf194b2f26" Mar 09 03:34:44 crc kubenswrapper[4901]: I0309 03:34:44.345409 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1475a47-1760-4892-98ee-3efb3bc6741a-utilities\") pod \"a1475a47-1760-4892-98ee-3efb3bc6741a\" (UID: \"a1475a47-1760-4892-98ee-3efb3bc6741a\") " Mar 09 03:34:44 crc kubenswrapper[4901]: I0309 03:34:44.345603 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2gvh\" (UniqueName: \"kubernetes.io/projected/a1475a47-1760-4892-98ee-3efb3bc6741a-kube-api-access-b2gvh\") pod \"a1475a47-1760-4892-98ee-3efb3bc6741a\" (UID: \"a1475a47-1760-4892-98ee-3efb3bc6741a\") " Mar 09 03:34:44 crc kubenswrapper[4901]: I0309 03:34:44.345695 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1475a47-1760-4892-98ee-3efb3bc6741a-catalog-content\") pod \"a1475a47-1760-4892-98ee-3efb3bc6741a\" (UID: \"a1475a47-1760-4892-98ee-3efb3bc6741a\") " Mar 09 03:34:44 crc kubenswrapper[4901]: I0309 03:34:44.346485 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1475a47-1760-4892-98ee-3efb3bc6741a-utilities" (OuterVolumeSpecName: "utilities") pod "a1475a47-1760-4892-98ee-3efb3bc6741a" (UID: "a1475a47-1760-4892-98ee-3efb3bc6741a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:34:44 crc kubenswrapper[4901]: I0309 03:34:44.351565 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1475a47-1760-4892-98ee-3efb3bc6741a-kube-api-access-b2gvh" (OuterVolumeSpecName: "kube-api-access-b2gvh") pod "a1475a47-1760-4892-98ee-3efb3bc6741a" (UID: "a1475a47-1760-4892-98ee-3efb3bc6741a"). InnerVolumeSpecName "kube-api-access-b2gvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:34:44 crc kubenswrapper[4901]: I0309 03:34:44.364517 4901 scope.go:117] "RemoveContainer" containerID="43b635d2359c58fcad6b554e642e8754b219ea20dace391f376820348381388c" Mar 09 03:34:44 crc kubenswrapper[4901]: E0309 03:34:44.367304 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43b635d2359c58fcad6b554e642e8754b219ea20dace391f376820348381388c\": container with ID starting with 43b635d2359c58fcad6b554e642e8754b219ea20dace391f376820348381388c not found: ID does not exist" containerID="43b635d2359c58fcad6b554e642e8754b219ea20dace391f376820348381388c" Mar 09 03:34:44 crc kubenswrapper[4901]: I0309 03:34:44.367374 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43b635d2359c58fcad6b554e642e8754b219ea20dace391f376820348381388c"} err="failed to get container status \"43b635d2359c58fcad6b554e642e8754b219ea20dace391f376820348381388c\": rpc error: code = NotFound desc = could not find container \"43b635d2359c58fcad6b554e642e8754b219ea20dace391f376820348381388c\": container with ID starting with 43b635d2359c58fcad6b554e642e8754b219ea20dace391f376820348381388c not found: ID does not exist" Mar 09 03:34:44 crc kubenswrapper[4901]: I0309 03:34:44.367409 4901 scope.go:117] "RemoveContainer" containerID="80beb3796ed51bb6b926bad88019011df255d797d353c8f2d5e6d3c1fb1e01f9" Mar 09 03:34:44 crc kubenswrapper[4901]: E0309 03:34:44.370722 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80beb3796ed51bb6b926bad88019011df255d797d353c8f2d5e6d3c1fb1e01f9\": container with ID starting with 80beb3796ed51bb6b926bad88019011df255d797d353c8f2d5e6d3c1fb1e01f9 not found: ID does not exist" containerID="80beb3796ed51bb6b926bad88019011df255d797d353c8f2d5e6d3c1fb1e01f9" Mar 09 03:34:44 crc kubenswrapper[4901]: I0309 03:34:44.370760 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80beb3796ed51bb6b926bad88019011df255d797d353c8f2d5e6d3c1fb1e01f9"} err="failed to get container status \"80beb3796ed51bb6b926bad88019011df255d797d353c8f2d5e6d3c1fb1e01f9\": rpc error: code = NotFound desc = could not find container \"80beb3796ed51bb6b926bad88019011df255d797d353c8f2d5e6d3c1fb1e01f9\": container with ID starting with 80beb3796ed51bb6b926bad88019011df255d797d353c8f2d5e6d3c1fb1e01f9 not found: ID does not exist" Mar 09 03:34:44 crc kubenswrapper[4901]: I0309 03:34:44.370787 4901 scope.go:117] "RemoveContainer" containerID="b199107bb56d813def31ce9e3887286576fc438ab878e4b0db5936cf194b2f26" Mar 09 03:34:44 crc kubenswrapper[4901]: E0309 03:34:44.371274 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b199107bb56d813def31ce9e3887286576fc438ab878e4b0db5936cf194b2f26\": container with ID starting with b199107bb56d813def31ce9e3887286576fc438ab878e4b0db5936cf194b2f26 not found: ID does not exist" containerID="b199107bb56d813def31ce9e3887286576fc438ab878e4b0db5936cf194b2f26" Mar 09 03:34:44 crc kubenswrapper[4901]: I0309 03:34:44.371309 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b199107bb56d813def31ce9e3887286576fc438ab878e4b0db5936cf194b2f26"} err="failed to get container status \"b199107bb56d813def31ce9e3887286576fc438ab878e4b0db5936cf194b2f26\": rpc error: code = NotFound desc = could not find container \"b199107bb56d813def31ce9e3887286576fc438ab878e4b0db5936cf194b2f26\": container with ID starting with b199107bb56d813def31ce9e3887286576fc438ab878e4b0db5936cf194b2f26 not found: ID does not exist" Mar 09 03:34:44 crc kubenswrapper[4901]: I0309 03:34:44.383407 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1475a47-1760-4892-98ee-3efb3bc6741a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a1475a47-1760-4892-98ee-3efb3bc6741a" (UID: "a1475a47-1760-4892-98ee-3efb3bc6741a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:34:44 crc kubenswrapper[4901]: I0309 03:34:44.447883 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1475a47-1760-4892-98ee-3efb3bc6741a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 03:34:44 crc kubenswrapper[4901]: I0309 03:34:44.447922 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1475a47-1760-4892-98ee-3efb3bc6741a-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 03:34:44 crc kubenswrapper[4901]: I0309 03:34:44.447935 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2gvh\" (UniqueName: \"kubernetes.io/projected/a1475a47-1760-4892-98ee-3efb3bc6741a-kube-api-access-b2gvh\") on node \"crc\" DevicePath \"\"" Mar 09 03:34:44 crc kubenswrapper[4901]: I0309 03:34:44.637353 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tq2dm"] Mar 09 03:34:44 crc kubenswrapper[4901]: I0309 03:34:44.666410 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tq2dm"] Mar 09 03:34:46 crc kubenswrapper[4901]: I0309 03:34:46.120180 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1475a47-1760-4892-98ee-3efb3bc6741a" path="/var/lib/kubelet/pods/a1475a47-1760-4892-98ee-3efb3bc6741a/volumes" Mar 09 03:35:00 crc kubenswrapper[4901]: I0309 03:35:00.863883 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 03:35:00 crc kubenswrapper[4901]: I0309 03:35:00.864811 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 03:35:00 crc kubenswrapper[4901]: I0309 03:35:00.864902 4901 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5c998" Mar 09 03:35:00 crc kubenswrapper[4901]: I0309 03:35:00.865991 4901 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4e595b8a3bc8456f295eac6997fdba3d4dfa8618a5fe46f9ea1dce83a5d6f9be"} pod="openshift-machine-config-operator/machine-config-daemon-5c998" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 03:35:00 crc kubenswrapper[4901]: I0309 03:35:00.866109 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" containerID="cri-o://4e595b8a3bc8456f295eac6997fdba3d4dfa8618a5fe46f9ea1dce83a5d6f9be" gracePeriod=600 Mar 09 03:35:00 crc kubenswrapper[4901]: E0309 03:35:00.988358 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:35:01 crc kubenswrapper[4901]: I0309 03:35:01.427479 4901 generic.go:334] "Generic (PLEG): container finished" podID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerID="4e595b8a3bc8456f295eac6997fdba3d4dfa8618a5fe46f9ea1dce83a5d6f9be" exitCode=0 Mar 09 03:35:01 crc kubenswrapper[4901]: I0309 03:35:01.427560 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" event={"ID":"65e722e8-52c4-4bb6-9927-f378b2f7296a","Type":"ContainerDied","Data":"4e595b8a3bc8456f295eac6997fdba3d4dfa8618a5fe46f9ea1dce83a5d6f9be"} Mar 09 03:35:01 crc kubenswrapper[4901]: I0309 03:35:01.427628 4901 scope.go:117] "RemoveContainer" containerID="96795b44fdfb1d02477775a37c4cf63b6931ee38b13fb7a897de264e1203c8a7" Mar 09 03:35:01 crc kubenswrapper[4901]: I0309 03:35:01.428554 4901 scope.go:117] "RemoveContainer" containerID="4e595b8a3bc8456f295eac6997fdba3d4dfa8618a5fe46f9ea1dce83a5d6f9be" Mar 09 03:35:01 crc kubenswrapper[4901]: E0309 03:35:01.429023 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:35:16 crc kubenswrapper[4901]: I0309 03:35:16.210321 4901 scope.go:117] "RemoveContainer" containerID="4e595b8a3bc8456f295eac6997fdba3d4dfa8618a5fe46f9ea1dce83a5d6f9be" Mar 09 03:35:16 crc kubenswrapper[4901]: E0309 03:35:16.211091 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:35:31 crc kubenswrapper[4901]: I0309 03:35:31.106799 4901 scope.go:117] "RemoveContainer" containerID="4e595b8a3bc8456f295eac6997fdba3d4dfa8618a5fe46f9ea1dce83a5d6f9be" Mar 09 03:35:31 crc kubenswrapper[4901]: E0309 03:35:31.107896 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:35:42 crc kubenswrapper[4901]: I0309 03:35:42.107171 4901 scope.go:117] "RemoveContainer" containerID="4e595b8a3bc8456f295eac6997fdba3d4dfa8618a5fe46f9ea1dce83a5d6f9be" Mar 09 03:35:42 crc kubenswrapper[4901]: E0309 03:35:42.109113 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:35:57 crc kubenswrapper[4901]: I0309 03:35:57.107330 4901 scope.go:117] "RemoveContainer" containerID="4e595b8a3bc8456f295eac6997fdba3d4dfa8618a5fe46f9ea1dce83a5d6f9be" Mar 09 03:35:57 crc kubenswrapper[4901]: E0309 03:35:57.108433 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:36:00 crc kubenswrapper[4901]: I0309 03:36:00.226376 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550456-jrgmq"] Mar 09 03:36:00 crc kubenswrapper[4901]: E0309 03:36:00.227217 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0eb1deb1-107d-4022-9922-1591cf4e406a" containerName="extract-content" Mar 09 03:36:00 crc kubenswrapper[4901]: I0309 03:36:00.227263 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="0eb1deb1-107d-4022-9922-1591cf4e406a" containerName="extract-content" Mar 09 03:36:00 crc kubenswrapper[4901]: E0309 03:36:00.227295 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1475a47-1760-4892-98ee-3efb3bc6741a" containerName="registry-server" Mar 09 03:36:00 crc kubenswrapper[4901]: I0309 03:36:00.227309 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1475a47-1760-4892-98ee-3efb3bc6741a" containerName="registry-server" Mar 09 03:36:00 crc kubenswrapper[4901]: E0309 03:36:00.227326 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1475a47-1760-4892-98ee-3efb3bc6741a" containerName="extract-utilities" Mar 09 03:36:00 crc kubenswrapper[4901]: I0309 03:36:00.227342 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1475a47-1760-4892-98ee-3efb3bc6741a" containerName="extract-utilities" Mar 09 03:36:00 crc kubenswrapper[4901]: E0309 03:36:00.227367 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0eb1deb1-107d-4022-9922-1591cf4e406a" containerName="extract-utilities" Mar 09 03:36:00 crc kubenswrapper[4901]: I0309 03:36:00.227380 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="0eb1deb1-107d-4022-9922-1591cf4e406a" containerName="extract-utilities" Mar 09 03:36:00 crc kubenswrapper[4901]: E0309 03:36:00.227413 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0eb1deb1-107d-4022-9922-1591cf4e406a" containerName="registry-server" Mar 09 03:36:00 crc kubenswrapper[4901]: I0309 03:36:00.227426 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="0eb1deb1-107d-4022-9922-1591cf4e406a" containerName="registry-server" Mar 09 03:36:00 crc kubenswrapper[4901]: E0309 03:36:00.227449 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1475a47-1760-4892-98ee-3efb3bc6741a" containerName="extract-content" Mar 09 03:36:00 crc kubenswrapper[4901]: I0309 03:36:00.227461 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1475a47-1760-4892-98ee-3efb3bc6741a" containerName="extract-content" Mar 09 03:36:00 crc kubenswrapper[4901]: I0309 03:36:00.227713 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1475a47-1760-4892-98ee-3efb3bc6741a" containerName="registry-server" Mar 09 03:36:00 crc kubenswrapper[4901]: I0309 03:36:00.227745 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="0eb1deb1-107d-4022-9922-1591cf4e406a" containerName="registry-server" Mar 09 03:36:00 crc kubenswrapper[4901]: I0309 03:36:00.228500 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550456-jrgmq" Mar 09 03:36:00 crc kubenswrapper[4901]: I0309 03:36:00.231911 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 03:36:00 crc kubenswrapper[4901]: I0309 03:36:00.231937 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 03:36:00 crc kubenswrapper[4901]: I0309 03:36:00.231921 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lq6vs" Mar 09 03:36:00 crc kubenswrapper[4901]: I0309 03:36:00.247310 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550456-jrgmq"] Mar 09 03:36:00 crc kubenswrapper[4901]: I0309 03:36:00.385132 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsfjw\" (UniqueName: \"kubernetes.io/projected/c947ff8e-466c-4494-b50f-3ee72935d020-kube-api-access-bsfjw\") pod \"auto-csr-approver-29550456-jrgmq\" (UID: \"c947ff8e-466c-4494-b50f-3ee72935d020\") " pod="openshift-infra/auto-csr-approver-29550456-jrgmq" Mar 09 03:36:00 crc kubenswrapper[4901]: I0309 03:36:00.486667 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsfjw\" (UniqueName: \"kubernetes.io/projected/c947ff8e-466c-4494-b50f-3ee72935d020-kube-api-access-bsfjw\") pod \"auto-csr-approver-29550456-jrgmq\" (UID: \"c947ff8e-466c-4494-b50f-3ee72935d020\") " pod="openshift-infra/auto-csr-approver-29550456-jrgmq" Mar 09 03:36:00 crc kubenswrapper[4901]: I0309 03:36:00.518134 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsfjw\" (UniqueName: \"kubernetes.io/projected/c947ff8e-466c-4494-b50f-3ee72935d020-kube-api-access-bsfjw\") pod \"auto-csr-approver-29550456-jrgmq\" (UID: \"c947ff8e-466c-4494-b50f-3ee72935d020\") " pod="openshift-infra/auto-csr-approver-29550456-jrgmq" Mar 09 03:36:00 crc kubenswrapper[4901]: I0309 03:36:00.562915 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550456-jrgmq" Mar 09 03:36:01 crc kubenswrapper[4901]: I0309 03:36:00.863445 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550456-jrgmq"] Mar 09 03:36:01 crc kubenswrapper[4901]: I0309 03:36:00.869549 4901 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 03:36:01 crc kubenswrapper[4901]: I0309 03:36:00.968599 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550456-jrgmq" event={"ID":"c947ff8e-466c-4494-b50f-3ee72935d020","Type":"ContainerStarted","Data":"f4d0b6a3b288350e743217e467bb0f4076fa9e9d5927eddd9ec95321688d353d"} Mar 09 03:36:02 crc kubenswrapper[4901]: I0309 03:36:02.992507 4901 generic.go:334] "Generic (PLEG): container finished" podID="c947ff8e-466c-4494-b50f-3ee72935d020" containerID="f520af7cb0e5e0c448b95b3982d9002ebfd3b9a21bae6adc977dcee76e8ee446" exitCode=0 Mar 09 03:36:02 crc kubenswrapper[4901]: I0309 03:36:02.992588 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550456-jrgmq" event={"ID":"c947ff8e-466c-4494-b50f-3ee72935d020","Type":"ContainerDied","Data":"f520af7cb0e5e0c448b95b3982d9002ebfd3b9a21bae6adc977dcee76e8ee446"} Mar 09 03:36:04 crc kubenswrapper[4901]: I0309 03:36:04.349040 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550456-jrgmq" Mar 09 03:36:04 crc kubenswrapper[4901]: I0309 03:36:04.451357 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsfjw\" (UniqueName: \"kubernetes.io/projected/c947ff8e-466c-4494-b50f-3ee72935d020-kube-api-access-bsfjw\") pod \"c947ff8e-466c-4494-b50f-3ee72935d020\" (UID: \"c947ff8e-466c-4494-b50f-3ee72935d020\") " Mar 09 03:36:04 crc kubenswrapper[4901]: I0309 03:36:04.459108 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c947ff8e-466c-4494-b50f-3ee72935d020-kube-api-access-bsfjw" (OuterVolumeSpecName: "kube-api-access-bsfjw") pod "c947ff8e-466c-4494-b50f-3ee72935d020" (UID: "c947ff8e-466c-4494-b50f-3ee72935d020"). InnerVolumeSpecName "kube-api-access-bsfjw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:36:04 crc kubenswrapper[4901]: I0309 03:36:04.553508 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsfjw\" (UniqueName: \"kubernetes.io/projected/c947ff8e-466c-4494-b50f-3ee72935d020-kube-api-access-bsfjw\") on node \"crc\" DevicePath \"\"" Mar 09 03:36:05 crc kubenswrapper[4901]: I0309 03:36:05.017470 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550456-jrgmq" event={"ID":"c947ff8e-466c-4494-b50f-3ee72935d020","Type":"ContainerDied","Data":"f4d0b6a3b288350e743217e467bb0f4076fa9e9d5927eddd9ec95321688d353d"} Mar 09 03:36:05 crc kubenswrapper[4901]: I0309 03:36:05.017528 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4d0b6a3b288350e743217e467bb0f4076fa9e9d5927eddd9ec95321688d353d" Mar 09 03:36:05 crc kubenswrapper[4901]: I0309 03:36:05.017548 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550456-jrgmq" Mar 09 03:36:05 crc kubenswrapper[4901]: I0309 03:36:05.448377 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550450-tj9sv"] Mar 09 03:36:05 crc kubenswrapper[4901]: I0309 03:36:05.461338 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550450-tj9sv"] Mar 09 03:36:06 crc kubenswrapper[4901]: I0309 03:36:06.122031 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fba0d69-8ec5-4a56-8486-db533ab566ec" path="/var/lib/kubelet/pods/3fba0d69-8ec5-4a56-8486-db533ab566ec/volumes" Mar 09 03:36:10 crc kubenswrapper[4901]: I0309 03:36:10.107483 4901 scope.go:117] "RemoveContainer" containerID="4e595b8a3bc8456f295eac6997fdba3d4dfa8618a5fe46f9ea1dce83a5d6f9be" Mar 09 03:36:10 crc kubenswrapper[4901]: E0309 03:36:10.109432 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:36:24 crc kubenswrapper[4901]: I0309 03:36:24.107270 4901 scope.go:117] "RemoveContainer" containerID="4e595b8a3bc8456f295eac6997fdba3d4dfa8618a5fe46f9ea1dce83a5d6f9be" Mar 09 03:36:24 crc kubenswrapper[4901]: E0309 03:36:24.109031 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:36:35 crc kubenswrapper[4901]: I0309 03:36:35.106120 4901 scope.go:117] "RemoveContainer" containerID="4e595b8a3bc8456f295eac6997fdba3d4dfa8618a5fe46f9ea1dce83a5d6f9be" Mar 09 03:36:35 crc kubenswrapper[4901]: E0309 03:36:35.107131 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:36:39 crc kubenswrapper[4901]: I0309 03:36:39.632559 4901 scope.go:117] "RemoveContainer" containerID="860adf32ce83cf817b6966b87e414ece86915ccb41cbd993e7eb57e154dad9c6" Mar 09 03:36:47 crc kubenswrapper[4901]: I0309 03:36:47.107026 4901 scope.go:117] "RemoveContainer" containerID="4e595b8a3bc8456f295eac6997fdba3d4dfa8618a5fe46f9ea1dce83a5d6f9be" Mar 09 03:36:47 crc kubenswrapper[4901]: E0309 03:36:47.108045 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:37:01 crc kubenswrapper[4901]: I0309 03:37:01.106638 4901 scope.go:117] "RemoveContainer" containerID="4e595b8a3bc8456f295eac6997fdba3d4dfa8618a5fe46f9ea1dce83a5d6f9be" Mar 09 03:37:01 crc kubenswrapper[4901]: E0309 03:37:01.107773 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:37:14 crc kubenswrapper[4901]: I0309 03:37:14.106442 4901 scope.go:117] "RemoveContainer" containerID="4e595b8a3bc8456f295eac6997fdba3d4dfa8618a5fe46f9ea1dce83a5d6f9be" Mar 09 03:37:14 crc kubenswrapper[4901]: E0309 03:37:14.107445 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:37:26 crc kubenswrapper[4901]: I0309 03:37:26.113883 4901 scope.go:117] "RemoveContainer" containerID="4e595b8a3bc8456f295eac6997fdba3d4dfa8618a5fe46f9ea1dce83a5d6f9be" Mar 09 03:37:26 crc kubenswrapper[4901]: E0309 03:37:26.115637 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:37:39 crc kubenswrapper[4901]: I0309 03:37:39.105994 4901 scope.go:117] "RemoveContainer" containerID="4e595b8a3bc8456f295eac6997fdba3d4dfa8618a5fe46f9ea1dce83a5d6f9be" Mar 09 03:37:39 crc kubenswrapper[4901]: E0309 03:37:39.107008 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:37:51 crc kubenswrapper[4901]: I0309 03:37:51.106024 4901 scope.go:117] "RemoveContainer" containerID="4e595b8a3bc8456f295eac6997fdba3d4dfa8618a5fe46f9ea1dce83a5d6f9be" Mar 09 03:37:51 crc kubenswrapper[4901]: E0309 03:37:51.106958 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:38:00 crc kubenswrapper[4901]: I0309 03:38:00.158529 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550458-fzptb"] Mar 09 03:38:00 crc kubenswrapper[4901]: E0309 03:38:00.159646 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c947ff8e-466c-4494-b50f-3ee72935d020" containerName="oc" Mar 09 03:38:00 crc kubenswrapper[4901]: I0309 03:38:00.159664 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="c947ff8e-466c-4494-b50f-3ee72935d020" containerName="oc" Mar 09 03:38:00 crc kubenswrapper[4901]: I0309 03:38:00.159898 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="c947ff8e-466c-4494-b50f-3ee72935d020" containerName="oc" Mar 09 03:38:00 crc kubenswrapper[4901]: I0309 03:38:00.160775 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550458-fzptb" Mar 09 03:38:00 crc kubenswrapper[4901]: I0309 03:38:00.163723 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lq6vs" Mar 09 03:38:00 crc kubenswrapper[4901]: I0309 03:38:00.169085 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 03:38:00 crc kubenswrapper[4901]: I0309 03:38:00.169093 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 03:38:00 crc kubenswrapper[4901]: I0309 03:38:00.183943 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550458-fzptb"] Mar 09 03:38:00 crc kubenswrapper[4901]: I0309 03:38:00.256278 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26m7j\" (UniqueName: \"kubernetes.io/projected/fc4c9e9f-d439-4994-8ac0-58713392ec57-kube-api-access-26m7j\") pod \"auto-csr-approver-29550458-fzptb\" (UID: \"fc4c9e9f-d439-4994-8ac0-58713392ec57\") " pod="openshift-infra/auto-csr-approver-29550458-fzptb" Mar 09 03:38:00 crc kubenswrapper[4901]: I0309 03:38:00.358394 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26m7j\" (UniqueName: \"kubernetes.io/projected/fc4c9e9f-d439-4994-8ac0-58713392ec57-kube-api-access-26m7j\") pod \"auto-csr-approver-29550458-fzptb\" (UID: \"fc4c9e9f-d439-4994-8ac0-58713392ec57\") " pod="openshift-infra/auto-csr-approver-29550458-fzptb" Mar 09 03:38:00 crc kubenswrapper[4901]: I0309 03:38:00.381862 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26m7j\" (UniqueName: \"kubernetes.io/projected/fc4c9e9f-d439-4994-8ac0-58713392ec57-kube-api-access-26m7j\") pod \"auto-csr-approver-29550458-fzptb\" (UID: \"fc4c9e9f-d439-4994-8ac0-58713392ec57\") " pod="openshift-infra/auto-csr-approver-29550458-fzptb" Mar 09 03:38:00 crc kubenswrapper[4901]: I0309 03:38:00.493530 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550458-fzptb" Mar 09 03:38:00 crc kubenswrapper[4901]: I0309 03:38:00.822452 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550458-fzptb"] Mar 09 03:38:01 crc kubenswrapper[4901]: I0309 03:38:01.098646 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550458-fzptb" event={"ID":"fc4c9e9f-d439-4994-8ac0-58713392ec57","Type":"ContainerStarted","Data":"5ece90d2d62413007df384b154fc438c97f19db1acee17d5ced4faba63bdd1df"} Mar 09 03:38:03 crc kubenswrapper[4901]: I0309 03:38:03.107049 4901 scope.go:117] "RemoveContainer" containerID="4e595b8a3bc8456f295eac6997fdba3d4dfa8618a5fe46f9ea1dce83a5d6f9be" Mar 09 03:38:03 crc kubenswrapper[4901]: E0309 03:38:03.107810 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:38:03 crc kubenswrapper[4901]: I0309 03:38:03.139131 4901 generic.go:334] "Generic (PLEG): container finished" podID="fc4c9e9f-d439-4994-8ac0-58713392ec57" containerID="b3c104c29b028fff48a2227ef633803b64f801389653e5449fbf7e2b756b3c22" exitCode=0 Mar 09 03:38:03 crc kubenswrapper[4901]: I0309 03:38:03.139199 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550458-fzptb" event={"ID":"fc4c9e9f-d439-4994-8ac0-58713392ec57","Type":"ContainerDied","Data":"b3c104c29b028fff48a2227ef633803b64f801389653e5449fbf7e2b756b3c22"} Mar 09 03:38:04 crc kubenswrapper[4901]: I0309 03:38:04.568156 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550458-fzptb" Mar 09 03:38:04 crc kubenswrapper[4901]: I0309 03:38:04.725322 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26m7j\" (UniqueName: \"kubernetes.io/projected/fc4c9e9f-d439-4994-8ac0-58713392ec57-kube-api-access-26m7j\") pod \"fc4c9e9f-d439-4994-8ac0-58713392ec57\" (UID: \"fc4c9e9f-d439-4994-8ac0-58713392ec57\") " Mar 09 03:38:04 crc kubenswrapper[4901]: I0309 03:38:04.733547 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc4c9e9f-d439-4994-8ac0-58713392ec57-kube-api-access-26m7j" (OuterVolumeSpecName: "kube-api-access-26m7j") pod "fc4c9e9f-d439-4994-8ac0-58713392ec57" (UID: "fc4c9e9f-d439-4994-8ac0-58713392ec57"). InnerVolumeSpecName "kube-api-access-26m7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:38:04 crc kubenswrapper[4901]: I0309 03:38:04.828328 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26m7j\" (UniqueName: \"kubernetes.io/projected/fc4c9e9f-d439-4994-8ac0-58713392ec57-kube-api-access-26m7j\") on node \"crc\" DevicePath \"\"" Mar 09 03:38:05 crc kubenswrapper[4901]: I0309 03:38:05.163145 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550458-fzptb" event={"ID":"fc4c9e9f-d439-4994-8ac0-58713392ec57","Type":"ContainerDied","Data":"5ece90d2d62413007df384b154fc438c97f19db1acee17d5ced4faba63bdd1df"} Mar 09 03:38:05 crc kubenswrapper[4901]: I0309 03:38:05.163199 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ece90d2d62413007df384b154fc438c97f19db1acee17d5ced4faba63bdd1df" Mar 09 03:38:05 crc kubenswrapper[4901]: I0309 03:38:05.163262 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550458-fzptb" Mar 09 03:38:05 crc kubenswrapper[4901]: I0309 03:38:05.681215 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550452-4sxfr"] Mar 09 03:38:05 crc kubenswrapper[4901]: I0309 03:38:05.691540 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550452-4sxfr"] Mar 09 03:38:06 crc kubenswrapper[4901]: I0309 03:38:06.122583 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9aa50188-8fc6-4944-8be3-9d8056584741" path="/var/lib/kubelet/pods/9aa50188-8fc6-4944-8be3-9d8056584741/volumes" Mar 09 03:38:18 crc kubenswrapper[4901]: I0309 03:38:18.108122 4901 scope.go:117] "RemoveContainer" containerID="4e595b8a3bc8456f295eac6997fdba3d4dfa8618a5fe46f9ea1dce83a5d6f9be" Mar 09 03:38:18 crc kubenswrapper[4901]: E0309 03:38:18.109087 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:38:30 crc kubenswrapper[4901]: I0309 03:38:30.106438 4901 scope.go:117] "RemoveContainer" containerID="4e595b8a3bc8456f295eac6997fdba3d4dfa8618a5fe46f9ea1dce83a5d6f9be" Mar 09 03:38:30 crc kubenswrapper[4901]: E0309 03:38:30.107419 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:38:39 crc kubenswrapper[4901]: I0309 03:38:39.720425 4901 scope.go:117] "RemoveContainer" containerID="1d27618494908526afd1286a15d708f6ad8a48e3f32445bca208dbf02c3d938b" Mar 09 03:38:43 crc kubenswrapper[4901]: I0309 03:38:43.106909 4901 scope.go:117] "RemoveContainer" containerID="4e595b8a3bc8456f295eac6997fdba3d4dfa8618a5fe46f9ea1dce83a5d6f9be" Mar 09 03:38:43 crc kubenswrapper[4901]: E0309 03:38:43.107926 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:38:57 crc kubenswrapper[4901]: I0309 03:38:57.106866 4901 scope.go:117] "RemoveContainer" containerID="4e595b8a3bc8456f295eac6997fdba3d4dfa8618a5fe46f9ea1dce83a5d6f9be" Mar 09 03:38:57 crc kubenswrapper[4901]: E0309 03:38:57.107553 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:39:11 crc kubenswrapper[4901]: I0309 03:39:11.106457 4901 scope.go:117] "RemoveContainer" containerID="4e595b8a3bc8456f295eac6997fdba3d4dfa8618a5fe46f9ea1dce83a5d6f9be" Mar 09 03:39:11 crc kubenswrapper[4901]: E0309 03:39:11.107623 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:39:26 crc kubenswrapper[4901]: I0309 03:39:26.115736 4901 scope.go:117] "RemoveContainer" containerID="4e595b8a3bc8456f295eac6997fdba3d4dfa8618a5fe46f9ea1dce83a5d6f9be" Mar 09 03:39:26 crc kubenswrapper[4901]: E0309 03:39:26.117947 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:39:39 crc kubenswrapper[4901]: I0309 03:39:39.107424 4901 scope.go:117] "RemoveContainer" containerID="4e595b8a3bc8456f295eac6997fdba3d4dfa8618a5fe46f9ea1dce83a5d6f9be" Mar 09 03:39:39 crc kubenswrapper[4901]: E0309 03:39:39.108564 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:39:54 crc kubenswrapper[4901]: I0309 03:39:54.106696 4901 scope.go:117] "RemoveContainer" containerID="4e595b8a3bc8456f295eac6997fdba3d4dfa8618a5fe46f9ea1dce83a5d6f9be" Mar 09 03:39:54 crc kubenswrapper[4901]: E0309 03:39:54.107654 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:40:00 crc kubenswrapper[4901]: I0309 03:40:00.175041 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550460-d7sfz"] Mar 09 03:40:00 crc kubenswrapper[4901]: E0309 03:40:00.175853 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc4c9e9f-d439-4994-8ac0-58713392ec57" containerName="oc" Mar 09 03:40:00 crc kubenswrapper[4901]: I0309 03:40:00.175876 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc4c9e9f-d439-4994-8ac0-58713392ec57" containerName="oc" Mar 09 03:40:00 crc kubenswrapper[4901]: I0309 03:40:00.176210 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc4c9e9f-d439-4994-8ac0-58713392ec57" containerName="oc" Mar 09 03:40:00 crc kubenswrapper[4901]: I0309 03:40:00.177006 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550460-d7sfz" Mar 09 03:40:00 crc kubenswrapper[4901]: I0309 03:40:00.185478 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 03:40:00 crc kubenswrapper[4901]: I0309 03:40:00.186126 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 03:40:00 crc kubenswrapper[4901]: I0309 03:40:00.186207 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lq6vs" Mar 09 03:40:00 crc kubenswrapper[4901]: I0309 03:40:00.186586 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550460-d7sfz"] Mar 09 03:40:00 crc kubenswrapper[4901]: I0309 03:40:00.308754 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gnsz\" (UniqueName: \"kubernetes.io/projected/f6f29225-f0d0-47b2-8817-0cf513c7f9fb-kube-api-access-7gnsz\") pod \"auto-csr-approver-29550460-d7sfz\" (UID: \"f6f29225-f0d0-47b2-8817-0cf513c7f9fb\") " pod="openshift-infra/auto-csr-approver-29550460-d7sfz" Mar 09 03:40:00 crc kubenswrapper[4901]: I0309 03:40:00.410826 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gnsz\" (UniqueName: \"kubernetes.io/projected/f6f29225-f0d0-47b2-8817-0cf513c7f9fb-kube-api-access-7gnsz\") pod \"auto-csr-approver-29550460-d7sfz\" (UID: \"f6f29225-f0d0-47b2-8817-0cf513c7f9fb\") " pod="openshift-infra/auto-csr-approver-29550460-d7sfz" Mar 09 03:40:00 crc kubenswrapper[4901]: I0309 03:40:00.436446 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gnsz\" (UniqueName: \"kubernetes.io/projected/f6f29225-f0d0-47b2-8817-0cf513c7f9fb-kube-api-access-7gnsz\") pod \"auto-csr-approver-29550460-d7sfz\" (UID: \"f6f29225-f0d0-47b2-8817-0cf513c7f9fb\") " pod="openshift-infra/auto-csr-approver-29550460-d7sfz" Mar 09 03:40:00 crc kubenswrapper[4901]: I0309 03:40:00.498039 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550460-d7sfz" Mar 09 03:40:00 crc kubenswrapper[4901]: I0309 03:40:00.997481 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550460-d7sfz"] Mar 09 03:40:01 crc kubenswrapper[4901]: I0309 03:40:01.231956 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550460-d7sfz" event={"ID":"f6f29225-f0d0-47b2-8817-0cf513c7f9fb","Type":"ContainerStarted","Data":"b853a96d15fbcc7322145fc9738ef2fdad914aff3ddd8ca3fcf55ddbddc59853"} Mar 09 03:40:03 crc kubenswrapper[4901]: I0309 03:40:03.257328 4901 generic.go:334] "Generic (PLEG): container finished" podID="f6f29225-f0d0-47b2-8817-0cf513c7f9fb" containerID="31645a9ab252414bee756c22fe7cffb652c70f26b63dee87cf21db0ea5f7250b" exitCode=0 Mar 09 03:40:03 crc kubenswrapper[4901]: I0309 03:40:03.257418 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550460-d7sfz" event={"ID":"f6f29225-f0d0-47b2-8817-0cf513c7f9fb","Type":"ContainerDied","Data":"31645a9ab252414bee756c22fe7cffb652c70f26b63dee87cf21db0ea5f7250b"} Mar 09 03:40:04 crc kubenswrapper[4901]: I0309 03:40:04.620687 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550460-d7sfz" Mar 09 03:40:04 crc kubenswrapper[4901]: I0309 03:40:04.793287 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gnsz\" (UniqueName: \"kubernetes.io/projected/f6f29225-f0d0-47b2-8817-0cf513c7f9fb-kube-api-access-7gnsz\") pod \"f6f29225-f0d0-47b2-8817-0cf513c7f9fb\" (UID: \"f6f29225-f0d0-47b2-8817-0cf513c7f9fb\") " Mar 09 03:40:04 crc kubenswrapper[4901]: I0309 03:40:04.801786 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6f29225-f0d0-47b2-8817-0cf513c7f9fb-kube-api-access-7gnsz" (OuterVolumeSpecName: "kube-api-access-7gnsz") pod "f6f29225-f0d0-47b2-8817-0cf513c7f9fb" (UID: "f6f29225-f0d0-47b2-8817-0cf513c7f9fb"). InnerVolumeSpecName "kube-api-access-7gnsz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:40:04 crc kubenswrapper[4901]: I0309 03:40:04.895126 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gnsz\" (UniqueName: \"kubernetes.io/projected/f6f29225-f0d0-47b2-8817-0cf513c7f9fb-kube-api-access-7gnsz\") on node \"crc\" DevicePath \"\"" Mar 09 03:40:05 crc kubenswrapper[4901]: I0309 03:40:05.283253 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550460-d7sfz" event={"ID":"f6f29225-f0d0-47b2-8817-0cf513c7f9fb","Type":"ContainerDied","Data":"b853a96d15fbcc7322145fc9738ef2fdad914aff3ddd8ca3fcf55ddbddc59853"} Mar 09 03:40:05 crc kubenswrapper[4901]: I0309 03:40:05.283322 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b853a96d15fbcc7322145fc9738ef2fdad914aff3ddd8ca3fcf55ddbddc59853" Mar 09 03:40:05 crc kubenswrapper[4901]: I0309 03:40:05.283408 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550460-d7sfz" Mar 09 03:40:05 crc kubenswrapper[4901]: I0309 03:40:05.714899 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550454-l6d8p"] Mar 09 03:40:05 crc kubenswrapper[4901]: I0309 03:40:05.726244 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550454-l6d8p"] Mar 09 03:40:06 crc kubenswrapper[4901]: I0309 03:40:06.119705 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="644165d2-91ab-413a-b9e0-45ea35318943" path="/var/lib/kubelet/pods/644165d2-91ab-413a-b9e0-45ea35318943/volumes" Mar 09 03:40:07 crc kubenswrapper[4901]: I0309 03:40:07.106520 4901 scope.go:117] "RemoveContainer" containerID="4e595b8a3bc8456f295eac6997fdba3d4dfa8618a5fe46f9ea1dce83a5d6f9be" Mar 09 03:40:07 crc kubenswrapper[4901]: I0309 03:40:07.299533 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" event={"ID":"65e722e8-52c4-4bb6-9927-f378b2f7296a","Type":"ContainerStarted","Data":"f516d08805ad123344b125ee30303c001e014432b07ec580df27cb03e8215ae7"} Mar 09 03:40:39 crc kubenswrapper[4901]: I0309 03:40:39.833552 4901 scope.go:117] "RemoveContainer" containerID="ab99b2238489620ae0271f3149ca6725e2f23b6049904ff499b7864bc2d5e6f0" Mar 09 03:42:00 crc kubenswrapper[4901]: I0309 03:42:00.175535 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550462-mk9cz"] Mar 09 03:42:00 crc kubenswrapper[4901]: E0309 03:42:00.176624 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6f29225-f0d0-47b2-8817-0cf513c7f9fb" containerName="oc" Mar 09 03:42:00 crc kubenswrapper[4901]: I0309 03:42:00.176643 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6f29225-f0d0-47b2-8817-0cf513c7f9fb" containerName="oc" Mar 09 03:42:00 crc kubenswrapper[4901]: I0309 03:42:00.176875 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6f29225-f0d0-47b2-8817-0cf513c7f9fb" containerName="oc" Mar 09 03:42:00 crc kubenswrapper[4901]: I0309 03:42:00.177559 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550462-mk9cz" Mar 09 03:42:00 crc kubenswrapper[4901]: I0309 03:42:00.180285 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lq6vs" Mar 09 03:42:00 crc kubenswrapper[4901]: I0309 03:42:00.180864 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 03:42:00 crc kubenswrapper[4901]: I0309 03:42:00.181066 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 03:42:00 crc kubenswrapper[4901]: I0309 03:42:00.196328 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550462-mk9cz"] Mar 09 03:42:00 crc kubenswrapper[4901]: I0309 03:42:00.246194 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbbgl\" (UniqueName: \"kubernetes.io/projected/35f6249c-909a-43ab-a1b4-b33e9473c5e8-kube-api-access-bbbgl\") pod \"auto-csr-approver-29550462-mk9cz\" (UID: \"35f6249c-909a-43ab-a1b4-b33e9473c5e8\") " pod="openshift-infra/auto-csr-approver-29550462-mk9cz" Mar 09 03:42:00 crc kubenswrapper[4901]: I0309 03:42:00.348089 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbbgl\" (UniqueName: \"kubernetes.io/projected/35f6249c-909a-43ab-a1b4-b33e9473c5e8-kube-api-access-bbbgl\") pod \"auto-csr-approver-29550462-mk9cz\" (UID: \"35f6249c-909a-43ab-a1b4-b33e9473c5e8\") " pod="openshift-infra/auto-csr-approver-29550462-mk9cz" Mar 09 03:42:00 crc kubenswrapper[4901]: I0309 03:42:00.390448 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbbgl\" (UniqueName: \"kubernetes.io/projected/35f6249c-909a-43ab-a1b4-b33e9473c5e8-kube-api-access-bbbgl\") pod \"auto-csr-approver-29550462-mk9cz\" (UID: \"35f6249c-909a-43ab-a1b4-b33e9473c5e8\") " pod="openshift-infra/auto-csr-approver-29550462-mk9cz" Mar 09 03:42:00 crc kubenswrapper[4901]: I0309 03:42:00.519350 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550462-mk9cz" Mar 09 03:42:01 crc kubenswrapper[4901]: I0309 03:42:01.053271 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550462-mk9cz"] Mar 09 03:42:01 crc kubenswrapper[4901]: I0309 03:42:01.064952 4901 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 03:42:01 crc kubenswrapper[4901]: I0309 03:42:01.410050 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550462-mk9cz" event={"ID":"35f6249c-909a-43ab-a1b4-b33e9473c5e8","Type":"ContainerStarted","Data":"865b05040fdb1539cbc1b2dfc0d5880a8a9c1b3384e9e482e117fd9caf16b018"} Mar 09 03:42:03 crc kubenswrapper[4901]: I0309 03:42:03.430735 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550462-mk9cz" event={"ID":"35f6249c-909a-43ab-a1b4-b33e9473c5e8","Type":"ContainerStarted","Data":"4cd3269bf71ec5cc1ac624c14e83b983a53eef7e3afe963693716fbe54b0fd54"} Mar 09 03:42:03 crc kubenswrapper[4901]: I0309 03:42:03.473128 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550462-mk9cz" podStartSLOduration=1.6145775439999999 podStartE2EDuration="3.473109611s" podCreationTimestamp="2026-03-09 03:42:00 +0000 UTC" firstStartedPulling="2026-03-09 03:42:01.064529889 +0000 UTC m=+3645.654193651" lastFinishedPulling="2026-03-09 03:42:02.923061956 +0000 UTC m=+3647.512725718" observedRunningTime="2026-03-09 03:42:03.47156892 +0000 UTC m=+3648.061232652" watchObservedRunningTime="2026-03-09 03:42:03.473109611 +0000 UTC m=+3648.062773343" Mar 09 03:42:04 crc kubenswrapper[4901]: I0309 03:42:04.442887 4901 generic.go:334] "Generic (PLEG): container finished" podID="35f6249c-909a-43ab-a1b4-b33e9473c5e8" containerID="4cd3269bf71ec5cc1ac624c14e83b983a53eef7e3afe963693716fbe54b0fd54" exitCode=0 Mar 09 03:42:04 crc kubenswrapper[4901]: I0309 03:42:04.442949 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550462-mk9cz" event={"ID":"35f6249c-909a-43ab-a1b4-b33e9473c5e8","Type":"ContainerDied","Data":"4cd3269bf71ec5cc1ac624c14e83b983a53eef7e3afe963693716fbe54b0fd54"} Mar 09 03:42:05 crc kubenswrapper[4901]: I0309 03:42:05.798074 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550462-mk9cz" Mar 09 03:42:05 crc kubenswrapper[4901]: I0309 03:42:05.843947 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbbgl\" (UniqueName: \"kubernetes.io/projected/35f6249c-909a-43ab-a1b4-b33e9473c5e8-kube-api-access-bbbgl\") pod \"35f6249c-909a-43ab-a1b4-b33e9473c5e8\" (UID: \"35f6249c-909a-43ab-a1b4-b33e9473c5e8\") " Mar 09 03:42:05 crc kubenswrapper[4901]: I0309 03:42:05.852696 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35f6249c-909a-43ab-a1b4-b33e9473c5e8-kube-api-access-bbbgl" (OuterVolumeSpecName: "kube-api-access-bbbgl") pod "35f6249c-909a-43ab-a1b4-b33e9473c5e8" (UID: "35f6249c-909a-43ab-a1b4-b33e9473c5e8"). InnerVolumeSpecName "kube-api-access-bbbgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:42:05 crc kubenswrapper[4901]: I0309 03:42:05.946350 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbbgl\" (UniqueName: \"kubernetes.io/projected/35f6249c-909a-43ab-a1b4-b33e9473c5e8-kube-api-access-bbbgl\") on node \"crc\" DevicePath \"\"" Mar 09 03:42:06 crc kubenswrapper[4901]: I0309 03:42:06.466959 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550462-mk9cz" event={"ID":"35f6249c-909a-43ab-a1b4-b33e9473c5e8","Type":"ContainerDied","Data":"865b05040fdb1539cbc1b2dfc0d5880a8a9c1b3384e9e482e117fd9caf16b018"} Mar 09 03:42:06 crc kubenswrapper[4901]: I0309 03:42:06.467032 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="865b05040fdb1539cbc1b2dfc0d5880a8a9c1b3384e9e482e117fd9caf16b018" Mar 09 03:42:06 crc kubenswrapper[4901]: I0309 03:42:06.467037 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550462-mk9cz" Mar 09 03:42:06 crc kubenswrapper[4901]: I0309 03:42:06.901459 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550456-jrgmq"] Mar 09 03:42:06 crc kubenswrapper[4901]: I0309 03:42:06.910002 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550456-jrgmq"] Mar 09 03:42:08 crc kubenswrapper[4901]: I0309 03:42:08.123161 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c947ff8e-466c-4494-b50f-3ee72935d020" path="/var/lib/kubelet/pods/c947ff8e-466c-4494-b50f-3ee72935d020/volumes" Mar 09 03:42:30 crc kubenswrapper[4901]: I0309 03:42:30.862878 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 03:42:30 crc kubenswrapper[4901]: I0309 03:42:30.863615 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 03:42:39 crc kubenswrapper[4901]: I0309 03:42:39.948415 4901 scope.go:117] "RemoveContainer" containerID="f520af7cb0e5e0c448b95b3982d9002ebfd3b9a21bae6adc977dcee76e8ee446" Mar 09 03:43:00 crc kubenswrapper[4901]: I0309 03:43:00.862941 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 03:43:00 crc kubenswrapper[4901]: I0309 03:43:00.863612 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 03:43:30 crc kubenswrapper[4901]: I0309 03:43:30.862954 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 03:43:30 crc kubenswrapper[4901]: I0309 03:43:30.863482 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 03:43:30 crc kubenswrapper[4901]: I0309 03:43:30.863540 4901 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5c998" Mar 09 03:43:30 crc kubenswrapper[4901]: I0309 03:43:30.864178 4901 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f516d08805ad123344b125ee30303c001e014432b07ec580df27cb03e8215ae7"} pod="openshift-machine-config-operator/machine-config-daemon-5c998" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 03:43:30 crc kubenswrapper[4901]: I0309 03:43:30.864255 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" containerID="cri-o://f516d08805ad123344b125ee30303c001e014432b07ec580df27cb03e8215ae7" gracePeriod=600 Mar 09 03:43:31 crc kubenswrapper[4901]: I0309 03:43:31.335995 4901 generic.go:334] "Generic (PLEG): container finished" podID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerID="f516d08805ad123344b125ee30303c001e014432b07ec580df27cb03e8215ae7" exitCode=0 Mar 09 03:43:31 crc kubenswrapper[4901]: I0309 03:43:31.336024 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" event={"ID":"65e722e8-52c4-4bb6-9927-f378b2f7296a","Type":"ContainerDied","Data":"f516d08805ad123344b125ee30303c001e014432b07ec580df27cb03e8215ae7"} Mar 09 03:43:31 crc kubenswrapper[4901]: I0309 03:43:31.336424 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" event={"ID":"65e722e8-52c4-4bb6-9927-f378b2f7296a","Type":"ContainerStarted","Data":"e6a90193a69cced7a1ad9274aa0b99053439ef573ce6ab6a66d6ddefe76fb026"} Mar 09 03:43:31 crc kubenswrapper[4901]: I0309 03:43:31.336448 4901 scope.go:117] "RemoveContainer" containerID="4e595b8a3bc8456f295eac6997fdba3d4dfa8618a5fe46f9ea1dce83a5d6f9be" Mar 09 03:44:00 crc kubenswrapper[4901]: I0309 03:44:00.163323 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550464-tktjd"] Mar 09 03:44:00 crc kubenswrapper[4901]: E0309 03:44:00.164598 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35f6249c-909a-43ab-a1b4-b33e9473c5e8" containerName="oc" Mar 09 03:44:00 crc kubenswrapper[4901]: I0309 03:44:00.164759 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="35f6249c-909a-43ab-a1b4-b33e9473c5e8" containerName="oc" Mar 09 03:44:00 crc kubenswrapper[4901]: I0309 03:44:00.165061 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="35f6249c-909a-43ab-a1b4-b33e9473c5e8" containerName="oc" Mar 09 03:44:00 crc kubenswrapper[4901]: I0309 03:44:00.166094 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550464-tktjd" Mar 09 03:44:00 crc kubenswrapper[4901]: I0309 03:44:00.170411 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 03:44:00 crc kubenswrapper[4901]: I0309 03:44:00.170431 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 03:44:00 crc kubenswrapper[4901]: I0309 03:44:00.170616 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lq6vs" Mar 09 03:44:00 crc kubenswrapper[4901]: I0309 03:44:00.175776 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550464-tktjd"] Mar 09 03:44:00 crc kubenswrapper[4901]: I0309 03:44:00.270726 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j765p\" (UniqueName: \"kubernetes.io/projected/af508450-04dd-4c49-b595-c84aa1f509ac-kube-api-access-j765p\") pod \"auto-csr-approver-29550464-tktjd\" (UID: \"af508450-04dd-4c49-b595-c84aa1f509ac\") " pod="openshift-infra/auto-csr-approver-29550464-tktjd" Mar 09 03:44:00 crc kubenswrapper[4901]: I0309 03:44:00.372418 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j765p\" (UniqueName: \"kubernetes.io/projected/af508450-04dd-4c49-b595-c84aa1f509ac-kube-api-access-j765p\") pod \"auto-csr-approver-29550464-tktjd\" (UID: \"af508450-04dd-4c49-b595-c84aa1f509ac\") " pod="openshift-infra/auto-csr-approver-29550464-tktjd" Mar 09 03:44:00 crc kubenswrapper[4901]: I0309 03:44:00.394934 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j765p\" (UniqueName: \"kubernetes.io/projected/af508450-04dd-4c49-b595-c84aa1f509ac-kube-api-access-j765p\") pod \"auto-csr-approver-29550464-tktjd\" (UID: \"af508450-04dd-4c49-b595-c84aa1f509ac\") " pod="openshift-infra/auto-csr-approver-29550464-tktjd" Mar 09 03:44:00 crc kubenswrapper[4901]: I0309 03:44:00.499189 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550464-tktjd" Mar 09 03:44:00 crc kubenswrapper[4901]: I0309 03:44:00.807157 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550464-tktjd"] Mar 09 03:44:01 crc kubenswrapper[4901]: I0309 03:44:01.613194 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550464-tktjd" event={"ID":"af508450-04dd-4c49-b595-c84aa1f509ac","Type":"ContainerStarted","Data":"b8a7ba1d4195f6dc35b758fab802eefb2173f6f6596817e10a983828e8419fbf"} Mar 09 03:44:02 crc kubenswrapper[4901]: I0309 03:44:02.625389 4901 generic.go:334] "Generic (PLEG): container finished" podID="af508450-04dd-4c49-b595-c84aa1f509ac" containerID="4b6a5495d7355a42aa7f37ac022b5887c43c84c12dfaf38b9f6818e2159ea48c" exitCode=0 Mar 09 03:44:02 crc kubenswrapper[4901]: I0309 03:44:02.625477 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550464-tktjd" event={"ID":"af508450-04dd-4c49-b595-c84aa1f509ac","Type":"ContainerDied","Data":"4b6a5495d7355a42aa7f37ac022b5887c43c84c12dfaf38b9f6818e2159ea48c"} Mar 09 03:44:04 crc kubenswrapper[4901]: I0309 03:44:04.002471 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550464-tktjd" Mar 09 03:44:04 crc kubenswrapper[4901]: I0309 03:44:04.130028 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j765p\" (UniqueName: \"kubernetes.io/projected/af508450-04dd-4c49-b595-c84aa1f509ac-kube-api-access-j765p\") pod \"af508450-04dd-4c49-b595-c84aa1f509ac\" (UID: \"af508450-04dd-4c49-b595-c84aa1f509ac\") " Mar 09 03:44:04 crc kubenswrapper[4901]: I0309 03:44:04.137317 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af508450-04dd-4c49-b595-c84aa1f509ac-kube-api-access-j765p" (OuterVolumeSpecName: "kube-api-access-j765p") pod "af508450-04dd-4c49-b595-c84aa1f509ac" (UID: "af508450-04dd-4c49-b595-c84aa1f509ac"). InnerVolumeSpecName "kube-api-access-j765p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:44:04 crc kubenswrapper[4901]: I0309 03:44:04.232005 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j765p\" (UniqueName: \"kubernetes.io/projected/af508450-04dd-4c49-b595-c84aa1f509ac-kube-api-access-j765p\") on node \"crc\" DevicePath \"\"" Mar 09 03:44:04 crc kubenswrapper[4901]: I0309 03:44:04.649088 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550464-tktjd" event={"ID":"af508450-04dd-4c49-b595-c84aa1f509ac","Type":"ContainerDied","Data":"b8a7ba1d4195f6dc35b758fab802eefb2173f6f6596817e10a983828e8419fbf"} Mar 09 03:44:04 crc kubenswrapper[4901]: I0309 03:44:04.649162 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8a7ba1d4195f6dc35b758fab802eefb2173f6f6596817e10a983828e8419fbf" Mar 09 03:44:04 crc kubenswrapper[4901]: I0309 03:44:04.649686 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550464-tktjd" Mar 09 03:44:05 crc kubenswrapper[4901]: I0309 03:44:05.106590 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550458-fzptb"] Mar 09 03:44:05 crc kubenswrapper[4901]: I0309 03:44:05.119106 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550458-fzptb"] Mar 09 03:44:06 crc kubenswrapper[4901]: I0309 03:44:06.123445 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc4c9e9f-d439-4994-8ac0-58713392ec57" path="/var/lib/kubelet/pods/fc4c9e9f-d439-4994-8ac0-58713392ec57/volumes" Mar 09 03:44:40 crc kubenswrapper[4901]: I0309 03:44:40.066690 4901 scope.go:117] "RemoveContainer" containerID="b3c104c29b028fff48a2227ef633803b64f801389653e5449fbf7e2b756b3c22" Mar 09 03:45:00 crc kubenswrapper[4901]: I0309 03:45:00.168528 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550465-4qjkd"] Mar 09 03:45:00 crc kubenswrapper[4901]: E0309 03:45:00.169898 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af508450-04dd-4c49-b595-c84aa1f509ac" containerName="oc" Mar 09 03:45:00 crc kubenswrapper[4901]: I0309 03:45:00.169927 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="af508450-04dd-4c49-b595-c84aa1f509ac" containerName="oc" Mar 09 03:45:00 crc kubenswrapper[4901]: I0309 03:45:00.174430 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="af508450-04dd-4c49-b595-c84aa1f509ac" containerName="oc" Mar 09 03:45:00 crc kubenswrapper[4901]: I0309 03:45:00.175515 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550465-4qjkd" Mar 09 03:45:00 crc kubenswrapper[4901]: I0309 03:45:00.179350 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 09 03:45:00 crc kubenswrapper[4901]: I0309 03:45:00.187495 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 09 03:45:00 crc kubenswrapper[4901]: I0309 03:45:00.193724 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550465-4qjkd"] Mar 09 03:45:00 crc kubenswrapper[4901]: I0309 03:45:00.329729 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h65n8\" (UniqueName: \"kubernetes.io/projected/685ee708-3377-434c-b5ec-c7b61822f3e1-kube-api-access-h65n8\") pod \"collect-profiles-29550465-4qjkd\" (UID: \"685ee708-3377-434c-b5ec-c7b61822f3e1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550465-4qjkd" Mar 09 03:45:00 crc kubenswrapper[4901]: I0309 03:45:00.329888 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/685ee708-3377-434c-b5ec-c7b61822f3e1-secret-volume\") pod \"collect-profiles-29550465-4qjkd\" (UID: \"685ee708-3377-434c-b5ec-c7b61822f3e1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550465-4qjkd" Mar 09 03:45:00 crc kubenswrapper[4901]: I0309 03:45:00.330121 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/685ee708-3377-434c-b5ec-c7b61822f3e1-config-volume\") pod \"collect-profiles-29550465-4qjkd\" (UID: \"685ee708-3377-434c-b5ec-c7b61822f3e1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550465-4qjkd" Mar 09 03:45:00 crc kubenswrapper[4901]: I0309 03:45:00.431035 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/685ee708-3377-434c-b5ec-c7b61822f3e1-config-volume\") pod \"collect-profiles-29550465-4qjkd\" (UID: \"685ee708-3377-434c-b5ec-c7b61822f3e1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550465-4qjkd" Mar 09 03:45:00 crc kubenswrapper[4901]: I0309 03:45:00.431132 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h65n8\" (UniqueName: \"kubernetes.io/projected/685ee708-3377-434c-b5ec-c7b61822f3e1-kube-api-access-h65n8\") pod \"collect-profiles-29550465-4qjkd\" (UID: \"685ee708-3377-434c-b5ec-c7b61822f3e1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550465-4qjkd" Mar 09 03:45:00 crc kubenswrapper[4901]: I0309 03:45:00.431180 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/685ee708-3377-434c-b5ec-c7b61822f3e1-secret-volume\") pod \"collect-profiles-29550465-4qjkd\" (UID: \"685ee708-3377-434c-b5ec-c7b61822f3e1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550465-4qjkd" Mar 09 03:45:00 crc kubenswrapper[4901]: I0309 03:45:00.432860 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/685ee708-3377-434c-b5ec-c7b61822f3e1-config-volume\") pod \"collect-profiles-29550465-4qjkd\" (UID: \"685ee708-3377-434c-b5ec-c7b61822f3e1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550465-4qjkd" Mar 09 03:45:00 crc kubenswrapper[4901]: I0309 03:45:00.438119 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/685ee708-3377-434c-b5ec-c7b61822f3e1-secret-volume\") pod \"collect-profiles-29550465-4qjkd\" (UID: \"685ee708-3377-434c-b5ec-c7b61822f3e1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550465-4qjkd" Mar 09 03:45:00 crc kubenswrapper[4901]: I0309 03:45:00.459315 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h65n8\" (UniqueName: \"kubernetes.io/projected/685ee708-3377-434c-b5ec-c7b61822f3e1-kube-api-access-h65n8\") pod \"collect-profiles-29550465-4qjkd\" (UID: \"685ee708-3377-434c-b5ec-c7b61822f3e1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550465-4qjkd" Mar 09 03:45:00 crc kubenswrapper[4901]: I0309 03:45:00.512002 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550465-4qjkd" Mar 09 03:45:00 crc kubenswrapper[4901]: I0309 03:45:00.982532 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550465-4qjkd"] Mar 09 03:45:01 crc kubenswrapper[4901]: I0309 03:45:01.181624 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550465-4qjkd" event={"ID":"685ee708-3377-434c-b5ec-c7b61822f3e1","Type":"ContainerStarted","Data":"278c53b7cd9b6b602f979211b77f19a38dffa551b943485bcaed0a5baff64e03"} Mar 09 03:45:01 crc kubenswrapper[4901]: I0309 03:45:01.183054 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550465-4qjkd" event={"ID":"685ee708-3377-434c-b5ec-c7b61822f3e1","Type":"ContainerStarted","Data":"7485320a5a281aad8ee875ec5d95b6bec6df5efad8b71df63921a703b57ed6aa"} Mar 09 03:45:02 crc kubenswrapper[4901]: I0309 03:45:02.192063 4901 generic.go:334] "Generic (PLEG): container finished" podID="685ee708-3377-434c-b5ec-c7b61822f3e1" containerID="278c53b7cd9b6b602f979211b77f19a38dffa551b943485bcaed0a5baff64e03" exitCode=0 Mar 09 03:45:02 crc kubenswrapper[4901]: I0309 03:45:02.192135 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550465-4qjkd" event={"ID":"685ee708-3377-434c-b5ec-c7b61822f3e1","Type":"ContainerDied","Data":"278c53b7cd9b6b602f979211b77f19a38dffa551b943485bcaed0a5baff64e03"} Mar 09 03:45:03 crc kubenswrapper[4901]: I0309 03:45:03.594710 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550465-4qjkd" Mar 09 03:45:03 crc kubenswrapper[4901]: I0309 03:45:03.699865 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/685ee708-3377-434c-b5ec-c7b61822f3e1-secret-volume\") pod \"685ee708-3377-434c-b5ec-c7b61822f3e1\" (UID: \"685ee708-3377-434c-b5ec-c7b61822f3e1\") " Mar 09 03:45:03 crc kubenswrapper[4901]: I0309 03:45:03.700203 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h65n8\" (UniqueName: \"kubernetes.io/projected/685ee708-3377-434c-b5ec-c7b61822f3e1-kube-api-access-h65n8\") pod \"685ee708-3377-434c-b5ec-c7b61822f3e1\" (UID: \"685ee708-3377-434c-b5ec-c7b61822f3e1\") " Mar 09 03:45:03 crc kubenswrapper[4901]: I0309 03:45:03.700287 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/685ee708-3377-434c-b5ec-c7b61822f3e1-config-volume\") pod \"685ee708-3377-434c-b5ec-c7b61822f3e1\" (UID: \"685ee708-3377-434c-b5ec-c7b61822f3e1\") " Mar 09 03:45:03 crc kubenswrapper[4901]: I0309 03:45:03.700970 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/685ee708-3377-434c-b5ec-c7b61822f3e1-config-volume" (OuterVolumeSpecName: "config-volume") pod "685ee708-3377-434c-b5ec-c7b61822f3e1" (UID: "685ee708-3377-434c-b5ec-c7b61822f3e1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 03:45:03 crc kubenswrapper[4901]: I0309 03:45:03.706371 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/685ee708-3377-434c-b5ec-c7b61822f3e1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "685ee708-3377-434c-b5ec-c7b61822f3e1" (UID: "685ee708-3377-434c-b5ec-c7b61822f3e1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 03:45:03 crc kubenswrapper[4901]: I0309 03:45:03.707914 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/685ee708-3377-434c-b5ec-c7b61822f3e1-kube-api-access-h65n8" (OuterVolumeSpecName: "kube-api-access-h65n8") pod "685ee708-3377-434c-b5ec-c7b61822f3e1" (UID: "685ee708-3377-434c-b5ec-c7b61822f3e1"). InnerVolumeSpecName "kube-api-access-h65n8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:45:03 crc kubenswrapper[4901]: I0309 03:45:03.802620 4901 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/685ee708-3377-434c-b5ec-c7b61822f3e1-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 03:45:03 crc kubenswrapper[4901]: I0309 03:45:03.802673 4901 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/685ee708-3377-434c-b5ec-c7b61822f3e1-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 09 03:45:03 crc kubenswrapper[4901]: I0309 03:45:03.802693 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h65n8\" (UniqueName: \"kubernetes.io/projected/685ee708-3377-434c-b5ec-c7b61822f3e1-kube-api-access-h65n8\") on node \"crc\" DevicePath \"\"" Mar 09 03:45:04 crc kubenswrapper[4901]: I0309 03:45:04.215611 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550465-4qjkd" event={"ID":"685ee708-3377-434c-b5ec-c7b61822f3e1","Type":"ContainerDied","Data":"7485320a5a281aad8ee875ec5d95b6bec6df5efad8b71df63921a703b57ed6aa"} Mar 09 03:45:04 crc kubenswrapper[4901]: I0309 03:45:04.215652 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550465-4qjkd" Mar 09 03:45:04 crc kubenswrapper[4901]: I0309 03:45:04.215651 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7485320a5a281aad8ee875ec5d95b6bec6df5efad8b71df63921a703b57ed6aa" Mar 09 03:45:04 crc kubenswrapper[4901]: I0309 03:45:04.272620 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550420-pqfpq"] Mar 09 03:45:04 crc kubenswrapper[4901]: I0309 03:45:04.279519 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550420-pqfpq"] Mar 09 03:45:06 crc kubenswrapper[4901]: I0309 03:45:06.124755 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d19c465-0987-41b6-b948-6a61d368bac4" path="/var/lib/kubelet/pods/7d19c465-0987-41b6-b948-6a61d368bac4/volumes" Mar 09 03:45:17 crc kubenswrapper[4901]: I0309 03:45:17.745955 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sng9q"] Mar 09 03:45:17 crc kubenswrapper[4901]: E0309 03:45:17.747132 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="685ee708-3377-434c-b5ec-c7b61822f3e1" containerName="collect-profiles" Mar 09 03:45:17 crc kubenswrapper[4901]: I0309 03:45:17.747158 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="685ee708-3377-434c-b5ec-c7b61822f3e1" containerName="collect-profiles" Mar 09 03:45:17 crc kubenswrapper[4901]: I0309 03:45:17.747538 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="685ee708-3377-434c-b5ec-c7b61822f3e1" containerName="collect-profiles" Mar 09 03:45:17 crc kubenswrapper[4901]: I0309 03:45:17.749445 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sng9q" Mar 09 03:45:17 crc kubenswrapper[4901]: I0309 03:45:17.764602 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sng9q"] Mar 09 03:45:17 crc kubenswrapper[4901]: I0309 03:45:17.949898 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21db27ff-04d1-44da-b347-671752e5246e-catalog-content\") pod \"community-operators-sng9q\" (UID: \"21db27ff-04d1-44da-b347-671752e5246e\") " pod="openshift-marketplace/community-operators-sng9q" Mar 09 03:45:17 crc kubenswrapper[4901]: I0309 03:45:17.949961 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6pts\" (UniqueName: \"kubernetes.io/projected/21db27ff-04d1-44da-b347-671752e5246e-kube-api-access-l6pts\") pod \"community-operators-sng9q\" (UID: \"21db27ff-04d1-44da-b347-671752e5246e\") " pod="openshift-marketplace/community-operators-sng9q" Mar 09 03:45:17 crc kubenswrapper[4901]: I0309 03:45:17.950152 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21db27ff-04d1-44da-b347-671752e5246e-utilities\") pod \"community-operators-sng9q\" (UID: \"21db27ff-04d1-44da-b347-671752e5246e\") " pod="openshift-marketplace/community-operators-sng9q" Mar 09 03:45:18 crc kubenswrapper[4901]: I0309 03:45:18.051261 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21db27ff-04d1-44da-b347-671752e5246e-utilities\") pod \"community-operators-sng9q\" (UID: \"21db27ff-04d1-44da-b347-671752e5246e\") " pod="openshift-marketplace/community-operators-sng9q" Mar 09 03:45:18 crc kubenswrapper[4901]: I0309 03:45:18.051352 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21db27ff-04d1-44da-b347-671752e5246e-catalog-content\") pod \"community-operators-sng9q\" (UID: \"21db27ff-04d1-44da-b347-671752e5246e\") " pod="openshift-marketplace/community-operators-sng9q" Mar 09 03:45:18 crc kubenswrapper[4901]: I0309 03:45:18.051374 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6pts\" (UniqueName: \"kubernetes.io/projected/21db27ff-04d1-44da-b347-671752e5246e-kube-api-access-l6pts\") pod \"community-operators-sng9q\" (UID: \"21db27ff-04d1-44da-b347-671752e5246e\") " pod="openshift-marketplace/community-operators-sng9q" Mar 09 03:45:18 crc kubenswrapper[4901]: I0309 03:45:18.051808 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21db27ff-04d1-44da-b347-671752e5246e-catalog-content\") pod \"community-operators-sng9q\" (UID: \"21db27ff-04d1-44da-b347-671752e5246e\") " pod="openshift-marketplace/community-operators-sng9q" Mar 09 03:45:18 crc kubenswrapper[4901]: I0309 03:45:18.051933 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21db27ff-04d1-44da-b347-671752e5246e-utilities\") pod \"community-operators-sng9q\" (UID: \"21db27ff-04d1-44da-b347-671752e5246e\") " pod="openshift-marketplace/community-operators-sng9q" Mar 09 03:45:18 crc kubenswrapper[4901]: I0309 03:45:18.070507 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6pts\" (UniqueName: \"kubernetes.io/projected/21db27ff-04d1-44da-b347-671752e5246e-kube-api-access-l6pts\") pod \"community-operators-sng9q\" (UID: \"21db27ff-04d1-44da-b347-671752e5246e\") " pod="openshift-marketplace/community-operators-sng9q" Mar 09 03:45:18 crc kubenswrapper[4901]: I0309 03:45:18.075971 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sng9q" Mar 09 03:45:18 crc kubenswrapper[4901]: I0309 03:45:18.638250 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sng9q"] Mar 09 03:45:19 crc kubenswrapper[4901]: I0309 03:45:19.359994 4901 generic.go:334] "Generic (PLEG): container finished" podID="21db27ff-04d1-44da-b347-671752e5246e" containerID="2121f7ac2c4d4156fbde41d0dad0cccacc82868ffa92f005572847f9d1db775e" exitCode=0 Mar 09 03:45:19 crc kubenswrapper[4901]: I0309 03:45:19.360070 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sng9q" event={"ID":"21db27ff-04d1-44da-b347-671752e5246e","Type":"ContainerDied","Data":"2121f7ac2c4d4156fbde41d0dad0cccacc82868ffa92f005572847f9d1db775e"} Mar 09 03:45:19 crc kubenswrapper[4901]: I0309 03:45:19.360115 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sng9q" event={"ID":"21db27ff-04d1-44da-b347-671752e5246e","Type":"ContainerStarted","Data":"0a6ccfacc967fd6e00ba8f578debe32813a012ca0e2899ad772d13843ee79d09"} Mar 09 03:45:21 crc kubenswrapper[4901]: I0309 03:45:21.376869 4901 generic.go:334] "Generic (PLEG): container finished" podID="21db27ff-04d1-44da-b347-671752e5246e" containerID="5b8cba18d1ebbb5a1b676cf186d7f27a2b6266b808153bbd731d8128b34eb56b" exitCode=0 Mar 09 03:45:21 crc kubenswrapper[4901]: I0309 03:45:21.376968 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sng9q" event={"ID":"21db27ff-04d1-44da-b347-671752e5246e","Type":"ContainerDied","Data":"5b8cba18d1ebbb5a1b676cf186d7f27a2b6266b808153bbd731d8128b34eb56b"} Mar 09 03:45:23 crc kubenswrapper[4901]: I0309 03:45:23.395675 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sng9q" event={"ID":"21db27ff-04d1-44da-b347-671752e5246e","Type":"ContainerStarted","Data":"e1d5faf4faf7eae88cb0b9ecb119db8095f32baf10668ac4cc641ed8a7fb1a9c"} Mar 09 03:45:23 crc kubenswrapper[4901]: I0309 03:45:23.417667 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sng9q" podStartSLOduration=3.5095783149999997 podStartE2EDuration="6.417641946s" podCreationTimestamp="2026-03-09 03:45:17 +0000 UTC" firstStartedPulling="2026-03-09 03:45:19.363586507 +0000 UTC m=+3843.953250279" lastFinishedPulling="2026-03-09 03:45:22.271650168 +0000 UTC m=+3846.861313910" observedRunningTime="2026-03-09 03:45:23.411464263 +0000 UTC m=+3848.001128005" watchObservedRunningTime="2026-03-09 03:45:23.417641946 +0000 UTC m=+3848.007305708" Mar 09 03:45:24 crc kubenswrapper[4901]: I0309 03:45:24.214059 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-m9v7g"] Mar 09 03:45:24 crc kubenswrapper[4901]: I0309 03:45:24.215822 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m9v7g" Mar 09 03:45:24 crc kubenswrapper[4901]: I0309 03:45:24.269831 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m9v7g"] Mar 09 03:45:24 crc kubenswrapper[4901]: I0309 03:45:24.276504 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc67ea62-3ce8-43d7-b680-34ae32e8eab6-utilities\") pod \"redhat-marketplace-m9v7g\" (UID: \"bc67ea62-3ce8-43d7-b680-34ae32e8eab6\") " pod="openshift-marketplace/redhat-marketplace-m9v7g" Mar 09 03:45:24 crc kubenswrapper[4901]: I0309 03:45:24.276570 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4gvm\" (UniqueName: \"kubernetes.io/projected/bc67ea62-3ce8-43d7-b680-34ae32e8eab6-kube-api-access-k4gvm\") pod \"redhat-marketplace-m9v7g\" (UID: \"bc67ea62-3ce8-43d7-b680-34ae32e8eab6\") " pod="openshift-marketplace/redhat-marketplace-m9v7g" Mar 09 03:45:24 crc kubenswrapper[4901]: I0309 03:45:24.276611 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc67ea62-3ce8-43d7-b680-34ae32e8eab6-catalog-content\") pod \"redhat-marketplace-m9v7g\" (UID: \"bc67ea62-3ce8-43d7-b680-34ae32e8eab6\") " pod="openshift-marketplace/redhat-marketplace-m9v7g" Mar 09 03:45:24 crc kubenswrapper[4901]: I0309 03:45:24.378054 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc67ea62-3ce8-43d7-b680-34ae32e8eab6-utilities\") pod \"redhat-marketplace-m9v7g\" (UID: \"bc67ea62-3ce8-43d7-b680-34ae32e8eab6\") " pod="openshift-marketplace/redhat-marketplace-m9v7g" Mar 09 03:45:24 crc kubenswrapper[4901]: I0309 03:45:24.378127 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4gvm\" (UniqueName: \"kubernetes.io/projected/bc67ea62-3ce8-43d7-b680-34ae32e8eab6-kube-api-access-k4gvm\") pod \"redhat-marketplace-m9v7g\" (UID: \"bc67ea62-3ce8-43d7-b680-34ae32e8eab6\") " pod="openshift-marketplace/redhat-marketplace-m9v7g" Mar 09 03:45:24 crc kubenswrapper[4901]: I0309 03:45:24.378178 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc67ea62-3ce8-43d7-b680-34ae32e8eab6-catalog-content\") pod \"redhat-marketplace-m9v7g\" (UID: \"bc67ea62-3ce8-43d7-b680-34ae32e8eab6\") " pod="openshift-marketplace/redhat-marketplace-m9v7g" Mar 09 03:45:24 crc kubenswrapper[4901]: I0309 03:45:24.378725 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc67ea62-3ce8-43d7-b680-34ae32e8eab6-catalog-content\") pod \"redhat-marketplace-m9v7g\" (UID: \"bc67ea62-3ce8-43d7-b680-34ae32e8eab6\") " pod="openshift-marketplace/redhat-marketplace-m9v7g" Mar 09 03:45:24 crc kubenswrapper[4901]: I0309 03:45:24.379011 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc67ea62-3ce8-43d7-b680-34ae32e8eab6-utilities\") pod \"redhat-marketplace-m9v7g\" (UID: \"bc67ea62-3ce8-43d7-b680-34ae32e8eab6\") " pod="openshift-marketplace/redhat-marketplace-m9v7g" Mar 09 03:45:24 crc kubenswrapper[4901]: I0309 03:45:24.404278 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4gvm\" (UniqueName: \"kubernetes.io/projected/bc67ea62-3ce8-43d7-b680-34ae32e8eab6-kube-api-access-k4gvm\") pod \"redhat-marketplace-m9v7g\" (UID: \"bc67ea62-3ce8-43d7-b680-34ae32e8eab6\") " pod="openshift-marketplace/redhat-marketplace-m9v7g" Mar 09 03:45:24 crc kubenswrapper[4901]: I0309 03:45:24.529732 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m9v7g" Mar 09 03:45:25 crc kubenswrapper[4901]: W0309 03:45:25.007878 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc67ea62_3ce8_43d7_b680_34ae32e8eab6.slice/crio-d4ea396842ddcf99c490fe3788a50f577f95aacd2dd19d381c08fb16c0c2d0d3 WatchSource:0}: Error finding container d4ea396842ddcf99c490fe3788a50f577f95aacd2dd19d381c08fb16c0c2d0d3: Status 404 returned error can't find the container with id d4ea396842ddcf99c490fe3788a50f577f95aacd2dd19d381c08fb16c0c2d0d3 Mar 09 03:45:25 crc kubenswrapper[4901]: I0309 03:45:25.008963 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m9v7g"] Mar 09 03:45:25 crc kubenswrapper[4901]: I0309 03:45:25.412815 4901 generic.go:334] "Generic (PLEG): container finished" podID="bc67ea62-3ce8-43d7-b680-34ae32e8eab6" containerID="9e53946f185fd66b811552199396ced8e5080af0bb28700da3404120d95628f0" exitCode=0 Mar 09 03:45:25 crc kubenswrapper[4901]: I0309 03:45:25.412888 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m9v7g" event={"ID":"bc67ea62-3ce8-43d7-b680-34ae32e8eab6","Type":"ContainerDied","Data":"9e53946f185fd66b811552199396ced8e5080af0bb28700da3404120d95628f0"} Mar 09 03:45:25 crc kubenswrapper[4901]: I0309 03:45:25.413535 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m9v7g" event={"ID":"bc67ea62-3ce8-43d7-b680-34ae32e8eab6","Type":"ContainerStarted","Data":"d4ea396842ddcf99c490fe3788a50f577f95aacd2dd19d381c08fb16c0c2d0d3"} Mar 09 03:45:26 crc kubenswrapper[4901]: I0309 03:45:26.424205 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m9v7g" event={"ID":"bc67ea62-3ce8-43d7-b680-34ae32e8eab6","Type":"ContainerStarted","Data":"ff0f63f01262b19f414b673c614b2fd9bb1003bab234cbef1995382eb274ea96"} Mar 09 03:45:27 crc kubenswrapper[4901]: I0309 03:45:27.437385 4901 generic.go:334] "Generic (PLEG): container finished" podID="bc67ea62-3ce8-43d7-b680-34ae32e8eab6" containerID="ff0f63f01262b19f414b673c614b2fd9bb1003bab234cbef1995382eb274ea96" exitCode=0 Mar 09 03:45:27 crc kubenswrapper[4901]: I0309 03:45:27.437450 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m9v7g" event={"ID":"bc67ea62-3ce8-43d7-b680-34ae32e8eab6","Type":"ContainerDied","Data":"ff0f63f01262b19f414b673c614b2fd9bb1003bab234cbef1995382eb274ea96"} Mar 09 03:45:28 crc kubenswrapper[4901]: I0309 03:45:28.076567 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sng9q" Mar 09 03:45:28 crc kubenswrapper[4901]: I0309 03:45:28.078335 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sng9q" Mar 09 03:45:28 crc kubenswrapper[4901]: I0309 03:45:28.163948 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sng9q" Mar 09 03:45:28 crc kubenswrapper[4901]: I0309 03:45:28.446691 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m9v7g" event={"ID":"bc67ea62-3ce8-43d7-b680-34ae32e8eab6","Type":"ContainerStarted","Data":"dc85d83cab9aa37a879c80ccb671613e8887158791fbc000c639efce0c4caeee"} Mar 09 03:45:28 crc kubenswrapper[4901]: I0309 03:45:28.476548 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-m9v7g" podStartSLOduration=1.9160483689999999 podStartE2EDuration="4.4765324s" podCreationTimestamp="2026-03-09 03:45:24 +0000 UTC" firstStartedPulling="2026-03-09 03:45:25.415504908 +0000 UTC m=+3850.005168670" lastFinishedPulling="2026-03-09 03:45:27.975988969 +0000 UTC m=+3852.565652701" observedRunningTime="2026-03-09 03:45:28.476250623 +0000 UTC m=+3853.065914355" watchObservedRunningTime="2026-03-09 03:45:28.4765324 +0000 UTC m=+3853.066196132" Mar 09 03:45:28 crc kubenswrapper[4901]: I0309 03:45:28.552791 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sng9q" Mar 09 03:45:30 crc kubenswrapper[4901]: I0309 03:45:30.502053 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sng9q"] Mar 09 03:45:31 crc kubenswrapper[4901]: I0309 03:45:31.470123 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sng9q" podUID="21db27ff-04d1-44da-b347-671752e5246e" containerName="registry-server" containerID="cri-o://e1d5faf4faf7eae88cb0b9ecb119db8095f32baf10668ac4cc641ed8a7fb1a9c" gracePeriod=2 Mar 09 03:45:32 crc kubenswrapper[4901]: I0309 03:45:32.429298 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sng9q" Mar 09 03:45:32 crc kubenswrapper[4901]: I0309 03:45:32.499930 4901 generic.go:334] "Generic (PLEG): container finished" podID="21db27ff-04d1-44da-b347-671752e5246e" containerID="e1d5faf4faf7eae88cb0b9ecb119db8095f32baf10668ac4cc641ed8a7fb1a9c" exitCode=0 Mar 09 03:45:32 crc kubenswrapper[4901]: I0309 03:45:32.499972 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sng9q" event={"ID":"21db27ff-04d1-44da-b347-671752e5246e","Type":"ContainerDied","Data":"e1d5faf4faf7eae88cb0b9ecb119db8095f32baf10668ac4cc641ed8a7fb1a9c"} Mar 09 03:45:32 crc kubenswrapper[4901]: I0309 03:45:32.499999 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sng9q" event={"ID":"21db27ff-04d1-44da-b347-671752e5246e","Type":"ContainerDied","Data":"0a6ccfacc967fd6e00ba8f578debe32813a012ca0e2899ad772d13843ee79d09"} Mar 09 03:45:32 crc kubenswrapper[4901]: I0309 03:45:32.500016 4901 scope.go:117] "RemoveContainer" containerID="e1d5faf4faf7eae88cb0b9ecb119db8095f32baf10668ac4cc641ed8a7fb1a9c" Mar 09 03:45:32 crc kubenswrapper[4901]: I0309 03:45:32.500038 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sng9q" Mar 09 03:45:32 crc kubenswrapper[4901]: I0309 03:45:32.510651 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21db27ff-04d1-44da-b347-671752e5246e-utilities\") pod \"21db27ff-04d1-44da-b347-671752e5246e\" (UID: \"21db27ff-04d1-44da-b347-671752e5246e\") " Mar 09 03:45:32 crc kubenswrapper[4901]: I0309 03:45:32.510822 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21db27ff-04d1-44da-b347-671752e5246e-catalog-content\") pod \"21db27ff-04d1-44da-b347-671752e5246e\" (UID: \"21db27ff-04d1-44da-b347-671752e5246e\") " Mar 09 03:45:32 crc kubenswrapper[4901]: I0309 03:45:32.510919 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6pts\" (UniqueName: \"kubernetes.io/projected/21db27ff-04d1-44da-b347-671752e5246e-kube-api-access-l6pts\") pod \"21db27ff-04d1-44da-b347-671752e5246e\" (UID: \"21db27ff-04d1-44da-b347-671752e5246e\") " Mar 09 03:45:32 crc kubenswrapper[4901]: I0309 03:45:32.511549 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21db27ff-04d1-44da-b347-671752e5246e-utilities" (OuterVolumeSpecName: "utilities") pod "21db27ff-04d1-44da-b347-671752e5246e" (UID: "21db27ff-04d1-44da-b347-671752e5246e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:45:32 crc kubenswrapper[4901]: I0309 03:45:32.516186 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21db27ff-04d1-44da-b347-671752e5246e-kube-api-access-l6pts" (OuterVolumeSpecName: "kube-api-access-l6pts") pod "21db27ff-04d1-44da-b347-671752e5246e" (UID: "21db27ff-04d1-44da-b347-671752e5246e"). InnerVolumeSpecName "kube-api-access-l6pts". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:45:32 crc kubenswrapper[4901]: I0309 03:45:32.521228 4901 scope.go:117] "RemoveContainer" containerID="5b8cba18d1ebbb5a1b676cf186d7f27a2b6266b808153bbd731d8128b34eb56b" Mar 09 03:45:32 crc kubenswrapper[4901]: I0309 03:45:32.557201 4901 scope.go:117] "RemoveContainer" containerID="2121f7ac2c4d4156fbde41d0dad0cccacc82868ffa92f005572847f9d1db775e" Mar 09 03:45:32 crc kubenswrapper[4901]: I0309 03:45:32.575645 4901 scope.go:117] "RemoveContainer" containerID="e1d5faf4faf7eae88cb0b9ecb119db8095f32baf10668ac4cc641ed8a7fb1a9c" Mar 09 03:45:32 crc kubenswrapper[4901]: E0309 03:45:32.576426 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1d5faf4faf7eae88cb0b9ecb119db8095f32baf10668ac4cc641ed8a7fb1a9c\": container with ID starting with e1d5faf4faf7eae88cb0b9ecb119db8095f32baf10668ac4cc641ed8a7fb1a9c not found: ID does not exist" containerID="e1d5faf4faf7eae88cb0b9ecb119db8095f32baf10668ac4cc641ed8a7fb1a9c" Mar 09 03:45:32 crc kubenswrapper[4901]: I0309 03:45:32.576471 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1d5faf4faf7eae88cb0b9ecb119db8095f32baf10668ac4cc641ed8a7fb1a9c"} err="failed to get container status \"e1d5faf4faf7eae88cb0b9ecb119db8095f32baf10668ac4cc641ed8a7fb1a9c\": rpc error: code = NotFound desc = could not find container \"e1d5faf4faf7eae88cb0b9ecb119db8095f32baf10668ac4cc641ed8a7fb1a9c\": container with ID starting with e1d5faf4faf7eae88cb0b9ecb119db8095f32baf10668ac4cc641ed8a7fb1a9c not found: ID does not exist" Mar 09 03:45:32 crc kubenswrapper[4901]: I0309 03:45:32.576496 4901 scope.go:117] "RemoveContainer" containerID="5b8cba18d1ebbb5a1b676cf186d7f27a2b6266b808153bbd731d8128b34eb56b" Mar 09 03:45:32 crc kubenswrapper[4901]: E0309 03:45:32.576880 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b8cba18d1ebbb5a1b676cf186d7f27a2b6266b808153bbd731d8128b34eb56b\": container with ID starting with 5b8cba18d1ebbb5a1b676cf186d7f27a2b6266b808153bbd731d8128b34eb56b not found: ID does not exist" containerID="5b8cba18d1ebbb5a1b676cf186d7f27a2b6266b808153bbd731d8128b34eb56b" Mar 09 03:45:32 crc kubenswrapper[4901]: I0309 03:45:32.576911 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b8cba18d1ebbb5a1b676cf186d7f27a2b6266b808153bbd731d8128b34eb56b"} err="failed to get container status \"5b8cba18d1ebbb5a1b676cf186d7f27a2b6266b808153bbd731d8128b34eb56b\": rpc error: code = NotFound desc = could not find container \"5b8cba18d1ebbb5a1b676cf186d7f27a2b6266b808153bbd731d8128b34eb56b\": container with ID starting with 5b8cba18d1ebbb5a1b676cf186d7f27a2b6266b808153bbd731d8128b34eb56b not found: ID does not exist" Mar 09 03:45:32 crc kubenswrapper[4901]: I0309 03:45:32.576932 4901 scope.go:117] "RemoveContainer" containerID="2121f7ac2c4d4156fbde41d0dad0cccacc82868ffa92f005572847f9d1db775e" Mar 09 03:45:32 crc kubenswrapper[4901]: E0309 03:45:32.577346 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2121f7ac2c4d4156fbde41d0dad0cccacc82868ffa92f005572847f9d1db775e\": container with ID starting with 2121f7ac2c4d4156fbde41d0dad0cccacc82868ffa92f005572847f9d1db775e not found: ID does not exist" containerID="2121f7ac2c4d4156fbde41d0dad0cccacc82868ffa92f005572847f9d1db775e" Mar 09 03:45:32 crc kubenswrapper[4901]: I0309 03:45:32.577373 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2121f7ac2c4d4156fbde41d0dad0cccacc82868ffa92f005572847f9d1db775e"} err="failed to get container status \"2121f7ac2c4d4156fbde41d0dad0cccacc82868ffa92f005572847f9d1db775e\": rpc error: code = NotFound desc = could not find container \"2121f7ac2c4d4156fbde41d0dad0cccacc82868ffa92f005572847f9d1db775e\": container with ID starting with 2121f7ac2c4d4156fbde41d0dad0cccacc82868ffa92f005572847f9d1db775e not found: ID does not exist" Mar 09 03:45:32 crc kubenswrapper[4901]: I0309 03:45:32.587081 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21db27ff-04d1-44da-b347-671752e5246e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "21db27ff-04d1-44da-b347-671752e5246e" (UID: "21db27ff-04d1-44da-b347-671752e5246e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:45:32 crc kubenswrapper[4901]: I0309 03:45:32.612907 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6pts\" (UniqueName: \"kubernetes.io/projected/21db27ff-04d1-44da-b347-671752e5246e-kube-api-access-l6pts\") on node \"crc\" DevicePath \"\"" Mar 09 03:45:32 crc kubenswrapper[4901]: I0309 03:45:32.612943 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21db27ff-04d1-44da-b347-671752e5246e-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 03:45:32 crc kubenswrapper[4901]: I0309 03:45:32.612959 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21db27ff-04d1-44da-b347-671752e5246e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 03:45:32 crc kubenswrapper[4901]: I0309 03:45:32.842155 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sng9q"] Mar 09 03:45:32 crc kubenswrapper[4901]: I0309 03:45:32.849158 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sng9q"] Mar 09 03:45:34 crc kubenswrapper[4901]: I0309 03:45:34.123388 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21db27ff-04d1-44da-b347-671752e5246e" path="/var/lib/kubelet/pods/21db27ff-04d1-44da-b347-671752e5246e/volumes" Mar 09 03:45:34 crc kubenswrapper[4901]: I0309 03:45:34.531179 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-m9v7g" Mar 09 03:45:34 crc kubenswrapper[4901]: I0309 03:45:34.531517 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-m9v7g" Mar 09 03:45:34 crc kubenswrapper[4901]: I0309 03:45:34.580558 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-m9v7g" Mar 09 03:45:35 crc kubenswrapper[4901]: I0309 03:45:35.586578 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-m9v7g" Mar 09 03:45:36 crc kubenswrapper[4901]: I0309 03:45:36.704467 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m9v7g"] Mar 09 03:45:37 crc kubenswrapper[4901]: I0309 03:45:37.540946 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-m9v7g" podUID="bc67ea62-3ce8-43d7-b680-34ae32e8eab6" containerName="registry-server" containerID="cri-o://dc85d83cab9aa37a879c80ccb671613e8887158791fbc000c639efce0c4caeee" gracePeriod=2 Mar 09 03:45:38 crc kubenswrapper[4901]: I0309 03:45:38.049658 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m9v7g" Mar 09 03:45:38 crc kubenswrapper[4901]: I0309 03:45:38.095611 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc67ea62-3ce8-43d7-b680-34ae32e8eab6-utilities\") pod \"bc67ea62-3ce8-43d7-b680-34ae32e8eab6\" (UID: \"bc67ea62-3ce8-43d7-b680-34ae32e8eab6\") " Mar 09 03:45:38 crc kubenswrapper[4901]: I0309 03:45:38.095808 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc67ea62-3ce8-43d7-b680-34ae32e8eab6-catalog-content\") pod \"bc67ea62-3ce8-43d7-b680-34ae32e8eab6\" (UID: \"bc67ea62-3ce8-43d7-b680-34ae32e8eab6\") " Mar 09 03:45:38 crc kubenswrapper[4901]: I0309 03:45:38.095890 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4gvm\" (UniqueName: \"kubernetes.io/projected/bc67ea62-3ce8-43d7-b680-34ae32e8eab6-kube-api-access-k4gvm\") pod \"bc67ea62-3ce8-43d7-b680-34ae32e8eab6\" (UID: \"bc67ea62-3ce8-43d7-b680-34ae32e8eab6\") " Mar 09 03:45:38 crc kubenswrapper[4901]: I0309 03:45:38.096798 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc67ea62-3ce8-43d7-b680-34ae32e8eab6-utilities" (OuterVolumeSpecName: "utilities") pod "bc67ea62-3ce8-43d7-b680-34ae32e8eab6" (UID: "bc67ea62-3ce8-43d7-b680-34ae32e8eab6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:45:38 crc kubenswrapper[4901]: I0309 03:45:38.103563 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc67ea62-3ce8-43d7-b680-34ae32e8eab6-kube-api-access-k4gvm" (OuterVolumeSpecName: "kube-api-access-k4gvm") pod "bc67ea62-3ce8-43d7-b680-34ae32e8eab6" (UID: "bc67ea62-3ce8-43d7-b680-34ae32e8eab6"). InnerVolumeSpecName "kube-api-access-k4gvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:45:38 crc kubenswrapper[4901]: I0309 03:45:38.140449 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc67ea62-3ce8-43d7-b680-34ae32e8eab6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bc67ea62-3ce8-43d7-b680-34ae32e8eab6" (UID: "bc67ea62-3ce8-43d7-b680-34ae32e8eab6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:45:38 crc kubenswrapper[4901]: I0309 03:45:38.197392 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc67ea62-3ce8-43d7-b680-34ae32e8eab6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 03:45:38 crc kubenswrapper[4901]: I0309 03:45:38.197429 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4gvm\" (UniqueName: \"kubernetes.io/projected/bc67ea62-3ce8-43d7-b680-34ae32e8eab6-kube-api-access-k4gvm\") on node \"crc\" DevicePath \"\"" Mar 09 03:45:38 crc kubenswrapper[4901]: I0309 03:45:38.197443 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc67ea62-3ce8-43d7-b680-34ae32e8eab6-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 03:45:38 crc kubenswrapper[4901]: I0309 03:45:38.550506 4901 generic.go:334] "Generic (PLEG): container finished" podID="bc67ea62-3ce8-43d7-b680-34ae32e8eab6" containerID="dc85d83cab9aa37a879c80ccb671613e8887158791fbc000c639efce0c4caeee" exitCode=0 Mar 09 03:45:38 crc kubenswrapper[4901]: I0309 03:45:38.550548 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m9v7g" event={"ID":"bc67ea62-3ce8-43d7-b680-34ae32e8eab6","Type":"ContainerDied","Data":"dc85d83cab9aa37a879c80ccb671613e8887158791fbc000c639efce0c4caeee"} Mar 09 03:45:38 crc kubenswrapper[4901]: I0309 03:45:38.550623 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m9v7g" event={"ID":"bc67ea62-3ce8-43d7-b680-34ae32e8eab6","Type":"ContainerDied","Data":"d4ea396842ddcf99c490fe3788a50f577f95aacd2dd19d381c08fb16c0c2d0d3"} Mar 09 03:45:38 crc kubenswrapper[4901]: I0309 03:45:38.550648 4901 scope.go:117] "RemoveContainer" containerID="dc85d83cab9aa37a879c80ccb671613e8887158791fbc000c639efce0c4caeee" Mar 09 03:45:38 crc kubenswrapper[4901]: I0309 03:45:38.550676 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m9v7g" Mar 09 03:45:38 crc kubenswrapper[4901]: I0309 03:45:38.578805 4901 scope.go:117] "RemoveContainer" containerID="ff0f63f01262b19f414b673c614b2fd9bb1003bab234cbef1995382eb274ea96" Mar 09 03:45:38 crc kubenswrapper[4901]: I0309 03:45:38.610630 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m9v7g"] Mar 09 03:45:38 crc kubenswrapper[4901]: I0309 03:45:38.621008 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-m9v7g"] Mar 09 03:45:38 crc kubenswrapper[4901]: I0309 03:45:38.631494 4901 scope.go:117] "RemoveContainer" containerID="9e53946f185fd66b811552199396ced8e5080af0bb28700da3404120d95628f0" Mar 09 03:45:38 crc kubenswrapper[4901]: I0309 03:45:38.661668 4901 scope.go:117] "RemoveContainer" containerID="dc85d83cab9aa37a879c80ccb671613e8887158791fbc000c639efce0c4caeee" Mar 09 03:45:38 crc kubenswrapper[4901]: E0309 03:45:38.662595 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc85d83cab9aa37a879c80ccb671613e8887158791fbc000c639efce0c4caeee\": container with ID starting with dc85d83cab9aa37a879c80ccb671613e8887158791fbc000c639efce0c4caeee not found: ID does not exist" containerID="dc85d83cab9aa37a879c80ccb671613e8887158791fbc000c639efce0c4caeee" Mar 09 03:45:38 crc kubenswrapper[4901]: I0309 03:45:38.662638 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc85d83cab9aa37a879c80ccb671613e8887158791fbc000c639efce0c4caeee"} err="failed to get container status \"dc85d83cab9aa37a879c80ccb671613e8887158791fbc000c639efce0c4caeee\": rpc error: code = NotFound desc = could not find container \"dc85d83cab9aa37a879c80ccb671613e8887158791fbc000c639efce0c4caeee\": container with ID starting with dc85d83cab9aa37a879c80ccb671613e8887158791fbc000c639efce0c4caeee not found: ID does not exist" Mar 09 03:45:38 crc kubenswrapper[4901]: I0309 03:45:38.662664 4901 scope.go:117] "RemoveContainer" containerID="ff0f63f01262b19f414b673c614b2fd9bb1003bab234cbef1995382eb274ea96" Mar 09 03:45:38 crc kubenswrapper[4901]: E0309 03:45:38.663080 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff0f63f01262b19f414b673c614b2fd9bb1003bab234cbef1995382eb274ea96\": container with ID starting with ff0f63f01262b19f414b673c614b2fd9bb1003bab234cbef1995382eb274ea96 not found: ID does not exist" containerID="ff0f63f01262b19f414b673c614b2fd9bb1003bab234cbef1995382eb274ea96" Mar 09 03:45:38 crc kubenswrapper[4901]: I0309 03:45:38.663136 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff0f63f01262b19f414b673c614b2fd9bb1003bab234cbef1995382eb274ea96"} err="failed to get container status \"ff0f63f01262b19f414b673c614b2fd9bb1003bab234cbef1995382eb274ea96\": rpc error: code = NotFound desc = could not find container \"ff0f63f01262b19f414b673c614b2fd9bb1003bab234cbef1995382eb274ea96\": container with ID starting with ff0f63f01262b19f414b673c614b2fd9bb1003bab234cbef1995382eb274ea96 not found: ID does not exist" Mar 09 03:45:38 crc kubenswrapper[4901]: I0309 03:45:38.663179 4901 scope.go:117] "RemoveContainer" containerID="9e53946f185fd66b811552199396ced8e5080af0bb28700da3404120d95628f0" Mar 09 03:45:38 crc kubenswrapper[4901]: E0309 03:45:38.663673 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e53946f185fd66b811552199396ced8e5080af0bb28700da3404120d95628f0\": container with ID starting with 9e53946f185fd66b811552199396ced8e5080af0bb28700da3404120d95628f0 not found: ID does not exist" containerID="9e53946f185fd66b811552199396ced8e5080af0bb28700da3404120d95628f0" Mar 09 03:45:38 crc kubenswrapper[4901]: I0309 03:45:38.663703 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e53946f185fd66b811552199396ced8e5080af0bb28700da3404120d95628f0"} err="failed to get container status \"9e53946f185fd66b811552199396ced8e5080af0bb28700da3404120d95628f0\": rpc error: code = NotFound desc = could not find container \"9e53946f185fd66b811552199396ced8e5080af0bb28700da3404120d95628f0\": container with ID starting with 9e53946f185fd66b811552199396ced8e5080af0bb28700da3404120d95628f0 not found: ID does not exist" Mar 09 03:45:40 crc kubenswrapper[4901]: I0309 03:45:40.126404 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc67ea62-3ce8-43d7-b680-34ae32e8eab6" path="/var/lib/kubelet/pods/bc67ea62-3ce8-43d7-b680-34ae32e8eab6/volumes" Mar 09 03:45:40 crc kubenswrapper[4901]: I0309 03:45:40.136519 4901 scope.go:117] "RemoveContainer" containerID="ed9c4c3e79f1be3c311a803930c595ab18d827a0f272a4923403ead5508b1d9d" Mar 09 03:46:00 crc kubenswrapper[4901]: I0309 03:46:00.175414 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550466-xvdcj"] Mar 09 03:46:00 crc kubenswrapper[4901]: E0309 03:46:00.176705 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc67ea62-3ce8-43d7-b680-34ae32e8eab6" containerName="registry-server" Mar 09 03:46:00 crc kubenswrapper[4901]: I0309 03:46:00.176736 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc67ea62-3ce8-43d7-b680-34ae32e8eab6" containerName="registry-server" Mar 09 03:46:00 crc kubenswrapper[4901]: E0309 03:46:00.176772 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc67ea62-3ce8-43d7-b680-34ae32e8eab6" containerName="extract-utilities" Mar 09 03:46:00 crc kubenswrapper[4901]: I0309 03:46:00.176789 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc67ea62-3ce8-43d7-b680-34ae32e8eab6" containerName="extract-utilities" Mar 09 03:46:00 crc kubenswrapper[4901]: E0309 03:46:00.176827 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21db27ff-04d1-44da-b347-671752e5246e" containerName="extract-utilities" Mar 09 03:46:00 crc kubenswrapper[4901]: I0309 03:46:00.176847 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="21db27ff-04d1-44da-b347-671752e5246e" containerName="extract-utilities" Mar 09 03:46:00 crc kubenswrapper[4901]: E0309 03:46:00.176908 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc67ea62-3ce8-43d7-b680-34ae32e8eab6" containerName="extract-content" Mar 09 03:46:00 crc kubenswrapper[4901]: I0309 03:46:00.176924 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc67ea62-3ce8-43d7-b680-34ae32e8eab6" containerName="extract-content" Mar 09 03:46:00 crc kubenswrapper[4901]: E0309 03:46:00.176946 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21db27ff-04d1-44da-b347-671752e5246e" containerName="registry-server" Mar 09 03:46:00 crc kubenswrapper[4901]: I0309 03:46:00.176961 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="21db27ff-04d1-44da-b347-671752e5246e" containerName="registry-server" Mar 09 03:46:00 crc kubenswrapper[4901]: E0309 03:46:00.176992 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21db27ff-04d1-44da-b347-671752e5246e" containerName="extract-content" Mar 09 03:46:00 crc kubenswrapper[4901]: I0309 03:46:00.177007 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="21db27ff-04d1-44da-b347-671752e5246e" containerName="extract-content" Mar 09 03:46:00 crc kubenswrapper[4901]: I0309 03:46:00.177389 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="21db27ff-04d1-44da-b347-671752e5246e" containerName="registry-server" Mar 09 03:46:00 crc kubenswrapper[4901]: I0309 03:46:00.177443 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc67ea62-3ce8-43d7-b680-34ae32e8eab6" containerName="registry-server" Mar 09 03:46:00 crc kubenswrapper[4901]: I0309 03:46:00.178430 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550466-xvdcj" Mar 09 03:46:00 crc kubenswrapper[4901]: I0309 03:46:00.182112 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lq6vs" Mar 09 03:46:00 crc kubenswrapper[4901]: I0309 03:46:00.182601 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 03:46:00 crc kubenswrapper[4901]: I0309 03:46:00.182918 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 03:46:00 crc kubenswrapper[4901]: I0309 03:46:00.190605 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550466-xvdcj"] Mar 09 03:46:00 crc kubenswrapper[4901]: I0309 03:46:00.352548 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svbpj\" (UniqueName: \"kubernetes.io/projected/8ba1faa7-d1dc-4ab5-b6ae-bfc75be7843c-kube-api-access-svbpj\") pod \"auto-csr-approver-29550466-xvdcj\" (UID: \"8ba1faa7-d1dc-4ab5-b6ae-bfc75be7843c\") " pod="openshift-infra/auto-csr-approver-29550466-xvdcj" Mar 09 03:46:00 crc kubenswrapper[4901]: I0309 03:46:00.453755 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svbpj\" (UniqueName: \"kubernetes.io/projected/8ba1faa7-d1dc-4ab5-b6ae-bfc75be7843c-kube-api-access-svbpj\") pod \"auto-csr-approver-29550466-xvdcj\" (UID: \"8ba1faa7-d1dc-4ab5-b6ae-bfc75be7843c\") " pod="openshift-infra/auto-csr-approver-29550466-xvdcj" Mar 09 03:46:00 crc kubenswrapper[4901]: I0309 03:46:00.492644 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svbpj\" (UniqueName: \"kubernetes.io/projected/8ba1faa7-d1dc-4ab5-b6ae-bfc75be7843c-kube-api-access-svbpj\") pod \"auto-csr-approver-29550466-xvdcj\" (UID: \"8ba1faa7-d1dc-4ab5-b6ae-bfc75be7843c\") " pod="openshift-infra/auto-csr-approver-29550466-xvdcj" Mar 09 03:46:00 crc kubenswrapper[4901]: I0309 03:46:00.516063 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550466-xvdcj" Mar 09 03:46:00 crc kubenswrapper[4901]: I0309 03:46:00.819205 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550466-xvdcj"] Mar 09 03:46:00 crc kubenswrapper[4901]: I0309 03:46:00.862757 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 03:46:00 crc kubenswrapper[4901]: I0309 03:46:00.862835 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 03:46:01 crc kubenswrapper[4901]: I0309 03:46:01.786473 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550466-xvdcj" event={"ID":"8ba1faa7-d1dc-4ab5-b6ae-bfc75be7843c","Type":"ContainerStarted","Data":"df75498fc063907cbfab37e7c9c969332e72f6a60472aa606358caecd187e858"} Mar 09 03:46:02 crc kubenswrapper[4901]: I0309 03:46:02.801658 4901 generic.go:334] "Generic (PLEG): container finished" podID="8ba1faa7-d1dc-4ab5-b6ae-bfc75be7843c" containerID="ff4d20b44289f2c1b459d690d0c146978428c682c423c25ae827bc1424c9336f" exitCode=0 Mar 09 03:46:02 crc kubenswrapper[4901]: I0309 03:46:02.801737 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550466-xvdcj" event={"ID":"8ba1faa7-d1dc-4ab5-b6ae-bfc75be7843c","Type":"ContainerDied","Data":"ff4d20b44289f2c1b459d690d0c146978428c682c423c25ae827bc1424c9336f"} Mar 09 03:46:04 crc kubenswrapper[4901]: I0309 03:46:04.235982 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550466-xvdcj" Mar 09 03:46:04 crc kubenswrapper[4901]: I0309 03:46:04.252133 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svbpj\" (UniqueName: \"kubernetes.io/projected/8ba1faa7-d1dc-4ab5-b6ae-bfc75be7843c-kube-api-access-svbpj\") pod \"8ba1faa7-d1dc-4ab5-b6ae-bfc75be7843c\" (UID: \"8ba1faa7-d1dc-4ab5-b6ae-bfc75be7843c\") " Mar 09 03:46:04 crc kubenswrapper[4901]: I0309 03:46:04.262535 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ba1faa7-d1dc-4ab5-b6ae-bfc75be7843c-kube-api-access-svbpj" (OuterVolumeSpecName: "kube-api-access-svbpj") pod "8ba1faa7-d1dc-4ab5-b6ae-bfc75be7843c" (UID: "8ba1faa7-d1dc-4ab5-b6ae-bfc75be7843c"). InnerVolumeSpecName "kube-api-access-svbpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:46:04 crc kubenswrapper[4901]: I0309 03:46:04.353996 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svbpj\" (UniqueName: \"kubernetes.io/projected/8ba1faa7-d1dc-4ab5-b6ae-bfc75be7843c-kube-api-access-svbpj\") on node \"crc\" DevicePath \"\"" Mar 09 03:46:04 crc kubenswrapper[4901]: I0309 03:46:04.844036 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550466-xvdcj" event={"ID":"8ba1faa7-d1dc-4ab5-b6ae-bfc75be7843c","Type":"ContainerDied","Data":"df75498fc063907cbfab37e7c9c969332e72f6a60472aa606358caecd187e858"} Mar 09 03:46:04 crc kubenswrapper[4901]: I0309 03:46:04.844095 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df75498fc063907cbfab37e7c9c969332e72f6a60472aa606358caecd187e858" Mar 09 03:46:04 crc kubenswrapper[4901]: I0309 03:46:04.844174 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550466-xvdcj" Mar 09 03:46:05 crc kubenswrapper[4901]: I0309 03:46:05.327452 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550460-d7sfz"] Mar 09 03:46:05 crc kubenswrapper[4901]: I0309 03:46:05.332779 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550460-d7sfz"] Mar 09 03:46:06 crc kubenswrapper[4901]: I0309 03:46:06.124771 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6f29225-f0d0-47b2-8817-0cf513c7f9fb" path="/var/lib/kubelet/pods/f6f29225-f0d0-47b2-8817-0cf513c7f9fb/volumes" Mar 09 03:46:30 crc kubenswrapper[4901]: I0309 03:46:30.862573 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 03:46:30 crc kubenswrapper[4901]: I0309 03:46:30.863171 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 03:46:40 crc kubenswrapper[4901]: I0309 03:46:40.256288 4901 scope.go:117] "RemoveContainer" containerID="31645a9ab252414bee756c22fe7cffb652c70f26b63dee87cf21db0ea5f7250b" Mar 09 03:47:00 crc kubenswrapper[4901]: I0309 03:47:00.863395 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 03:47:00 crc kubenswrapper[4901]: I0309 03:47:00.866119 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 03:47:00 crc kubenswrapper[4901]: I0309 03:47:00.866451 4901 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5c998" Mar 09 03:47:00 crc kubenswrapper[4901]: I0309 03:47:00.867685 4901 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e6a90193a69cced7a1ad9274aa0b99053439ef573ce6ab6a66d6ddefe76fb026"} pod="openshift-machine-config-operator/machine-config-daemon-5c998" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 03:47:00 crc kubenswrapper[4901]: I0309 03:47:00.867957 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" containerID="cri-o://e6a90193a69cced7a1ad9274aa0b99053439ef573ce6ab6a66d6ddefe76fb026" gracePeriod=600 Mar 09 03:47:01 crc kubenswrapper[4901]: I0309 03:47:01.376185 4901 generic.go:334] "Generic (PLEG): container finished" podID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerID="e6a90193a69cced7a1ad9274aa0b99053439ef573ce6ab6a66d6ddefe76fb026" exitCode=0 Mar 09 03:47:01 crc kubenswrapper[4901]: I0309 03:47:01.376407 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" event={"ID":"65e722e8-52c4-4bb6-9927-f378b2f7296a","Type":"ContainerDied","Data":"e6a90193a69cced7a1ad9274aa0b99053439ef573ce6ab6a66d6ddefe76fb026"} Mar 09 03:47:01 crc kubenswrapper[4901]: I0309 03:47:01.376902 4901 scope.go:117] "RemoveContainer" containerID="f516d08805ad123344b125ee30303c001e014432b07ec580df27cb03e8215ae7" Mar 09 03:47:01 crc kubenswrapper[4901]: E0309 03:47:01.736627 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:47:02 crc kubenswrapper[4901]: I0309 03:47:02.391256 4901 scope.go:117] "RemoveContainer" containerID="e6a90193a69cced7a1ad9274aa0b99053439ef573ce6ab6a66d6ddefe76fb026" Mar 09 03:47:02 crc kubenswrapper[4901]: E0309 03:47:02.391760 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:47:14 crc kubenswrapper[4901]: I0309 03:47:14.106967 4901 scope.go:117] "RemoveContainer" containerID="e6a90193a69cced7a1ad9274aa0b99053439ef573ce6ab6a66d6ddefe76fb026" Mar 09 03:47:14 crc kubenswrapper[4901]: E0309 03:47:14.108149 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:47:20 crc kubenswrapper[4901]: I0309 03:47:20.989104 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8v6q7"] Mar 09 03:47:20 crc kubenswrapper[4901]: E0309 03:47:20.990182 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ba1faa7-d1dc-4ab5-b6ae-bfc75be7843c" containerName="oc" Mar 09 03:47:20 crc kubenswrapper[4901]: I0309 03:47:20.990204 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ba1faa7-d1dc-4ab5-b6ae-bfc75be7843c" containerName="oc" Mar 09 03:47:20 crc kubenswrapper[4901]: I0309 03:47:20.990563 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ba1faa7-d1dc-4ab5-b6ae-bfc75be7843c" containerName="oc" Mar 09 03:47:20 crc kubenswrapper[4901]: I0309 03:47:20.992368 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8v6q7" Mar 09 03:47:21 crc kubenswrapper[4901]: I0309 03:47:21.006488 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8v6q7"] Mar 09 03:47:21 crc kubenswrapper[4901]: I0309 03:47:21.110326 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0dacb9c0-bfd5-4e95-98fb-1508dc14977f-catalog-content\") pod \"certified-operators-8v6q7\" (UID: \"0dacb9c0-bfd5-4e95-98fb-1508dc14977f\") " pod="openshift-marketplace/certified-operators-8v6q7" Mar 09 03:47:21 crc kubenswrapper[4901]: I0309 03:47:21.110525 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqgz5\" (UniqueName: \"kubernetes.io/projected/0dacb9c0-bfd5-4e95-98fb-1508dc14977f-kube-api-access-lqgz5\") pod \"certified-operators-8v6q7\" (UID: \"0dacb9c0-bfd5-4e95-98fb-1508dc14977f\") " pod="openshift-marketplace/certified-operators-8v6q7" Mar 09 03:47:21 crc kubenswrapper[4901]: I0309 03:47:21.110668 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0dacb9c0-bfd5-4e95-98fb-1508dc14977f-utilities\") pod \"certified-operators-8v6q7\" (UID: \"0dacb9c0-bfd5-4e95-98fb-1508dc14977f\") " pod="openshift-marketplace/certified-operators-8v6q7" Mar 09 03:47:21 crc kubenswrapper[4901]: I0309 03:47:21.181162 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4rhpl"] Mar 09 03:47:21 crc kubenswrapper[4901]: I0309 03:47:21.182837 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4rhpl" Mar 09 03:47:21 crc kubenswrapper[4901]: I0309 03:47:21.212570 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0dacb9c0-bfd5-4e95-98fb-1508dc14977f-catalog-content\") pod \"certified-operators-8v6q7\" (UID: \"0dacb9c0-bfd5-4e95-98fb-1508dc14977f\") " pod="openshift-marketplace/certified-operators-8v6q7" Mar 09 03:47:21 crc kubenswrapper[4901]: I0309 03:47:21.212658 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqgz5\" (UniqueName: \"kubernetes.io/projected/0dacb9c0-bfd5-4e95-98fb-1508dc14977f-kube-api-access-lqgz5\") pod \"certified-operators-8v6q7\" (UID: \"0dacb9c0-bfd5-4e95-98fb-1508dc14977f\") " pod="openshift-marketplace/certified-operators-8v6q7" Mar 09 03:47:21 crc kubenswrapper[4901]: I0309 03:47:21.212691 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0dacb9c0-bfd5-4e95-98fb-1508dc14977f-utilities\") pod \"certified-operators-8v6q7\" (UID: \"0dacb9c0-bfd5-4e95-98fb-1508dc14977f\") " pod="openshift-marketplace/certified-operators-8v6q7" Mar 09 03:47:21 crc kubenswrapper[4901]: I0309 03:47:21.213302 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0dacb9c0-bfd5-4e95-98fb-1508dc14977f-utilities\") pod \"certified-operators-8v6q7\" (UID: \"0dacb9c0-bfd5-4e95-98fb-1508dc14977f\") " pod="openshift-marketplace/certified-operators-8v6q7" Mar 09 03:47:21 crc kubenswrapper[4901]: I0309 03:47:21.213441 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0dacb9c0-bfd5-4e95-98fb-1508dc14977f-catalog-content\") pod \"certified-operators-8v6q7\" (UID: \"0dacb9c0-bfd5-4e95-98fb-1508dc14977f\") " pod="openshift-marketplace/certified-operators-8v6q7" Mar 09 03:47:21 crc kubenswrapper[4901]: I0309 03:47:21.242671 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4rhpl"] Mar 09 03:47:21 crc kubenswrapper[4901]: I0309 03:47:21.248203 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqgz5\" (UniqueName: \"kubernetes.io/projected/0dacb9c0-bfd5-4e95-98fb-1508dc14977f-kube-api-access-lqgz5\") pod \"certified-operators-8v6q7\" (UID: \"0dacb9c0-bfd5-4e95-98fb-1508dc14977f\") " pod="openshift-marketplace/certified-operators-8v6q7" Mar 09 03:47:21 crc kubenswrapper[4901]: I0309 03:47:21.313989 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ae31526-1b58-4e36-bbd0-ecd908e453d6-utilities\") pod \"redhat-operators-4rhpl\" (UID: \"0ae31526-1b58-4e36-bbd0-ecd908e453d6\") " pod="openshift-marketplace/redhat-operators-4rhpl" Mar 09 03:47:21 crc kubenswrapper[4901]: I0309 03:47:21.314036 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbnlh\" (UniqueName: \"kubernetes.io/projected/0ae31526-1b58-4e36-bbd0-ecd908e453d6-kube-api-access-kbnlh\") pod \"redhat-operators-4rhpl\" (UID: \"0ae31526-1b58-4e36-bbd0-ecd908e453d6\") " pod="openshift-marketplace/redhat-operators-4rhpl" Mar 09 03:47:21 crc kubenswrapper[4901]: I0309 03:47:21.314112 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ae31526-1b58-4e36-bbd0-ecd908e453d6-catalog-content\") pod \"redhat-operators-4rhpl\" (UID: \"0ae31526-1b58-4e36-bbd0-ecd908e453d6\") " pod="openshift-marketplace/redhat-operators-4rhpl" Mar 09 03:47:21 crc kubenswrapper[4901]: I0309 03:47:21.329961 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8v6q7" Mar 09 03:47:21 crc kubenswrapper[4901]: I0309 03:47:21.415239 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ae31526-1b58-4e36-bbd0-ecd908e453d6-catalog-content\") pod \"redhat-operators-4rhpl\" (UID: \"0ae31526-1b58-4e36-bbd0-ecd908e453d6\") " pod="openshift-marketplace/redhat-operators-4rhpl" Mar 09 03:47:21 crc kubenswrapper[4901]: I0309 03:47:21.415479 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ae31526-1b58-4e36-bbd0-ecd908e453d6-utilities\") pod \"redhat-operators-4rhpl\" (UID: \"0ae31526-1b58-4e36-bbd0-ecd908e453d6\") " pod="openshift-marketplace/redhat-operators-4rhpl" Mar 09 03:47:21 crc kubenswrapper[4901]: I0309 03:47:21.415509 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbnlh\" (UniqueName: \"kubernetes.io/projected/0ae31526-1b58-4e36-bbd0-ecd908e453d6-kube-api-access-kbnlh\") pod \"redhat-operators-4rhpl\" (UID: \"0ae31526-1b58-4e36-bbd0-ecd908e453d6\") " pod="openshift-marketplace/redhat-operators-4rhpl" Mar 09 03:47:21 crc kubenswrapper[4901]: I0309 03:47:21.415779 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ae31526-1b58-4e36-bbd0-ecd908e453d6-catalog-content\") pod \"redhat-operators-4rhpl\" (UID: \"0ae31526-1b58-4e36-bbd0-ecd908e453d6\") " pod="openshift-marketplace/redhat-operators-4rhpl" Mar 09 03:47:21 crc kubenswrapper[4901]: I0309 03:47:21.415914 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ae31526-1b58-4e36-bbd0-ecd908e453d6-utilities\") pod \"redhat-operators-4rhpl\" (UID: \"0ae31526-1b58-4e36-bbd0-ecd908e453d6\") " pod="openshift-marketplace/redhat-operators-4rhpl" Mar 09 03:47:21 crc kubenswrapper[4901]: I0309 03:47:21.491637 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbnlh\" (UniqueName: \"kubernetes.io/projected/0ae31526-1b58-4e36-bbd0-ecd908e453d6-kube-api-access-kbnlh\") pod \"redhat-operators-4rhpl\" (UID: \"0ae31526-1b58-4e36-bbd0-ecd908e453d6\") " pod="openshift-marketplace/redhat-operators-4rhpl" Mar 09 03:47:21 crc kubenswrapper[4901]: I0309 03:47:21.500988 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4rhpl" Mar 09 03:47:21 crc kubenswrapper[4901]: I0309 03:47:21.807273 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8v6q7"] Mar 09 03:47:21 crc kubenswrapper[4901]: I0309 03:47:21.940214 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4rhpl"] Mar 09 03:47:22 crc kubenswrapper[4901]: I0309 03:47:22.569112 4901 generic.go:334] "Generic (PLEG): container finished" podID="0ae31526-1b58-4e36-bbd0-ecd908e453d6" containerID="5d4b01bc5f79ce934480bb35dffff2b60b81b9cca5aa4665e833c789625b536e" exitCode=0 Mar 09 03:47:22 crc kubenswrapper[4901]: I0309 03:47:22.569151 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4rhpl" event={"ID":"0ae31526-1b58-4e36-bbd0-ecd908e453d6","Type":"ContainerDied","Data":"5d4b01bc5f79ce934480bb35dffff2b60b81b9cca5aa4665e833c789625b536e"} Mar 09 03:47:22 crc kubenswrapper[4901]: I0309 03:47:22.569185 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4rhpl" event={"ID":"0ae31526-1b58-4e36-bbd0-ecd908e453d6","Type":"ContainerStarted","Data":"21c1f8c103073950a9b37f354d0e0248b1741ba124e7d867b46ab7af0d5805cd"} Mar 09 03:47:22 crc kubenswrapper[4901]: I0309 03:47:22.571262 4901 generic.go:334] "Generic (PLEG): container finished" podID="0dacb9c0-bfd5-4e95-98fb-1508dc14977f" containerID="8ba2eef9d6305361ae1662bf65cc60e1283f8358122f2886a12ff95e4cc7fbe9" exitCode=0 Mar 09 03:47:22 crc kubenswrapper[4901]: I0309 03:47:22.571321 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8v6q7" event={"ID":"0dacb9c0-bfd5-4e95-98fb-1508dc14977f","Type":"ContainerDied","Data":"8ba2eef9d6305361ae1662bf65cc60e1283f8358122f2886a12ff95e4cc7fbe9"} Mar 09 03:47:22 crc kubenswrapper[4901]: I0309 03:47:22.571351 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8v6q7" event={"ID":"0dacb9c0-bfd5-4e95-98fb-1508dc14977f","Type":"ContainerStarted","Data":"55a31e4c99b660e3b7b02b994a33e4fe14840466d16b7a49f5ee0f1cfdff99d0"} Mar 09 03:47:22 crc kubenswrapper[4901]: I0309 03:47:22.573344 4901 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 03:47:24 crc kubenswrapper[4901]: I0309 03:47:24.608537 4901 generic.go:334] "Generic (PLEG): container finished" podID="0dacb9c0-bfd5-4e95-98fb-1508dc14977f" containerID="13ef48a4bb9f2c4f2f25d52df2c9cfc2719d3a067bb64074b75e239fa3162b21" exitCode=0 Mar 09 03:47:24 crc kubenswrapper[4901]: I0309 03:47:24.608662 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8v6q7" event={"ID":"0dacb9c0-bfd5-4e95-98fb-1508dc14977f","Type":"ContainerDied","Data":"13ef48a4bb9f2c4f2f25d52df2c9cfc2719d3a067bb64074b75e239fa3162b21"} Mar 09 03:47:24 crc kubenswrapper[4901]: I0309 03:47:24.611198 4901 generic.go:334] "Generic (PLEG): container finished" podID="0ae31526-1b58-4e36-bbd0-ecd908e453d6" containerID="06f774a7805a2535c5cd16a60a44508caf6c5d49514db6dba1fb8a08b8c1f32b" exitCode=0 Mar 09 03:47:24 crc kubenswrapper[4901]: I0309 03:47:24.611248 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4rhpl" event={"ID":"0ae31526-1b58-4e36-bbd0-ecd908e453d6","Type":"ContainerDied","Data":"06f774a7805a2535c5cd16a60a44508caf6c5d49514db6dba1fb8a08b8c1f32b"} Mar 09 03:47:25 crc kubenswrapper[4901]: I0309 03:47:25.106315 4901 scope.go:117] "RemoveContainer" containerID="e6a90193a69cced7a1ad9274aa0b99053439ef573ce6ab6a66d6ddefe76fb026" Mar 09 03:47:25 crc kubenswrapper[4901]: E0309 03:47:25.106531 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:47:25 crc kubenswrapper[4901]: I0309 03:47:25.620644 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4rhpl" event={"ID":"0ae31526-1b58-4e36-bbd0-ecd908e453d6","Type":"ContainerStarted","Data":"7089cae823b3ce7d679c716b4eebece775d9f488f809316e93ab04bd9860623d"} Mar 09 03:47:25 crc kubenswrapper[4901]: I0309 03:47:25.622783 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8v6q7" event={"ID":"0dacb9c0-bfd5-4e95-98fb-1508dc14977f","Type":"ContainerStarted","Data":"020b36a732099fda10e743c4c9ce07cdc14faa1a9f13c54f58019c327a1cad11"} Mar 09 03:47:25 crc kubenswrapper[4901]: I0309 03:47:25.640390 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4rhpl" podStartSLOduration=2.254271993 podStartE2EDuration="4.640377683s" podCreationTimestamp="2026-03-09 03:47:21 +0000 UTC" firstStartedPulling="2026-03-09 03:47:22.573056825 +0000 UTC m=+3967.162720567" lastFinishedPulling="2026-03-09 03:47:24.959162535 +0000 UTC m=+3969.548826257" observedRunningTime="2026-03-09 03:47:25.638543828 +0000 UTC m=+3970.228207550" watchObservedRunningTime="2026-03-09 03:47:25.640377683 +0000 UTC m=+3970.230041415" Mar 09 03:47:25 crc kubenswrapper[4901]: I0309 03:47:25.658516 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8v6q7" podStartSLOduration=3.126729051 podStartE2EDuration="5.658500241s" podCreationTimestamp="2026-03-09 03:47:20 +0000 UTC" firstStartedPulling="2026-03-09 03:47:22.573051285 +0000 UTC m=+3967.162715027" lastFinishedPulling="2026-03-09 03:47:25.104822455 +0000 UTC m=+3969.694486217" observedRunningTime="2026-03-09 03:47:25.655991049 +0000 UTC m=+3970.245654781" watchObservedRunningTime="2026-03-09 03:47:25.658500241 +0000 UTC m=+3970.248163973" Mar 09 03:47:31 crc kubenswrapper[4901]: I0309 03:47:31.331171 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8v6q7" Mar 09 03:47:31 crc kubenswrapper[4901]: I0309 03:47:31.331475 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8v6q7" Mar 09 03:47:31 crc kubenswrapper[4901]: I0309 03:47:31.386892 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8v6q7" Mar 09 03:47:31 crc kubenswrapper[4901]: I0309 03:47:31.501909 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4rhpl" Mar 09 03:47:31 crc kubenswrapper[4901]: I0309 03:47:31.502308 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4rhpl" Mar 09 03:47:31 crc kubenswrapper[4901]: I0309 03:47:31.732033 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8v6q7" Mar 09 03:47:32 crc kubenswrapper[4901]: I0309 03:47:32.547335 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4rhpl" podUID="0ae31526-1b58-4e36-bbd0-ecd908e453d6" containerName="registry-server" probeResult="failure" output=< Mar 09 03:47:32 crc kubenswrapper[4901]: timeout: failed to connect service ":50051" within 1s Mar 09 03:47:32 crc kubenswrapper[4901]: > Mar 09 03:47:35 crc kubenswrapper[4901]: I0309 03:47:35.171845 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8v6q7"] Mar 09 03:47:35 crc kubenswrapper[4901]: I0309 03:47:35.172896 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8v6q7" podUID="0dacb9c0-bfd5-4e95-98fb-1508dc14977f" containerName="registry-server" containerID="cri-o://020b36a732099fda10e743c4c9ce07cdc14faa1a9f13c54f58019c327a1cad11" gracePeriod=2 Mar 09 03:47:35 crc kubenswrapper[4901]: I0309 03:47:35.714465 4901 generic.go:334] "Generic (PLEG): container finished" podID="0dacb9c0-bfd5-4e95-98fb-1508dc14977f" containerID="020b36a732099fda10e743c4c9ce07cdc14faa1a9f13c54f58019c327a1cad11" exitCode=0 Mar 09 03:47:35 crc kubenswrapper[4901]: I0309 03:47:35.714526 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8v6q7" event={"ID":"0dacb9c0-bfd5-4e95-98fb-1508dc14977f","Type":"ContainerDied","Data":"020b36a732099fda10e743c4c9ce07cdc14faa1a9f13c54f58019c327a1cad11"} Mar 09 03:47:36 crc kubenswrapper[4901]: I0309 03:47:36.394968 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8v6q7" Mar 09 03:47:36 crc kubenswrapper[4901]: I0309 03:47:36.546537 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0dacb9c0-bfd5-4e95-98fb-1508dc14977f-utilities\") pod \"0dacb9c0-bfd5-4e95-98fb-1508dc14977f\" (UID: \"0dacb9c0-bfd5-4e95-98fb-1508dc14977f\") " Mar 09 03:47:36 crc kubenswrapper[4901]: I0309 03:47:36.546619 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqgz5\" (UniqueName: \"kubernetes.io/projected/0dacb9c0-bfd5-4e95-98fb-1508dc14977f-kube-api-access-lqgz5\") pod \"0dacb9c0-bfd5-4e95-98fb-1508dc14977f\" (UID: \"0dacb9c0-bfd5-4e95-98fb-1508dc14977f\") " Mar 09 03:47:36 crc kubenswrapper[4901]: I0309 03:47:36.546727 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0dacb9c0-bfd5-4e95-98fb-1508dc14977f-catalog-content\") pod \"0dacb9c0-bfd5-4e95-98fb-1508dc14977f\" (UID: \"0dacb9c0-bfd5-4e95-98fb-1508dc14977f\") " Mar 09 03:47:36 crc kubenswrapper[4901]: I0309 03:47:36.548142 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0dacb9c0-bfd5-4e95-98fb-1508dc14977f-utilities" (OuterVolumeSpecName: "utilities") pod "0dacb9c0-bfd5-4e95-98fb-1508dc14977f" (UID: "0dacb9c0-bfd5-4e95-98fb-1508dc14977f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:47:36 crc kubenswrapper[4901]: I0309 03:47:36.646098 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0dacb9c0-bfd5-4e95-98fb-1508dc14977f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0dacb9c0-bfd5-4e95-98fb-1508dc14977f" (UID: "0dacb9c0-bfd5-4e95-98fb-1508dc14977f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:47:36 crc kubenswrapper[4901]: I0309 03:47:36.649714 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0dacb9c0-bfd5-4e95-98fb-1508dc14977f-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 03:47:36 crc kubenswrapper[4901]: I0309 03:47:36.649768 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0dacb9c0-bfd5-4e95-98fb-1508dc14977f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 03:47:36 crc kubenswrapper[4901]: I0309 03:47:36.728305 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8v6q7" event={"ID":"0dacb9c0-bfd5-4e95-98fb-1508dc14977f","Type":"ContainerDied","Data":"55a31e4c99b660e3b7b02b994a33e4fe14840466d16b7a49f5ee0f1cfdff99d0"} Mar 09 03:47:36 crc kubenswrapper[4901]: I0309 03:47:36.728504 4901 scope.go:117] "RemoveContainer" containerID="020b36a732099fda10e743c4c9ce07cdc14faa1a9f13c54f58019c327a1cad11" Mar 09 03:47:36 crc kubenswrapper[4901]: I0309 03:47:36.728556 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8v6q7" Mar 09 03:47:36 crc kubenswrapper[4901]: I0309 03:47:36.765873 4901 scope.go:117] "RemoveContainer" containerID="13ef48a4bb9f2c4f2f25d52df2c9cfc2719d3a067bb64074b75e239fa3162b21" Mar 09 03:47:37 crc kubenswrapper[4901]: I0309 03:47:37.121890 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dacb9c0-bfd5-4e95-98fb-1508dc14977f-kube-api-access-lqgz5" (OuterVolumeSpecName: "kube-api-access-lqgz5") pod "0dacb9c0-bfd5-4e95-98fb-1508dc14977f" (UID: "0dacb9c0-bfd5-4e95-98fb-1508dc14977f"). InnerVolumeSpecName "kube-api-access-lqgz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:47:37 crc kubenswrapper[4901]: I0309 03:47:37.128507 4901 scope.go:117] "RemoveContainer" containerID="8ba2eef9d6305361ae1662bf65cc60e1283f8358122f2886a12ff95e4cc7fbe9" Mar 09 03:47:37 crc kubenswrapper[4901]: I0309 03:47:37.158900 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqgz5\" (UniqueName: \"kubernetes.io/projected/0dacb9c0-bfd5-4e95-98fb-1508dc14977f-kube-api-access-lqgz5\") on node \"crc\" DevicePath \"\"" Mar 09 03:47:37 crc kubenswrapper[4901]: I0309 03:47:37.383138 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8v6q7"] Mar 09 03:47:37 crc kubenswrapper[4901]: I0309 03:47:37.392137 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8v6q7"] Mar 09 03:47:38 crc kubenswrapper[4901]: I0309 03:47:38.114027 4901 scope.go:117] "RemoveContainer" containerID="e6a90193a69cced7a1ad9274aa0b99053439ef573ce6ab6a66d6ddefe76fb026" Mar 09 03:47:38 crc kubenswrapper[4901]: E0309 03:47:38.115033 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:47:38 crc kubenswrapper[4901]: I0309 03:47:38.143860 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dacb9c0-bfd5-4e95-98fb-1508dc14977f" path="/var/lib/kubelet/pods/0dacb9c0-bfd5-4e95-98fb-1508dc14977f/volumes" Mar 09 03:47:41 crc kubenswrapper[4901]: I0309 03:47:41.589708 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4rhpl" Mar 09 03:47:41 crc kubenswrapper[4901]: I0309 03:47:41.659424 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4rhpl" Mar 09 03:47:42 crc kubenswrapper[4901]: I0309 03:47:42.575493 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4rhpl"] Mar 09 03:47:42 crc kubenswrapper[4901]: I0309 03:47:42.793847 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4rhpl" podUID="0ae31526-1b58-4e36-bbd0-ecd908e453d6" containerName="registry-server" containerID="cri-o://7089cae823b3ce7d679c716b4eebece775d9f488f809316e93ab04bd9860623d" gracePeriod=2 Mar 09 03:47:43 crc kubenswrapper[4901]: I0309 03:47:43.348389 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4rhpl" Mar 09 03:47:43 crc kubenswrapper[4901]: I0309 03:47:43.484628 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ae31526-1b58-4e36-bbd0-ecd908e453d6-catalog-content\") pod \"0ae31526-1b58-4e36-bbd0-ecd908e453d6\" (UID: \"0ae31526-1b58-4e36-bbd0-ecd908e453d6\") " Mar 09 03:47:43 crc kubenswrapper[4901]: I0309 03:47:43.484700 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbnlh\" (UniqueName: \"kubernetes.io/projected/0ae31526-1b58-4e36-bbd0-ecd908e453d6-kube-api-access-kbnlh\") pod \"0ae31526-1b58-4e36-bbd0-ecd908e453d6\" (UID: \"0ae31526-1b58-4e36-bbd0-ecd908e453d6\") " Mar 09 03:47:43 crc kubenswrapper[4901]: I0309 03:47:43.484740 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ae31526-1b58-4e36-bbd0-ecd908e453d6-utilities\") pod \"0ae31526-1b58-4e36-bbd0-ecd908e453d6\" (UID: \"0ae31526-1b58-4e36-bbd0-ecd908e453d6\") " Mar 09 03:47:43 crc kubenswrapper[4901]: I0309 03:47:43.486385 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ae31526-1b58-4e36-bbd0-ecd908e453d6-utilities" (OuterVolumeSpecName: "utilities") pod "0ae31526-1b58-4e36-bbd0-ecd908e453d6" (UID: "0ae31526-1b58-4e36-bbd0-ecd908e453d6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:47:43 crc kubenswrapper[4901]: I0309 03:47:43.491393 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ae31526-1b58-4e36-bbd0-ecd908e453d6-kube-api-access-kbnlh" (OuterVolumeSpecName: "kube-api-access-kbnlh") pod "0ae31526-1b58-4e36-bbd0-ecd908e453d6" (UID: "0ae31526-1b58-4e36-bbd0-ecd908e453d6"). InnerVolumeSpecName "kube-api-access-kbnlh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:47:43 crc kubenswrapper[4901]: I0309 03:47:43.586851 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbnlh\" (UniqueName: \"kubernetes.io/projected/0ae31526-1b58-4e36-bbd0-ecd908e453d6-kube-api-access-kbnlh\") on node \"crc\" DevicePath \"\"" Mar 09 03:47:43 crc kubenswrapper[4901]: I0309 03:47:43.586906 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ae31526-1b58-4e36-bbd0-ecd908e453d6-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 03:47:43 crc kubenswrapper[4901]: I0309 03:47:43.663009 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ae31526-1b58-4e36-bbd0-ecd908e453d6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0ae31526-1b58-4e36-bbd0-ecd908e453d6" (UID: "0ae31526-1b58-4e36-bbd0-ecd908e453d6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:47:43 crc kubenswrapper[4901]: I0309 03:47:43.688137 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ae31526-1b58-4e36-bbd0-ecd908e453d6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 03:47:43 crc kubenswrapper[4901]: I0309 03:47:43.806239 4901 generic.go:334] "Generic (PLEG): container finished" podID="0ae31526-1b58-4e36-bbd0-ecd908e453d6" containerID="7089cae823b3ce7d679c716b4eebece775d9f488f809316e93ab04bd9860623d" exitCode=0 Mar 09 03:47:43 crc kubenswrapper[4901]: I0309 03:47:43.806282 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4rhpl" event={"ID":"0ae31526-1b58-4e36-bbd0-ecd908e453d6","Type":"ContainerDied","Data":"7089cae823b3ce7d679c716b4eebece775d9f488f809316e93ab04bd9860623d"} Mar 09 03:47:43 crc kubenswrapper[4901]: I0309 03:47:43.806334 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4rhpl" event={"ID":"0ae31526-1b58-4e36-bbd0-ecd908e453d6","Type":"ContainerDied","Data":"21c1f8c103073950a9b37f354d0e0248b1741ba124e7d867b46ab7af0d5805cd"} Mar 09 03:47:43 crc kubenswrapper[4901]: I0309 03:47:43.806353 4901 scope.go:117] "RemoveContainer" containerID="7089cae823b3ce7d679c716b4eebece775d9f488f809316e93ab04bd9860623d" Mar 09 03:47:43 crc kubenswrapper[4901]: I0309 03:47:43.806427 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4rhpl" Mar 09 03:47:43 crc kubenswrapper[4901]: I0309 03:47:43.830682 4901 scope.go:117] "RemoveContainer" containerID="06f774a7805a2535c5cd16a60a44508caf6c5d49514db6dba1fb8a08b8c1f32b" Mar 09 03:47:43 crc kubenswrapper[4901]: I0309 03:47:43.872811 4901 scope.go:117] "RemoveContainer" containerID="5d4b01bc5f79ce934480bb35dffff2b60b81b9cca5aa4665e833c789625b536e" Mar 09 03:47:43 crc kubenswrapper[4901]: I0309 03:47:43.913729 4901 scope.go:117] "RemoveContainer" containerID="7089cae823b3ce7d679c716b4eebece775d9f488f809316e93ab04bd9860623d" Mar 09 03:47:43 crc kubenswrapper[4901]: E0309 03:47:43.914654 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7089cae823b3ce7d679c716b4eebece775d9f488f809316e93ab04bd9860623d\": container with ID starting with 7089cae823b3ce7d679c716b4eebece775d9f488f809316e93ab04bd9860623d not found: ID does not exist" containerID="7089cae823b3ce7d679c716b4eebece775d9f488f809316e93ab04bd9860623d" Mar 09 03:47:43 crc kubenswrapper[4901]: I0309 03:47:43.914717 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7089cae823b3ce7d679c716b4eebece775d9f488f809316e93ab04bd9860623d"} err="failed to get container status \"7089cae823b3ce7d679c716b4eebece775d9f488f809316e93ab04bd9860623d\": rpc error: code = NotFound desc = could not find container \"7089cae823b3ce7d679c716b4eebece775d9f488f809316e93ab04bd9860623d\": container with ID starting with 7089cae823b3ce7d679c716b4eebece775d9f488f809316e93ab04bd9860623d not found: ID does not exist" Mar 09 03:47:43 crc kubenswrapper[4901]: I0309 03:47:43.914757 4901 scope.go:117] "RemoveContainer" containerID="06f774a7805a2535c5cd16a60a44508caf6c5d49514db6dba1fb8a08b8c1f32b" Mar 09 03:47:43 crc kubenswrapper[4901]: E0309 03:47:43.915095 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06f774a7805a2535c5cd16a60a44508caf6c5d49514db6dba1fb8a08b8c1f32b\": container with ID starting with 06f774a7805a2535c5cd16a60a44508caf6c5d49514db6dba1fb8a08b8c1f32b not found: ID does not exist" containerID="06f774a7805a2535c5cd16a60a44508caf6c5d49514db6dba1fb8a08b8c1f32b" Mar 09 03:47:43 crc kubenswrapper[4901]: I0309 03:47:43.915121 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06f774a7805a2535c5cd16a60a44508caf6c5d49514db6dba1fb8a08b8c1f32b"} err="failed to get container status \"06f774a7805a2535c5cd16a60a44508caf6c5d49514db6dba1fb8a08b8c1f32b\": rpc error: code = NotFound desc = could not find container \"06f774a7805a2535c5cd16a60a44508caf6c5d49514db6dba1fb8a08b8c1f32b\": container with ID starting with 06f774a7805a2535c5cd16a60a44508caf6c5d49514db6dba1fb8a08b8c1f32b not found: ID does not exist" Mar 09 03:47:43 crc kubenswrapper[4901]: I0309 03:47:43.915145 4901 scope.go:117] "RemoveContainer" containerID="5d4b01bc5f79ce934480bb35dffff2b60b81b9cca5aa4665e833c789625b536e" Mar 09 03:47:43 crc kubenswrapper[4901]: I0309 03:47:43.915275 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4rhpl"] Mar 09 03:47:43 crc kubenswrapper[4901]: E0309 03:47:43.916693 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d4b01bc5f79ce934480bb35dffff2b60b81b9cca5aa4665e833c789625b536e\": container with ID starting with 5d4b01bc5f79ce934480bb35dffff2b60b81b9cca5aa4665e833c789625b536e not found: ID does not exist" containerID="5d4b01bc5f79ce934480bb35dffff2b60b81b9cca5aa4665e833c789625b536e" Mar 09 03:47:43 crc kubenswrapper[4901]: I0309 03:47:43.916738 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d4b01bc5f79ce934480bb35dffff2b60b81b9cca5aa4665e833c789625b536e"} err="failed to get container status \"5d4b01bc5f79ce934480bb35dffff2b60b81b9cca5aa4665e833c789625b536e\": rpc error: code = NotFound desc = could not find container \"5d4b01bc5f79ce934480bb35dffff2b60b81b9cca5aa4665e833c789625b536e\": container with ID starting with 5d4b01bc5f79ce934480bb35dffff2b60b81b9cca5aa4665e833c789625b536e not found: ID does not exist" Mar 09 03:47:43 crc kubenswrapper[4901]: I0309 03:47:43.927701 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4rhpl"] Mar 09 03:47:44 crc kubenswrapper[4901]: I0309 03:47:44.124989 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ae31526-1b58-4e36-bbd0-ecd908e453d6" path="/var/lib/kubelet/pods/0ae31526-1b58-4e36-bbd0-ecd908e453d6/volumes" Mar 09 03:47:50 crc kubenswrapper[4901]: I0309 03:47:50.106947 4901 scope.go:117] "RemoveContainer" containerID="e6a90193a69cced7a1ad9274aa0b99053439ef573ce6ab6a66d6ddefe76fb026" Mar 09 03:47:50 crc kubenswrapper[4901]: E0309 03:47:50.108085 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:48:00 crc kubenswrapper[4901]: I0309 03:48:00.154427 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550468-rbxnj"] Mar 09 03:48:00 crc kubenswrapper[4901]: E0309 03:48:00.155265 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ae31526-1b58-4e36-bbd0-ecd908e453d6" containerName="extract-utilities" Mar 09 03:48:00 crc kubenswrapper[4901]: I0309 03:48:00.155278 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ae31526-1b58-4e36-bbd0-ecd908e453d6" containerName="extract-utilities" Mar 09 03:48:00 crc kubenswrapper[4901]: E0309 03:48:00.155292 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dacb9c0-bfd5-4e95-98fb-1508dc14977f" containerName="extract-utilities" Mar 09 03:48:00 crc kubenswrapper[4901]: I0309 03:48:00.155298 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dacb9c0-bfd5-4e95-98fb-1508dc14977f" containerName="extract-utilities" Mar 09 03:48:00 crc kubenswrapper[4901]: E0309 03:48:00.155312 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dacb9c0-bfd5-4e95-98fb-1508dc14977f" containerName="registry-server" Mar 09 03:48:00 crc kubenswrapper[4901]: I0309 03:48:00.155319 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dacb9c0-bfd5-4e95-98fb-1508dc14977f" containerName="registry-server" Mar 09 03:48:00 crc kubenswrapper[4901]: E0309 03:48:00.155333 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ae31526-1b58-4e36-bbd0-ecd908e453d6" containerName="extract-content" Mar 09 03:48:00 crc kubenswrapper[4901]: I0309 03:48:00.155340 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ae31526-1b58-4e36-bbd0-ecd908e453d6" containerName="extract-content" Mar 09 03:48:00 crc kubenswrapper[4901]: E0309 03:48:00.155355 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dacb9c0-bfd5-4e95-98fb-1508dc14977f" containerName="extract-content" Mar 09 03:48:00 crc kubenswrapper[4901]: I0309 03:48:00.155361 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dacb9c0-bfd5-4e95-98fb-1508dc14977f" containerName="extract-content" Mar 09 03:48:00 crc kubenswrapper[4901]: E0309 03:48:00.155371 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ae31526-1b58-4e36-bbd0-ecd908e453d6" containerName="registry-server" Mar 09 03:48:00 crc kubenswrapper[4901]: I0309 03:48:00.155380 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ae31526-1b58-4e36-bbd0-ecd908e453d6" containerName="registry-server" Mar 09 03:48:00 crc kubenswrapper[4901]: I0309 03:48:00.155547 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dacb9c0-bfd5-4e95-98fb-1508dc14977f" containerName="registry-server" Mar 09 03:48:00 crc kubenswrapper[4901]: I0309 03:48:00.155566 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ae31526-1b58-4e36-bbd0-ecd908e453d6" containerName="registry-server" Mar 09 03:48:00 crc kubenswrapper[4901]: I0309 03:48:00.156036 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550468-rbxnj" Mar 09 03:48:00 crc kubenswrapper[4901]: I0309 03:48:00.158959 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 03:48:00 crc kubenswrapper[4901]: I0309 03:48:00.159144 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lq6vs" Mar 09 03:48:00 crc kubenswrapper[4901]: I0309 03:48:00.160465 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 03:48:00 crc kubenswrapper[4901]: I0309 03:48:00.178518 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550468-rbxnj"] Mar 09 03:48:00 crc kubenswrapper[4901]: I0309 03:48:00.199269 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwqlq\" (UniqueName: \"kubernetes.io/projected/c21da82b-5cbf-4d49-b100-39d7add2c473-kube-api-access-fwqlq\") pod \"auto-csr-approver-29550468-rbxnj\" (UID: \"c21da82b-5cbf-4d49-b100-39d7add2c473\") " pod="openshift-infra/auto-csr-approver-29550468-rbxnj" Mar 09 03:48:00 crc kubenswrapper[4901]: I0309 03:48:00.299718 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwqlq\" (UniqueName: \"kubernetes.io/projected/c21da82b-5cbf-4d49-b100-39d7add2c473-kube-api-access-fwqlq\") pod \"auto-csr-approver-29550468-rbxnj\" (UID: \"c21da82b-5cbf-4d49-b100-39d7add2c473\") " pod="openshift-infra/auto-csr-approver-29550468-rbxnj" Mar 09 03:48:00 crc kubenswrapper[4901]: I0309 03:48:00.319651 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwqlq\" (UniqueName: \"kubernetes.io/projected/c21da82b-5cbf-4d49-b100-39d7add2c473-kube-api-access-fwqlq\") pod \"auto-csr-approver-29550468-rbxnj\" (UID: \"c21da82b-5cbf-4d49-b100-39d7add2c473\") " pod="openshift-infra/auto-csr-approver-29550468-rbxnj" Mar 09 03:48:00 crc kubenswrapper[4901]: I0309 03:48:00.486887 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550468-rbxnj" Mar 09 03:48:00 crc kubenswrapper[4901]: I0309 03:48:00.977661 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550468-rbxnj"] Mar 09 03:48:00 crc kubenswrapper[4901]: W0309 03:48:00.985010 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc21da82b_5cbf_4d49_b100_39d7add2c473.slice/crio-a6a7d65bfd55422ef0c16308d1e1ad500f3e5742f1e38b921bfcb2c75a4c829c WatchSource:0}: Error finding container a6a7d65bfd55422ef0c16308d1e1ad500f3e5742f1e38b921bfcb2c75a4c829c: Status 404 returned error can't find the container with id a6a7d65bfd55422ef0c16308d1e1ad500f3e5742f1e38b921bfcb2c75a4c829c Mar 09 03:48:01 crc kubenswrapper[4901]: I0309 03:48:01.974949 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550468-rbxnj" event={"ID":"c21da82b-5cbf-4d49-b100-39d7add2c473","Type":"ContainerStarted","Data":"a6a7d65bfd55422ef0c16308d1e1ad500f3e5742f1e38b921bfcb2c75a4c829c"} Mar 09 03:48:02 crc kubenswrapper[4901]: I0309 03:48:02.989710 4901 generic.go:334] "Generic (PLEG): container finished" podID="c21da82b-5cbf-4d49-b100-39d7add2c473" containerID="cadbdc92ca5753c3457d7b982aa88e6f67ccf9a2bc7d19b87bdd95eef2c6d15d" exitCode=0 Mar 09 03:48:02 crc kubenswrapper[4901]: I0309 03:48:02.989813 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550468-rbxnj" event={"ID":"c21da82b-5cbf-4d49-b100-39d7add2c473","Type":"ContainerDied","Data":"cadbdc92ca5753c3457d7b982aa88e6f67ccf9a2bc7d19b87bdd95eef2c6d15d"} Mar 09 03:48:03 crc kubenswrapper[4901]: I0309 03:48:03.107118 4901 scope.go:117] "RemoveContainer" containerID="e6a90193a69cced7a1ad9274aa0b99053439ef573ce6ab6a66d6ddefe76fb026" Mar 09 03:48:03 crc kubenswrapper[4901]: E0309 03:48:03.107956 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:48:04 crc kubenswrapper[4901]: I0309 03:48:04.396099 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550468-rbxnj" Mar 09 03:48:04 crc kubenswrapper[4901]: I0309 03:48:04.574016 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwqlq\" (UniqueName: \"kubernetes.io/projected/c21da82b-5cbf-4d49-b100-39d7add2c473-kube-api-access-fwqlq\") pod \"c21da82b-5cbf-4d49-b100-39d7add2c473\" (UID: \"c21da82b-5cbf-4d49-b100-39d7add2c473\") " Mar 09 03:48:04 crc kubenswrapper[4901]: I0309 03:48:04.582527 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c21da82b-5cbf-4d49-b100-39d7add2c473-kube-api-access-fwqlq" (OuterVolumeSpecName: "kube-api-access-fwqlq") pod "c21da82b-5cbf-4d49-b100-39d7add2c473" (UID: "c21da82b-5cbf-4d49-b100-39d7add2c473"). InnerVolumeSpecName "kube-api-access-fwqlq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:48:04 crc kubenswrapper[4901]: I0309 03:48:04.675841 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwqlq\" (UniqueName: \"kubernetes.io/projected/c21da82b-5cbf-4d49-b100-39d7add2c473-kube-api-access-fwqlq\") on node \"crc\" DevicePath \"\"" Mar 09 03:48:05 crc kubenswrapper[4901]: I0309 03:48:05.013660 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550468-rbxnj" event={"ID":"c21da82b-5cbf-4d49-b100-39d7add2c473","Type":"ContainerDied","Data":"a6a7d65bfd55422ef0c16308d1e1ad500f3e5742f1e38b921bfcb2c75a4c829c"} Mar 09 03:48:05 crc kubenswrapper[4901]: I0309 03:48:05.013710 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6a7d65bfd55422ef0c16308d1e1ad500f3e5742f1e38b921bfcb2c75a4c829c" Mar 09 03:48:05 crc kubenswrapper[4901]: I0309 03:48:05.013747 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550468-rbxnj" Mar 09 03:48:05 crc kubenswrapper[4901]: I0309 03:48:05.484688 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550462-mk9cz"] Mar 09 03:48:05 crc kubenswrapper[4901]: I0309 03:48:05.496546 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550462-mk9cz"] Mar 09 03:48:06 crc kubenswrapper[4901]: I0309 03:48:06.120771 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35f6249c-909a-43ab-a1b4-b33e9473c5e8" path="/var/lib/kubelet/pods/35f6249c-909a-43ab-a1b4-b33e9473c5e8/volumes" Mar 09 03:48:18 crc kubenswrapper[4901]: I0309 03:48:18.106212 4901 scope.go:117] "RemoveContainer" containerID="e6a90193a69cced7a1ad9274aa0b99053439ef573ce6ab6a66d6ddefe76fb026" Mar 09 03:48:18 crc kubenswrapper[4901]: E0309 03:48:18.107201 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:48:32 crc kubenswrapper[4901]: I0309 03:48:32.106925 4901 scope.go:117] "RemoveContainer" containerID="e6a90193a69cced7a1ad9274aa0b99053439ef573ce6ab6a66d6ddefe76fb026" Mar 09 03:48:32 crc kubenswrapper[4901]: E0309 03:48:32.107906 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:48:40 crc kubenswrapper[4901]: I0309 03:48:40.398108 4901 scope.go:117] "RemoveContainer" containerID="4cd3269bf71ec5cc1ac624c14e83b983a53eef7e3afe963693716fbe54b0fd54" Mar 09 03:48:46 crc kubenswrapper[4901]: I0309 03:48:46.113387 4901 scope.go:117] "RemoveContainer" containerID="e6a90193a69cced7a1ad9274aa0b99053439ef573ce6ab6a66d6ddefe76fb026" Mar 09 03:48:46 crc kubenswrapper[4901]: E0309 03:48:46.114504 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:49:01 crc kubenswrapper[4901]: I0309 03:49:01.106496 4901 scope.go:117] "RemoveContainer" containerID="e6a90193a69cced7a1ad9274aa0b99053439ef573ce6ab6a66d6ddefe76fb026" Mar 09 03:49:01 crc kubenswrapper[4901]: E0309 03:49:01.107462 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:49:14 crc kubenswrapper[4901]: I0309 03:49:14.106612 4901 scope.go:117] "RemoveContainer" containerID="e6a90193a69cced7a1ad9274aa0b99053439ef573ce6ab6a66d6ddefe76fb026" Mar 09 03:49:14 crc kubenswrapper[4901]: E0309 03:49:14.107267 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:49:27 crc kubenswrapper[4901]: I0309 03:49:27.106148 4901 scope.go:117] "RemoveContainer" containerID="e6a90193a69cced7a1ad9274aa0b99053439ef573ce6ab6a66d6ddefe76fb026" Mar 09 03:49:27 crc kubenswrapper[4901]: E0309 03:49:27.107376 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:49:39 crc kubenswrapper[4901]: I0309 03:49:39.106375 4901 scope.go:117] "RemoveContainer" containerID="e6a90193a69cced7a1ad9274aa0b99053439ef573ce6ab6a66d6ddefe76fb026" Mar 09 03:49:39 crc kubenswrapper[4901]: E0309 03:49:39.107323 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:49:52 crc kubenswrapper[4901]: I0309 03:49:52.107203 4901 scope.go:117] "RemoveContainer" containerID="e6a90193a69cced7a1ad9274aa0b99053439ef573ce6ab6a66d6ddefe76fb026" Mar 09 03:49:52 crc kubenswrapper[4901]: E0309 03:49:52.108162 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:50:00 crc kubenswrapper[4901]: I0309 03:50:00.167173 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550470-5gdvz"] Mar 09 03:50:00 crc kubenswrapper[4901]: E0309 03:50:00.168141 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c21da82b-5cbf-4d49-b100-39d7add2c473" containerName="oc" Mar 09 03:50:00 crc kubenswrapper[4901]: I0309 03:50:00.168156 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="c21da82b-5cbf-4d49-b100-39d7add2c473" containerName="oc" Mar 09 03:50:00 crc kubenswrapper[4901]: I0309 03:50:00.168359 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="c21da82b-5cbf-4d49-b100-39d7add2c473" containerName="oc" Mar 09 03:50:00 crc kubenswrapper[4901]: I0309 03:50:00.169012 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550470-5gdvz" Mar 09 03:50:00 crc kubenswrapper[4901]: I0309 03:50:00.171295 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 03:50:00 crc kubenswrapper[4901]: I0309 03:50:00.171298 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 03:50:00 crc kubenswrapper[4901]: I0309 03:50:00.176981 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lq6vs" Mar 09 03:50:00 crc kubenswrapper[4901]: I0309 03:50:00.178112 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550470-5gdvz"] Mar 09 03:50:00 crc kubenswrapper[4901]: I0309 03:50:00.317787 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lw9l\" (UniqueName: \"kubernetes.io/projected/1a81220b-a3e6-47b7-9d48-fd27b882200c-kube-api-access-2lw9l\") pod \"auto-csr-approver-29550470-5gdvz\" (UID: \"1a81220b-a3e6-47b7-9d48-fd27b882200c\") " pod="openshift-infra/auto-csr-approver-29550470-5gdvz" Mar 09 03:50:00 crc kubenswrapper[4901]: I0309 03:50:00.418872 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lw9l\" (UniqueName: \"kubernetes.io/projected/1a81220b-a3e6-47b7-9d48-fd27b882200c-kube-api-access-2lw9l\") pod \"auto-csr-approver-29550470-5gdvz\" (UID: \"1a81220b-a3e6-47b7-9d48-fd27b882200c\") " pod="openshift-infra/auto-csr-approver-29550470-5gdvz" Mar 09 03:50:00 crc kubenswrapper[4901]: I0309 03:50:00.457451 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lw9l\" (UniqueName: \"kubernetes.io/projected/1a81220b-a3e6-47b7-9d48-fd27b882200c-kube-api-access-2lw9l\") pod \"auto-csr-approver-29550470-5gdvz\" (UID: \"1a81220b-a3e6-47b7-9d48-fd27b882200c\") " pod="openshift-infra/auto-csr-approver-29550470-5gdvz" Mar 09 03:50:00 crc kubenswrapper[4901]: I0309 03:50:00.524275 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550470-5gdvz" Mar 09 03:50:01 crc kubenswrapper[4901]: I0309 03:50:01.036355 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550470-5gdvz"] Mar 09 03:50:01 crc kubenswrapper[4901]: I0309 03:50:01.115434 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550470-5gdvz" event={"ID":"1a81220b-a3e6-47b7-9d48-fd27b882200c","Type":"ContainerStarted","Data":"668e47cfc59cd929dd806d3dd01215c5ff859bd57d750ac2d745940d8a05a917"} Mar 09 03:50:03 crc kubenswrapper[4901]: I0309 03:50:03.129310 4901 generic.go:334] "Generic (PLEG): container finished" podID="1a81220b-a3e6-47b7-9d48-fd27b882200c" containerID="f77e67cead9ee618419896089f63f1d531dd326f6f93efacb78cd2deba027485" exitCode=0 Mar 09 03:50:03 crc kubenswrapper[4901]: I0309 03:50:03.129393 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550470-5gdvz" event={"ID":"1a81220b-a3e6-47b7-9d48-fd27b882200c","Type":"ContainerDied","Data":"f77e67cead9ee618419896089f63f1d531dd326f6f93efacb78cd2deba027485"} Mar 09 03:50:04 crc kubenswrapper[4901]: I0309 03:50:04.529364 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550470-5gdvz" Mar 09 03:50:04 crc kubenswrapper[4901]: I0309 03:50:04.589689 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lw9l\" (UniqueName: \"kubernetes.io/projected/1a81220b-a3e6-47b7-9d48-fd27b882200c-kube-api-access-2lw9l\") pod \"1a81220b-a3e6-47b7-9d48-fd27b882200c\" (UID: \"1a81220b-a3e6-47b7-9d48-fd27b882200c\") " Mar 09 03:50:04 crc kubenswrapper[4901]: I0309 03:50:04.598272 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a81220b-a3e6-47b7-9d48-fd27b882200c-kube-api-access-2lw9l" (OuterVolumeSpecName: "kube-api-access-2lw9l") pod "1a81220b-a3e6-47b7-9d48-fd27b882200c" (UID: "1a81220b-a3e6-47b7-9d48-fd27b882200c"). InnerVolumeSpecName "kube-api-access-2lw9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:50:04 crc kubenswrapper[4901]: I0309 03:50:04.693053 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lw9l\" (UniqueName: \"kubernetes.io/projected/1a81220b-a3e6-47b7-9d48-fd27b882200c-kube-api-access-2lw9l\") on node \"crc\" DevicePath \"\"" Mar 09 03:50:05 crc kubenswrapper[4901]: I0309 03:50:05.150707 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550470-5gdvz" event={"ID":"1a81220b-a3e6-47b7-9d48-fd27b882200c","Type":"ContainerDied","Data":"668e47cfc59cd929dd806d3dd01215c5ff859bd57d750ac2d745940d8a05a917"} Mar 09 03:50:05 crc kubenswrapper[4901]: I0309 03:50:05.150748 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550470-5gdvz" Mar 09 03:50:05 crc kubenswrapper[4901]: I0309 03:50:05.150763 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="668e47cfc59cd929dd806d3dd01215c5ff859bd57d750ac2d745940d8a05a917" Mar 09 03:50:05 crc kubenswrapper[4901]: I0309 03:50:05.625733 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550464-tktjd"] Mar 09 03:50:05 crc kubenswrapper[4901]: I0309 03:50:05.634853 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550464-tktjd"] Mar 09 03:50:06 crc kubenswrapper[4901]: I0309 03:50:06.113333 4901 scope.go:117] "RemoveContainer" containerID="e6a90193a69cced7a1ad9274aa0b99053439ef573ce6ab6a66d6ddefe76fb026" Mar 09 03:50:06 crc kubenswrapper[4901]: E0309 03:50:06.113733 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:50:06 crc kubenswrapper[4901]: I0309 03:50:06.126943 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af508450-04dd-4c49-b595-c84aa1f509ac" path="/var/lib/kubelet/pods/af508450-04dd-4c49-b595-c84aa1f509ac/volumes" Mar 09 03:50:21 crc kubenswrapper[4901]: I0309 03:50:21.107280 4901 scope.go:117] "RemoveContainer" containerID="e6a90193a69cced7a1ad9274aa0b99053439ef573ce6ab6a66d6ddefe76fb026" Mar 09 03:50:21 crc kubenswrapper[4901]: E0309 03:50:21.108526 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:50:32 crc kubenswrapper[4901]: I0309 03:50:32.107284 4901 scope.go:117] "RemoveContainer" containerID="e6a90193a69cced7a1ad9274aa0b99053439ef573ce6ab6a66d6ddefe76fb026" Mar 09 03:50:32 crc kubenswrapper[4901]: E0309 03:50:32.108524 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:50:40 crc kubenswrapper[4901]: I0309 03:50:40.548993 4901 scope.go:117] "RemoveContainer" containerID="4b6a5495d7355a42aa7f37ac022b5887c43c84c12dfaf38b9f6818e2159ea48c" Mar 09 03:50:46 crc kubenswrapper[4901]: I0309 03:50:46.114426 4901 scope.go:117] "RemoveContainer" containerID="e6a90193a69cced7a1ad9274aa0b99053439ef573ce6ab6a66d6ddefe76fb026" Mar 09 03:50:46 crc kubenswrapper[4901]: E0309 03:50:46.114844 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:50:58 crc kubenswrapper[4901]: I0309 03:50:58.106410 4901 scope.go:117] "RemoveContainer" containerID="e6a90193a69cced7a1ad9274aa0b99053439ef573ce6ab6a66d6ddefe76fb026" Mar 09 03:50:58 crc kubenswrapper[4901]: E0309 03:50:58.107737 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:51:12 crc kubenswrapper[4901]: I0309 03:51:12.106500 4901 scope.go:117] "RemoveContainer" containerID="e6a90193a69cced7a1ad9274aa0b99053439ef573ce6ab6a66d6ddefe76fb026" Mar 09 03:51:12 crc kubenswrapper[4901]: E0309 03:51:12.107527 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:51:24 crc kubenswrapper[4901]: I0309 03:51:24.107505 4901 scope.go:117] "RemoveContainer" containerID="e6a90193a69cced7a1ad9274aa0b99053439ef573ce6ab6a66d6ddefe76fb026" Mar 09 03:51:24 crc kubenswrapper[4901]: E0309 03:51:24.108476 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:51:39 crc kubenswrapper[4901]: I0309 03:51:39.106433 4901 scope.go:117] "RemoveContainer" containerID="e6a90193a69cced7a1ad9274aa0b99053439ef573ce6ab6a66d6ddefe76fb026" Mar 09 03:51:39 crc kubenswrapper[4901]: E0309 03:51:39.108469 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:51:54 crc kubenswrapper[4901]: I0309 03:51:54.106697 4901 scope.go:117] "RemoveContainer" containerID="e6a90193a69cced7a1ad9274aa0b99053439ef573ce6ab6a66d6ddefe76fb026" Mar 09 03:51:54 crc kubenswrapper[4901]: E0309 03:51:54.109059 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:52:00 crc kubenswrapper[4901]: I0309 03:52:00.175544 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550472-6xqn9"] Mar 09 03:52:00 crc kubenswrapper[4901]: E0309 03:52:00.177769 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a81220b-a3e6-47b7-9d48-fd27b882200c" containerName="oc" Mar 09 03:52:00 crc kubenswrapper[4901]: I0309 03:52:00.177795 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a81220b-a3e6-47b7-9d48-fd27b882200c" containerName="oc" Mar 09 03:52:00 crc kubenswrapper[4901]: I0309 03:52:00.178140 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a81220b-a3e6-47b7-9d48-fd27b882200c" containerName="oc" Mar 09 03:52:00 crc kubenswrapper[4901]: I0309 03:52:00.179209 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550472-6xqn9" Mar 09 03:52:00 crc kubenswrapper[4901]: I0309 03:52:00.183069 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 03:52:00 crc kubenswrapper[4901]: I0309 03:52:00.183424 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 03:52:00 crc kubenswrapper[4901]: I0309 03:52:00.183573 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lq6vs" Mar 09 03:52:00 crc kubenswrapper[4901]: I0309 03:52:00.186392 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550472-6xqn9"] Mar 09 03:52:00 crc kubenswrapper[4901]: I0309 03:52:00.345634 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shxtf\" (UniqueName: \"kubernetes.io/projected/b0c5f03c-0270-4605-a375-76bd7308381e-kube-api-access-shxtf\") pod \"auto-csr-approver-29550472-6xqn9\" (UID: \"b0c5f03c-0270-4605-a375-76bd7308381e\") " pod="openshift-infra/auto-csr-approver-29550472-6xqn9" Mar 09 03:52:00 crc kubenswrapper[4901]: I0309 03:52:00.447954 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shxtf\" (UniqueName: \"kubernetes.io/projected/b0c5f03c-0270-4605-a375-76bd7308381e-kube-api-access-shxtf\") pod \"auto-csr-approver-29550472-6xqn9\" (UID: \"b0c5f03c-0270-4605-a375-76bd7308381e\") " pod="openshift-infra/auto-csr-approver-29550472-6xqn9" Mar 09 03:52:00 crc kubenswrapper[4901]: I0309 03:52:00.713273 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shxtf\" (UniqueName: \"kubernetes.io/projected/b0c5f03c-0270-4605-a375-76bd7308381e-kube-api-access-shxtf\") pod \"auto-csr-approver-29550472-6xqn9\" (UID: \"b0c5f03c-0270-4605-a375-76bd7308381e\") " pod="openshift-infra/auto-csr-approver-29550472-6xqn9" Mar 09 03:52:00 crc kubenswrapper[4901]: I0309 03:52:00.810193 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550472-6xqn9" Mar 09 03:52:01 crc kubenswrapper[4901]: I0309 03:52:01.080135 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550472-6xqn9"] Mar 09 03:52:01 crc kubenswrapper[4901]: I0309 03:52:01.262229 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550472-6xqn9" event={"ID":"b0c5f03c-0270-4605-a375-76bd7308381e","Type":"ContainerStarted","Data":"f88c3a5f7e60bb08b8ab3d5b1444b4798f0d41dfa35c482a85e6d3d191970ae2"} Mar 09 03:52:03 crc kubenswrapper[4901]: I0309 03:52:03.282636 4901 generic.go:334] "Generic (PLEG): container finished" podID="b0c5f03c-0270-4605-a375-76bd7308381e" containerID="1b1343d21249b5c8b933d1331da9795623fd2c10a4cb2c6ffd9c1bb8dc890c32" exitCode=0 Mar 09 03:52:03 crc kubenswrapper[4901]: I0309 03:52:03.282713 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550472-6xqn9" event={"ID":"b0c5f03c-0270-4605-a375-76bd7308381e","Type":"ContainerDied","Data":"1b1343d21249b5c8b933d1331da9795623fd2c10a4cb2c6ffd9c1bb8dc890c32"} Mar 09 03:52:04 crc kubenswrapper[4901]: I0309 03:52:04.719672 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550472-6xqn9" Mar 09 03:52:04 crc kubenswrapper[4901]: I0309 03:52:04.919844 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shxtf\" (UniqueName: \"kubernetes.io/projected/b0c5f03c-0270-4605-a375-76bd7308381e-kube-api-access-shxtf\") pod \"b0c5f03c-0270-4605-a375-76bd7308381e\" (UID: \"b0c5f03c-0270-4605-a375-76bd7308381e\") " Mar 09 03:52:04 crc kubenswrapper[4901]: I0309 03:52:04.928589 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0c5f03c-0270-4605-a375-76bd7308381e-kube-api-access-shxtf" (OuterVolumeSpecName: "kube-api-access-shxtf") pod "b0c5f03c-0270-4605-a375-76bd7308381e" (UID: "b0c5f03c-0270-4605-a375-76bd7308381e"). InnerVolumeSpecName "kube-api-access-shxtf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:52:05 crc kubenswrapper[4901]: I0309 03:52:05.022388 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shxtf\" (UniqueName: \"kubernetes.io/projected/b0c5f03c-0270-4605-a375-76bd7308381e-kube-api-access-shxtf\") on node \"crc\" DevicePath \"\"" Mar 09 03:52:05 crc kubenswrapper[4901]: I0309 03:52:05.304085 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550472-6xqn9" event={"ID":"b0c5f03c-0270-4605-a375-76bd7308381e","Type":"ContainerDied","Data":"f88c3a5f7e60bb08b8ab3d5b1444b4798f0d41dfa35c482a85e6d3d191970ae2"} Mar 09 03:52:05 crc kubenswrapper[4901]: I0309 03:52:05.304140 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f88c3a5f7e60bb08b8ab3d5b1444b4798f0d41dfa35c482a85e6d3d191970ae2" Mar 09 03:52:05 crc kubenswrapper[4901]: I0309 03:52:05.304181 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550472-6xqn9" Mar 09 03:52:05 crc kubenswrapper[4901]: I0309 03:52:05.823020 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550466-xvdcj"] Mar 09 03:52:05 crc kubenswrapper[4901]: I0309 03:52:05.833279 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550466-xvdcj"] Mar 09 03:52:06 crc kubenswrapper[4901]: I0309 03:52:06.123429 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ba1faa7-d1dc-4ab5-b6ae-bfc75be7843c" path="/var/lib/kubelet/pods/8ba1faa7-d1dc-4ab5-b6ae-bfc75be7843c/volumes" Mar 09 03:52:08 crc kubenswrapper[4901]: I0309 03:52:08.106744 4901 scope.go:117] "RemoveContainer" containerID="e6a90193a69cced7a1ad9274aa0b99053439ef573ce6ab6a66d6ddefe76fb026" Mar 09 03:52:09 crc kubenswrapper[4901]: I0309 03:52:09.345520 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" event={"ID":"65e722e8-52c4-4bb6-9927-f378b2f7296a","Type":"ContainerStarted","Data":"2bbd5a3d2c0fe44692c1657737ce4c08f09c92b4a8c3331b9fa71858ac3e491a"} Mar 09 03:52:40 crc kubenswrapper[4901]: I0309 03:52:40.680986 4901 scope.go:117] "RemoveContainer" containerID="ff4d20b44289f2c1b459d690d0c146978428c682c423c25ae827bc1424c9336f" Mar 09 03:54:00 crc kubenswrapper[4901]: I0309 03:54:00.166999 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550474-hjmmp"] Mar 09 03:54:00 crc kubenswrapper[4901]: E0309 03:54:00.168106 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0c5f03c-0270-4605-a375-76bd7308381e" containerName="oc" Mar 09 03:54:00 crc kubenswrapper[4901]: I0309 03:54:00.168129 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0c5f03c-0270-4605-a375-76bd7308381e" containerName="oc" Mar 09 03:54:00 crc kubenswrapper[4901]: I0309 03:54:00.168434 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0c5f03c-0270-4605-a375-76bd7308381e" containerName="oc" Mar 09 03:54:00 crc kubenswrapper[4901]: I0309 03:54:00.169282 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550474-hjmmp" Mar 09 03:54:00 crc kubenswrapper[4901]: I0309 03:54:00.171903 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 03:54:00 crc kubenswrapper[4901]: I0309 03:54:00.172574 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lq6vs" Mar 09 03:54:00 crc kubenswrapper[4901]: I0309 03:54:00.173268 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 03:54:00 crc kubenswrapper[4901]: I0309 03:54:00.182038 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550474-hjmmp"] Mar 09 03:54:00 crc kubenswrapper[4901]: I0309 03:54:00.322905 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4v24\" (UniqueName: \"kubernetes.io/projected/dd2d162d-bdfc-43b9-a4ec-a72f813974a2-kube-api-access-r4v24\") pod \"auto-csr-approver-29550474-hjmmp\" (UID: \"dd2d162d-bdfc-43b9-a4ec-a72f813974a2\") " pod="openshift-infra/auto-csr-approver-29550474-hjmmp" Mar 09 03:54:00 crc kubenswrapper[4901]: I0309 03:54:00.424001 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4v24\" (UniqueName: \"kubernetes.io/projected/dd2d162d-bdfc-43b9-a4ec-a72f813974a2-kube-api-access-r4v24\") pod \"auto-csr-approver-29550474-hjmmp\" (UID: \"dd2d162d-bdfc-43b9-a4ec-a72f813974a2\") " pod="openshift-infra/auto-csr-approver-29550474-hjmmp" Mar 09 03:54:00 crc kubenswrapper[4901]: I0309 03:54:00.455300 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4v24\" (UniqueName: \"kubernetes.io/projected/dd2d162d-bdfc-43b9-a4ec-a72f813974a2-kube-api-access-r4v24\") pod \"auto-csr-approver-29550474-hjmmp\" (UID: \"dd2d162d-bdfc-43b9-a4ec-a72f813974a2\") " pod="openshift-infra/auto-csr-approver-29550474-hjmmp" Mar 09 03:54:00 crc kubenswrapper[4901]: I0309 03:54:00.501413 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550474-hjmmp" Mar 09 03:54:00 crc kubenswrapper[4901]: I0309 03:54:00.972024 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550474-hjmmp"] Mar 09 03:54:00 crc kubenswrapper[4901]: I0309 03:54:00.975624 4901 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 03:54:01 crc kubenswrapper[4901]: I0309 03:54:01.432980 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550474-hjmmp" event={"ID":"dd2d162d-bdfc-43b9-a4ec-a72f813974a2","Type":"ContainerStarted","Data":"2fc90933c6f508d1f1d363bf6611437bb44216d69bf692b436e623419e498904"} Mar 09 03:54:02 crc kubenswrapper[4901]: I0309 03:54:02.441108 4901 generic.go:334] "Generic (PLEG): container finished" podID="dd2d162d-bdfc-43b9-a4ec-a72f813974a2" containerID="656f267c36a077587532187ba28b0e350644f5e384ff33f8765d78fdb18236d8" exitCode=0 Mar 09 03:54:02 crc kubenswrapper[4901]: I0309 03:54:02.441206 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550474-hjmmp" event={"ID":"dd2d162d-bdfc-43b9-a4ec-a72f813974a2","Type":"ContainerDied","Data":"656f267c36a077587532187ba28b0e350644f5e384ff33f8765d78fdb18236d8"} Mar 09 03:54:03 crc kubenswrapper[4901]: I0309 03:54:03.770049 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550474-hjmmp" Mar 09 03:54:03 crc kubenswrapper[4901]: I0309 03:54:03.877882 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4v24\" (UniqueName: \"kubernetes.io/projected/dd2d162d-bdfc-43b9-a4ec-a72f813974a2-kube-api-access-r4v24\") pod \"dd2d162d-bdfc-43b9-a4ec-a72f813974a2\" (UID: \"dd2d162d-bdfc-43b9-a4ec-a72f813974a2\") " Mar 09 03:54:04 crc kubenswrapper[4901]: I0309 03:54:04.110852 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd2d162d-bdfc-43b9-a4ec-a72f813974a2-kube-api-access-r4v24" (OuterVolumeSpecName: "kube-api-access-r4v24") pod "dd2d162d-bdfc-43b9-a4ec-a72f813974a2" (UID: "dd2d162d-bdfc-43b9-a4ec-a72f813974a2"). InnerVolumeSpecName "kube-api-access-r4v24". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:54:04 crc kubenswrapper[4901]: I0309 03:54:04.182993 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4v24\" (UniqueName: \"kubernetes.io/projected/dd2d162d-bdfc-43b9-a4ec-a72f813974a2-kube-api-access-r4v24\") on node \"crc\" DevicePath \"\"" Mar 09 03:54:04 crc kubenswrapper[4901]: I0309 03:54:04.463871 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550474-hjmmp" event={"ID":"dd2d162d-bdfc-43b9-a4ec-a72f813974a2","Type":"ContainerDied","Data":"2fc90933c6f508d1f1d363bf6611437bb44216d69bf692b436e623419e498904"} Mar 09 03:54:04 crc kubenswrapper[4901]: I0309 03:54:04.463914 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2fc90933c6f508d1f1d363bf6611437bb44216d69bf692b436e623419e498904" Mar 09 03:54:04 crc kubenswrapper[4901]: I0309 03:54:04.464092 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550474-hjmmp" Mar 09 03:54:04 crc kubenswrapper[4901]: I0309 03:54:04.867488 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550468-rbxnj"] Mar 09 03:54:04 crc kubenswrapper[4901]: I0309 03:54:04.876842 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550468-rbxnj"] Mar 09 03:54:06 crc kubenswrapper[4901]: I0309 03:54:06.113215 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c21da82b-5cbf-4d49-b100-39d7add2c473" path="/var/lib/kubelet/pods/c21da82b-5cbf-4d49-b100-39d7add2c473/volumes" Mar 09 03:54:30 crc kubenswrapper[4901]: I0309 03:54:30.863417 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 03:54:30 crc kubenswrapper[4901]: I0309 03:54:30.864302 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 03:54:40 crc kubenswrapper[4901]: I0309 03:54:40.798065 4901 scope.go:117] "RemoveContainer" containerID="cadbdc92ca5753c3457d7b982aa88e6f67ccf9a2bc7d19b87bdd95eef2c6d15d" Mar 09 03:55:00 crc kubenswrapper[4901]: I0309 03:55:00.863366 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 03:55:00 crc kubenswrapper[4901]: I0309 03:55:00.865969 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 03:55:30 crc kubenswrapper[4901]: I0309 03:55:30.863011 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 03:55:30 crc kubenswrapper[4901]: I0309 03:55:30.863724 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 03:55:30 crc kubenswrapper[4901]: I0309 03:55:30.863803 4901 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5c998" Mar 09 03:55:30 crc kubenswrapper[4901]: I0309 03:55:30.864805 4901 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2bbd5a3d2c0fe44692c1657737ce4c08f09c92b4a8c3331b9fa71858ac3e491a"} pod="openshift-machine-config-operator/machine-config-daemon-5c998" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 03:55:30 crc kubenswrapper[4901]: I0309 03:55:30.864921 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" containerID="cri-o://2bbd5a3d2c0fe44692c1657737ce4c08f09c92b4a8c3331b9fa71858ac3e491a" gracePeriod=600 Mar 09 03:55:31 crc kubenswrapper[4901]: I0309 03:55:31.268518 4901 generic.go:334] "Generic (PLEG): container finished" podID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerID="2bbd5a3d2c0fe44692c1657737ce4c08f09c92b4a8c3331b9fa71858ac3e491a" exitCode=0 Mar 09 03:55:31 crc kubenswrapper[4901]: I0309 03:55:31.268577 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" event={"ID":"65e722e8-52c4-4bb6-9927-f378b2f7296a","Type":"ContainerDied","Data":"2bbd5a3d2c0fe44692c1657737ce4c08f09c92b4a8c3331b9fa71858ac3e491a"} Mar 09 03:55:31 crc kubenswrapper[4901]: I0309 03:55:31.268614 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" event={"ID":"65e722e8-52c4-4bb6-9927-f378b2f7296a","Type":"ContainerStarted","Data":"f2eaaa94ef813f27ab89958e9931b25930a897defaee3221bd3f9fa043f32e39"} Mar 09 03:55:31 crc kubenswrapper[4901]: I0309 03:55:31.268656 4901 scope.go:117] "RemoveContainer" containerID="e6a90193a69cced7a1ad9274aa0b99053439ef573ce6ab6a66d6ddefe76fb026" Mar 09 03:55:36 crc kubenswrapper[4901]: I0309 03:55:36.749044 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2j2ht"] Mar 09 03:55:36 crc kubenswrapper[4901]: E0309 03:55:36.750463 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd2d162d-bdfc-43b9-a4ec-a72f813974a2" containerName="oc" Mar 09 03:55:36 crc kubenswrapper[4901]: I0309 03:55:36.750500 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd2d162d-bdfc-43b9-a4ec-a72f813974a2" containerName="oc" Mar 09 03:55:36 crc kubenswrapper[4901]: I0309 03:55:36.750863 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd2d162d-bdfc-43b9-a4ec-a72f813974a2" containerName="oc" Mar 09 03:55:36 crc kubenswrapper[4901]: I0309 03:55:36.753276 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2j2ht" Mar 09 03:55:36 crc kubenswrapper[4901]: I0309 03:55:36.763819 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2j2ht"] Mar 09 03:55:36 crc kubenswrapper[4901]: I0309 03:55:36.902033 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0ff6388-6d41-4301-8625-4ae2b42e6dc2-utilities\") pod \"community-operators-2j2ht\" (UID: \"e0ff6388-6d41-4301-8625-4ae2b42e6dc2\") " pod="openshift-marketplace/community-operators-2j2ht" Mar 09 03:55:36 crc kubenswrapper[4901]: I0309 03:55:36.902141 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fcqp\" (UniqueName: \"kubernetes.io/projected/e0ff6388-6d41-4301-8625-4ae2b42e6dc2-kube-api-access-6fcqp\") pod \"community-operators-2j2ht\" (UID: \"e0ff6388-6d41-4301-8625-4ae2b42e6dc2\") " pod="openshift-marketplace/community-operators-2j2ht" Mar 09 03:55:36 crc kubenswrapper[4901]: I0309 03:55:36.902203 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0ff6388-6d41-4301-8625-4ae2b42e6dc2-catalog-content\") pod \"community-operators-2j2ht\" (UID: \"e0ff6388-6d41-4301-8625-4ae2b42e6dc2\") " pod="openshift-marketplace/community-operators-2j2ht" Mar 09 03:55:37 crc kubenswrapper[4901]: I0309 03:55:37.003534 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0ff6388-6d41-4301-8625-4ae2b42e6dc2-utilities\") pod \"community-operators-2j2ht\" (UID: \"e0ff6388-6d41-4301-8625-4ae2b42e6dc2\") " pod="openshift-marketplace/community-operators-2j2ht" Mar 09 03:55:37 crc kubenswrapper[4901]: I0309 03:55:37.003949 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fcqp\" (UniqueName: \"kubernetes.io/projected/e0ff6388-6d41-4301-8625-4ae2b42e6dc2-kube-api-access-6fcqp\") pod \"community-operators-2j2ht\" (UID: \"e0ff6388-6d41-4301-8625-4ae2b42e6dc2\") " pod="openshift-marketplace/community-operators-2j2ht" Mar 09 03:55:37 crc kubenswrapper[4901]: I0309 03:55:37.004036 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0ff6388-6d41-4301-8625-4ae2b42e6dc2-utilities\") pod \"community-operators-2j2ht\" (UID: \"e0ff6388-6d41-4301-8625-4ae2b42e6dc2\") " pod="openshift-marketplace/community-operators-2j2ht" Mar 09 03:55:37 crc kubenswrapper[4901]: I0309 03:55:37.004049 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0ff6388-6d41-4301-8625-4ae2b42e6dc2-catalog-content\") pod \"community-operators-2j2ht\" (UID: \"e0ff6388-6d41-4301-8625-4ae2b42e6dc2\") " pod="openshift-marketplace/community-operators-2j2ht" Mar 09 03:55:37 crc kubenswrapper[4901]: I0309 03:55:37.004484 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0ff6388-6d41-4301-8625-4ae2b42e6dc2-catalog-content\") pod \"community-operators-2j2ht\" (UID: \"e0ff6388-6d41-4301-8625-4ae2b42e6dc2\") " pod="openshift-marketplace/community-operators-2j2ht" Mar 09 03:55:37 crc kubenswrapper[4901]: I0309 03:55:37.026708 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fcqp\" (UniqueName: \"kubernetes.io/projected/e0ff6388-6d41-4301-8625-4ae2b42e6dc2-kube-api-access-6fcqp\") pod \"community-operators-2j2ht\" (UID: \"e0ff6388-6d41-4301-8625-4ae2b42e6dc2\") " pod="openshift-marketplace/community-operators-2j2ht" Mar 09 03:55:37 crc kubenswrapper[4901]: I0309 03:55:37.095850 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2j2ht" Mar 09 03:55:37 crc kubenswrapper[4901]: I0309 03:55:37.625337 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2j2ht"] Mar 09 03:55:38 crc kubenswrapper[4901]: I0309 03:55:38.331285 4901 generic.go:334] "Generic (PLEG): container finished" podID="e0ff6388-6d41-4301-8625-4ae2b42e6dc2" containerID="1bd97127c5f1d1e50a48355a0a38c7111af2c4209d6027580a957eb10645d541" exitCode=0 Mar 09 03:55:38 crc kubenswrapper[4901]: I0309 03:55:38.331375 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2j2ht" event={"ID":"e0ff6388-6d41-4301-8625-4ae2b42e6dc2","Type":"ContainerDied","Data":"1bd97127c5f1d1e50a48355a0a38c7111af2c4209d6027580a957eb10645d541"} Mar 09 03:55:38 crc kubenswrapper[4901]: I0309 03:55:38.331834 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2j2ht" event={"ID":"e0ff6388-6d41-4301-8625-4ae2b42e6dc2","Type":"ContainerStarted","Data":"3d52cc8d3714a8137ce7723a40ef835970c278694d16b675d009c9d91ae80b84"} Mar 09 03:55:40 crc kubenswrapper[4901]: I0309 03:55:40.382401 4901 generic.go:334] "Generic (PLEG): container finished" podID="e0ff6388-6d41-4301-8625-4ae2b42e6dc2" containerID="27d6c05b358666e7fa093b61cab8e30b92dd8ae1b7179fe3903a8ebaae593bd9" exitCode=0 Mar 09 03:55:40 crc kubenswrapper[4901]: I0309 03:55:40.382580 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2j2ht" event={"ID":"e0ff6388-6d41-4301-8625-4ae2b42e6dc2","Type":"ContainerDied","Data":"27d6c05b358666e7fa093b61cab8e30b92dd8ae1b7179fe3903a8ebaae593bd9"} Mar 09 03:55:41 crc kubenswrapper[4901]: I0309 03:55:41.398634 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2j2ht" event={"ID":"e0ff6388-6d41-4301-8625-4ae2b42e6dc2","Type":"ContainerStarted","Data":"4bd05d9c8772ed66651688c1872431882383feeeb0b88987b239e796f70d0dd0"} Mar 09 03:55:42 crc kubenswrapper[4901]: I0309 03:55:42.431279 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2j2ht" podStartSLOduration=3.725515874 podStartE2EDuration="6.431254789s" podCreationTimestamp="2026-03-09 03:55:36 +0000 UTC" firstStartedPulling="2026-03-09 03:55:38.335052001 +0000 UTC m=+4462.924715773" lastFinishedPulling="2026-03-09 03:55:41.040790916 +0000 UTC m=+4465.630454688" observedRunningTime="2026-03-09 03:55:42.428571724 +0000 UTC m=+4467.018235466" watchObservedRunningTime="2026-03-09 03:55:42.431254789 +0000 UTC m=+4467.020918531" Mar 09 03:55:47 crc kubenswrapper[4901]: I0309 03:55:47.097039 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2j2ht" Mar 09 03:55:47 crc kubenswrapper[4901]: I0309 03:55:47.097834 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2j2ht" Mar 09 03:55:47 crc kubenswrapper[4901]: I0309 03:55:47.193938 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2j2ht" Mar 09 03:55:47 crc kubenswrapper[4901]: I0309 03:55:47.522170 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2j2ht" Mar 09 03:55:47 crc kubenswrapper[4901]: I0309 03:55:47.593435 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2j2ht"] Mar 09 03:55:49 crc kubenswrapper[4901]: I0309 03:55:49.476047 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2j2ht" podUID="e0ff6388-6d41-4301-8625-4ae2b42e6dc2" containerName="registry-server" containerID="cri-o://4bd05d9c8772ed66651688c1872431882383feeeb0b88987b239e796f70d0dd0" gracePeriod=2 Mar 09 03:55:49 crc kubenswrapper[4901]: I0309 03:55:49.958176 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2j2ht" Mar 09 03:55:50 crc kubenswrapper[4901]: I0309 03:55:50.132802 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0ff6388-6d41-4301-8625-4ae2b42e6dc2-utilities\") pod \"e0ff6388-6d41-4301-8625-4ae2b42e6dc2\" (UID: \"e0ff6388-6d41-4301-8625-4ae2b42e6dc2\") " Mar 09 03:55:50 crc kubenswrapper[4901]: I0309 03:55:50.132873 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fcqp\" (UniqueName: \"kubernetes.io/projected/e0ff6388-6d41-4301-8625-4ae2b42e6dc2-kube-api-access-6fcqp\") pod \"e0ff6388-6d41-4301-8625-4ae2b42e6dc2\" (UID: \"e0ff6388-6d41-4301-8625-4ae2b42e6dc2\") " Mar 09 03:55:50 crc kubenswrapper[4901]: I0309 03:55:50.133003 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0ff6388-6d41-4301-8625-4ae2b42e6dc2-catalog-content\") pod \"e0ff6388-6d41-4301-8625-4ae2b42e6dc2\" (UID: \"e0ff6388-6d41-4301-8625-4ae2b42e6dc2\") " Mar 09 03:55:50 crc kubenswrapper[4901]: I0309 03:55:50.134627 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0ff6388-6d41-4301-8625-4ae2b42e6dc2-utilities" (OuterVolumeSpecName: "utilities") pod "e0ff6388-6d41-4301-8625-4ae2b42e6dc2" (UID: "e0ff6388-6d41-4301-8625-4ae2b42e6dc2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:55:50 crc kubenswrapper[4901]: I0309 03:55:50.139536 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0ff6388-6d41-4301-8625-4ae2b42e6dc2-kube-api-access-6fcqp" (OuterVolumeSpecName: "kube-api-access-6fcqp") pod "e0ff6388-6d41-4301-8625-4ae2b42e6dc2" (UID: "e0ff6388-6d41-4301-8625-4ae2b42e6dc2"). InnerVolumeSpecName "kube-api-access-6fcqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:55:50 crc kubenswrapper[4901]: I0309 03:55:50.210112 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0ff6388-6d41-4301-8625-4ae2b42e6dc2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e0ff6388-6d41-4301-8625-4ae2b42e6dc2" (UID: "e0ff6388-6d41-4301-8625-4ae2b42e6dc2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:55:50 crc kubenswrapper[4901]: I0309 03:55:50.235012 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0ff6388-6d41-4301-8625-4ae2b42e6dc2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 03:55:50 crc kubenswrapper[4901]: I0309 03:55:50.235044 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0ff6388-6d41-4301-8625-4ae2b42e6dc2-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 03:55:50 crc kubenswrapper[4901]: I0309 03:55:50.235057 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6fcqp\" (UniqueName: \"kubernetes.io/projected/e0ff6388-6d41-4301-8625-4ae2b42e6dc2-kube-api-access-6fcqp\") on node \"crc\" DevicePath \"\"" Mar 09 03:55:50 crc kubenswrapper[4901]: I0309 03:55:50.488345 4901 generic.go:334] "Generic (PLEG): container finished" podID="e0ff6388-6d41-4301-8625-4ae2b42e6dc2" containerID="4bd05d9c8772ed66651688c1872431882383feeeb0b88987b239e796f70d0dd0" exitCode=0 Mar 09 03:55:50 crc kubenswrapper[4901]: I0309 03:55:50.488638 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2j2ht" event={"ID":"e0ff6388-6d41-4301-8625-4ae2b42e6dc2","Type":"ContainerDied","Data":"4bd05d9c8772ed66651688c1872431882383feeeb0b88987b239e796f70d0dd0"} Mar 09 03:55:50 crc kubenswrapper[4901]: I0309 03:55:50.488836 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2j2ht" event={"ID":"e0ff6388-6d41-4301-8625-4ae2b42e6dc2","Type":"ContainerDied","Data":"3d52cc8d3714a8137ce7723a40ef835970c278694d16b675d009c9d91ae80b84"} Mar 09 03:55:50 crc kubenswrapper[4901]: I0309 03:55:50.488891 4901 scope.go:117] "RemoveContainer" containerID="4bd05d9c8772ed66651688c1872431882383feeeb0b88987b239e796f70d0dd0" Mar 09 03:55:50 crc kubenswrapper[4901]: I0309 03:55:50.488748 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2j2ht" Mar 09 03:55:50 crc kubenswrapper[4901]: I0309 03:55:50.527079 4901 scope.go:117] "RemoveContainer" containerID="27d6c05b358666e7fa093b61cab8e30b92dd8ae1b7179fe3903a8ebaae593bd9" Mar 09 03:55:50 crc kubenswrapper[4901]: I0309 03:55:50.563806 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2j2ht"] Mar 09 03:55:50 crc kubenswrapper[4901]: I0309 03:55:50.577254 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2j2ht"] Mar 09 03:55:50 crc kubenswrapper[4901]: I0309 03:55:50.584908 4901 scope.go:117] "RemoveContainer" containerID="1bd97127c5f1d1e50a48355a0a38c7111af2c4209d6027580a957eb10645d541" Mar 09 03:55:50 crc kubenswrapper[4901]: I0309 03:55:50.612476 4901 scope.go:117] "RemoveContainer" containerID="4bd05d9c8772ed66651688c1872431882383feeeb0b88987b239e796f70d0dd0" Mar 09 03:55:50 crc kubenswrapper[4901]: E0309 03:55:50.613440 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bd05d9c8772ed66651688c1872431882383feeeb0b88987b239e796f70d0dd0\": container with ID starting with 4bd05d9c8772ed66651688c1872431882383feeeb0b88987b239e796f70d0dd0 not found: ID does not exist" containerID="4bd05d9c8772ed66651688c1872431882383feeeb0b88987b239e796f70d0dd0" Mar 09 03:55:50 crc kubenswrapper[4901]: I0309 03:55:50.613493 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bd05d9c8772ed66651688c1872431882383feeeb0b88987b239e796f70d0dd0"} err="failed to get container status \"4bd05d9c8772ed66651688c1872431882383feeeb0b88987b239e796f70d0dd0\": rpc error: code = NotFound desc = could not find container \"4bd05d9c8772ed66651688c1872431882383feeeb0b88987b239e796f70d0dd0\": container with ID starting with 4bd05d9c8772ed66651688c1872431882383feeeb0b88987b239e796f70d0dd0 not found: ID does not exist" Mar 09 03:55:50 crc kubenswrapper[4901]: I0309 03:55:50.613517 4901 scope.go:117] "RemoveContainer" containerID="27d6c05b358666e7fa093b61cab8e30b92dd8ae1b7179fe3903a8ebaae593bd9" Mar 09 03:55:50 crc kubenswrapper[4901]: E0309 03:55:50.614078 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27d6c05b358666e7fa093b61cab8e30b92dd8ae1b7179fe3903a8ebaae593bd9\": container with ID starting with 27d6c05b358666e7fa093b61cab8e30b92dd8ae1b7179fe3903a8ebaae593bd9 not found: ID does not exist" containerID="27d6c05b358666e7fa093b61cab8e30b92dd8ae1b7179fe3903a8ebaae593bd9" Mar 09 03:55:50 crc kubenswrapper[4901]: I0309 03:55:50.614107 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27d6c05b358666e7fa093b61cab8e30b92dd8ae1b7179fe3903a8ebaae593bd9"} err="failed to get container status \"27d6c05b358666e7fa093b61cab8e30b92dd8ae1b7179fe3903a8ebaae593bd9\": rpc error: code = NotFound desc = could not find container \"27d6c05b358666e7fa093b61cab8e30b92dd8ae1b7179fe3903a8ebaae593bd9\": container with ID starting with 27d6c05b358666e7fa093b61cab8e30b92dd8ae1b7179fe3903a8ebaae593bd9 not found: ID does not exist" Mar 09 03:55:50 crc kubenswrapper[4901]: I0309 03:55:50.614123 4901 scope.go:117] "RemoveContainer" containerID="1bd97127c5f1d1e50a48355a0a38c7111af2c4209d6027580a957eb10645d541" Mar 09 03:55:50 crc kubenswrapper[4901]: E0309 03:55:50.614677 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bd97127c5f1d1e50a48355a0a38c7111af2c4209d6027580a957eb10645d541\": container with ID starting with 1bd97127c5f1d1e50a48355a0a38c7111af2c4209d6027580a957eb10645d541 not found: ID does not exist" containerID="1bd97127c5f1d1e50a48355a0a38c7111af2c4209d6027580a957eb10645d541" Mar 09 03:55:50 crc kubenswrapper[4901]: I0309 03:55:50.614702 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bd97127c5f1d1e50a48355a0a38c7111af2c4209d6027580a957eb10645d541"} err="failed to get container status \"1bd97127c5f1d1e50a48355a0a38c7111af2c4209d6027580a957eb10645d541\": rpc error: code = NotFound desc = could not find container \"1bd97127c5f1d1e50a48355a0a38c7111af2c4209d6027580a957eb10645d541\": container with ID starting with 1bd97127c5f1d1e50a48355a0a38c7111af2c4209d6027580a957eb10645d541 not found: ID does not exist" Mar 09 03:55:52 crc kubenswrapper[4901]: I0309 03:55:52.124746 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0ff6388-6d41-4301-8625-4ae2b42e6dc2" path="/var/lib/kubelet/pods/e0ff6388-6d41-4301-8625-4ae2b42e6dc2/volumes" Mar 09 03:55:58 crc kubenswrapper[4901]: I0309 03:55:58.466377 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-s624m"] Mar 09 03:55:58 crc kubenswrapper[4901]: E0309 03:55:58.467895 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0ff6388-6d41-4301-8625-4ae2b42e6dc2" containerName="extract-content" Mar 09 03:55:58 crc kubenswrapper[4901]: I0309 03:55:58.467911 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0ff6388-6d41-4301-8625-4ae2b42e6dc2" containerName="extract-content" Mar 09 03:55:58 crc kubenswrapper[4901]: E0309 03:55:58.467933 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0ff6388-6d41-4301-8625-4ae2b42e6dc2" containerName="extract-utilities" Mar 09 03:55:58 crc kubenswrapper[4901]: I0309 03:55:58.467943 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0ff6388-6d41-4301-8625-4ae2b42e6dc2" containerName="extract-utilities" Mar 09 03:55:58 crc kubenswrapper[4901]: E0309 03:55:58.467957 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0ff6388-6d41-4301-8625-4ae2b42e6dc2" containerName="registry-server" Mar 09 03:55:58 crc kubenswrapper[4901]: I0309 03:55:58.467965 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0ff6388-6d41-4301-8625-4ae2b42e6dc2" containerName="registry-server" Mar 09 03:55:58 crc kubenswrapper[4901]: I0309 03:55:58.468205 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0ff6388-6d41-4301-8625-4ae2b42e6dc2" containerName="registry-server" Mar 09 03:55:58 crc kubenswrapper[4901]: I0309 03:55:58.469455 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s624m" Mar 09 03:55:58 crc kubenswrapper[4901]: I0309 03:55:58.485495 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s624m"] Mar 09 03:55:58 crc kubenswrapper[4901]: I0309 03:55:58.574093 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ae0b74d-6cd7-4a3d-a9c6-73adfe861dfc-catalog-content\") pod \"redhat-marketplace-s624m\" (UID: \"5ae0b74d-6cd7-4a3d-a9c6-73adfe861dfc\") " pod="openshift-marketplace/redhat-marketplace-s624m" Mar 09 03:55:58 crc kubenswrapper[4901]: I0309 03:55:58.574978 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f62rh\" (UniqueName: \"kubernetes.io/projected/5ae0b74d-6cd7-4a3d-a9c6-73adfe861dfc-kube-api-access-f62rh\") pod \"redhat-marketplace-s624m\" (UID: \"5ae0b74d-6cd7-4a3d-a9c6-73adfe861dfc\") " pod="openshift-marketplace/redhat-marketplace-s624m" Mar 09 03:55:58 crc kubenswrapper[4901]: I0309 03:55:58.575175 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ae0b74d-6cd7-4a3d-a9c6-73adfe861dfc-utilities\") pod \"redhat-marketplace-s624m\" (UID: \"5ae0b74d-6cd7-4a3d-a9c6-73adfe861dfc\") " pod="openshift-marketplace/redhat-marketplace-s624m" Mar 09 03:55:58 crc kubenswrapper[4901]: I0309 03:55:58.676854 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ae0b74d-6cd7-4a3d-a9c6-73adfe861dfc-catalog-content\") pod \"redhat-marketplace-s624m\" (UID: \"5ae0b74d-6cd7-4a3d-a9c6-73adfe861dfc\") " pod="openshift-marketplace/redhat-marketplace-s624m" Mar 09 03:55:58 crc kubenswrapper[4901]: I0309 03:55:58.676968 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f62rh\" (UniqueName: \"kubernetes.io/projected/5ae0b74d-6cd7-4a3d-a9c6-73adfe861dfc-kube-api-access-f62rh\") pod \"redhat-marketplace-s624m\" (UID: \"5ae0b74d-6cd7-4a3d-a9c6-73adfe861dfc\") " pod="openshift-marketplace/redhat-marketplace-s624m" Mar 09 03:55:58 crc kubenswrapper[4901]: I0309 03:55:58.677096 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ae0b74d-6cd7-4a3d-a9c6-73adfe861dfc-utilities\") pod \"redhat-marketplace-s624m\" (UID: \"5ae0b74d-6cd7-4a3d-a9c6-73adfe861dfc\") " pod="openshift-marketplace/redhat-marketplace-s624m" Mar 09 03:55:58 crc kubenswrapper[4901]: I0309 03:55:58.678444 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ae0b74d-6cd7-4a3d-a9c6-73adfe861dfc-utilities\") pod \"redhat-marketplace-s624m\" (UID: \"5ae0b74d-6cd7-4a3d-a9c6-73adfe861dfc\") " pod="openshift-marketplace/redhat-marketplace-s624m" Mar 09 03:55:58 crc kubenswrapper[4901]: I0309 03:55:58.678445 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ae0b74d-6cd7-4a3d-a9c6-73adfe861dfc-catalog-content\") pod \"redhat-marketplace-s624m\" (UID: \"5ae0b74d-6cd7-4a3d-a9c6-73adfe861dfc\") " pod="openshift-marketplace/redhat-marketplace-s624m" Mar 09 03:55:58 crc kubenswrapper[4901]: I0309 03:55:58.712287 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f62rh\" (UniqueName: \"kubernetes.io/projected/5ae0b74d-6cd7-4a3d-a9c6-73adfe861dfc-kube-api-access-f62rh\") pod \"redhat-marketplace-s624m\" (UID: \"5ae0b74d-6cd7-4a3d-a9c6-73adfe861dfc\") " pod="openshift-marketplace/redhat-marketplace-s624m" Mar 09 03:55:58 crc kubenswrapper[4901]: I0309 03:55:58.794839 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s624m" Mar 09 03:55:59 crc kubenswrapper[4901]: I0309 03:55:59.070489 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s624m"] Mar 09 03:55:59 crc kubenswrapper[4901]: I0309 03:55:59.579445 4901 generic.go:334] "Generic (PLEG): container finished" podID="5ae0b74d-6cd7-4a3d-a9c6-73adfe861dfc" containerID="a4857cd72df2d7790478809b4c04ccd975b9ac50303a6a0e9b76f139d39466c5" exitCode=0 Mar 09 03:55:59 crc kubenswrapper[4901]: I0309 03:55:59.579538 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s624m" event={"ID":"5ae0b74d-6cd7-4a3d-a9c6-73adfe861dfc","Type":"ContainerDied","Data":"a4857cd72df2d7790478809b4c04ccd975b9ac50303a6a0e9b76f139d39466c5"} Mar 09 03:55:59 crc kubenswrapper[4901]: I0309 03:55:59.579855 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s624m" event={"ID":"5ae0b74d-6cd7-4a3d-a9c6-73adfe861dfc","Type":"ContainerStarted","Data":"b95ae616543836cbd2b832757f540ff7fff2604e9491ef7e39bb0458cc8cad3b"} Mar 09 03:56:00 crc kubenswrapper[4901]: I0309 03:56:00.167088 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550476-s6wwl"] Mar 09 03:56:00 crc kubenswrapper[4901]: I0309 03:56:00.169115 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550476-s6wwl" Mar 09 03:56:00 crc kubenswrapper[4901]: I0309 03:56:00.179519 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550476-s6wwl"] Mar 09 03:56:00 crc kubenswrapper[4901]: I0309 03:56:00.206361 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 03:56:00 crc kubenswrapper[4901]: I0309 03:56:00.206509 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lq6vs" Mar 09 03:56:00 crc kubenswrapper[4901]: I0309 03:56:00.209744 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 03:56:00 crc kubenswrapper[4901]: I0309 03:56:00.305107 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j95fz\" (UniqueName: \"kubernetes.io/projected/20ce0ce0-5535-4b6b-ba9f-44ba2b3bbc13-kube-api-access-j95fz\") pod \"auto-csr-approver-29550476-s6wwl\" (UID: \"20ce0ce0-5535-4b6b-ba9f-44ba2b3bbc13\") " pod="openshift-infra/auto-csr-approver-29550476-s6wwl" Mar 09 03:56:00 crc kubenswrapper[4901]: I0309 03:56:00.406552 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j95fz\" (UniqueName: \"kubernetes.io/projected/20ce0ce0-5535-4b6b-ba9f-44ba2b3bbc13-kube-api-access-j95fz\") pod \"auto-csr-approver-29550476-s6wwl\" (UID: \"20ce0ce0-5535-4b6b-ba9f-44ba2b3bbc13\") " pod="openshift-infra/auto-csr-approver-29550476-s6wwl" Mar 09 03:56:00 crc kubenswrapper[4901]: I0309 03:56:00.430851 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j95fz\" (UniqueName: \"kubernetes.io/projected/20ce0ce0-5535-4b6b-ba9f-44ba2b3bbc13-kube-api-access-j95fz\") pod \"auto-csr-approver-29550476-s6wwl\" (UID: \"20ce0ce0-5535-4b6b-ba9f-44ba2b3bbc13\") " pod="openshift-infra/auto-csr-approver-29550476-s6wwl" Mar 09 03:56:00 crc kubenswrapper[4901]: I0309 03:56:00.517880 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550476-s6wwl" Mar 09 03:56:01 crc kubenswrapper[4901]: I0309 03:56:01.057694 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550476-s6wwl"] Mar 09 03:56:01 crc kubenswrapper[4901]: W0309 03:56:01.067199 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20ce0ce0_5535_4b6b_ba9f_44ba2b3bbc13.slice/crio-65f657e639d56af306fae828fb477491ea371ee34626539770c8d47a1b314e7c WatchSource:0}: Error finding container 65f657e639d56af306fae828fb477491ea371ee34626539770c8d47a1b314e7c: Status 404 returned error can't find the container with id 65f657e639d56af306fae828fb477491ea371ee34626539770c8d47a1b314e7c Mar 09 03:56:01 crc kubenswrapper[4901]: I0309 03:56:01.597406 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550476-s6wwl" event={"ID":"20ce0ce0-5535-4b6b-ba9f-44ba2b3bbc13","Type":"ContainerStarted","Data":"65f657e639d56af306fae828fb477491ea371ee34626539770c8d47a1b314e7c"} Mar 09 03:56:01 crc kubenswrapper[4901]: I0309 03:56:01.601120 4901 generic.go:334] "Generic (PLEG): container finished" podID="5ae0b74d-6cd7-4a3d-a9c6-73adfe861dfc" containerID="02ddf8944fa627a294f3a7299b834bff0d3d2edcd475424151131d9b3acfd6d8" exitCode=0 Mar 09 03:56:01 crc kubenswrapper[4901]: I0309 03:56:01.601164 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s624m" event={"ID":"5ae0b74d-6cd7-4a3d-a9c6-73adfe861dfc","Type":"ContainerDied","Data":"02ddf8944fa627a294f3a7299b834bff0d3d2edcd475424151131d9b3acfd6d8"} Mar 09 03:56:02 crc kubenswrapper[4901]: I0309 03:56:02.610802 4901 generic.go:334] "Generic (PLEG): container finished" podID="20ce0ce0-5535-4b6b-ba9f-44ba2b3bbc13" containerID="1ad4b5968552d7fcdb51346e35443083df9b75fe6ebc01bc458c2adcf76e9e05" exitCode=0 Mar 09 03:56:02 crc kubenswrapper[4901]: I0309 03:56:02.610907 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550476-s6wwl" event={"ID":"20ce0ce0-5535-4b6b-ba9f-44ba2b3bbc13","Type":"ContainerDied","Data":"1ad4b5968552d7fcdb51346e35443083df9b75fe6ebc01bc458c2adcf76e9e05"} Mar 09 03:56:02 crc kubenswrapper[4901]: I0309 03:56:02.613514 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s624m" event={"ID":"5ae0b74d-6cd7-4a3d-a9c6-73adfe861dfc","Type":"ContainerStarted","Data":"f7ee246f59d67c4dbb02e3c85e2ed9f426cdd8511e118ffa15cd6ecb23ac73f9"} Mar 09 03:56:02 crc kubenswrapper[4901]: I0309 03:56:02.655565 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-s624m" podStartSLOduration=2.244267524 podStartE2EDuration="4.655544617s" podCreationTimestamp="2026-03-09 03:55:58 +0000 UTC" firstStartedPulling="2026-03-09 03:55:59.583487351 +0000 UTC m=+4484.173151123" lastFinishedPulling="2026-03-09 03:56:01.994764474 +0000 UTC m=+4486.584428216" observedRunningTime="2026-03-09 03:56:02.652683847 +0000 UTC m=+4487.242347599" watchObservedRunningTime="2026-03-09 03:56:02.655544617 +0000 UTC m=+4487.245208349" Mar 09 03:56:03 crc kubenswrapper[4901]: I0309 03:56:03.934302 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550476-s6wwl" Mar 09 03:56:04 crc kubenswrapper[4901]: I0309 03:56:04.061529 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j95fz\" (UniqueName: \"kubernetes.io/projected/20ce0ce0-5535-4b6b-ba9f-44ba2b3bbc13-kube-api-access-j95fz\") pod \"20ce0ce0-5535-4b6b-ba9f-44ba2b3bbc13\" (UID: \"20ce0ce0-5535-4b6b-ba9f-44ba2b3bbc13\") " Mar 09 03:56:04 crc kubenswrapper[4901]: I0309 03:56:04.072419 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20ce0ce0-5535-4b6b-ba9f-44ba2b3bbc13-kube-api-access-j95fz" (OuterVolumeSpecName: "kube-api-access-j95fz") pod "20ce0ce0-5535-4b6b-ba9f-44ba2b3bbc13" (UID: "20ce0ce0-5535-4b6b-ba9f-44ba2b3bbc13"). InnerVolumeSpecName "kube-api-access-j95fz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:56:04 crc kubenswrapper[4901]: I0309 03:56:04.163265 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j95fz\" (UniqueName: \"kubernetes.io/projected/20ce0ce0-5535-4b6b-ba9f-44ba2b3bbc13-kube-api-access-j95fz\") on node \"crc\" DevicePath \"\"" Mar 09 03:56:04 crc kubenswrapper[4901]: I0309 03:56:04.635966 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550476-s6wwl" event={"ID":"20ce0ce0-5535-4b6b-ba9f-44ba2b3bbc13","Type":"ContainerDied","Data":"65f657e639d56af306fae828fb477491ea371ee34626539770c8d47a1b314e7c"} Mar 09 03:56:04 crc kubenswrapper[4901]: I0309 03:56:04.636021 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65f657e639d56af306fae828fb477491ea371ee34626539770c8d47a1b314e7c" Mar 09 03:56:04 crc kubenswrapper[4901]: I0309 03:56:04.636045 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550476-s6wwl" Mar 09 03:56:05 crc kubenswrapper[4901]: I0309 03:56:05.038880 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550470-5gdvz"] Mar 09 03:56:05 crc kubenswrapper[4901]: I0309 03:56:05.050254 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550470-5gdvz"] Mar 09 03:56:06 crc kubenswrapper[4901]: I0309 03:56:06.122135 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a81220b-a3e6-47b7-9d48-fd27b882200c" path="/var/lib/kubelet/pods/1a81220b-a3e6-47b7-9d48-fd27b882200c/volumes" Mar 09 03:56:08 crc kubenswrapper[4901]: I0309 03:56:08.795761 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-s624m" Mar 09 03:56:08 crc kubenswrapper[4901]: I0309 03:56:08.795874 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-s624m" Mar 09 03:56:08 crc kubenswrapper[4901]: I0309 03:56:08.873922 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-s624m" Mar 09 03:56:09 crc kubenswrapper[4901]: I0309 03:56:09.767778 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-s624m" Mar 09 03:56:09 crc kubenswrapper[4901]: I0309 03:56:09.823129 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s624m"] Mar 09 03:56:11 crc kubenswrapper[4901]: I0309 03:56:11.699250 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-s624m" podUID="5ae0b74d-6cd7-4a3d-a9c6-73adfe861dfc" containerName="registry-server" containerID="cri-o://f7ee246f59d67c4dbb02e3c85e2ed9f426cdd8511e118ffa15cd6ecb23ac73f9" gracePeriod=2 Mar 09 03:56:12 crc kubenswrapper[4901]: I0309 03:56:12.173468 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s624m" Mar 09 03:56:13 crc kubenswrapper[4901]: I0309 03:56:13.145779 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f62rh\" (UniqueName: \"kubernetes.io/projected/5ae0b74d-6cd7-4a3d-a9c6-73adfe861dfc-kube-api-access-f62rh\") pod \"5ae0b74d-6cd7-4a3d-a9c6-73adfe861dfc\" (UID: \"5ae0b74d-6cd7-4a3d-a9c6-73adfe861dfc\") " Mar 09 03:56:13 crc kubenswrapper[4901]: I0309 03:56:13.145878 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ae0b74d-6cd7-4a3d-a9c6-73adfe861dfc-catalog-content\") pod \"5ae0b74d-6cd7-4a3d-a9c6-73adfe861dfc\" (UID: \"5ae0b74d-6cd7-4a3d-a9c6-73adfe861dfc\") " Mar 09 03:56:13 crc kubenswrapper[4901]: I0309 03:56:13.145914 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ae0b74d-6cd7-4a3d-a9c6-73adfe861dfc-utilities\") pod \"5ae0b74d-6cd7-4a3d-a9c6-73adfe861dfc\" (UID: \"5ae0b74d-6cd7-4a3d-a9c6-73adfe861dfc\") " Mar 09 03:56:13 crc kubenswrapper[4901]: I0309 03:56:13.146923 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ae0b74d-6cd7-4a3d-a9c6-73adfe861dfc-utilities" (OuterVolumeSpecName: "utilities") pod "5ae0b74d-6cd7-4a3d-a9c6-73adfe861dfc" (UID: "5ae0b74d-6cd7-4a3d-a9c6-73adfe861dfc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:56:13 crc kubenswrapper[4901]: I0309 03:56:13.150857 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ae0b74d-6cd7-4a3d-a9c6-73adfe861dfc-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 03:56:13 crc kubenswrapper[4901]: I0309 03:56:13.156564 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ae0b74d-6cd7-4a3d-a9c6-73adfe861dfc-kube-api-access-f62rh" (OuterVolumeSpecName: "kube-api-access-f62rh") pod "5ae0b74d-6cd7-4a3d-a9c6-73adfe861dfc" (UID: "5ae0b74d-6cd7-4a3d-a9c6-73adfe861dfc"). InnerVolumeSpecName "kube-api-access-f62rh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:56:13 crc kubenswrapper[4901]: I0309 03:56:13.165649 4901 generic.go:334] "Generic (PLEG): container finished" podID="5ae0b74d-6cd7-4a3d-a9c6-73adfe861dfc" containerID="f7ee246f59d67c4dbb02e3c85e2ed9f426cdd8511e118ffa15cd6ecb23ac73f9" exitCode=0 Mar 09 03:56:13 crc kubenswrapper[4901]: I0309 03:56:13.165762 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s624m" Mar 09 03:56:13 crc kubenswrapper[4901]: I0309 03:56:13.165782 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s624m" event={"ID":"5ae0b74d-6cd7-4a3d-a9c6-73adfe861dfc","Type":"ContainerDied","Data":"f7ee246f59d67c4dbb02e3c85e2ed9f426cdd8511e118ffa15cd6ecb23ac73f9"} Mar 09 03:56:13 crc kubenswrapper[4901]: I0309 03:56:13.166513 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s624m" event={"ID":"5ae0b74d-6cd7-4a3d-a9c6-73adfe861dfc","Type":"ContainerDied","Data":"b95ae616543836cbd2b832757f540ff7fff2604e9491ef7e39bb0458cc8cad3b"} Mar 09 03:56:13 crc kubenswrapper[4901]: I0309 03:56:13.166574 4901 scope.go:117] "RemoveContainer" containerID="f7ee246f59d67c4dbb02e3c85e2ed9f426cdd8511e118ffa15cd6ecb23ac73f9" Mar 09 03:56:13 crc kubenswrapper[4901]: I0309 03:56:13.179894 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ae0b74d-6cd7-4a3d-a9c6-73adfe861dfc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5ae0b74d-6cd7-4a3d-a9c6-73adfe861dfc" (UID: "5ae0b74d-6cd7-4a3d-a9c6-73adfe861dfc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:56:13 crc kubenswrapper[4901]: I0309 03:56:13.220531 4901 scope.go:117] "RemoveContainer" containerID="02ddf8944fa627a294f3a7299b834bff0d3d2edcd475424151131d9b3acfd6d8" Mar 09 03:56:13 crc kubenswrapper[4901]: I0309 03:56:13.240992 4901 scope.go:117] "RemoveContainer" containerID="a4857cd72df2d7790478809b4c04ccd975b9ac50303a6a0e9b76f139d39466c5" Mar 09 03:56:13 crc kubenswrapper[4901]: I0309 03:56:13.252569 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ae0b74d-6cd7-4a3d-a9c6-73adfe861dfc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 03:56:13 crc kubenswrapper[4901]: I0309 03:56:13.252683 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f62rh\" (UniqueName: \"kubernetes.io/projected/5ae0b74d-6cd7-4a3d-a9c6-73adfe861dfc-kube-api-access-f62rh\") on node \"crc\" DevicePath \"\"" Mar 09 03:56:13 crc kubenswrapper[4901]: I0309 03:56:13.288241 4901 scope.go:117] "RemoveContainer" containerID="f7ee246f59d67c4dbb02e3c85e2ed9f426cdd8511e118ffa15cd6ecb23ac73f9" Mar 09 03:56:13 crc kubenswrapper[4901]: E0309 03:56:13.288888 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7ee246f59d67c4dbb02e3c85e2ed9f426cdd8511e118ffa15cd6ecb23ac73f9\": container with ID starting with f7ee246f59d67c4dbb02e3c85e2ed9f426cdd8511e118ffa15cd6ecb23ac73f9 not found: ID does not exist" containerID="f7ee246f59d67c4dbb02e3c85e2ed9f426cdd8511e118ffa15cd6ecb23ac73f9" Mar 09 03:56:13 crc kubenswrapper[4901]: I0309 03:56:13.288959 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7ee246f59d67c4dbb02e3c85e2ed9f426cdd8511e118ffa15cd6ecb23ac73f9"} err="failed to get container status \"f7ee246f59d67c4dbb02e3c85e2ed9f426cdd8511e118ffa15cd6ecb23ac73f9\": rpc error: code = NotFound desc = could not find container \"f7ee246f59d67c4dbb02e3c85e2ed9f426cdd8511e118ffa15cd6ecb23ac73f9\": container with ID starting with f7ee246f59d67c4dbb02e3c85e2ed9f426cdd8511e118ffa15cd6ecb23ac73f9 not found: ID does not exist" Mar 09 03:56:13 crc kubenswrapper[4901]: I0309 03:56:13.289001 4901 scope.go:117] "RemoveContainer" containerID="02ddf8944fa627a294f3a7299b834bff0d3d2edcd475424151131d9b3acfd6d8" Mar 09 03:56:13 crc kubenswrapper[4901]: E0309 03:56:13.289581 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02ddf8944fa627a294f3a7299b834bff0d3d2edcd475424151131d9b3acfd6d8\": container with ID starting with 02ddf8944fa627a294f3a7299b834bff0d3d2edcd475424151131d9b3acfd6d8 not found: ID does not exist" containerID="02ddf8944fa627a294f3a7299b834bff0d3d2edcd475424151131d9b3acfd6d8" Mar 09 03:56:13 crc kubenswrapper[4901]: I0309 03:56:13.289693 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02ddf8944fa627a294f3a7299b834bff0d3d2edcd475424151131d9b3acfd6d8"} err="failed to get container status \"02ddf8944fa627a294f3a7299b834bff0d3d2edcd475424151131d9b3acfd6d8\": rpc error: code = NotFound desc = could not find container \"02ddf8944fa627a294f3a7299b834bff0d3d2edcd475424151131d9b3acfd6d8\": container with ID starting with 02ddf8944fa627a294f3a7299b834bff0d3d2edcd475424151131d9b3acfd6d8 not found: ID does not exist" Mar 09 03:56:13 crc kubenswrapper[4901]: I0309 03:56:13.289801 4901 scope.go:117] "RemoveContainer" containerID="a4857cd72df2d7790478809b4c04ccd975b9ac50303a6a0e9b76f139d39466c5" Mar 09 03:56:13 crc kubenswrapper[4901]: E0309 03:56:13.290618 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4857cd72df2d7790478809b4c04ccd975b9ac50303a6a0e9b76f139d39466c5\": container with ID starting with a4857cd72df2d7790478809b4c04ccd975b9ac50303a6a0e9b76f139d39466c5 not found: ID does not exist" containerID="a4857cd72df2d7790478809b4c04ccd975b9ac50303a6a0e9b76f139d39466c5" Mar 09 03:56:13 crc kubenswrapper[4901]: I0309 03:56:13.290677 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4857cd72df2d7790478809b4c04ccd975b9ac50303a6a0e9b76f139d39466c5"} err="failed to get container status \"a4857cd72df2d7790478809b4c04ccd975b9ac50303a6a0e9b76f139d39466c5\": rpc error: code = NotFound desc = could not find container \"a4857cd72df2d7790478809b4c04ccd975b9ac50303a6a0e9b76f139d39466c5\": container with ID starting with a4857cd72df2d7790478809b4c04ccd975b9ac50303a6a0e9b76f139d39466c5 not found: ID does not exist" Mar 09 03:56:13 crc kubenswrapper[4901]: I0309 03:56:13.525698 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s624m"] Mar 09 03:56:13 crc kubenswrapper[4901]: I0309 03:56:13.535901 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-s624m"] Mar 09 03:56:14 crc kubenswrapper[4901]: I0309 03:56:14.120038 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ae0b74d-6cd7-4a3d-a9c6-73adfe861dfc" path="/var/lib/kubelet/pods/5ae0b74d-6cd7-4a3d-a9c6-73adfe861dfc/volumes" Mar 09 03:56:40 crc kubenswrapper[4901]: I0309 03:56:40.935339 4901 scope.go:117] "RemoveContainer" containerID="f77e67cead9ee618419896089f63f1d531dd326f6f93efacb78cd2deba027485" Mar 09 03:57:34 crc kubenswrapper[4901]: I0309 03:57:34.991767 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jz9pm"] Mar 09 03:57:34 crc kubenswrapper[4901]: E0309 03:57:34.992634 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20ce0ce0-5535-4b6b-ba9f-44ba2b3bbc13" containerName="oc" Mar 09 03:57:34 crc kubenswrapper[4901]: I0309 03:57:34.992651 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="20ce0ce0-5535-4b6b-ba9f-44ba2b3bbc13" containerName="oc" Mar 09 03:57:34 crc kubenswrapper[4901]: E0309 03:57:34.992663 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ae0b74d-6cd7-4a3d-a9c6-73adfe861dfc" containerName="extract-utilities" Mar 09 03:57:34 crc kubenswrapper[4901]: I0309 03:57:34.992672 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ae0b74d-6cd7-4a3d-a9c6-73adfe861dfc" containerName="extract-utilities" Mar 09 03:57:34 crc kubenswrapper[4901]: E0309 03:57:34.992687 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ae0b74d-6cd7-4a3d-a9c6-73adfe861dfc" containerName="extract-content" Mar 09 03:57:34 crc kubenswrapper[4901]: I0309 03:57:34.992696 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ae0b74d-6cd7-4a3d-a9c6-73adfe861dfc" containerName="extract-content" Mar 09 03:57:34 crc kubenswrapper[4901]: E0309 03:57:34.992716 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ae0b74d-6cd7-4a3d-a9c6-73adfe861dfc" containerName="registry-server" Mar 09 03:57:34 crc kubenswrapper[4901]: I0309 03:57:34.992724 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ae0b74d-6cd7-4a3d-a9c6-73adfe861dfc" containerName="registry-server" Mar 09 03:57:34 crc kubenswrapper[4901]: I0309 03:57:34.992895 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ae0b74d-6cd7-4a3d-a9c6-73adfe861dfc" containerName="registry-server" Mar 09 03:57:34 crc kubenswrapper[4901]: I0309 03:57:34.992930 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="20ce0ce0-5535-4b6b-ba9f-44ba2b3bbc13" containerName="oc" Mar 09 03:57:34 crc kubenswrapper[4901]: I0309 03:57:34.994163 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jz9pm" Mar 09 03:57:35 crc kubenswrapper[4901]: I0309 03:57:35.010501 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jz9pm"] Mar 09 03:57:35 crc kubenswrapper[4901]: I0309 03:57:35.159523 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqtps\" (UniqueName: \"kubernetes.io/projected/5c7433b4-1915-4b36-805a-bc32c9892801-kube-api-access-kqtps\") pod \"redhat-operators-jz9pm\" (UID: \"5c7433b4-1915-4b36-805a-bc32c9892801\") " pod="openshift-marketplace/redhat-operators-jz9pm" Mar 09 03:57:35 crc kubenswrapper[4901]: I0309 03:57:35.159640 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c7433b4-1915-4b36-805a-bc32c9892801-utilities\") pod \"redhat-operators-jz9pm\" (UID: \"5c7433b4-1915-4b36-805a-bc32c9892801\") " pod="openshift-marketplace/redhat-operators-jz9pm" Mar 09 03:57:35 crc kubenswrapper[4901]: I0309 03:57:35.159712 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c7433b4-1915-4b36-805a-bc32c9892801-catalog-content\") pod \"redhat-operators-jz9pm\" (UID: \"5c7433b4-1915-4b36-805a-bc32c9892801\") " pod="openshift-marketplace/redhat-operators-jz9pm" Mar 09 03:57:35 crc kubenswrapper[4901]: I0309 03:57:35.261490 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c7433b4-1915-4b36-805a-bc32c9892801-utilities\") pod \"redhat-operators-jz9pm\" (UID: \"5c7433b4-1915-4b36-805a-bc32c9892801\") " pod="openshift-marketplace/redhat-operators-jz9pm" Mar 09 03:57:35 crc kubenswrapper[4901]: I0309 03:57:35.261567 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c7433b4-1915-4b36-805a-bc32c9892801-catalog-content\") pod \"redhat-operators-jz9pm\" (UID: \"5c7433b4-1915-4b36-805a-bc32c9892801\") " pod="openshift-marketplace/redhat-operators-jz9pm" Mar 09 03:57:35 crc kubenswrapper[4901]: I0309 03:57:35.261622 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqtps\" (UniqueName: \"kubernetes.io/projected/5c7433b4-1915-4b36-805a-bc32c9892801-kube-api-access-kqtps\") pod \"redhat-operators-jz9pm\" (UID: \"5c7433b4-1915-4b36-805a-bc32c9892801\") " pod="openshift-marketplace/redhat-operators-jz9pm" Mar 09 03:57:35 crc kubenswrapper[4901]: I0309 03:57:35.262085 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c7433b4-1915-4b36-805a-bc32c9892801-utilities\") pod \"redhat-operators-jz9pm\" (UID: \"5c7433b4-1915-4b36-805a-bc32c9892801\") " pod="openshift-marketplace/redhat-operators-jz9pm" Mar 09 03:57:35 crc kubenswrapper[4901]: I0309 03:57:35.262209 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c7433b4-1915-4b36-805a-bc32c9892801-catalog-content\") pod \"redhat-operators-jz9pm\" (UID: \"5c7433b4-1915-4b36-805a-bc32c9892801\") " pod="openshift-marketplace/redhat-operators-jz9pm" Mar 09 03:57:35 crc kubenswrapper[4901]: I0309 03:57:35.284772 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqtps\" (UniqueName: \"kubernetes.io/projected/5c7433b4-1915-4b36-805a-bc32c9892801-kube-api-access-kqtps\") pod \"redhat-operators-jz9pm\" (UID: \"5c7433b4-1915-4b36-805a-bc32c9892801\") " pod="openshift-marketplace/redhat-operators-jz9pm" Mar 09 03:57:35 crc kubenswrapper[4901]: I0309 03:57:35.313382 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jz9pm" Mar 09 03:57:35 crc kubenswrapper[4901]: I0309 03:57:35.776570 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jz9pm"] Mar 09 03:57:35 crc kubenswrapper[4901]: I0309 03:57:35.968337 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jz9pm" event={"ID":"5c7433b4-1915-4b36-805a-bc32c9892801","Type":"ContainerStarted","Data":"f1def01470a4b438f746406b2d01e8bdb628614cbc7ef080bc5e06af186fcb20"} Mar 09 03:57:36 crc kubenswrapper[4901]: I0309 03:57:36.975412 4901 generic.go:334] "Generic (PLEG): container finished" podID="5c7433b4-1915-4b36-805a-bc32c9892801" containerID="3a7b46b18291ca5d49487cde6715577ddd1de83c52718b2475cd093121fa66f6" exitCode=0 Mar 09 03:57:36 crc kubenswrapper[4901]: I0309 03:57:36.975646 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jz9pm" event={"ID":"5c7433b4-1915-4b36-805a-bc32c9892801","Type":"ContainerDied","Data":"3a7b46b18291ca5d49487cde6715577ddd1de83c52718b2475cd093121fa66f6"} Mar 09 03:57:37 crc kubenswrapper[4901]: I0309 03:57:37.985607 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jz9pm" event={"ID":"5c7433b4-1915-4b36-805a-bc32c9892801","Type":"ContainerStarted","Data":"305a9d4043c5f8dfe1af5ce178363ff5fe8634d3a51302dc89b8ec750c2ee14b"} Mar 09 03:57:39 crc kubenswrapper[4901]: I0309 03:57:38.999948 4901 generic.go:334] "Generic (PLEG): container finished" podID="5c7433b4-1915-4b36-805a-bc32c9892801" containerID="305a9d4043c5f8dfe1af5ce178363ff5fe8634d3a51302dc89b8ec750c2ee14b" exitCode=0 Mar 09 03:57:39 crc kubenswrapper[4901]: I0309 03:57:39.000017 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jz9pm" event={"ID":"5c7433b4-1915-4b36-805a-bc32c9892801","Type":"ContainerDied","Data":"305a9d4043c5f8dfe1af5ce178363ff5fe8634d3a51302dc89b8ec750c2ee14b"} Mar 09 03:57:40 crc kubenswrapper[4901]: I0309 03:57:40.009313 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jz9pm" event={"ID":"5c7433b4-1915-4b36-805a-bc32c9892801","Type":"ContainerStarted","Data":"6a786797f997f853aa7b934a6bbd8d6ab111342dc5bf47579f3dfed49f7d47c5"} Mar 09 03:57:40 crc kubenswrapper[4901]: I0309 03:57:40.032340 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jz9pm" podStartSLOduration=3.575211814 podStartE2EDuration="6.032323533s" podCreationTimestamp="2026-03-09 03:57:34 +0000 UTC" firstStartedPulling="2026-03-09 03:57:36.977810661 +0000 UTC m=+4581.567474393" lastFinishedPulling="2026-03-09 03:57:39.43492234 +0000 UTC m=+4584.024586112" observedRunningTime="2026-03-09 03:57:40.029033962 +0000 UTC m=+4584.618697734" watchObservedRunningTime="2026-03-09 03:57:40.032323533 +0000 UTC m=+4584.621987255" Mar 09 03:57:45 crc kubenswrapper[4901]: I0309 03:57:45.314469 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jz9pm" Mar 09 03:57:45 crc kubenswrapper[4901]: I0309 03:57:45.314799 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jz9pm" Mar 09 03:57:46 crc kubenswrapper[4901]: I0309 03:57:46.379831 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jz9pm" podUID="5c7433b4-1915-4b36-805a-bc32c9892801" containerName="registry-server" probeResult="failure" output=< Mar 09 03:57:46 crc kubenswrapper[4901]: timeout: failed to connect service ":50051" within 1s Mar 09 03:57:46 crc kubenswrapper[4901]: > Mar 09 03:57:55 crc kubenswrapper[4901]: I0309 03:57:55.763202 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jz9pm" Mar 09 03:57:55 crc kubenswrapper[4901]: I0309 03:57:55.812594 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jz9pm" Mar 09 03:57:56 crc kubenswrapper[4901]: I0309 03:57:56.013582 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jz9pm"] Mar 09 03:57:57 crc kubenswrapper[4901]: I0309 03:57:57.152049 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jz9pm" podUID="5c7433b4-1915-4b36-805a-bc32c9892801" containerName="registry-server" containerID="cri-o://6a786797f997f853aa7b934a6bbd8d6ab111342dc5bf47579f3dfed49f7d47c5" gracePeriod=2 Mar 09 03:57:57 crc kubenswrapper[4901]: I0309 03:57:57.625294 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jz9pm" Mar 09 03:57:57 crc kubenswrapper[4901]: I0309 03:57:57.724680 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c7433b4-1915-4b36-805a-bc32c9892801-catalog-content\") pod \"5c7433b4-1915-4b36-805a-bc32c9892801\" (UID: \"5c7433b4-1915-4b36-805a-bc32c9892801\") " Mar 09 03:57:57 crc kubenswrapper[4901]: I0309 03:57:57.724734 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqtps\" (UniqueName: \"kubernetes.io/projected/5c7433b4-1915-4b36-805a-bc32c9892801-kube-api-access-kqtps\") pod \"5c7433b4-1915-4b36-805a-bc32c9892801\" (UID: \"5c7433b4-1915-4b36-805a-bc32c9892801\") " Mar 09 03:57:57 crc kubenswrapper[4901]: I0309 03:57:57.724821 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c7433b4-1915-4b36-805a-bc32c9892801-utilities\") pod \"5c7433b4-1915-4b36-805a-bc32c9892801\" (UID: \"5c7433b4-1915-4b36-805a-bc32c9892801\") " Mar 09 03:57:57 crc kubenswrapper[4901]: I0309 03:57:57.725692 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c7433b4-1915-4b36-805a-bc32c9892801-utilities" (OuterVolumeSpecName: "utilities") pod "5c7433b4-1915-4b36-805a-bc32c9892801" (UID: "5c7433b4-1915-4b36-805a-bc32c9892801"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:57:57 crc kubenswrapper[4901]: I0309 03:57:57.742617 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c7433b4-1915-4b36-805a-bc32c9892801-kube-api-access-kqtps" (OuterVolumeSpecName: "kube-api-access-kqtps") pod "5c7433b4-1915-4b36-805a-bc32c9892801" (UID: "5c7433b4-1915-4b36-805a-bc32c9892801"). InnerVolumeSpecName "kube-api-access-kqtps". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:57:57 crc kubenswrapper[4901]: I0309 03:57:57.826196 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqtps\" (UniqueName: \"kubernetes.io/projected/5c7433b4-1915-4b36-805a-bc32c9892801-kube-api-access-kqtps\") on node \"crc\" DevicePath \"\"" Mar 09 03:57:57 crc kubenswrapper[4901]: I0309 03:57:57.826224 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c7433b4-1915-4b36-805a-bc32c9892801-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 03:57:57 crc kubenswrapper[4901]: I0309 03:57:57.903454 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c7433b4-1915-4b36-805a-bc32c9892801-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5c7433b4-1915-4b36-805a-bc32c9892801" (UID: "5c7433b4-1915-4b36-805a-bc32c9892801"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 03:57:57 crc kubenswrapper[4901]: I0309 03:57:57.927610 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c7433b4-1915-4b36-805a-bc32c9892801-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 03:57:58 crc kubenswrapper[4901]: I0309 03:57:58.163304 4901 generic.go:334] "Generic (PLEG): container finished" podID="5c7433b4-1915-4b36-805a-bc32c9892801" containerID="6a786797f997f853aa7b934a6bbd8d6ab111342dc5bf47579f3dfed49f7d47c5" exitCode=0 Mar 09 03:57:58 crc kubenswrapper[4901]: I0309 03:57:58.163494 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jz9pm" event={"ID":"5c7433b4-1915-4b36-805a-bc32c9892801","Type":"ContainerDied","Data":"6a786797f997f853aa7b934a6bbd8d6ab111342dc5bf47579f3dfed49f7d47c5"} Mar 09 03:57:58 crc kubenswrapper[4901]: I0309 03:57:58.164287 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jz9pm" event={"ID":"5c7433b4-1915-4b36-805a-bc32c9892801","Type":"ContainerDied","Data":"f1def01470a4b438f746406b2d01e8bdb628614cbc7ef080bc5e06af186fcb20"} Mar 09 03:57:58 crc kubenswrapper[4901]: I0309 03:57:58.164395 4901 scope.go:117] "RemoveContainer" containerID="6a786797f997f853aa7b934a6bbd8d6ab111342dc5bf47579f3dfed49f7d47c5" Mar 09 03:57:58 crc kubenswrapper[4901]: I0309 03:57:58.163610 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jz9pm" Mar 09 03:57:58 crc kubenswrapper[4901]: I0309 03:57:58.192415 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jz9pm"] Mar 09 03:57:58 crc kubenswrapper[4901]: I0309 03:57:58.201922 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jz9pm"] Mar 09 03:57:58 crc kubenswrapper[4901]: I0309 03:57:58.208131 4901 scope.go:117] "RemoveContainer" containerID="305a9d4043c5f8dfe1af5ce178363ff5fe8634d3a51302dc89b8ec750c2ee14b" Mar 09 03:57:58 crc kubenswrapper[4901]: I0309 03:57:58.238330 4901 scope.go:117] "RemoveContainer" containerID="3a7b46b18291ca5d49487cde6715577ddd1de83c52718b2475cd093121fa66f6" Mar 09 03:57:58 crc kubenswrapper[4901]: I0309 03:57:58.280112 4901 scope.go:117] "RemoveContainer" containerID="6a786797f997f853aa7b934a6bbd8d6ab111342dc5bf47579f3dfed49f7d47c5" Mar 09 03:57:58 crc kubenswrapper[4901]: E0309 03:57:58.280695 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a786797f997f853aa7b934a6bbd8d6ab111342dc5bf47579f3dfed49f7d47c5\": container with ID starting with 6a786797f997f853aa7b934a6bbd8d6ab111342dc5bf47579f3dfed49f7d47c5 not found: ID does not exist" containerID="6a786797f997f853aa7b934a6bbd8d6ab111342dc5bf47579f3dfed49f7d47c5" Mar 09 03:57:58 crc kubenswrapper[4901]: I0309 03:57:58.280744 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a786797f997f853aa7b934a6bbd8d6ab111342dc5bf47579f3dfed49f7d47c5"} err="failed to get container status \"6a786797f997f853aa7b934a6bbd8d6ab111342dc5bf47579f3dfed49f7d47c5\": rpc error: code = NotFound desc = could not find container \"6a786797f997f853aa7b934a6bbd8d6ab111342dc5bf47579f3dfed49f7d47c5\": container with ID starting with 6a786797f997f853aa7b934a6bbd8d6ab111342dc5bf47579f3dfed49f7d47c5 not found: ID does not exist" Mar 09 03:57:58 crc kubenswrapper[4901]: I0309 03:57:58.280779 4901 scope.go:117] "RemoveContainer" containerID="305a9d4043c5f8dfe1af5ce178363ff5fe8634d3a51302dc89b8ec750c2ee14b" Mar 09 03:57:58 crc kubenswrapper[4901]: E0309 03:57:58.281514 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"305a9d4043c5f8dfe1af5ce178363ff5fe8634d3a51302dc89b8ec750c2ee14b\": container with ID starting with 305a9d4043c5f8dfe1af5ce178363ff5fe8634d3a51302dc89b8ec750c2ee14b not found: ID does not exist" containerID="305a9d4043c5f8dfe1af5ce178363ff5fe8634d3a51302dc89b8ec750c2ee14b" Mar 09 03:57:58 crc kubenswrapper[4901]: I0309 03:57:58.281540 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"305a9d4043c5f8dfe1af5ce178363ff5fe8634d3a51302dc89b8ec750c2ee14b"} err="failed to get container status \"305a9d4043c5f8dfe1af5ce178363ff5fe8634d3a51302dc89b8ec750c2ee14b\": rpc error: code = NotFound desc = could not find container \"305a9d4043c5f8dfe1af5ce178363ff5fe8634d3a51302dc89b8ec750c2ee14b\": container with ID starting with 305a9d4043c5f8dfe1af5ce178363ff5fe8634d3a51302dc89b8ec750c2ee14b not found: ID does not exist" Mar 09 03:57:58 crc kubenswrapper[4901]: I0309 03:57:58.281558 4901 scope.go:117] "RemoveContainer" containerID="3a7b46b18291ca5d49487cde6715577ddd1de83c52718b2475cd093121fa66f6" Mar 09 03:57:58 crc kubenswrapper[4901]: E0309 03:57:58.281888 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a7b46b18291ca5d49487cde6715577ddd1de83c52718b2475cd093121fa66f6\": container with ID starting with 3a7b46b18291ca5d49487cde6715577ddd1de83c52718b2475cd093121fa66f6 not found: ID does not exist" containerID="3a7b46b18291ca5d49487cde6715577ddd1de83c52718b2475cd093121fa66f6" Mar 09 03:57:58 crc kubenswrapper[4901]: I0309 03:57:58.281923 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a7b46b18291ca5d49487cde6715577ddd1de83c52718b2475cd093121fa66f6"} err="failed to get container status \"3a7b46b18291ca5d49487cde6715577ddd1de83c52718b2475cd093121fa66f6\": rpc error: code = NotFound desc = could not find container \"3a7b46b18291ca5d49487cde6715577ddd1de83c52718b2475cd093121fa66f6\": container with ID starting with 3a7b46b18291ca5d49487cde6715577ddd1de83c52718b2475cd093121fa66f6 not found: ID does not exist" Mar 09 03:58:00 crc kubenswrapper[4901]: I0309 03:58:00.117552 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c7433b4-1915-4b36-805a-bc32c9892801" path="/var/lib/kubelet/pods/5c7433b4-1915-4b36-805a-bc32c9892801/volumes" Mar 09 03:58:00 crc kubenswrapper[4901]: I0309 03:58:00.151321 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550478-sn8lz"] Mar 09 03:58:00 crc kubenswrapper[4901]: E0309 03:58:00.152097 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c7433b4-1915-4b36-805a-bc32c9892801" containerName="registry-server" Mar 09 03:58:00 crc kubenswrapper[4901]: I0309 03:58:00.152122 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c7433b4-1915-4b36-805a-bc32c9892801" containerName="registry-server" Mar 09 03:58:00 crc kubenswrapper[4901]: E0309 03:58:00.152143 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c7433b4-1915-4b36-805a-bc32c9892801" containerName="extract-utilities" Mar 09 03:58:00 crc kubenswrapper[4901]: I0309 03:58:00.152152 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c7433b4-1915-4b36-805a-bc32c9892801" containerName="extract-utilities" Mar 09 03:58:00 crc kubenswrapper[4901]: E0309 03:58:00.152168 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c7433b4-1915-4b36-805a-bc32c9892801" containerName="extract-content" Mar 09 03:58:00 crc kubenswrapper[4901]: I0309 03:58:00.152177 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c7433b4-1915-4b36-805a-bc32c9892801" containerName="extract-content" Mar 09 03:58:00 crc kubenswrapper[4901]: I0309 03:58:00.152404 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c7433b4-1915-4b36-805a-bc32c9892801" containerName="registry-server" Mar 09 03:58:00 crc kubenswrapper[4901]: I0309 03:58:00.153254 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550478-sn8lz" Mar 09 03:58:00 crc kubenswrapper[4901]: I0309 03:58:00.156595 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 03:58:00 crc kubenswrapper[4901]: I0309 03:58:00.156849 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lq6vs" Mar 09 03:58:00 crc kubenswrapper[4901]: I0309 03:58:00.158930 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550478-sn8lz"] Mar 09 03:58:00 crc kubenswrapper[4901]: I0309 03:58:00.163394 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 03:58:00 crc kubenswrapper[4901]: I0309 03:58:00.268351 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hrpd\" (UniqueName: \"kubernetes.io/projected/c89db68b-e834-4486-848e-303d651be8c0-kube-api-access-9hrpd\") pod \"auto-csr-approver-29550478-sn8lz\" (UID: \"c89db68b-e834-4486-848e-303d651be8c0\") " pod="openshift-infra/auto-csr-approver-29550478-sn8lz" Mar 09 03:58:00 crc kubenswrapper[4901]: I0309 03:58:00.370089 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hrpd\" (UniqueName: \"kubernetes.io/projected/c89db68b-e834-4486-848e-303d651be8c0-kube-api-access-9hrpd\") pod \"auto-csr-approver-29550478-sn8lz\" (UID: \"c89db68b-e834-4486-848e-303d651be8c0\") " pod="openshift-infra/auto-csr-approver-29550478-sn8lz" Mar 09 03:58:00 crc kubenswrapper[4901]: I0309 03:58:00.390279 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hrpd\" (UniqueName: \"kubernetes.io/projected/c89db68b-e834-4486-848e-303d651be8c0-kube-api-access-9hrpd\") pod \"auto-csr-approver-29550478-sn8lz\" (UID: \"c89db68b-e834-4486-848e-303d651be8c0\") " pod="openshift-infra/auto-csr-approver-29550478-sn8lz" Mar 09 03:58:00 crc kubenswrapper[4901]: I0309 03:58:00.473517 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550478-sn8lz" Mar 09 03:58:00 crc kubenswrapper[4901]: I0309 03:58:00.863299 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 03:58:00 crc kubenswrapper[4901]: I0309 03:58:00.863659 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 03:58:00 crc kubenswrapper[4901]: I0309 03:58:00.944763 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550478-sn8lz"] Mar 09 03:58:00 crc kubenswrapper[4901]: W0309 03:58:00.962533 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc89db68b_e834_4486_848e_303d651be8c0.slice/crio-b9e1490e8ffb7805252bb52ae85b3e4a059a3fa71ea9df19a8b05cc4a6c5d7ba WatchSource:0}: Error finding container b9e1490e8ffb7805252bb52ae85b3e4a059a3fa71ea9df19a8b05cc4a6c5d7ba: Status 404 returned error can't find the container with id b9e1490e8ffb7805252bb52ae85b3e4a059a3fa71ea9df19a8b05cc4a6c5d7ba Mar 09 03:58:01 crc kubenswrapper[4901]: I0309 03:58:01.208558 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550478-sn8lz" event={"ID":"c89db68b-e834-4486-848e-303d651be8c0","Type":"ContainerStarted","Data":"b9e1490e8ffb7805252bb52ae85b3e4a059a3fa71ea9df19a8b05cc4a6c5d7ba"} Mar 09 03:58:02 crc kubenswrapper[4901]: I0309 03:58:02.220091 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550478-sn8lz" event={"ID":"c89db68b-e834-4486-848e-303d651be8c0","Type":"ContainerStarted","Data":"03e4c2ed359b7095b87a721630de8607e1d76adc905ec0c387c97814877b006e"} Mar 09 03:58:02 crc kubenswrapper[4901]: I0309 03:58:02.239798 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550478-sn8lz" podStartSLOduration=1.410513455 podStartE2EDuration="2.239782273s" podCreationTimestamp="2026-03-09 03:58:00 +0000 UTC" firstStartedPulling="2026-03-09 03:58:00.967138486 +0000 UTC m=+4605.556802228" lastFinishedPulling="2026-03-09 03:58:01.796407304 +0000 UTC m=+4606.386071046" observedRunningTime="2026-03-09 03:58:02.238353658 +0000 UTC m=+4606.828017460" watchObservedRunningTime="2026-03-09 03:58:02.239782273 +0000 UTC m=+4606.829446005" Mar 09 03:58:03 crc kubenswrapper[4901]: I0309 03:58:03.238363 4901 generic.go:334] "Generic (PLEG): container finished" podID="c89db68b-e834-4486-848e-303d651be8c0" containerID="03e4c2ed359b7095b87a721630de8607e1d76adc905ec0c387c97814877b006e" exitCode=0 Mar 09 03:58:03 crc kubenswrapper[4901]: I0309 03:58:03.238412 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550478-sn8lz" event={"ID":"c89db68b-e834-4486-848e-303d651be8c0","Type":"ContainerDied","Data":"03e4c2ed359b7095b87a721630de8607e1d76adc905ec0c387c97814877b006e"} Mar 09 03:58:04 crc kubenswrapper[4901]: I0309 03:58:04.987080 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550478-sn8lz" Mar 09 03:58:05 crc kubenswrapper[4901]: I0309 03:58:05.148809 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hrpd\" (UniqueName: \"kubernetes.io/projected/c89db68b-e834-4486-848e-303d651be8c0-kube-api-access-9hrpd\") pod \"c89db68b-e834-4486-848e-303d651be8c0\" (UID: \"c89db68b-e834-4486-848e-303d651be8c0\") " Mar 09 03:58:05 crc kubenswrapper[4901]: I0309 03:58:05.156006 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c89db68b-e834-4486-848e-303d651be8c0-kube-api-access-9hrpd" (OuterVolumeSpecName: "kube-api-access-9hrpd") pod "c89db68b-e834-4486-848e-303d651be8c0" (UID: "c89db68b-e834-4486-848e-303d651be8c0"). InnerVolumeSpecName "kube-api-access-9hrpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 03:58:05 crc kubenswrapper[4901]: I0309 03:58:05.250516 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hrpd\" (UniqueName: \"kubernetes.io/projected/c89db68b-e834-4486-848e-303d651be8c0-kube-api-access-9hrpd\") on node \"crc\" DevicePath \"\"" Mar 09 03:58:05 crc kubenswrapper[4901]: I0309 03:58:05.255518 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550478-sn8lz" event={"ID":"c89db68b-e834-4486-848e-303d651be8c0","Type":"ContainerDied","Data":"b9e1490e8ffb7805252bb52ae85b3e4a059a3fa71ea9df19a8b05cc4a6c5d7ba"} Mar 09 03:58:05 crc kubenswrapper[4901]: I0309 03:58:05.255724 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9e1490e8ffb7805252bb52ae85b3e4a059a3fa71ea9df19a8b05cc4a6c5d7ba" Mar 09 03:58:05 crc kubenswrapper[4901]: I0309 03:58:05.255599 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550478-sn8lz" Mar 09 03:58:05 crc kubenswrapper[4901]: I0309 03:58:05.325965 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550472-6xqn9"] Mar 09 03:58:05 crc kubenswrapper[4901]: I0309 03:58:05.337044 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550472-6xqn9"] Mar 09 03:58:06 crc kubenswrapper[4901]: I0309 03:58:06.119524 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0c5f03c-0270-4605-a375-76bd7308381e" path="/var/lib/kubelet/pods/b0c5f03c-0270-4605-a375-76bd7308381e/volumes" Mar 09 03:58:30 crc kubenswrapper[4901]: I0309 03:58:30.863591 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 03:58:30 crc kubenswrapper[4901]: I0309 03:58:30.864277 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 03:58:41 crc kubenswrapper[4901]: I0309 03:58:41.088709 4901 scope.go:117] "RemoveContainer" containerID="1b1343d21249b5c8b933d1331da9795623fd2c10a4cb2c6ffd9c1bb8dc890c32" Mar 09 03:59:00 crc kubenswrapper[4901]: I0309 03:59:00.862915 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 03:59:00 crc kubenswrapper[4901]: I0309 03:59:00.865063 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 03:59:00 crc kubenswrapper[4901]: I0309 03:59:00.865307 4901 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5c998" Mar 09 03:59:00 crc kubenswrapper[4901]: I0309 03:59:00.866391 4901 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f2eaaa94ef813f27ab89958e9931b25930a897defaee3221bd3f9fa043f32e39"} pod="openshift-machine-config-operator/machine-config-daemon-5c998" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 03:59:00 crc kubenswrapper[4901]: I0309 03:59:00.866633 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" containerID="cri-o://f2eaaa94ef813f27ab89958e9931b25930a897defaee3221bd3f9fa043f32e39" gracePeriod=600 Mar 09 03:59:01 crc kubenswrapper[4901]: E0309 03:59:01.018942 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:59:01 crc kubenswrapper[4901]: I0309 03:59:01.774220 4901 generic.go:334] "Generic (PLEG): container finished" podID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerID="f2eaaa94ef813f27ab89958e9931b25930a897defaee3221bd3f9fa043f32e39" exitCode=0 Mar 09 03:59:01 crc kubenswrapper[4901]: I0309 03:59:01.774282 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" event={"ID":"65e722e8-52c4-4bb6-9927-f378b2f7296a","Type":"ContainerDied","Data":"f2eaaa94ef813f27ab89958e9931b25930a897defaee3221bd3f9fa043f32e39"} Mar 09 03:59:01 crc kubenswrapper[4901]: I0309 03:59:01.774387 4901 scope.go:117] "RemoveContainer" containerID="2bbd5a3d2c0fe44692c1657737ce4c08f09c92b4a8c3331b9fa71858ac3e491a" Mar 09 03:59:01 crc kubenswrapper[4901]: I0309 03:59:01.775048 4901 scope.go:117] "RemoveContainer" containerID="f2eaaa94ef813f27ab89958e9931b25930a897defaee3221bd3f9fa043f32e39" Mar 09 03:59:01 crc kubenswrapper[4901]: E0309 03:59:01.775450 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:59:16 crc kubenswrapper[4901]: I0309 03:59:16.114681 4901 scope.go:117] "RemoveContainer" containerID="f2eaaa94ef813f27ab89958e9931b25930a897defaee3221bd3f9fa043f32e39" Mar 09 03:59:16 crc kubenswrapper[4901]: E0309 03:59:16.115775 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:59:29 crc kubenswrapper[4901]: I0309 03:59:29.107835 4901 scope.go:117] "RemoveContainer" containerID="f2eaaa94ef813f27ab89958e9931b25930a897defaee3221bd3f9fa043f32e39" Mar 09 03:59:29 crc kubenswrapper[4901]: E0309 03:59:29.109059 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:59:43 crc kubenswrapper[4901]: I0309 03:59:43.105785 4901 scope.go:117] "RemoveContainer" containerID="f2eaaa94ef813f27ab89958e9931b25930a897defaee3221bd3f9fa043f32e39" Mar 09 03:59:43 crc kubenswrapper[4901]: E0309 03:59:43.106851 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 03:59:57 crc kubenswrapper[4901]: I0309 03:59:57.107266 4901 scope.go:117] "RemoveContainer" containerID="f2eaaa94ef813f27ab89958e9931b25930a897defaee3221bd3f9fa043f32e39" Mar 09 03:59:57 crc kubenswrapper[4901]: E0309 03:59:57.108281 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:00:00 crc kubenswrapper[4901]: I0309 04:00:00.165820 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550480-4trsp"] Mar 09 04:00:00 crc kubenswrapper[4901]: E0309 04:00:00.166940 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c89db68b-e834-4486-848e-303d651be8c0" containerName="oc" Mar 09 04:00:00 crc kubenswrapper[4901]: I0309 04:00:00.166960 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="c89db68b-e834-4486-848e-303d651be8c0" containerName="oc" Mar 09 04:00:00 crc kubenswrapper[4901]: I0309 04:00:00.167189 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="c89db68b-e834-4486-848e-303d651be8c0" containerName="oc" Mar 09 04:00:00 crc kubenswrapper[4901]: I0309 04:00:00.167906 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550480-4trsp" Mar 09 04:00:00 crc kubenswrapper[4901]: I0309 04:00:00.172791 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 04:00:00 crc kubenswrapper[4901]: I0309 04:00:00.172948 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lq6vs" Mar 09 04:00:00 crc kubenswrapper[4901]: I0309 04:00:00.173122 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 04:00:00 crc kubenswrapper[4901]: I0309 04:00:00.182364 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550480-f8sc9"] Mar 09 04:00:00 crc kubenswrapper[4901]: I0309 04:00:00.183900 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550480-f8sc9" Mar 09 04:00:00 crc kubenswrapper[4901]: I0309 04:00:00.188648 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 09 04:00:00 crc kubenswrapper[4901]: I0309 04:00:00.188726 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 09 04:00:00 crc kubenswrapper[4901]: I0309 04:00:00.192318 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550480-4trsp"] Mar 09 04:00:00 crc kubenswrapper[4901]: I0309 04:00:00.202107 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550480-f8sc9"] Mar 09 04:00:00 crc kubenswrapper[4901]: I0309 04:00:00.282107 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fd0bea92-7299-48b7-a4e2-f907b51cfe8d-secret-volume\") pod \"collect-profiles-29550480-f8sc9\" (UID: \"fd0bea92-7299-48b7-a4e2-f907b51cfe8d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550480-f8sc9" Mar 09 04:00:00 crc kubenswrapper[4901]: I0309 04:00:00.282313 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lj9l\" (UniqueName: \"kubernetes.io/projected/fd0bea92-7299-48b7-a4e2-f907b51cfe8d-kube-api-access-5lj9l\") pod \"collect-profiles-29550480-f8sc9\" (UID: \"fd0bea92-7299-48b7-a4e2-f907b51cfe8d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550480-f8sc9" Mar 09 04:00:00 crc kubenswrapper[4901]: I0309 04:00:00.282369 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fd0bea92-7299-48b7-a4e2-f907b51cfe8d-config-volume\") pod \"collect-profiles-29550480-f8sc9\" (UID: \"fd0bea92-7299-48b7-a4e2-f907b51cfe8d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550480-f8sc9" Mar 09 04:00:00 crc kubenswrapper[4901]: I0309 04:00:00.282484 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vspxg\" (UniqueName: \"kubernetes.io/projected/016d7314-606e-4045-bdbd-1554776219aa-kube-api-access-vspxg\") pod \"auto-csr-approver-29550480-4trsp\" (UID: \"016d7314-606e-4045-bdbd-1554776219aa\") " pod="openshift-infra/auto-csr-approver-29550480-4trsp" Mar 09 04:00:00 crc kubenswrapper[4901]: I0309 04:00:00.384684 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fd0bea92-7299-48b7-a4e2-f907b51cfe8d-secret-volume\") pod \"collect-profiles-29550480-f8sc9\" (UID: \"fd0bea92-7299-48b7-a4e2-f907b51cfe8d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550480-f8sc9" Mar 09 04:00:00 crc kubenswrapper[4901]: I0309 04:00:00.384853 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lj9l\" (UniqueName: \"kubernetes.io/projected/fd0bea92-7299-48b7-a4e2-f907b51cfe8d-kube-api-access-5lj9l\") pod \"collect-profiles-29550480-f8sc9\" (UID: \"fd0bea92-7299-48b7-a4e2-f907b51cfe8d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550480-f8sc9" Mar 09 04:00:00 crc kubenswrapper[4901]: I0309 04:00:00.384907 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fd0bea92-7299-48b7-a4e2-f907b51cfe8d-config-volume\") pod \"collect-profiles-29550480-f8sc9\" (UID: \"fd0bea92-7299-48b7-a4e2-f907b51cfe8d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550480-f8sc9" Mar 09 04:00:00 crc kubenswrapper[4901]: I0309 04:00:00.384972 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vspxg\" (UniqueName: \"kubernetes.io/projected/016d7314-606e-4045-bdbd-1554776219aa-kube-api-access-vspxg\") pod \"auto-csr-approver-29550480-4trsp\" (UID: \"016d7314-606e-4045-bdbd-1554776219aa\") " pod="openshift-infra/auto-csr-approver-29550480-4trsp" Mar 09 04:00:00 crc kubenswrapper[4901]: I0309 04:00:00.385892 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fd0bea92-7299-48b7-a4e2-f907b51cfe8d-config-volume\") pod \"collect-profiles-29550480-f8sc9\" (UID: \"fd0bea92-7299-48b7-a4e2-f907b51cfe8d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550480-f8sc9" Mar 09 04:00:00 crc kubenswrapper[4901]: I0309 04:00:00.405569 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fd0bea92-7299-48b7-a4e2-f907b51cfe8d-secret-volume\") pod \"collect-profiles-29550480-f8sc9\" (UID: \"fd0bea92-7299-48b7-a4e2-f907b51cfe8d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550480-f8sc9" Mar 09 04:00:00 crc kubenswrapper[4901]: I0309 04:00:00.414271 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vspxg\" (UniqueName: \"kubernetes.io/projected/016d7314-606e-4045-bdbd-1554776219aa-kube-api-access-vspxg\") pod \"auto-csr-approver-29550480-4trsp\" (UID: \"016d7314-606e-4045-bdbd-1554776219aa\") " pod="openshift-infra/auto-csr-approver-29550480-4trsp" Mar 09 04:00:00 crc kubenswrapper[4901]: I0309 04:00:00.414482 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lj9l\" (UniqueName: \"kubernetes.io/projected/fd0bea92-7299-48b7-a4e2-f907b51cfe8d-kube-api-access-5lj9l\") pod \"collect-profiles-29550480-f8sc9\" (UID: \"fd0bea92-7299-48b7-a4e2-f907b51cfe8d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550480-f8sc9" Mar 09 04:00:00 crc kubenswrapper[4901]: I0309 04:00:00.499355 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550480-4trsp" Mar 09 04:00:00 crc kubenswrapper[4901]: I0309 04:00:00.510587 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550480-f8sc9" Mar 09 04:00:00 crc kubenswrapper[4901]: I0309 04:00:00.805723 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550480-4trsp"] Mar 09 04:00:00 crc kubenswrapper[4901]: I0309 04:00:00.811118 4901 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 04:00:00 crc kubenswrapper[4901]: I0309 04:00:00.876400 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550480-f8sc9"] Mar 09 04:00:00 crc kubenswrapper[4901]: W0309 04:00:00.878660 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd0bea92_7299_48b7_a4e2_f907b51cfe8d.slice/crio-bdf06087471844d89f8eb50dbc991f1bb9ac4d9d34bd1000ad7a06a97e56e31a WatchSource:0}: Error finding container bdf06087471844d89f8eb50dbc991f1bb9ac4d9d34bd1000ad7a06a97e56e31a: Status 404 returned error can't find the container with id bdf06087471844d89f8eb50dbc991f1bb9ac4d9d34bd1000ad7a06a97e56e31a Mar 09 04:00:01 crc kubenswrapper[4901]: I0309 04:00:01.813513 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550480-4trsp" event={"ID":"016d7314-606e-4045-bdbd-1554776219aa","Type":"ContainerStarted","Data":"563756b62aefbff061881e9ae362ecf36954028173f48005b1d51e3bc6c963ed"} Mar 09 04:00:01 crc kubenswrapper[4901]: I0309 04:00:01.817418 4901 generic.go:334] "Generic (PLEG): container finished" podID="fd0bea92-7299-48b7-a4e2-f907b51cfe8d" containerID="30becb8b77d9e617775b07ee2f8a8b893aef4a3706be7e37aef36a957e698991" exitCode=0 Mar 09 04:00:01 crc kubenswrapper[4901]: I0309 04:00:01.817473 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550480-f8sc9" event={"ID":"fd0bea92-7299-48b7-a4e2-f907b51cfe8d","Type":"ContainerDied","Data":"30becb8b77d9e617775b07ee2f8a8b893aef4a3706be7e37aef36a957e698991"} Mar 09 04:00:01 crc kubenswrapper[4901]: I0309 04:00:01.817503 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550480-f8sc9" event={"ID":"fd0bea92-7299-48b7-a4e2-f907b51cfe8d","Type":"ContainerStarted","Data":"bdf06087471844d89f8eb50dbc991f1bb9ac4d9d34bd1000ad7a06a97e56e31a"} Mar 09 04:00:03 crc kubenswrapper[4901]: I0309 04:00:03.260798 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550480-f8sc9" Mar 09 04:00:03 crc kubenswrapper[4901]: I0309 04:00:03.345077 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fd0bea92-7299-48b7-a4e2-f907b51cfe8d-config-volume\") pod \"fd0bea92-7299-48b7-a4e2-f907b51cfe8d\" (UID: \"fd0bea92-7299-48b7-a4e2-f907b51cfe8d\") " Mar 09 04:00:03 crc kubenswrapper[4901]: I0309 04:00:03.345271 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fd0bea92-7299-48b7-a4e2-f907b51cfe8d-secret-volume\") pod \"fd0bea92-7299-48b7-a4e2-f907b51cfe8d\" (UID: \"fd0bea92-7299-48b7-a4e2-f907b51cfe8d\") " Mar 09 04:00:03 crc kubenswrapper[4901]: I0309 04:00:03.345407 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lj9l\" (UniqueName: \"kubernetes.io/projected/fd0bea92-7299-48b7-a4e2-f907b51cfe8d-kube-api-access-5lj9l\") pod \"fd0bea92-7299-48b7-a4e2-f907b51cfe8d\" (UID: \"fd0bea92-7299-48b7-a4e2-f907b51cfe8d\") " Mar 09 04:00:03 crc kubenswrapper[4901]: I0309 04:00:03.346180 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd0bea92-7299-48b7-a4e2-f907b51cfe8d-config-volume" (OuterVolumeSpecName: "config-volume") pod "fd0bea92-7299-48b7-a4e2-f907b51cfe8d" (UID: "fd0bea92-7299-48b7-a4e2-f907b51cfe8d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 04:00:03 crc kubenswrapper[4901]: I0309 04:00:03.352395 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd0bea92-7299-48b7-a4e2-f907b51cfe8d-kube-api-access-5lj9l" (OuterVolumeSpecName: "kube-api-access-5lj9l") pod "fd0bea92-7299-48b7-a4e2-f907b51cfe8d" (UID: "fd0bea92-7299-48b7-a4e2-f907b51cfe8d"). InnerVolumeSpecName "kube-api-access-5lj9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:00:03 crc kubenswrapper[4901]: I0309 04:00:03.353455 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd0bea92-7299-48b7-a4e2-f907b51cfe8d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "fd0bea92-7299-48b7-a4e2-f907b51cfe8d" (UID: "fd0bea92-7299-48b7-a4e2-f907b51cfe8d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 04:00:03 crc kubenswrapper[4901]: I0309 04:00:03.447321 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lj9l\" (UniqueName: \"kubernetes.io/projected/fd0bea92-7299-48b7-a4e2-f907b51cfe8d-kube-api-access-5lj9l\") on node \"crc\" DevicePath \"\"" Mar 09 04:00:03 crc kubenswrapper[4901]: I0309 04:00:03.447363 4901 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fd0bea92-7299-48b7-a4e2-f907b51cfe8d-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 04:00:03 crc kubenswrapper[4901]: I0309 04:00:03.447384 4901 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fd0bea92-7299-48b7-a4e2-f907b51cfe8d-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 09 04:00:03 crc kubenswrapper[4901]: I0309 04:00:03.839308 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550480-f8sc9" event={"ID":"fd0bea92-7299-48b7-a4e2-f907b51cfe8d","Type":"ContainerDied","Data":"bdf06087471844d89f8eb50dbc991f1bb9ac4d9d34bd1000ad7a06a97e56e31a"} Mar 09 04:00:03 crc kubenswrapper[4901]: I0309 04:00:03.839372 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bdf06087471844d89f8eb50dbc991f1bb9ac4d9d34bd1000ad7a06a97e56e31a" Mar 09 04:00:03 crc kubenswrapper[4901]: I0309 04:00:03.839448 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550480-f8sc9" Mar 09 04:00:04 crc kubenswrapper[4901]: I0309 04:00:04.376993 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550435-4qfpp"] Mar 09 04:00:04 crc kubenswrapper[4901]: I0309 04:00:04.387354 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550435-4qfpp"] Mar 09 04:00:06 crc kubenswrapper[4901]: I0309 04:00:06.135870 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5e2f2de-1f76-43b5-97f2-0ad3cf044169" path="/var/lib/kubelet/pods/c5e2f2de-1f76-43b5-97f2-0ad3cf044169/volumes" Mar 09 04:00:07 crc kubenswrapper[4901]: I0309 04:00:07.856335 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4xsdb"] Mar 09 04:00:07 crc kubenswrapper[4901]: E0309 04:00:07.857138 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd0bea92-7299-48b7-a4e2-f907b51cfe8d" containerName="collect-profiles" Mar 09 04:00:07 crc kubenswrapper[4901]: I0309 04:00:07.857161 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd0bea92-7299-48b7-a4e2-f907b51cfe8d" containerName="collect-profiles" Mar 09 04:00:07 crc kubenswrapper[4901]: I0309 04:00:07.857497 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd0bea92-7299-48b7-a4e2-f907b51cfe8d" containerName="collect-profiles" Mar 09 04:00:07 crc kubenswrapper[4901]: I0309 04:00:07.859635 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4xsdb" Mar 09 04:00:07 crc kubenswrapper[4901]: I0309 04:00:07.866173 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4xsdb"] Mar 09 04:00:07 crc kubenswrapper[4901]: I0309 04:00:07.922031 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47bcc99e-5a38-4daa-b1da-eb28a58d19f9-catalog-content\") pod \"certified-operators-4xsdb\" (UID: \"47bcc99e-5a38-4daa-b1da-eb28a58d19f9\") " pod="openshift-marketplace/certified-operators-4xsdb" Mar 09 04:00:07 crc kubenswrapper[4901]: I0309 04:00:07.922104 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-862sd\" (UniqueName: \"kubernetes.io/projected/47bcc99e-5a38-4daa-b1da-eb28a58d19f9-kube-api-access-862sd\") pod \"certified-operators-4xsdb\" (UID: \"47bcc99e-5a38-4daa-b1da-eb28a58d19f9\") " pod="openshift-marketplace/certified-operators-4xsdb" Mar 09 04:00:07 crc kubenswrapper[4901]: I0309 04:00:07.922141 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47bcc99e-5a38-4daa-b1da-eb28a58d19f9-utilities\") pod \"certified-operators-4xsdb\" (UID: \"47bcc99e-5a38-4daa-b1da-eb28a58d19f9\") " pod="openshift-marketplace/certified-operators-4xsdb" Mar 09 04:00:08 crc kubenswrapper[4901]: I0309 04:00:08.023532 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47bcc99e-5a38-4daa-b1da-eb28a58d19f9-catalog-content\") pod \"certified-operators-4xsdb\" (UID: \"47bcc99e-5a38-4daa-b1da-eb28a58d19f9\") " pod="openshift-marketplace/certified-operators-4xsdb" Mar 09 04:00:08 crc kubenswrapper[4901]: I0309 04:00:08.023653 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-862sd\" (UniqueName: \"kubernetes.io/projected/47bcc99e-5a38-4daa-b1da-eb28a58d19f9-kube-api-access-862sd\") pod \"certified-operators-4xsdb\" (UID: \"47bcc99e-5a38-4daa-b1da-eb28a58d19f9\") " pod="openshift-marketplace/certified-operators-4xsdb" Mar 09 04:00:08 crc kubenswrapper[4901]: I0309 04:00:08.023706 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47bcc99e-5a38-4daa-b1da-eb28a58d19f9-utilities\") pod \"certified-operators-4xsdb\" (UID: \"47bcc99e-5a38-4daa-b1da-eb28a58d19f9\") " pod="openshift-marketplace/certified-operators-4xsdb" Mar 09 04:00:08 crc kubenswrapper[4901]: I0309 04:00:08.024022 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47bcc99e-5a38-4daa-b1da-eb28a58d19f9-catalog-content\") pod \"certified-operators-4xsdb\" (UID: \"47bcc99e-5a38-4daa-b1da-eb28a58d19f9\") " pod="openshift-marketplace/certified-operators-4xsdb" Mar 09 04:00:08 crc kubenswrapper[4901]: I0309 04:00:08.024173 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47bcc99e-5a38-4daa-b1da-eb28a58d19f9-utilities\") pod \"certified-operators-4xsdb\" (UID: \"47bcc99e-5a38-4daa-b1da-eb28a58d19f9\") " pod="openshift-marketplace/certified-operators-4xsdb" Mar 09 04:00:08 crc kubenswrapper[4901]: I0309 04:00:08.043260 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-862sd\" (UniqueName: \"kubernetes.io/projected/47bcc99e-5a38-4daa-b1da-eb28a58d19f9-kube-api-access-862sd\") pod \"certified-operators-4xsdb\" (UID: \"47bcc99e-5a38-4daa-b1da-eb28a58d19f9\") " pod="openshift-marketplace/certified-operators-4xsdb" Mar 09 04:00:08 crc kubenswrapper[4901]: I0309 04:00:08.190586 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4xsdb" Mar 09 04:00:08 crc kubenswrapper[4901]: I0309 04:00:08.466078 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4xsdb"] Mar 09 04:00:08 crc kubenswrapper[4901]: I0309 04:00:08.886257 4901 generic.go:334] "Generic (PLEG): container finished" podID="47bcc99e-5a38-4daa-b1da-eb28a58d19f9" containerID="df0d010db9a3c9981c2395d5249a422b6dbc874213033a95875fc49d9e670f1b" exitCode=0 Mar 09 04:00:08 crc kubenswrapper[4901]: I0309 04:00:08.886306 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4xsdb" event={"ID":"47bcc99e-5a38-4daa-b1da-eb28a58d19f9","Type":"ContainerDied","Data":"df0d010db9a3c9981c2395d5249a422b6dbc874213033a95875fc49d9e670f1b"} Mar 09 04:00:08 crc kubenswrapper[4901]: I0309 04:00:08.886341 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4xsdb" event={"ID":"47bcc99e-5a38-4daa-b1da-eb28a58d19f9","Type":"ContainerStarted","Data":"c2b2314dbb2d3aaeadb497792df0ae9d465331f3ace1309ff9227ea70f31a659"} Mar 09 04:00:11 crc kubenswrapper[4901]: I0309 04:00:11.107006 4901 scope.go:117] "RemoveContainer" containerID="f2eaaa94ef813f27ab89958e9931b25930a897defaee3221bd3f9fa043f32e39" Mar 09 04:00:11 crc kubenswrapper[4901]: E0309 04:00:11.108109 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:00:15 crc kubenswrapper[4901]: I0309 04:00:15.958597 4901 generic.go:334] "Generic (PLEG): container finished" podID="47bcc99e-5a38-4daa-b1da-eb28a58d19f9" containerID="b99f2e4c9b3c85a4ac01eec79702c8d1a15de949d32ec03763c48ae492df0618" exitCode=0 Mar 09 04:00:15 crc kubenswrapper[4901]: I0309 04:00:15.958657 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4xsdb" event={"ID":"47bcc99e-5a38-4daa-b1da-eb28a58d19f9","Type":"ContainerDied","Data":"b99f2e4c9b3c85a4ac01eec79702c8d1a15de949d32ec03763c48ae492df0618"} Mar 09 04:00:17 crc kubenswrapper[4901]: I0309 04:00:17.981994 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550480-4trsp" event={"ID":"016d7314-606e-4045-bdbd-1554776219aa","Type":"ContainerStarted","Data":"ed0b53477ba76499d938ed98d2b45817a433ae1e3efcf11a6aecd464c8f5a3e1"} Mar 09 04:00:17 crc kubenswrapper[4901]: I0309 04:00:17.985304 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4xsdb" event={"ID":"47bcc99e-5a38-4daa-b1da-eb28a58d19f9","Type":"ContainerStarted","Data":"3166088df36aef79c2e59c1a392871eba7e31e64dc76339260405128eea780ce"} Mar 09 04:00:18 crc kubenswrapper[4901]: I0309 04:00:18.047651 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550480-4trsp" podStartSLOduration=1.359843347 podStartE2EDuration="18.047622087s" podCreationTimestamp="2026-03-09 04:00:00 +0000 UTC" firstStartedPulling="2026-03-09 04:00:00.810613427 +0000 UTC m=+4725.400277199" lastFinishedPulling="2026-03-09 04:00:17.498392157 +0000 UTC m=+4742.088055939" observedRunningTime="2026-03-09 04:00:18.044135392 +0000 UTC m=+4742.633799144" watchObservedRunningTime="2026-03-09 04:00:18.047622087 +0000 UTC m=+4742.637285869" Mar 09 04:00:18 crc kubenswrapper[4901]: I0309 04:00:18.077360 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4xsdb" podStartSLOduration=3.480596203 podStartE2EDuration="11.077337907s" podCreationTimestamp="2026-03-09 04:00:07 +0000 UTC" firstStartedPulling="2026-03-09 04:00:08.888733264 +0000 UTC m=+4733.478397006" lastFinishedPulling="2026-03-09 04:00:16.485474938 +0000 UTC m=+4741.075138710" observedRunningTime="2026-03-09 04:00:18.06237907 +0000 UTC m=+4742.652042812" watchObservedRunningTime="2026-03-09 04:00:18.077337907 +0000 UTC m=+4742.667001649" Mar 09 04:00:18 crc kubenswrapper[4901]: I0309 04:00:18.191360 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4xsdb" Mar 09 04:00:18 crc kubenswrapper[4901]: I0309 04:00:18.191428 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4xsdb" Mar 09 04:00:18 crc kubenswrapper[4901]: I0309 04:00:18.994763 4901 generic.go:334] "Generic (PLEG): container finished" podID="016d7314-606e-4045-bdbd-1554776219aa" containerID="ed0b53477ba76499d938ed98d2b45817a433ae1e3efcf11a6aecd464c8f5a3e1" exitCode=0 Mar 09 04:00:18 crc kubenswrapper[4901]: I0309 04:00:18.994930 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550480-4trsp" event={"ID":"016d7314-606e-4045-bdbd-1554776219aa","Type":"ContainerDied","Data":"ed0b53477ba76499d938ed98d2b45817a433ae1e3efcf11a6aecd464c8f5a3e1"} Mar 09 04:00:19 crc kubenswrapper[4901]: I0309 04:00:19.246263 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-4xsdb" podUID="47bcc99e-5a38-4daa-b1da-eb28a58d19f9" containerName="registry-server" probeResult="failure" output=< Mar 09 04:00:19 crc kubenswrapper[4901]: timeout: failed to connect service ":50051" within 1s Mar 09 04:00:19 crc kubenswrapper[4901]: > Mar 09 04:00:20 crc kubenswrapper[4901]: I0309 04:00:20.388844 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550480-4trsp" Mar 09 04:00:20 crc kubenswrapper[4901]: I0309 04:00:20.530540 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vspxg\" (UniqueName: \"kubernetes.io/projected/016d7314-606e-4045-bdbd-1554776219aa-kube-api-access-vspxg\") pod \"016d7314-606e-4045-bdbd-1554776219aa\" (UID: \"016d7314-606e-4045-bdbd-1554776219aa\") " Mar 09 04:00:20 crc kubenswrapper[4901]: I0309 04:00:20.539531 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/016d7314-606e-4045-bdbd-1554776219aa-kube-api-access-vspxg" (OuterVolumeSpecName: "kube-api-access-vspxg") pod "016d7314-606e-4045-bdbd-1554776219aa" (UID: "016d7314-606e-4045-bdbd-1554776219aa"). InnerVolumeSpecName "kube-api-access-vspxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:00:20 crc kubenswrapper[4901]: I0309 04:00:20.634506 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vspxg\" (UniqueName: \"kubernetes.io/projected/016d7314-606e-4045-bdbd-1554776219aa-kube-api-access-vspxg\") on node \"crc\" DevicePath \"\"" Mar 09 04:00:21 crc kubenswrapper[4901]: I0309 04:00:21.028376 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550480-4trsp" event={"ID":"016d7314-606e-4045-bdbd-1554776219aa","Type":"ContainerDied","Data":"563756b62aefbff061881e9ae362ecf36954028173f48005b1d51e3bc6c963ed"} Mar 09 04:00:21 crc kubenswrapper[4901]: I0309 04:00:21.028433 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="563756b62aefbff061881e9ae362ecf36954028173f48005b1d51e3bc6c963ed" Mar 09 04:00:21 crc kubenswrapper[4901]: I0309 04:00:21.028469 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550480-4trsp" Mar 09 04:00:21 crc kubenswrapper[4901]: I0309 04:00:21.478501 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550474-hjmmp"] Mar 09 04:00:21 crc kubenswrapper[4901]: I0309 04:00:21.483109 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550474-hjmmp"] Mar 09 04:00:22 crc kubenswrapper[4901]: I0309 04:00:22.118462 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd2d162d-bdfc-43b9-a4ec-a72f813974a2" path="/var/lib/kubelet/pods/dd2d162d-bdfc-43b9-a4ec-a72f813974a2/volumes" Mar 09 04:00:26 crc kubenswrapper[4901]: I0309 04:00:26.114536 4901 scope.go:117] "RemoveContainer" containerID="f2eaaa94ef813f27ab89958e9931b25930a897defaee3221bd3f9fa043f32e39" Mar 09 04:00:26 crc kubenswrapper[4901]: E0309 04:00:26.115337 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:00:28 crc kubenswrapper[4901]: I0309 04:00:28.269467 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4xsdb" Mar 09 04:00:28 crc kubenswrapper[4901]: I0309 04:00:28.342130 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4xsdb" Mar 09 04:00:28 crc kubenswrapper[4901]: I0309 04:00:28.527641 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4xsdb"] Mar 09 04:00:30 crc kubenswrapper[4901]: I0309 04:00:30.125051 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4xsdb" podUID="47bcc99e-5a38-4daa-b1da-eb28a58d19f9" containerName="registry-server" containerID="cri-o://3166088df36aef79c2e59c1a392871eba7e31e64dc76339260405128eea780ce" gracePeriod=2 Mar 09 04:00:30 crc kubenswrapper[4901]: I0309 04:00:30.630100 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4xsdb" Mar 09 04:00:30 crc kubenswrapper[4901]: I0309 04:00:30.701873 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47bcc99e-5a38-4daa-b1da-eb28a58d19f9-utilities\") pod \"47bcc99e-5a38-4daa-b1da-eb28a58d19f9\" (UID: \"47bcc99e-5a38-4daa-b1da-eb28a58d19f9\") " Mar 09 04:00:30 crc kubenswrapper[4901]: I0309 04:00:30.701973 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47bcc99e-5a38-4daa-b1da-eb28a58d19f9-catalog-content\") pod \"47bcc99e-5a38-4daa-b1da-eb28a58d19f9\" (UID: \"47bcc99e-5a38-4daa-b1da-eb28a58d19f9\") " Mar 09 04:00:30 crc kubenswrapper[4901]: I0309 04:00:30.702080 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-862sd\" (UniqueName: \"kubernetes.io/projected/47bcc99e-5a38-4daa-b1da-eb28a58d19f9-kube-api-access-862sd\") pod \"47bcc99e-5a38-4daa-b1da-eb28a58d19f9\" (UID: \"47bcc99e-5a38-4daa-b1da-eb28a58d19f9\") " Mar 09 04:00:30 crc kubenswrapper[4901]: I0309 04:00:30.704878 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47bcc99e-5a38-4daa-b1da-eb28a58d19f9-utilities" (OuterVolumeSpecName: "utilities") pod "47bcc99e-5a38-4daa-b1da-eb28a58d19f9" (UID: "47bcc99e-5a38-4daa-b1da-eb28a58d19f9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 04:00:30 crc kubenswrapper[4901]: I0309 04:00:30.712650 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47bcc99e-5a38-4daa-b1da-eb28a58d19f9-kube-api-access-862sd" (OuterVolumeSpecName: "kube-api-access-862sd") pod "47bcc99e-5a38-4daa-b1da-eb28a58d19f9" (UID: "47bcc99e-5a38-4daa-b1da-eb28a58d19f9"). InnerVolumeSpecName "kube-api-access-862sd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:00:30 crc kubenswrapper[4901]: I0309 04:00:30.775670 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47bcc99e-5a38-4daa-b1da-eb28a58d19f9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "47bcc99e-5a38-4daa-b1da-eb28a58d19f9" (UID: "47bcc99e-5a38-4daa-b1da-eb28a58d19f9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 04:00:30 crc kubenswrapper[4901]: I0309 04:00:30.805191 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-862sd\" (UniqueName: \"kubernetes.io/projected/47bcc99e-5a38-4daa-b1da-eb28a58d19f9-kube-api-access-862sd\") on node \"crc\" DevicePath \"\"" Mar 09 04:00:30 crc kubenswrapper[4901]: I0309 04:00:30.805278 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47bcc99e-5a38-4daa-b1da-eb28a58d19f9-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 04:00:30 crc kubenswrapper[4901]: I0309 04:00:30.805298 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47bcc99e-5a38-4daa-b1da-eb28a58d19f9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 04:00:31 crc kubenswrapper[4901]: I0309 04:00:31.138489 4901 generic.go:334] "Generic (PLEG): container finished" podID="47bcc99e-5a38-4daa-b1da-eb28a58d19f9" containerID="3166088df36aef79c2e59c1a392871eba7e31e64dc76339260405128eea780ce" exitCode=0 Mar 09 04:00:31 crc kubenswrapper[4901]: I0309 04:00:31.138546 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4xsdb" event={"ID":"47bcc99e-5a38-4daa-b1da-eb28a58d19f9","Type":"ContainerDied","Data":"3166088df36aef79c2e59c1a392871eba7e31e64dc76339260405128eea780ce"} Mar 09 04:00:31 crc kubenswrapper[4901]: I0309 04:00:31.138585 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4xsdb" event={"ID":"47bcc99e-5a38-4daa-b1da-eb28a58d19f9","Type":"ContainerDied","Data":"c2b2314dbb2d3aaeadb497792df0ae9d465331f3ace1309ff9227ea70f31a659"} Mar 09 04:00:31 crc kubenswrapper[4901]: I0309 04:00:31.138613 4901 scope.go:117] "RemoveContainer" containerID="3166088df36aef79c2e59c1a392871eba7e31e64dc76339260405128eea780ce" Mar 09 04:00:31 crc kubenswrapper[4901]: I0309 04:00:31.138652 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4xsdb" Mar 09 04:00:31 crc kubenswrapper[4901]: I0309 04:00:31.186374 4901 scope.go:117] "RemoveContainer" containerID="b99f2e4c9b3c85a4ac01eec79702c8d1a15de949d32ec03763c48ae492df0618" Mar 09 04:00:31 crc kubenswrapper[4901]: I0309 04:00:31.204971 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4xsdb"] Mar 09 04:00:31 crc kubenswrapper[4901]: I0309 04:00:31.217004 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4xsdb"] Mar 09 04:00:31 crc kubenswrapper[4901]: I0309 04:00:31.238945 4901 scope.go:117] "RemoveContainer" containerID="df0d010db9a3c9981c2395d5249a422b6dbc874213033a95875fc49d9e670f1b" Mar 09 04:00:31 crc kubenswrapper[4901]: I0309 04:00:31.269777 4901 scope.go:117] "RemoveContainer" containerID="3166088df36aef79c2e59c1a392871eba7e31e64dc76339260405128eea780ce" Mar 09 04:00:31 crc kubenswrapper[4901]: E0309 04:00:31.270452 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3166088df36aef79c2e59c1a392871eba7e31e64dc76339260405128eea780ce\": container with ID starting with 3166088df36aef79c2e59c1a392871eba7e31e64dc76339260405128eea780ce not found: ID does not exist" containerID="3166088df36aef79c2e59c1a392871eba7e31e64dc76339260405128eea780ce" Mar 09 04:00:31 crc kubenswrapper[4901]: I0309 04:00:31.270512 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3166088df36aef79c2e59c1a392871eba7e31e64dc76339260405128eea780ce"} err="failed to get container status \"3166088df36aef79c2e59c1a392871eba7e31e64dc76339260405128eea780ce\": rpc error: code = NotFound desc = could not find container \"3166088df36aef79c2e59c1a392871eba7e31e64dc76339260405128eea780ce\": container with ID starting with 3166088df36aef79c2e59c1a392871eba7e31e64dc76339260405128eea780ce not found: ID does not exist" Mar 09 04:00:31 crc kubenswrapper[4901]: I0309 04:00:31.270539 4901 scope.go:117] "RemoveContainer" containerID="b99f2e4c9b3c85a4ac01eec79702c8d1a15de949d32ec03763c48ae492df0618" Mar 09 04:00:31 crc kubenswrapper[4901]: E0309 04:00:31.270922 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b99f2e4c9b3c85a4ac01eec79702c8d1a15de949d32ec03763c48ae492df0618\": container with ID starting with b99f2e4c9b3c85a4ac01eec79702c8d1a15de949d32ec03763c48ae492df0618 not found: ID does not exist" containerID="b99f2e4c9b3c85a4ac01eec79702c8d1a15de949d32ec03763c48ae492df0618" Mar 09 04:00:31 crc kubenswrapper[4901]: I0309 04:00:31.271095 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b99f2e4c9b3c85a4ac01eec79702c8d1a15de949d32ec03763c48ae492df0618"} err="failed to get container status \"b99f2e4c9b3c85a4ac01eec79702c8d1a15de949d32ec03763c48ae492df0618\": rpc error: code = NotFound desc = could not find container \"b99f2e4c9b3c85a4ac01eec79702c8d1a15de949d32ec03763c48ae492df0618\": container with ID starting with b99f2e4c9b3c85a4ac01eec79702c8d1a15de949d32ec03763c48ae492df0618 not found: ID does not exist" Mar 09 04:00:31 crc kubenswrapper[4901]: I0309 04:00:31.271283 4901 scope.go:117] "RemoveContainer" containerID="df0d010db9a3c9981c2395d5249a422b6dbc874213033a95875fc49d9e670f1b" Mar 09 04:00:31 crc kubenswrapper[4901]: E0309 04:00:31.272077 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df0d010db9a3c9981c2395d5249a422b6dbc874213033a95875fc49d9e670f1b\": container with ID starting with df0d010db9a3c9981c2395d5249a422b6dbc874213033a95875fc49d9e670f1b not found: ID does not exist" containerID="df0d010db9a3c9981c2395d5249a422b6dbc874213033a95875fc49d9e670f1b" Mar 09 04:00:31 crc kubenswrapper[4901]: I0309 04:00:31.272110 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df0d010db9a3c9981c2395d5249a422b6dbc874213033a95875fc49d9e670f1b"} err="failed to get container status \"df0d010db9a3c9981c2395d5249a422b6dbc874213033a95875fc49d9e670f1b\": rpc error: code = NotFound desc = could not find container \"df0d010db9a3c9981c2395d5249a422b6dbc874213033a95875fc49d9e670f1b\": container with ID starting with df0d010db9a3c9981c2395d5249a422b6dbc874213033a95875fc49d9e670f1b not found: ID does not exist" Mar 09 04:00:32 crc kubenswrapper[4901]: I0309 04:00:32.124575 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47bcc99e-5a38-4daa-b1da-eb28a58d19f9" path="/var/lib/kubelet/pods/47bcc99e-5a38-4daa-b1da-eb28a58d19f9/volumes" Mar 09 04:00:37 crc kubenswrapper[4901]: I0309 04:00:37.106113 4901 scope.go:117] "RemoveContainer" containerID="f2eaaa94ef813f27ab89958e9931b25930a897defaee3221bd3f9fa043f32e39" Mar 09 04:00:37 crc kubenswrapper[4901]: E0309 04:00:37.107089 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:00:41 crc kubenswrapper[4901]: I0309 04:00:41.194571 4901 scope.go:117] "RemoveContainer" containerID="fb942c6ec6ae344c9d8d8bfa81bfb200c00e8c88d136dbe8cce57bed0c9275ee" Mar 09 04:00:41 crc kubenswrapper[4901]: I0309 04:00:41.238649 4901 scope.go:117] "RemoveContainer" containerID="656f267c36a077587532187ba28b0e350644f5e384ff33f8765d78fdb18236d8" Mar 09 04:00:50 crc kubenswrapper[4901]: I0309 04:00:50.106892 4901 scope.go:117] "RemoveContainer" containerID="f2eaaa94ef813f27ab89958e9931b25930a897defaee3221bd3f9fa043f32e39" Mar 09 04:00:50 crc kubenswrapper[4901]: E0309 04:00:50.108866 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:01:04 crc kubenswrapper[4901]: I0309 04:01:04.107166 4901 scope.go:117] "RemoveContainer" containerID="f2eaaa94ef813f27ab89958e9931b25930a897defaee3221bd3f9fa043f32e39" Mar 09 04:01:04 crc kubenswrapper[4901]: E0309 04:01:04.108519 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:01:17 crc kubenswrapper[4901]: I0309 04:01:17.106218 4901 scope.go:117] "RemoveContainer" containerID="f2eaaa94ef813f27ab89958e9931b25930a897defaee3221bd3f9fa043f32e39" Mar 09 04:01:17 crc kubenswrapper[4901]: E0309 04:01:17.107470 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:01:32 crc kubenswrapper[4901]: I0309 04:01:32.107862 4901 scope.go:117] "RemoveContainer" containerID="f2eaaa94ef813f27ab89958e9931b25930a897defaee3221bd3f9fa043f32e39" Mar 09 04:01:32 crc kubenswrapper[4901]: E0309 04:01:32.108870 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:01:45 crc kubenswrapper[4901]: I0309 04:01:45.106693 4901 scope.go:117] "RemoveContainer" containerID="f2eaaa94ef813f27ab89958e9931b25930a897defaee3221bd3f9fa043f32e39" Mar 09 04:01:45 crc kubenswrapper[4901]: E0309 04:01:45.107687 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:01:57 crc kubenswrapper[4901]: I0309 04:01:57.106341 4901 scope.go:117] "RemoveContainer" containerID="f2eaaa94ef813f27ab89958e9931b25930a897defaee3221bd3f9fa043f32e39" Mar 09 04:01:57 crc kubenswrapper[4901]: E0309 04:01:57.107389 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:02:00 crc kubenswrapper[4901]: I0309 04:02:00.177860 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550482-cvb95"] Mar 09 04:02:00 crc kubenswrapper[4901]: E0309 04:02:00.178697 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="016d7314-606e-4045-bdbd-1554776219aa" containerName="oc" Mar 09 04:02:00 crc kubenswrapper[4901]: I0309 04:02:00.178720 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="016d7314-606e-4045-bdbd-1554776219aa" containerName="oc" Mar 09 04:02:00 crc kubenswrapper[4901]: E0309 04:02:00.178742 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47bcc99e-5a38-4daa-b1da-eb28a58d19f9" containerName="extract-utilities" Mar 09 04:02:00 crc kubenswrapper[4901]: I0309 04:02:00.178754 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="47bcc99e-5a38-4daa-b1da-eb28a58d19f9" containerName="extract-utilities" Mar 09 04:02:00 crc kubenswrapper[4901]: E0309 04:02:00.178783 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47bcc99e-5a38-4daa-b1da-eb28a58d19f9" containerName="extract-content" Mar 09 04:02:00 crc kubenswrapper[4901]: I0309 04:02:00.178796 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="47bcc99e-5a38-4daa-b1da-eb28a58d19f9" containerName="extract-content" Mar 09 04:02:00 crc kubenswrapper[4901]: E0309 04:02:00.178838 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47bcc99e-5a38-4daa-b1da-eb28a58d19f9" containerName="registry-server" Mar 09 04:02:00 crc kubenswrapper[4901]: I0309 04:02:00.178850 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="47bcc99e-5a38-4daa-b1da-eb28a58d19f9" containerName="registry-server" Mar 09 04:02:00 crc kubenswrapper[4901]: I0309 04:02:00.179091 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="016d7314-606e-4045-bdbd-1554776219aa" containerName="oc" Mar 09 04:02:00 crc kubenswrapper[4901]: I0309 04:02:00.179116 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="47bcc99e-5a38-4daa-b1da-eb28a58d19f9" containerName="registry-server" Mar 09 04:02:00 crc kubenswrapper[4901]: I0309 04:02:00.179892 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550482-cvb95" Mar 09 04:02:00 crc kubenswrapper[4901]: I0309 04:02:00.183456 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 04:02:00 crc kubenswrapper[4901]: I0309 04:02:00.183574 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lq6vs" Mar 09 04:02:00 crc kubenswrapper[4901]: I0309 04:02:00.183618 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 04:02:00 crc kubenswrapper[4901]: I0309 04:02:00.189982 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550482-cvb95"] Mar 09 04:02:00 crc kubenswrapper[4901]: I0309 04:02:00.220173 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzdfm\" (UniqueName: \"kubernetes.io/projected/1923e3c5-bc27-4914-9789-a5f731fc2725-kube-api-access-rzdfm\") pod \"auto-csr-approver-29550482-cvb95\" (UID: \"1923e3c5-bc27-4914-9789-a5f731fc2725\") " pod="openshift-infra/auto-csr-approver-29550482-cvb95" Mar 09 04:02:00 crc kubenswrapper[4901]: I0309 04:02:00.322042 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzdfm\" (UniqueName: \"kubernetes.io/projected/1923e3c5-bc27-4914-9789-a5f731fc2725-kube-api-access-rzdfm\") pod \"auto-csr-approver-29550482-cvb95\" (UID: \"1923e3c5-bc27-4914-9789-a5f731fc2725\") " pod="openshift-infra/auto-csr-approver-29550482-cvb95" Mar 09 04:02:00 crc kubenswrapper[4901]: I0309 04:02:00.359834 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzdfm\" (UniqueName: \"kubernetes.io/projected/1923e3c5-bc27-4914-9789-a5f731fc2725-kube-api-access-rzdfm\") pod \"auto-csr-approver-29550482-cvb95\" (UID: \"1923e3c5-bc27-4914-9789-a5f731fc2725\") " pod="openshift-infra/auto-csr-approver-29550482-cvb95" Mar 09 04:02:00 crc kubenswrapper[4901]: I0309 04:02:00.514732 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550482-cvb95" Mar 09 04:02:01 crc kubenswrapper[4901]: I0309 04:02:01.041258 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550482-cvb95"] Mar 09 04:02:02 crc kubenswrapper[4901]: I0309 04:02:02.017800 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550482-cvb95" event={"ID":"1923e3c5-bc27-4914-9789-a5f731fc2725","Type":"ContainerStarted","Data":"9da2ef4e47976ded3f861f2e63659ae3dfbce7536dda00411b41cd4ce59a031e"} Mar 09 04:02:03 crc kubenswrapper[4901]: I0309 04:02:03.031565 4901 generic.go:334] "Generic (PLEG): container finished" podID="1923e3c5-bc27-4914-9789-a5f731fc2725" containerID="36b1b227bfe6f1a73a7a1cb4120a1829a8c4b32d60c17b57e40d1201f9ca2bd4" exitCode=0 Mar 09 04:02:03 crc kubenswrapper[4901]: I0309 04:02:03.031630 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550482-cvb95" event={"ID":"1923e3c5-bc27-4914-9789-a5f731fc2725","Type":"ContainerDied","Data":"36b1b227bfe6f1a73a7a1cb4120a1829a8c4b32d60c17b57e40d1201f9ca2bd4"} Mar 09 04:02:04 crc kubenswrapper[4901]: I0309 04:02:04.408692 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550482-cvb95" Mar 09 04:02:04 crc kubenswrapper[4901]: I0309 04:02:04.591074 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzdfm\" (UniqueName: \"kubernetes.io/projected/1923e3c5-bc27-4914-9789-a5f731fc2725-kube-api-access-rzdfm\") pod \"1923e3c5-bc27-4914-9789-a5f731fc2725\" (UID: \"1923e3c5-bc27-4914-9789-a5f731fc2725\") " Mar 09 04:02:04 crc kubenswrapper[4901]: I0309 04:02:04.599177 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1923e3c5-bc27-4914-9789-a5f731fc2725-kube-api-access-rzdfm" (OuterVolumeSpecName: "kube-api-access-rzdfm") pod "1923e3c5-bc27-4914-9789-a5f731fc2725" (UID: "1923e3c5-bc27-4914-9789-a5f731fc2725"). InnerVolumeSpecName "kube-api-access-rzdfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:02:04 crc kubenswrapper[4901]: I0309 04:02:04.692620 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzdfm\" (UniqueName: \"kubernetes.io/projected/1923e3c5-bc27-4914-9789-a5f731fc2725-kube-api-access-rzdfm\") on node \"crc\" DevicePath \"\"" Mar 09 04:02:05 crc kubenswrapper[4901]: I0309 04:02:05.053501 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550482-cvb95" event={"ID":"1923e3c5-bc27-4914-9789-a5f731fc2725","Type":"ContainerDied","Data":"9da2ef4e47976ded3f861f2e63659ae3dfbce7536dda00411b41cd4ce59a031e"} Mar 09 04:02:05 crc kubenswrapper[4901]: I0309 04:02:05.053565 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9da2ef4e47976ded3f861f2e63659ae3dfbce7536dda00411b41cd4ce59a031e" Mar 09 04:02:05 crc kubenswrapper[4901]: I0309 04:02:05.053605 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550482-cvb95" Mar 09 04:02:05 crc kubenswrapper[4901]: I0309 04:02:05.510670 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550476-s6wwl"] Mar 09 04:02:05 crc kubenswrapper[4901]: I0309 04:02:05.520886 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550476-s6wwl"] Mar 09 04:02:06 crc kubenswrapper[4901]: I0309 04:02:06.115606 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20ce0ce0-5535-4b6b-ba9f-44ba2b3bbc13" path="/var/lib/kubelet/pods/20ce0ce0-5535-4b6b-ba9f-44ba2b3bbc13/volumes" Mar 09 04:02:12 crc kubenswrapper[4901]: I0309 04:02:12.106586 4901 scope.go:117] "RemoveContainer" containerID="f2eaaa94ef813f27ab89958e9931b25930a897defaee3221bd3f9fa043f32e39" Mar 09 04:02:12 crc kubenswrapper[4901]: E0309 04:02:12.107817 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:02:24 crc kubenswrapper[4901]: I0309 04:02:24.107326 4901 scope.go:117] "RemoveContainer" containerID="f2eaaa94ef813f27ab89958e9931b25930a897defaee3221bd3f9fa043f32e39" Mar 09 04:02:24 crc kubenswrapper[4901]: E0309 04:02:24.108149 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:02:35 crc kubenswrapper[4901]: I0309 04:02:35.105693 4901 scope.go:117] "RemoveContainer" containerID="f2eaaa94ef813f27ab89958e9931b25930a897defaee3221bd3f9fa043f32e39" Mar 09 04:02:35 crc kubenswrapper[4901]: E0309 04:02:35.106622 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:02:41 crc kubenswrapper[4901]: I0309 04:02:41.381725 4901 scope.go:117] "RemoveContainer" containerID="1ad4b5968552d7fcdb51346e35443083df9b75fe6ebc01bc458c2adcf76e9e05" Mar 09 04:02:47 crc kubenswrapper[4901]: I0309 04:02:47.106599 4901 scope.go:117] "RemoveContainer" containerID="f2eaaa94ef813f27ab89958e9931b25930a897defaee3221bd3f9fa043f32e39" Mar 09 04:02:47 crc kubenswrapper[4901]: E0309 04:02:47.107567 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:02:58 crc kubenswrapper[4901]: I0309 04:02:58.106667 4901 scope.go:117] "RemoveContainer" containerID="f2eaaa94ef813f27ab89958e9931b25930a897defaee3221bd3f9fa043f32e39" Mar 09 04:02:58 crc kubenswrapper[4901]: E0309 04:02:58.107520 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:03:11 crc kubenswrapper[4901]: I0309 04:03:11.109651 4901 scope.go:117] "RemoveContainer" containerID="f2eaaa94ef813f27ab89958e9931b25930a897defaee3221bd3f9fa043f32e39" Mar 09 04:03:11 crc kubenswrapper[4901]: E0309 04:03:11.111793 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:03:25 crc kubenswrapper[4901]: I0309 04:03:25.106697 4901 scope.go:117] "RemoveContainer" containerID="f2eaaa94ef813f27ab89958e9931b25930a897defaee3221bd3f9fa043f32e39" Mar 09 04:03:25 crc kubenswrapper[4901]: E0309 04:03:25.107457 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:03:37 crc kubenswrapper[4901]: I0309 04:03:37.107088 4901 scope.go:117] "RemoveContainer" containerID="f2eaaa94ef813f27ab89958e9931b25930a897defaee3221bd3f9fa043f32e39" Mar 09 04:03:37 crc kubenswrapper[4901]: E0309 04:03:37.107880 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:03:51 crc kubenswrapper[4901]: I0309 04:03:51.106957 4901 scope.go:117] "RemoveContainer" containerID="f2eaaa94ef813f27ab89958e9931b25930a897defaee3221bd3f9fa043f32e39" Mar 09 04:03:51 crc kubenswrapper[4901]: E0309 04:03:51.107903 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:04:00 crc kubenswrapper[4901]: I0309 04:04:00.163294 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550484-thrv5"] Mar 09 04:04:00 crc kubenswrapper[4901]: E0309 04:04:00.164346 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1923e3c5-bc27-4914-9789-a5f731fc2725" containerName="oc" Mar 09 04:04:00 crc kubenswrapper[4901]: I0309 04:04:00.164366 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="1923e3c5-bc27-4914-9789-a5f731fc2725" containerName="oc" Mar 09 04:04:00 crc kubenswrapper[4901]: I0309 04:04:00.164650 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="1923e3c5-bc27-4914-9789-a5f731fc2725" containerName="oc" Mar 09 04:04:00 crc kubenswrapper[4901]: I0309 04:04:00.165373 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550484-thrv5" Mar 09 04:04:00 crc kubenswrapper[4901]: I0309 04:04:00.168673 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lq6vs" Mar 09 04:04:00 crc kubenswrapper[4901]: I0309 04:04:00.169347 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 04:04:00 crc kubenswrapper[4901]: I0309 04:04:00.169609 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 04:04:00 crc kubenswrapper[4901]: I0309 04:04:00.184302 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550484-thrv5"] Mar 09 04:04:00 crc kubenswrapper[4901]: I0309 04:04:00.326043 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skc8z\" (UniqueName: \"kubernetes.io/projected/045dd9d2-24bb-4198-ad1f-1dfdfe81afb5-kube-api-access-skc8z\") pod \"auto-csr-approver-29550484-thrv5\" (UID: \"045dd9d2-24bb-4198-ad1f-1dfdfe81afb5\") " pod="openshift-infra/auto-csr-approver-29550484-thrv5" Mar 09 04:04:00 crc kubenswrapper[4901]: I0309 04:04:00.427700 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skc8z\" (UniqueName: \"kubernetes.io/projected/045dd9d2-24bb-4198-ad1f-1dfdfe81afb5-kube-api-access-skc8z\") pod \"auto-csr-approver-29550484-thrv5\" (UID: \"045dd9d2-24bb-4198-ad1f-1dfdfe81afb5\") " pod="openshift-infra/auto-csr-approver-29550484-thrv5" Mar 09 04:04:00 crc kubenswrapper[4901]: I0309 04:04:00.455262 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skc8z\" (UniqueName: \"kubernetes.io/projected/045dd9d2-24bb-4198-ad1f-1dfdfe81afb5-kube-api-access-skc8z\") pod \"auto-csr-approver-29550484-thrv5\" (UID: \"045dd9d2-24bb-4198-ad1f-1dfdfe81afb5\") " pod="openshift-infra/auto-csr-approver-29550484-thrv5" Mar 09 04:04:00 crc kubenswrapper[4901]: I0309 04:04:00.501360 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550484-thrv5" Mar 09 04:04:01 crc kubenswrapper[4901]: I0309 04:04:01.018305 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550484-thrv5"] Mar 09 04:04:01 crc kubenswrapper[4901]: I0309 04:04:01.156414 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550484-thrv5" event={"ID":"045dd9d2-24bb-4198-ad1f-1dfdfe81afb5","Type":"ContainerStarted","Data":"75af71436862ccafb63f7c33427709f4097ff8e8c9612c57666980720c85aadb"} Mar 09 04:04:02 crc kubenswrapper[4901]: I0309 04:04:02.106428 4901 scope.go:117] "RemoveContainer" containerID="f2eaaa94ef813f27ab89958e9931b25930a897defaee3221bd3f9fa043f32e39" Mar 09 04:04:03 crc kubenswrapper[4901]: I0309 04:04:03.181911 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" event={"ID":"65e722e8-52c4-4bb6-9927-f378b2f7296a","Type":"ContainerStarted","Data":"ee11181bba501a52807c1c6a38036b291b104f5a882250fea8dbae3a89c2ab93"} Mar 09 04:04:03 crc kubenswrapper[4901]: I0309 04:04:03.184611 4901 generic.go:334] "Generic (PLEG): container finished" podID="045dd9d2-24bb-4198-ad1f-1dfdfe81afb5" containerID="8620404e3532a82760258afe7ee9479528ae51c4ce90cbcc60db4c8ce16cd758" exitCode=0 Mar 09 04:04:03 crc kubenswrapper[4901]: I0309 04:04:03.184649 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550484-thrv5" event={"ID":"045dd9d2-24bb-4198-ad1f-1dfdfe81afb5","Type":"ContainerDied","Data":"8620404e3532a82760258afe7ee9479528ae51c4ce90cbcc60db4c8ce16cd758"} Mar 09 04:04:04 crc kubenswrapper[4901]: I0309 04:04:04.519773 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550484-thrv5" Mar 09 04:04:04 crc kubenswrapper[4901]: I0309 04:04:04.709477 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skc8z\" (UniqueName: \"kubernetes.io/projected/045dd9d2-24bb-4198-ad1f-1dfdfe81afb5-kube-api-access-skc8z\") pod \"045dd9d2-24bb-4198-ad1f-1dfdfe81afb5\" (UID: \"045dd9d2-24bb-4198-ad1f-1dfdfe81afb5\") " Mar 09 04:04:04 crc kubenswrapper[4901]: I0309 04:04:04.718974 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/045dd9d2-24bb-4198-ad1f-1dfdfe81afb5-kube-api-access-skc8z" (OuterVolumeSpecName: "kube-api-access-skc8z") pod "045dd9d2-24bb-4198-ad1f-1dfdfe81afb5" (UID: "045dd9d2-24bb-4198-ad1f-1dfdfe81afb5"). InnerVolumeSpecName "kube-api-access-skc8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:04:04 crc kubenswrapper[4901]: I0309 04:04:04.812248 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skc8z\" (UniqueName: \"kubernetes.io/projected/045dd9d2-24bb-4198-ad1f-1dfdfe81afb5-kube-api-access-skc8z\") on node \"crc\" DevicePath \"\"" Mar 09 04:04:05 crc kubenswrapper[4901]: I0309 04:04:05.211873 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550484-thrv5" event={"ID":"045dd9d2-24bb-4198-ad1f-1dfdfe81afb5","Type":"ContainerDied","Data":"75af71436862ccafb63f7c33427709f4097ff8e8c9612c57666980720c85aadb"} Mar 09 04:04:05 crc kubenswrapper[4901]: I0309 04:04:05.211971 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550484-thrv5" Mar 09 04:04:05 crc kubenswrapper[4901]: I0309 04:04:05.211975 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75af71436862ccafb63f7c33427709f4097ff8e8c9612c57666980720c85aadb" Mar 09 04:04:05 crc kubenswrapper[4901]: E0309 04:04:05.387976 4901 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod045dd9d2_24bb_4198_ad1f_1dfdfe81afb5.slice/crio-75af71436862ccafb63f7c33427709f4097ff8e8c9612c57666980720c85aadb\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod045dd9d2_24bb_4198_ad1f_1dfdfe81afb5.slice\": RecentStats: unable to find data in memory cache]" Mar 09 04:04:05 crc kubenswrapper[4901]: I0309 04:04:05.620548 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550478-sn8lz"] Mar 09 04:04:05 crc kubenswrapper[4901]: I0309 04:04:05.638356 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550478-sn8lz"] Mar 09 04:04:06 crc kubenswrapper[4901]: I0309 04:04:06.117396 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c89db68b-e834-4486-848e-303d651be8c0" path="/var/lib/kubelet/pods/c89db68b-e834-4486-848e-303d651be8c0/volumes" Mar 09 04:04:30 crc kubenswrapper[4901]: I0309 04:04:30.852714 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-j4cvp"] Mar 09 04:04:30 crc kubenswrapper[4901]: I0309 04:04:30.864762 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-j4cvp"] Mar 09 04:04:30 crc kubenswrapper[4901]: I0309 04:04:30.942748 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-jx9nz"] Mar 09 04:04:30 crc kubenswrapper[4901]: E0309 04:04:30.943189 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="045dd9d2-24bb-4198-ad1f-1dfdfe81afb5" containerName="oc" Mar 09 04:04:30 crc kubenswrapper[4901]: I0309 04:04:30.943257 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="045dd9d2-24bb-4198-ad1f-1dfdfe81afb5" containerName="oc" Mar 09 04:04:30 crc kubenswrapper[4901]: I0309 04:04:30.943516 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="045dd9d2-24bb-4198-ad1f-1dfdfe81afb5" containerName="oc" Mar 09 04:04:30 crc kubenswrapper[4901]: I0309 04:04:30.944255 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jx9nz" Mar 09 04:04:30 crc kubenswrapper[4901]: I0309 04:04:30.952427 4901 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-xv8zs" Mar 09 04:04:30 crc kubenswrapper[4901]: I0309 04:04:30.952922 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 09 04:04:30 crc kubenswrapper[4901]: I0309 04:04:30.953184 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 09 04:04:30 crc kubenswrapper[4901]: I0309 04:04:30.954363 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-jx9nz"] Mar 09 04:04:30 crc kubenswrapper[4901]: I0309 04:04:30.968698 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 09 04:04:31 crc kubenswrapper[4901]: I0309 04:04:31.117025 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6zjb\" (UniqueName: \"kubernetes.io/projected/5d14e961-4c4d-43f4-b71a-9a65c1cb5732-kube-api-access-p6zjb\") pod \"crc-storage-crc-jx9nz\" (UID: \"5d14e961-4c4d-43f4-b71a-9a65c1cb5732\") " pod="crc-storage/crc-storage-crc-jx9nz" Mar 09 04:04:31 crc kubenswrapper[4901]: I0309 04:04:31.117090 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/5d14e961-4c4d-43f4-b71a-9a65c1cb5732-crc-storage\") pod \"crc-storage-crc-jx9nz\" (UID: \"5d14e961-4c4d-43f4-b71a-9a65c1cb5732\") " pod="crc-storage/crc-storage-crc-jx9nz" Mar 09 04:04:31 crc kubenswrapper[4901]: I0309 04:04:31.117279 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/5d14e961-4c4d-43f4-b71a-9a65c1cb5732-node-mnt\") pod \"crc-storage-crc-jx9nz\" (UID: \"5d14e961-4c4d-43f4-b71a-9a65c1cb5732\") " pod="crc-storage/crc-storage-crc-jx9nz" Mar 09 04:04:31 crc kubenswrapper[4901]: I0309 04:04:31.220307 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/5d14e961-4c4d-43f4-b71a-9a65c1cb5732-node-mnt\") pod \"crc-storage-crc-jx9nz\" (UID: \"5d14e961-4c4d-43f4-b71a-9a65c1cb5732\") " pod="crc-storage/crc-storage-crc-jx9nz" Mar 09 04:04:31 crc kubenswrapper[4901]: I0309 04:04:31.220703 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/5d14e961-4c4d-43f4-b71a-9a65c1cb5732-node-mnt\") pod \"crc-storage-crc-jx9nz\" (UID: \"5d14e961-4c4d-43f4-b71a-9a65c1cb5732\") " pod="crc-storage/crc-storage-crc-jx9nz" Mar 09 04:04:31 crc kubenswrapper[4901]: I0309 04:04:31.221149 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6zjb\" (UniqueName: \"kubernetes.io/projected/5d14e961-4c4d-43f4-b71a-9a65c1cb5732-kube-api-access-p6zjb\") pod \"crc-storage-crc-jx9nz\" (UID: \"5d14e961-4c4d-43f4-b71a-9a65c1cb5732\") " pod="crc-storage/crc-storage-crc-jx9nz" Mar 09 04:04:31 crc kubenswrapper[4901]: I0309 04:04:31.221183 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/5d14e961-4c4d-43f4-b71a-9a65c1cb5732-crc-storage\") pod \"crc-storage-crc-jx9nz\" (UID: \"5d14e961-4c4d-43f4-b71a-9a65c1cb5732\") " pod="crc-storage/crc-storage-crc-jx9nz" Mar 09 04:04:31 crc kubenswrapper[4901]: I0309 04:04:31.222640 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/5d14e961-4c4d-43f4-b71a-9a65c1cb5732-crc-storage\") pod \"crc-storage-crc-jx9nz\" (UID: \"5d14e961-4c4d-43f4-b71a-9a65c1cb5732\") " pod="crc-storage/crc-storage-crc-jx9nz" Mar 09 04:04:31 crc kubenswrapper[4901]: I0309 04:04:31.247877 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6zjb\" (UniqueName: \"kubernetes.io/projected/5d14e961-4c4d-43f4-b71a-9a65c1cb5732-kube-api-access-p6zjb\") pod \"crc-storage-crc-jx9nz\" (UID: \"5d14e961-4c4d-43f4-b71a-9a65c1cb5732\") " pod="crc-storage/crc-storage-crc-jx9nz" Mar 09 04:04:31 crc kubenswrapper[4901]: I0309 04:04:31.286637 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jx9nz" Mar 09 04:04:31 crc kubenswrapper[4901]: I0309 04:04:31.535580 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-jx9nz"] Mar 09 04:04:31 crc kubenswrapper[4901]: I0309 04:04:31.718844 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-jx9nz" event={"ID":"5d14e961-4c4d-43f4-b71a-9a65c1cb5732","Type":"ContainerStarted","Data":"e4e64ed3c36087689d3056b39261097091f21495151e7748548325834395363f"} Mar 09 04:04:32 crc kubenswrapper[4901]: I0309 04:04:32.117353 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0ede59c-7e38-437e-ba16-82adff7f9ef4" path="/var/lib/kubelet/pods/e0ede59c-7e38-437e-ba16-82adff7f9ef4/volumes" Mar 09 04:04:32 crc kubenswrapper[4901]: I0309 04:04:32.730923 4901 generic.go:334] "Generic (PLEG): container finished" podID="5d14e961-4c4d-43f4-b71a-9a65c1cb5732" containerID="2e66386be2b1f3b38ff34fb6f84d874f024b7ba9a4e63a7ca9df1d17e4294b4d" exitCode=0 Mar 09 04:04:32 crc kubenswrapper[4901]: I0309 04:04:32.731171 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-jx9nz" event={"ID":"5d14e961-4c4d-43f4-b71a-9a65c1cb5732","Type":"ContainerDied","Data":"2e66386be2b1f3b38ff34fb6f84d874f024b7ba9a4e63a7ca9df1d17e4294b4d"} Mar 09 04:04:34 crc kubenswrapper[4901]: I0309 04:04:34.105681 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jx9nz" Mar 09 04:04:34 crc kubenswrapper[4901]: I0309 04:04:34.173298 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/5d14e961-4c4d-43f4-b71a-9a65c1cb5732-crc-storage\") pod \"5d14e961-4c4d-43f4-b71a-9a65c1cb5732\" (UID: \"5d14e961-4c4d-43f4-b71a-9a65c1cb5732\") " Mar 09 04:04:34 crc kubenswrapper[4901]: I0309 04:04:34.173657 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/5d14e961-4c4d-43f4-b71a-9a65c1cb5732-node-mnt\") pod \"5d14e961-4c4d-43f4-b71a-9a65c1cb5732\" (UID: \"5d14e961-4c4d-43f4-b71a-9a65c1cb5732\") " Mar 09 04:04:34 crc kubenswrapper[4901]: I0309 04:04:34.173682 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6zjb\" (UniqueName: \"kubernetes.io/projected/5d14e961-4c4d-43f4-b71a-9a65c1cb5732-kube-api-access-p6zjb\") pod \"5d14e961-4c4d-43f4-b71a-9a65c1cb5732\" (UID: \"5d14e961-4c4d-43f4-b71a-9a65c1cb5732\") " Mar 09 04:04:34 crc kubenswrapper[4901]: I0309 04:04:34.173783 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5d14e961-4c4d-43f4-b71a-9a65c1cb5732-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "5d14e961-4c4d-43f4-b71a-9a65c1cb5732" (UID: "5d14e961-4c4d-43f4-b71a-9a65c1cb5732"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 04:04:34 crc kubenswrapper[4901]: I0309 04:04:34.174384 4901 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/5d14e961-4c4d-43f4-b71a-9a65c1cb5732-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 09 04:04:34 crc kubenswrapper[4901]: I0309 04:04:34.180197 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d14e961-4c4d-43f4-b71a-9a65c1cb5732-kube-api-access-p6zjb" (OuterVolumeSpecName: "kube-api-access-p6zjb") pod "5d14e961-4c4d-43f4-b71a-9a65c1cb5732" (UID: "5d14e961-4c4d-43f4-b71a-9a65c1cb5732"). InnerVolumeSpecName "kube-api-access-p6zjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:04:34 crc kubenswrapper[4901]: I0309 04:04:34.193097 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d14e961-4c4d-43f4-b71a-9a65c1cb5732-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "5d14e961-4c4d-43f4-b71a-9a65c1cb5732" (UID: "5d14e961-4c4d-43f4-b71a-9a65c1cb5732"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 04:04:34 crc kubenswrapper[4901]: I0309 04:04:34.275498 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6zjb\" (UniqueName: \"kubernetes.io/projected/5d14e961-4c4d-43f4-b71a-9a65c1cb5732-kube-api-access-p6zjb\") on node \"crc\" DevicePath \"\"" Mar 09 04:04:34 crc kubenswrapper[4901]: I0309 04:04:34.275546 4901 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/5d14e961-4c4d-43f4-b71a-9a65c1cb5732-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 09 04:04:34 crc kubenswrapper[4901]: I0309 04:04:34.762742 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-jx9nz" event={"ID":"5d14e961-4c4d-43f4-b71a-9a65c1cb5732","Type":"ContainerDied","Data":"e4e64ed3c36087689d3056b39261097091f21495151e7748548325834395363f"} Mar 09 04:04:34 crc kubenswrapper[4901]: I0309 04:04:34.762796 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4e64ed3c36087689d3056b39261097091f21495151e7748548325834395363f" Mar 09 04:04:34 crc kubenswrapper[4901]: I0309 04:04:34.762856 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jx9nz" Mar 09 04:04:36 crc kubenswrapper[4901]: I0309 04:04:36.489581 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-jx9nz"] Mar 09 04:04:36 crc kubenswrapper[4901]: I0309 04:04:36.499625 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-jx9nz"] Mar 09 04:04:36 crc kubenswrapper[4901]: I0309 04:04:36.635176 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-bfvhj"] Mar 09 04:04:36 crc kubenswrapper[4901]: E0309 04:04:36.635687 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d14e961-4c4d-43f4-b71a-9a65c1cb5732" containerName="storage" Mar 09 04:04:36 crc kubenswrapper[4901]: I0309 04:04:36.635716 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d14e961-4c4d-43f4-b71a-9a65c1cb5732" containerName="storage" Mar 09 04:04:36 crc kubenswrapper[4901]: I0309 04:04:36.635976 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d14e961-4c4d-43f4-b71a-9a65c1cb5732" containerName="storage" Mar 09 04:04:36 crc kubenswrapper[4901]: I0309 04:04:36.636810 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bfvhj" Mar 09 04:04:36 crc kubenswrapper[4901]: I0309 04:04:36.639191 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 09 04:04:36 crc kubenswrapper[4901]: I0309 04:04:36.639382 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 09 04:04:36 crc kubenswrapper[4901]: I0309 04:04:36.640517 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 09 04:04:36 crc kubenswrapper[4901]: I0309 04:04:36.640852 4901 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-xv8zs" Mar 09 04:04:36 crc kubenswrapper[4901]: I0309 04:04:36.643580 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-bfvhj"] Mar 09 04:04:36 crc kubenswrapper[4901]: I0309 04:04:36.721552 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7f2535f8-61b1-4707-bd09-5400a895d21f-node-mnt\") pod \"crc-storage-crc-bfvhj\" (UID: \"7f2535f8-61b1-4707-bd09-5400a895d21f\") " pod="crc-storage/crc-storage-crc-bfvhj" Mar 09 04:04:36 crc kubenswrapper[4901]: I0309 04:04:36.721758 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7f2535f8-61b1-4707-bd09-5400a895d21f-crc-storage\") pod \"crc-storage-crc-bfvhj\" (UID: \"7f2535f8-61b1-4707-bd09-5400a895d21f\") " pod="crc-storage/crc-storage-crc-bfvhj" Mar 09 04:04:36 crc kubenswrapper[4901]: I0309 04:04:36.721953 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xxmg\" (UniqueName: \"kubernetes.io/projected/7f2535f8-61b1-4707-bd09-5400a895d21f-kube-api-access-4xxmg\") pod \"crc-storage-crc-bfvhj\" (UID: \"7f2535f8-61b1-4707-bd09-5400a895d21f\") " pod="crc-storage/crc-storage-crc-bfvhj" Mar 09 04:04:36 crc kubenswrapper[4901]: I0309 04:04:36.822876 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xxmg\" (UniqueName: \"kubernetes.io/projected/7f2535f8-61b1-4707-bd09-5400a895d21f-kube-api-access-4xxmg\") pod \"crc-storage-crc-bfvhj\" (UID: \"7f2535f8-61b1-4707-bd09-5400a895d21f\") " pod="crc-storage/crc-storage-crc-bfvhj" Mar 09 04:04:36 crc kubenswrapper[4901]: I0309 04:04:36.822949 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7f2535f8-61b1-4707-bd09-5400a895d21f-node-mnt\") pod \"crc-storage-crc-bfvhj\" (UID: \"7f2535f8-61b1-4707-bd09-5400a895d21f\") " pod="crc-storage/crc-storage-crc-bfvhj" Mar 09 04:04:36 crc kubenswrapper[4901]: I0309 04:04:36.822989 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7f2535f8-61b1-4707-bd09-5400a895d21f-crc-storage\") pod \"crc-storage-crc-bfvhj\" (UID: \"7f2535f8-61b1-4707-bd09-5400a895d21f\") " pod="crc-storage/crc-storage-crc-bfvhj" Mar 09 04:04:36 crc kubenswrapper[4901]: I0309 04:04:36.823331 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7f2535f8-61b1-4707-bd09-5400a895d21f-node-mnt\") pod \"crc-storage-crc-bfvhj\" (UID: \"7f2535f8-61b1-4707-bd09-5400a895d21f\") " pod="crc-storage/crc-storage-crc-bfvhj" Mar 09 04:04:36 crc kubenswrapper[4901]: I0309 04:04:36.824590 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7f2535f8-61b1-4707-bd09-5400a895d21f-crc-storage\") pod \"crc-storage-crc-bfvhj\" (UID: \"7f2535f8-61b1-4707-bd09-5400a895d21f\") " pod="crc-storage/crc-storage-crc-bfvhj" Mar 09 04:04:36 crc kubenswrapper[4901]: I0309 04:04:36.856176 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xxmg\" (UniqueName: \"kubernetes.io/projected/7f2535f8-61b1-4707-bd09-5400a895d21f-kube-api-access-4xxmg\") pod \"crc-storage-crc-bfvhj\" (UID: \"7f2535f8-61b1-4707-bd09-5400a895d21f\") " pod="crc-storage/crc-storage-crc-bfvhj" Mar 09 04:04:36 crc kubenswrapper[4901]: I0309 04:04:36.959687 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bfvhj" Mar 09 04:04:37 crc kubenswrapper[4901]: I0309 04:04:37.473843 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-bfvhj"] Mar 09 04:04:37 crc kubenswrapper[4901]: I0309 04:04:37.792847 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-bfvhj" event={"ID":"7f2535f8-61b1-4707-bd09-5400a895d21f","Type":"ContainerStarted","Data":"271ca5676c4c0f68d9a33769eff1b36e07bb7dca643d24ce9e3dc02d0521b0ac"} Mar 09 04:04:38 crc kubenswrapper[4901]: I0309 04:04:38.119914 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d14e961-4c4d-43f4-b71a-9a65c1cb5732" path="/var/lib/kubelet/pods/5d14e961-4c4d-43f4-b71a-9a65c1cb5732/volumes" Mar 09 04:04:38 crc kubenswrapper[4901]: I0309 04:04:38.800708 4901 generic.go:334] "Generic (PLEG): container finished" podID="7f2535f8-61b1-4707-bd09-5400a895d21f" containerID="2f46eee5ac8c68418285a72673af94cd717eb319d050392bdd1be052cbf587d6" exitCode=0 Mar 09 04:04:38 crc kubenswrapper[4901]: I0309 04:04:38.800832 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-bfvhj" event={"ID":"7f2535f8-61b1-4707-bd09-5400a895d21f","Type":"ContainerDied","Data":"2f46eee5ac8c68418285a72673af94cd717eb319d050392bdd1be052cbf587d6"} Mar 09 04:04:40 crc kubenswrapper[4901]: I0309 04:04:40.587293 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bfvhj" Mar 09 04:04:40 crc kubenswrapper[4901]: I0309 04:04:40.701291 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xxmg\" (UniqueName: \"kubernetes.io/projected/7f2535f8-61b1-4707-bd09-5400a895d21f-kube-api-access-4xxmg\") pod \"7f2535f8-61b1-4707-bd09-5400a895d21f\" (UID: \"7f2535f8-61b1-4707-bd09-5400a895d21f\") " Mar 09 04:04:40 crc kubenswrapper[4901]: I0309 04:04:40.701479 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7f2535f8-61b1-4707-bd09-5400a895d21f-node-mnt\") pod \"7f2535f8-61b1-4707-bd09-5400a895d21f\" (UID: \"7f2535f8-61b1-4707-bd09-5400a895d21f\") " Mar 09 04:04:40 crc kubenswrapper[4901]: I0309 04:04:40.701689 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7f2535f8-61b1-4707-bd09-5400a895d21f-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "7f2535f8-61b1-4707-bd09-5400a895d21f" (UID: "7f2535f8-61b1-4707-bd09-5400a895d21f"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 04:04:40 crc kubenswrapper[4901]: I0309 04:04:40.701747 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7f2535f8-61b1-4707-bd09-5400a895d21f-crc-storage\") pod \"7f2535f8-61b1-4707-bd09-5400a895d21f\" (UID: \"7f2535f8-61b1-4707-bd09-5400a895d21f\") " Mar 09 04:04:40 crc kubenswrapper[4901]: I0309 04:04:40.702265 4901 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7f2535f8-61b1-4707-bd09-5400a895d21f-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 09 04:04:40 crc kubenswrapper[4901]: I0309 04:04:40.710649 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f2535f8-61b1-4707-bd09-5400a895d21f-kube-api-access-4xxmg" (OuterVolumeSpecName: "kube-api-access-4xxmg") pod "7f2535f8-61b1-4707-bd09-5400a895d21f" (UID: "7f2535f8-61b1-4707-bd09-5400a895d21f"). InnerVolumeSpecName "kube-api-access-4xxmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:04:40 crc kubenswrapper[4901]: I0309 04:04:40.741022 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f2535f8-61b1-4707-bd09-5400a895d21f-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "7f2535f8-61b1-4707-bd09-5400a895d21f" (UID: "7f2535f8-61b1-4707-bd09-5400a895d21f"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 04:04:40 crc kubenswrapper[4901]: I0309 04:04:40.803781 4901 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7f2535f8-61b1-4707-bd09-5400a895d21f-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 09 04:04:40 crc kubenswrapper[4901]: I0309 04:04:40.803840 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xxmg\" (UniqueName: \"kubernetes.io/projected/7f2535f8-61b1-4707-bd09-5400a895d21f-kube-api-access-4xxmg\") on node \"crc\" DevicePath \"\"" Mar 09 04:04:40 crc kubenswrapper[4901]: I0309 04:04:40.824884 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-bfvhj" event={"ID":"7f2535f8-61b1-4707-bd09-5400a895d21f","Type":"ContainerDied","Data":"271ca5676c4c0f68d9a33769eff1b36e07bb7dca643d24ce9e3dc02d0521b0ac"} Mar 09 04:04:40 crc kubenswrapper[4901]: I0309 04:04:40.824982 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="271ca5676c4c0f68d9a33769eff1b36e07bb7dca643d24ce9e3dc02d0521b0ac" Mar 09 04:04:40 crc kubenswrapper[4901]: I0309 04:04:40.824941 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bfvhj" Mar 09 04:04:41 crc kubenswrapper[4901]: I0309 04:04:41.501179 4901 scope.go:117] "RemoveContainer" containerID="03e4c2ed359b7095b87a721630de8607e1d76adc905ec0c387c97814877b006e" Mar 09 04:04:41 crc kubenswrapper[4901]: I0309 04:04:41.573392 4901 scope.go:117] "RemoveContainer" containerID="0a620c1720dbb11c2c0624f8c305c9ee0a274e8f00074f89bbf3562aba93deb8" Mar 09 04:05:58 crc kubenswrapper[4901]: I0309 04:05:58.846146 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-b2dq2"] Mar 09 04:05:58 crc kubenswrapper[4901]: E0309 04:05:58.847491 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f2535f8-61b1-4707-bd09-5400a895d21f" containerName="storage" Mar 09 04:05:58 crc kubenswrapper[4901]: I0309 04:05:58.847523 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f2535f8-61b1-4707-bd09-5400a895d21f" containerName="storage" Mar 09 04:05:58 crc kubenswrapper[4901]: I0309 04:05:58.847946 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f2535f8-61b1-4707-bd09-5400a895d21f" containerName="storage" Mar 09 04:05:58 crc kubenswrapper[4901]: I0309 04:05:58.849915 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b2dq2" Mar 09 04:05:58 crc kubenswrapper[4901]: I0309 04:05:58.855451 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b2dq2"] Mar 09 04:05:58 crc kubenswrapper[4901]: I0309 04:05:58.914402 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wq97\" (UniqueName: \"kubernetes.io/projected/af3e6664-3e02-4c1d-8304-8794bd131a0a-kube-api-access-7wq97\") pod \"redhat-marketplace-b2dq2\" (UID: \"af3e6664-3e02-4c1d-8304-8794bd131a0a\") " pod="openshift-marketplace/redhat-marketplace-b2dq2" Mar 09 04:05:58 crc kubenswrapper[4901]: I0309 04:05:58.914511 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af3e6664-3e02-4c1d-8304-8794bd131a0a-utilities\") pod \"redhat-marketplace-b2dq2\" (UID: \"af3e6664-3e02-4c1d-8304-8794bd131a0a\") " pod="openshift-marketplace/redhat-marketplace-b2dq2" Mar 09 04:05:58 crc kubenswrapper[4901]: I0309 04:05:58.914568 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af3e6664-3e02-4c1d-8304-8794bd131a0a-catalog-content\") pod \"redhat-marketplace-b2dq2\" (UID: \"af3e6664-3e02-4c1d-8304-8794bd131a0a\") " pod="openshift-marketplace/redhat-marketplace-b2dq2" Mar 09 04:05:59 crc kubenswrapper[4901]: I0309 04:05:59.016650 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wq97\" (UniqueName: \"kubernetes.io/projected/af3e6664-3e02-4c1d-8304-8794bd131a0a-kube-api-access-7wq97\") pod \"redhat-marketplace-b2dq2\" (UID: \"af3e6664-3e02-4c1d-8304-8794bd131a0a\") " pod="openshift-marketplace/redhat-marketplace-b2dq2" Mar 09 04:05:59 crc kubenswrapper[4901]: I0309 04:05:59.016826 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af3e6664-3e02-4c1d-8304-8794bd131a0a-utilities\") pod \"redhat-marketplace-b2dq2\" (UID: \"af3e6664-3e02-4c1d-8304-8794bd131a0a\") " pod="openshift-marketplace/redhat-marketplace-b2dq2" Mar 09 04:05:59 crc kubenswrapper[4901]: I0309 04:05:59.016920 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af3e6664-3e02-4c1d-8304-8794bd131a0a-catalog-content\") pod \"redhat-marketplace-b2dq2\" (UID: \"af3e6664-3e02-4c1d-8304-8794bd131a0a\") " pod="openshift-marketplace/redhat-marketplace-b2dq2" Mar 09 04:05:59 crc kubenswrapper[4901]: I0309 04:05:59.017408 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af3e6664-3e02-4c1d-8304-8794bd131a0a-utilities\") pod \"redhat-marketplace-b2dq2\" (UID: \"af3e6664-3e02-4c1d-8304-8794bd131a0a\") " pod="openshift-marketplace/redhat-marketplace-b2dq2" Mar 09 04:05:59 crc kubenswrapper[4901]: I0309 04:05:59.017696 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af3e6664-3e02-4c1d-8304-8794bd131a0a-catalog-content\") pod \"redhat-marketplace-b2dq2\" (UID: \"af3e6664-3e02-4c1d-8304-8794bd131a0a\") " pod="openshift-marketplace/redhat-marketplace-b2dq2" Mar 09 04:05:59 crc kubenswrapper[4901]: I0309 04:05:59.042031 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wq97\" (UniqueName: \"kubernetes.io/projected/af3e6664-3e02-4c1d-8304-8794bd131a0a-kube-api-access-7wq97\") pod \"redhat-marketplace-b2dq2\" (UID: \"af3e6664-3e02-4c1d-8304-8794bd131a0a\") " pod="openshift-marketplace/redhat-marketplace-b2dq2" Mar 09 04:05:59 crc kubenswrapper[4901]: I0309 04:05:59.194681 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b2dq2" Mar 09 04:05:59 crc kubenswrapper[4901]: I0309 04:05:59.517905 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b2dq2"] Mar 09 04:05:59 crc kubenswrapper[4901]: I0309 04:05:59.587424 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b2dq2" event={"ID":"af3e6664-3e02-4c1d-8304-8794bd131a0a","Type":"ContainerStarted","Data":"81a5f0e14f1e79ae05f6c8f996ab9c628ee3e4d28348d8aa94b48201efe16b5d"} Mar 09 04:06:00 crc kubenswrapper[4901]: I0309 04:06:00.157605 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550486-sp2hq"] Mar 09 04:06:00 crc kubenswrapper[4901]: I0309 04:06:00.159548 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550486-sp2hq" Mar 09 04:06:00 crc kubenswrapper[4901]: I0309 04:06:00.165106 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 04:06:00 crc kubenswrapper[4901]: I0309 04:06:00.165424 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 04:06:00 crc kubenswrapper[4901]: I0309 04:06:00.166087 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lq6vs" Mar 09 04:06:00 crc kubenswrapper[4901]: I0309 04:06:00.170303 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550486-sp2hq"] Mar 09 04:06:00 crc kubenswrapper[4901]: I0309 04:06:00.235024 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fn6gw\" (UniqueName: \"kubernetes.io/projected/1f69fd42-fdc0-4311-bc09-3a1307f04e40-kube-api-access-fn6gw\") pod \"auto-csr-approver-29550486-sp2hq\" (UID: \"1f69fd42-fdc0-4311-bc09-3a1307f04e40\") " pod="openshift-infra/auto-csr-approver-29550486-sp2hq" Mar 09 04:06:00 crc kubenswrapper[4901]: I0309 04:06:00.336651 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fn6gw\" (UniqueName: \"kubernetes.io/projected/1f69fd42-fdc0-4311-bc09-3a1307f04e40-kube-api-access-fn6gw\") pod \"auto-csr-approver-29550486-sp2hq\" (UID: \"1f69fd42-fdc0-4311-bc09-3a1307f04e40\") " pod="openshift-infra/auto-csr-approver-29550486-sp2hq" Mar 09 04:06:00 crc kubenswrapper[4901]: I0309 04:06:00.376008 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fn6gw\" (UniqueName: \"kubernetes.io/projected/1f69fd42-fdc0-4311-bc09-3a1307f04e40-kube-api-access-fn6gw\") pod \"auto-csr-approver-29550486-sp2hq\" (UID: \"1f69fd42-fdc0-4311-bc09-3a1307f04e40\") " pod="openshift-infra/auto-csr-approver-29550486-sp2hq" Mar 09 04:06:00 crc kubenswrapper[4901]: I0309 04:06:00.491968 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550486-sp2hq" Mar 09 04:06:00 crc kubenswrapper[4901]: I0309 04:06:00.596878 4901 generic.go:334] "Generic (PLEG): container finished" podID="af3e6664-3e02-4c1d-8304-8794bd131a0a" containerID="024d16a4d4d8bff5b416d3f2fc171525deef6f47b442b01206639ca98a214715" exitCode=0 Mar 09 04:06:00 crc kubenswrapper[4901]: I0309 04:06:00.596923 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b2dq2" event={"ID":"af3e6664-3e02-4c1d-8304-8794bd131a0a","Type":"ContainerDied","Data":"024d16a4d4d8bff5b416d3f2fc171525deef6f47b442b01206639ca98a214715"} Mar 09 04:06:00 crc kubenswrapper[4901]: I0309 04:06:00.599558 4901 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 04:06:00 crc kubenswrapper[4901]: I0309 04:06:00.788729 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550486-sp2hq"] Mar 09 04:06:01 crc kubenswrapper[4901]: I0309 04:06:01.609063 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550486-sp2hq" event={"ID":"1f69fd42-fdc0-4311-bc09-3a1307f04e40","Type":"ContainerStarted","Data":"fff60191600db1b59a803419495b8354717dda4aa8263740559628212714f222"} Mar 09 04:06:01 crc kubenswrapper[4901]: I0309 04:06:01.613645 4901 generic.go:334] "Generic (PLEG): container finished" podID="af3e6664-3e02-4c1d-8304-8794bd131a0a" containerID="c078d81fe7018e3fff4b95a7f287c279aa8fc1c0476e704009e123e88deb60be" exitCode=0 Mar 09 04:06:01 crc kubenswrapper[4901]: I0309 04:06:01.613679 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b2dq2" event={"ID":"af3e6664-3e02-4c1d-8304-8794bd131a0a","Type":"ContainerDied","Data":"c078d81fe7018e3fff4b95a7f287c279aa8fc1c0476e704009e123e88deb60be"} Mar 09 04:06:02 crc kubenswrapper[4901]: I0309 04:06:02.623776 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b2dq2" event={"ID":"af3e6664-3e02-4c1d-8304-8794bd131a0a","Type":"ContainerStarted","Data":"d2ec2c3daec9509803ef69ec3f3b4fcaa276da6c87371fa097f66a8d040f995f"} Mar 09 04:06:02 crc kubenswrapper[4901]: I0309 04:06:02.626139 4901 generic.go:334] "Generic (PLEG): container finished" podID="1f69fd42-fdc0-4311-bc09-3a1307f04e40" containerID="0a578304b4f8dd429dc0322a1a5701a989c64ed6aed0f8a27d5886dfe44f993d" exitCode=0 Mar 09 04:06:02 crc kubenswrapper[4901]: I0309 04:06:02.626191 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550486-sp2hq" event={"ID":"1f69fd42-fdc0-4311-bc09-3a1307f04e40","Type":"ContainerDied","Data":"0a578304b4f8dd429dc0322a1a5701a989c64ed6aed0f8a27d5886dfe44f993d"} Mar 09 04:06:02 crc kubenswrapper[4901]: I0309 04:06:02.652198 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-b2dq2" podStartSLOduration=3.203996012 podStartE2EDuration="4.652174216s" podCreationTimestamp="2026-03-09 04:05:58 +0000 UTC" firstStartedPulling="2026-03-09 04:06:00.59910949 +0000 UTC m=+5085.188773242" lastFinishedPulling="2026-03-09 04:06:02.047287684 +0000 UTC m=+5086.636951446" observedRunningTime="2026-03-09 04:06:02.64147298 +0000 UTC m=+5087.231136722" watchObservedRunningTime="2026-03-09 04:06:02.652174216 +0000 UTC m=+5087.241837958" Mar 09 04:06:03 crc kubenswrapper[4901]: I0309 04:06:03.988038 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550486-sp2hq" Mar 09 04:06:04 crc kubenswrapper[4901]: I0309 04:06:04.090394 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fn6gw\" (UniqueName: \"kubernetes.io/projected/1f69fd42-fdc0-4311-bc09-3a1307f04e40-kube-api-access-fn6gw\") pod \"1f69fd42-fdc0-4311-bc09-3a1307f04e40\" (UID: \"1f69fd42-fdc0-4311-bc09-3a1307f04e40\") " Mar 09 04:06:04 crc kubenswrapper[4901]: I0309 04:06:04.096717 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f69fd42-fdc0-4311-bc09-3a1307f04e40-kube-api-access-fn6gw" (OuterVolumeSpecName: "kube-api-access-fn6gw") pod "1f69fd42-fdc0-4311-bc09-3a1307f04e40" (UID: "1f69fd42-fdc0-4311-bc09-3a1307f04e40"). InnerVolumeSpecName "kube-api-access-fn6gw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:06:04 crc kubenswrapper[4901]: I0309 04:06:04.192327 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fn6gw\" (UniqueName: \"kubernetes.io/projected/1f69fd42-fdc0-4311-bc09-3a1307f04e40-kube-api-access-fn6gw\") on node \"crc\" DevicePath \"\"" Mar 09 04:06:04 crc kubenswrapper[4901]: I0309 04:06:04.646763 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550486-sp2hq" event={"ID":"1f69fd42-fdc0-4311-bc09-3a1307f04e40","Type":"ContainerDied","Data":"fff60191600db1b59a803419495b8354717dda4aa8263740559628212714f222"} Mar 09 04:06:04 crc kubenswrapper[4901]: I0309 04:06:04.646808 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fff60191600db1b59a803419495b8354717dda4aa8263740559628212714f222" Mar 09 04:06:04 crc kubenswrapper[4901]: I0309 04:06:04.646862 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550486-sp2hq" Mar 09 04:06:05 crc kubenswrapper[4901]: I0309 04:06:05.070940 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550480-4trsp"] Mar 09 04:06:05 crc kubenswrapper[4901]: I0309 04:06:05.075468 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550480-4trsp"] Mar 09 04:06:06 crc kubenswrapper[4901]: I0309 04:06:06.124279 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="016d7314-606e-4045-bdbd-1554776219aa" path="/var/lib/kubelet/pods/016d7314-606e-4045-bdbd-1554776219aa/volumes" Mar 09 04:06:09 crc kubenswrapper[4901]: I0309 04:06:09.195277 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-b2dq2" Mar 09 04:06:09 crc kubenswrapper[4901]: I0309 04:06:09.196505 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-b2dq2" Mar 09 04:06:09 crc kubenswrapper[4901]: I0309 04:06:09.277643 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-b2dq2" Mar 09 04:06:09 crc kubenswrapper[4901]: I0309 04:06:09.767497 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-b2dq2" Mar 09 04:06:09 crc kubenswrapper[4901]: I0309 04:06:09.828079 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b2dq2"] Mar 09 04:06:11 crc kubenswrapper[4901]: I0309 04:06:11.710141 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-b2dq2" podUID="af3e6664-3e02-4c1d-8304-8794bd131a0a" containerName="registry-server" containerID="cri-o://d2ec2c3daec9509803ef69ec3f3b4fcaa276da6c87371fa097f66a8d040f995f" gracePeriod=2 Mar 09 04:06:12 crc kubenswrapper[4901]: I0309 04:06:12.224638 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b2dq2" Mar 09 04:06:12 crc kubenswrapper[4901]: I0309 04:06:12.337060 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af3e6664-3e02-4c1d-8304-8794bd131a0a-utilities\") pod \"af3e6664-3e02-4c1d-8304-8794bd131a0a\" (UID: \"af3e6664-3e02-4c1d-8304-8794bd131a0a\") " Mar 09 04:06:12 crc kubenswrapper[4901]: I0309 04:06:12.337181 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wq97\" (UniqueName: \"kubernetes.io/projected/af3e6664-3e02-4c1d-8304-8794bd131a0a-kube-api-access-7wq97\") pod \"af3e6664-3e02-4c1d-8304-8794bd131a0a\" (UID: \"af3e6664-3e02-4c1d-8304-8794bd131a0a\") " Mar 09 04:06:12 crc kubenswrapper[4901]: I0309 04:06:12.337297 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af3e6664-3e02-4c1d-8304-8794bd131a0a-catalog-content\") pod \"af3e6664-3e02-4c1d-8304-8794bd131a0a\" (UID: \"af3e6664-3e02-4c1d-8304-8794bd131a0a\") " Mar 09 04:06:12 crc kubenswrapper[4901]: I0309 04:06:12.339923 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af3e6664-3e02-4c1d-8304-8794bd131a0a-utilities" (OuterVolumeSpecName: "utilities") pod "af3e6664-3e02-4c1d-8304-8794bd131a0a" (UID: "af3e6664-3e02-4c1d-8304-8794bd131a0a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 04:06:12 crc kubenswrapper[4901]: I0309 04:06:12.347754 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af3e6664-3e02-4c1d-8304-8794bd131a0a-kube-api-access-7wq97" (OuterVolumeSpecName: "kube-api-access-7wq97") pod "af3e6664-3e02-4c1d-8304-8794bd131a0a" (UID: "af3e6664-3e02-4c1d-8304-8794bd131a0a"). InnerVolumeSpecName "kube-api-access-7wq97". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:06:12 crc kubenswrapper[4901]: I0309 04:06:12.381665 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af3e6664-3e02-4c1d-8304-8794bd131a0a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "af3e6664-3e02-4c1d-8304-8794bd131a0a" (UID: "af3e6664-3e02-4c1d-8304-8794bd131a0a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 04:06:12 crc kubenswrapper[4901]: I0309 04:06:12.439303 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af3e6664-3e02-4c1d-8304-8794bd131a0a-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 04:06:12 crc kubenswrapper[4901]: I0309 04:06:12.439371 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wq97\" (UniqueName: \"kubernetes.io/projected/af3e6664-3e02-4c1d-8304-8794bd131a0a-kube-api-access-7wq97\") on node \"crc\" DevicePath \"\"" Mar 09 04:06:12 crc kubenswrapper[4901]: I0309 04:06:12.439395 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af3e6664-3e02-4c1d-8304-8794bd131a0a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 04:06:12 crc kubenswrapper[4901]: I0309 04:06:12.726364 4901 generic.go:334] "Generic (PLEG): container finished" podID="af3e6664-3e02-4c1d-8304-8794bd131a0a" containerID="d2ec2c3daec9509803ef69ec3f3b4fcaa276da6c87371fa097f66a8d040f995f" exitCode=0 Mar 09 04:06:12 crc kubenswrapper[4901]: I0309 04:06:12.726431 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b2dq2" event={"ID":"af3e6664-3e02-4c1d-8304-8794bd131a0a","Type":"ContainerDied","Data":"d2ec2c3daec9509803ef69ec3f3b4fcaa276da6c87371fa097f66a8d040f995f"} Mar 09 04:06:12 crc kubenswrapper[4901]: I0309 04:06:12.726483 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b2dq2" event={"ID":"af3e6664-3e02-4c1d-8304-8794bd131a0a","Type":"ContainerDied","Data":"81a5f0e14f1e79ae05f6c8f996ab9c628ee3e4d28348d8aa94b48201efe16b5d"} Mar 09 04:06:12 crc kubenswrapper[4901]: I0309 04:06:12.726515 4901 scope.go:117] "RemoveContainer" containerID="d2ec2c3daec9509803ef69ec3f3b4fcaa276da6c87371fa097f66a8d040f995f" Mar 09 04:06:12 crc kubenswrapper[4901]: I0309 04:06:12.726567 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b2dq2" Mar 09 04:06:12 crc kubenswrapper[4901]: I0309 04:06:12.801731 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b2dq2"] Mar 09 04:06:12 crc kubenswrapper[4901]: I0309 04:06:12.803778 4901 scope.go:117] "RemoveContainer" containerID="c078d81fe7018e3fff4b95a7f287c279aa8fc1c0476e704009e123e88deb60be" Mar 09 04:06:12 crc kubenswrapper[4901]: I0309 04:06:12.812455 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-b2dq2"] Mar 09 04:06:13 crc kubenswrapper[4901]: I0309 04:06:13.092157 4901 scope.go:117] "RemoveContainer" containerID="024d16a4d4d8bff5b416d3f2fc171525deef6f47b442b01206639ca98a214715" Mar 09 04:06:13 crc kubenswrapper[4901]: I0309 04:06:13.120333 4901 scope.go:117] "RemoveContainer" containerID="d2ec2c3daec9509803ef69ec3f3b4fcaa276da6c87371fa097f66a8d040f995f" Mar 09 04:06:13 crc kubenswrapper[4901]: E0309 04:06:13.120837 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2ec2c3daec9509803ef69ec3f3b4fcaa276da6c87371fa097f66a8d040f995f\": container with ID starting with d2ec2c3daec9509803ef69ec3f3b4fcaa276da6c87371fa097f66a8d040f995f not found: ID does not exist" containerID="d2ec2c3daec9509803ef69ec3f3b4fcaa276da6c87371fa097f66a8d040f995f" Mar 09 04:06:13 crc kubenswrapper[4901]: I0309 04:06:13.120915 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2ec2c3daec9509803ef69ec3f3b4fcaa276da6c87371fa097f66a8d040f995f"} err="failed to get container status \"d2ec2c3daec9509803ef69ec3f3b4fcaa276da6c87371fa097f66a8d040f995f\": rpc error: code = NotFound desc = could not find container \"d2ec2c3daec9509803ef69ec3f3b4fcaa276da6c87371fa097f66a8d040f995f\": container with ID starting with d2ec2c3daec9509803ef69ec3f3b4fcaa276da6c87371fa097f66a8d040f995f not found: ID does not exist" Mar 09 04:06:13 crc kubenswrapper[4901]: I0309 04:06:13.120959 4901 scope.go:117] "RemoveContainer" containerID="c078d81fe7018e3fff4b95a7f287c279aa8fc1c0476e704009e123e88deb60be" Mar 09 04:06:13 crc kubenswrapper[4901]: E0309 04:06:13.121407 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c078d81fe7018e3fff4b95a7f287c279aa8fc1c0476e704009e123e88deb60be\": container with ID starting with c078d81fe7018e3fff4b95a7f287c279aa8fc1c0476e704009e123e88deb60be not found: ID does not exist" containerID="c078d81fe7018e3fff4b95a7f287c279aa8fc1c0476e704009e123e88deb60be" Mar 09 04:06:13 crc kubenswrapper[4901]: I0309 04:06:13.121437 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c078d81fe7018e3fff4b95a7f287c279aa8fc1c0476e704009e123e88deb60be"} err="failed to get container status \"c078d81fe7018e3fff4b95a7f287c279aa8fc1c0476e704009e123e88deb60be\": rpc error: code = NotFound desc = could not find container \"c078d81fe7018e3fff4b95a7f287c279aa8fc1c0476e704009e123e88deb60be\": container with ID starting with c078d81fe7018e3fff4b95a7f287c279aa8fc1c0476e704009e123e88deb60be not found: ID does not exist" Mar 09 04:06:13 crc kubenswrapper[4901]: I0309 04:06:13.121457 4901 scope.go:117] "RemoveContainer" containerID="024d16a4d4d8bff5b416d3f2fc171525deef6f47b442b01206639ca98a214715" Mar 09 04:06:13 crc kubenswrapper[4901]: E0309 04:06:13.121817 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"024d16a4d4d8bff5b416d3f2fc171525deef6f47b442b01206639ca98a214715\": container with ID starting with 024d16a4d4d8bff5b416d3f2fc171525deef6f47b442b01206639ca98a214715 not found: ID does not exist" containerID="024d16a4d4d8bff5b416d3f2fc171525deef6f47b442b01206639ca98a214715" Mar 09 04:06:13 crc kubenswrapper[4901]: I0309 04:06:13.121904 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"024d16a4d4d8bff5b416d3f2fc171525deef6f47b442b01206639ca98a214715"} err="failed to get container status \"024d16a4d4d8bff5b416d3f2fc171525deef6f47b442b01206639ca98a214715\": rpc error: code = NotFound desc = could not find container \"024d16a4d4d8bff5b416d3f2fc171525deef6f47b442b01206639ca98a214715\": container with ID starting with 024d16a4d4d8bff5b416d3f2fc171525deef6f47b442b01206639ca98a214715 not found: ID does not exist" Mar 09 04:06:14 crc kubenswrapper[4901]: I0309 04:06:14.123799 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af3e6664-3e02-4c1d-8304-8794bd131a0a" path="/var/lib/kubelet/pods/af3e6664-3e02-4c1d-8304-8794bd131a0a/volumes" Mar 09 04:06:30 crc kubenswrapper[4901]: I0309 04:06:30.863558 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 04:06:30 crc kubenswrapper[4901]: I0309 04:06:30.864105 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 04:06:34 crc kubenswrapper[4901]: I0309 04:06:34.222303 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vsmcd"] Mar 09 04:06:34 crc kubenswrapper[4901]: E0309 04:06:34.223314 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af3e6664-3e02-4c1d-8304-8794bd131a0a" containerName="registry-server" Mar 09 04:06:34 crc kubenswrapper[4901]: I0309 04:06:34.223335 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="af3e6664-3e02-4c1d-8304-8794bd131a0a" containerName="registry-server" Mar 09 04:06:34 crc kubenswrapper[4901]: E0309 04:06:34.223389 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af3e6664-3e02-4c1d-8304-8794bd131a0a" containerName="extract-utilities" Mar 09 04:06:34 crc kubenswrapper[4901]: I0309 04:06:34.223405 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="af3e6664-3e02-4c1d-8304-8794bd131a0a" containerName="extract-utilities" Mar 09 04:06:34 crc kubenswrapper[4901]: E0309 04:06:34.223428 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f69fd42-fdc0-4311-bc09-3a1307f04e40" containerName="oc" Mar 09 04:06:34 crc kubenswrapper[4901]: I0309 04:06:34.223440 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f69fd42-fdc0-4311-bc09-3a1307f04e40" containerName="oc" Mar 09 04:06:34 crc kubenswrapper[4901]: E0309 04:06:34.223458 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af3e6664-3e02-4c1d-8304-8794bd131a0a" containerName="extract-content" Mar 09 04:06:34 crc kubenswrapper[4901]: I0309 04:06:34.223469 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="af3e6664-3e02-4c1d-8304-8794bd131a0a" containerName="extract-content" Mar 09 04:06:34 crc kubenswrapper[4901]: I0309 04:06:34.223702 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="af3e6664-3e02-4c1d-8304-8794bd131a0a" containerName="registry-server" Mar 09 04:06:34 crc kubenswrapper[4901]: I0309 04:06:34.223737 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f69fd42-fdc0-4311-bc09-3a1307f04e40" containerName="oc" Mar 09 04:06:34 crc kubenswrapper[4901]: I0309 04:06:34.225427 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vsmcd" Mar 09 04:06:34 crc kubenswrapper[4901]: I0309 04:06:34.238862 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vsmcd"] Mar 09 04:06:34 crc kubenswrapper[4901]: I0309 04:06:34.378956 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bv59p\" (UniqueName: \"kubernetes.io/projected/0527d7c3-c662-41df-bb0c-7bc56b6da775-kube-api-access-bv59p\") pod \"community-operators-vsmcd\" (UID: \"0527d7c3-c662-41df-bb0c-7bc56b6da775\") " pod="openshift-marketplace/community-operators-vsmcd" Mar 09 04:06:34 crc kubenswrapper[4901]: I0309 04:06:34.379013 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0527d7c3-c662-41df-bb0c-7bc56b6da775-utilities\") pod \"community-operators-vsmcd\" (UID: \"0527d7c3-c662-41df-bb0c-7bc56b6da775\") " pod="openshift-marketplace/community-operators-vsmcd" Mar 09 04:06:34 crc kubenswrapper[4901]: I0309 04:06:34.379118 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0527d7c3-c662-41df-bb0c-7bc56b6da775-catalog-content\") pod \"community-operators-vsmcd\" (UID: \"0527d7c3-c662-41df-bb0c-7bc56b6da775\") " pod="openshift-marketplace/community-operators-vsmcd" Mar 09 04:06:34 crc kubenswrapper[4901]: I0309 04:06:34.480067 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bv59p\" (UniqueName: \"kubernetes.io/projected/0527d7c3-c662-41df-bb0c-7bc56b6da775-kube-api-access-bv59p\") pod \"community-operators-vsmcd\" (UID: \"0527d7c3-c662-41df-bb0c-7bc56b6da775\") " pod="openshift-marketplace/community-operators-vsmcd" Mar 09 04:06:34 crc kubenswrapper[4901]: I0309 04:06:34.480149 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0527d7c3-c662-41df-bb0c-7bc56b6da775-utilities\") pod \"community-operators-vsmcd\" (UID: \"0527d7c3-c662-41df-bb0c-7bc56b6da775\") " pod="openshift-marketplace/community-operators-vsmcd" Mar 09 04:06:34 crc kubenswrapper[4901]: I0309 04:06:34.481042 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0527d7c3-c662-41df-bb0c-7bc56b6da775-utilities\") pod \"community-operators-vsmcd\" (UID: \"0527d7c3-c662-41df-bb0c-7bc56b6da775\") " pod="openshift-marketplace/community-operators-vsmcd" Mar 09 04:06:34 crc kubenswrapper[4901]: I0309 04:06:34.481294 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0527d7c3-c662-41df-bb0c-7bc56b6da775-catalog-content\") pod \"community-operators-vsmcd\" (UID: \"0527d7c3-c662-41df-bb0c-7bc56b6da775\") " pod="openshift-marketplace/community-operators-vsmcd" Mar 09 04:06:34 crc kubenswrapper[4901]: I0309 04:06:34.481775 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0527d7c3-c662-41df-bb0c-7bc56b6da775-catalog-content\") pod \"community-operators-vsmcd\" (UID: \"0527d7c3-c662-41df-bb0c-7bc56b6da775\") " pod="openshift-marketplace/community-operators-vsmcd" Mar 09 04:06:34 crc kubenswrapper[4901]: I0309 04:06:34.542054 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bv59p\" (UniqueName: \"kubernetes.io/projected/0527d7c3-c662-41df-bb0c-7bc56b6da775-kube-api-access-bv59p\") pod \"community-operators-vsmcd\" (UID: \"0527d7c3-c662-41df-bb0c-7bc56b6da775\") " pod="openshift-marketplace/community-operators-vsmcd" Mar 09 04:06:34 crc kubenswrapper[4901]: I0309 04:06:34.553081 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vsmcd" Mar 09 04:06:35 crc kubenswrapper[4901]: I0309 04:06:35.096280 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vsmcd"] Mar 09 04:06:35 crc kubenswrapper[4901]: W0309 04:06:35.100168 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0527d7c3_c662_41df_bb0c_7bc56b6da775.slice/crio-160df8621c49b3578480995fb8761e1074a9eb9c391bca89210e680ff09b9a4e WatchSource:0}: Error finding container 160df8621c49b3578480995fb8761e1074a9eb9c391bca89210e680ff09b9a4e: Status 404 returned error can't find the container with id 160df8621c49b3578480995fb8761e1074a9eb9c391bca89210e680ff09b9a4e Mar 09 04:06:35 crc kubenswrapper[4901]: I0309 04:06:35.958456 4901 generic.go:334] "Generic (PLEG): container finished" podID="0527d7c3-c662-41df-bb0c-7bc56b6da775" containerID="e80e9ff17682a3282e2f9eed155e2c903dbb77f63a8c1d00d8e04ff8f4837ef3" exitCode=0 Mar 09 04:06:35 crc kubenswrapper[4901]: I0309 04:06:35.958541 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vsmcd" event={"ID":"0527d7c3-c662-41df-bb0c-7bc56b6da775","Type":"ContainerDied","Data":"e80e9ff17682a3282e2f9eed155e2c903dbb77f63a8c1d00d8e04ff8f4837ef3"} Mar 09 04:06:35 crc kubenswrapper[4901]: I0309 04:06:35.958859 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vsmcd" event={"ID":"0527d7c3-c662-41df-bb0c-7bc56b6da775","Type":"ContainerStarted","Data":"160df8621c49b3578480995fb8761e1074a9eb9c391bca89210e680ff09b9a4e"} Mar 09 04:06:37 crc kubenswrapper[4901]: I0309 04:06:37.978531 4901 generic.go:334] "Generic (PLEG): container finished" podID="0527d7c3-c662-41df-bb0c-7bc56b6da775" containerID="6f75caace87d2e602cdfc152b4ded0991c8093063410d04d17fcf35c0c6c9699" exitCode=0 Mar 09 04:06:37 crc kubenswrapper[4901]: I0309 04:06:37.978647 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vsmcd" event={"ID":"0527d7c3-c662-41df-bb0c-7bc56b6da775","Type":"ContainerDied","Data":"6f75caace87d2e602cdfc152b4ded0991c8093063410d04d17fcf35c0c6c9699"} Mar 09 04:06:38 crc kubenswrapper[4901]: I0309 04:06:38.991629 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vsmcd" event={"ID":"0527d7c3-c662-41df-bb0c-7bc56b6da775","Type":"ContainerStarted","Data":"d1de0d291c025dd4ecbc55849aa3a6b747dacd88cfa53cef65f698222cc1a01b"} Mar 09 04:06:39 crc kubenswrapper[4901]: I0309 04:06:39.025750 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vsmcd" podStartSLOduration=2.597319367 podStartE2EDuration="5.025725556s" podCreationTimestamp="2026-03-09 04:06:34 +0000 UTC" firstStartedPulling="2026-03-09 04:06:35.960935513 +0000 UTC m=+5120.550599275" lastFinishedPulling="2026-03-09 04:06:38.389341712 +0000 UTC m=+5122.979005464" observedRunningTime="2026-03-09 04:06:39.019746028 +0000 UTC m=+5123.609409770" watchObservedRunningTime="2026-03-09 04:06:39.025725556 +0000 UTC m=+5123.615389298" Mar 09 04:06:41 crc kubenswrapper[4901]: I0309 04:06:41.711377 4901 scope.go:117] "RemoveContainer" containerID="ed0b53477ba76499d938ed98d2b45817a433ae1e3efcf11a6aecd464c8f5a3e1" Mar 09 04:06:43 crc kubenswrapper[4901]: I0309 04:06:43.861357 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-c44667757-7bvgc"] Mar 09 04:06:43 crc kubenswrapper[4901]: I0309 04:06:43.863714 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c44667757-7bvgc" Mar 09 04:06:43 crc kubenswrapper[4901]: I0309 04:06:43.866925 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 09 04:06:43 crc kubenswrapper[4901]: I0309 04:06:43.867343 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 09 04:06:43 crc kubenswrapper[4901]: I0309 04:06:43.868107 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55c76fd6b7-sg4nn"] Mar 09 04:06:43 crc kubenswrapper[4901]: I0309 04:06:43.868478 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 09 04:06:43 crc kubenswrapper[4901]: I0309 04:06:43.869291 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55c76fd6b7-sg4nn" Mar 09 04:06:43 crc kubenswrapper[4901]: I0309 04:06:43.872643 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-jmkmg" Mar 09 04:06:43 crc kubenswrapper[4901]: I0309 04:06:43.876184 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 09 04:06:43 crc kubenswrapper[4901]: I0309 04:06:43.891271 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55c76fd6b7-sg4nn"] Mar 09 04:06:43 crc kubenswrapper[4901]: I0309 04:06:43.934601 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c44667757-7bvgc"] Mar 09 04:06:43 crc kubenswrapper[4901]: I0309 04:06:43.948095 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xttzt\" (UniqueName: \"kubernetes.io/projected/efc43cb1-a1e5-4e46-98ed-071fd960a438-kube-api-access-xttzt\") pod \"dnsmasq-dns-55c76fd6b7-sg4nn\" (UID: \"efc43cb1-a1e5-4e46-98ed-071fd960a438\") " pod="openstack/dnsmasq-dns-55c76fd6b7-sg4nn" Mar 09 04:06:43 crc kubenswrapper[4901]: I0309 04:06:43.948139 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pcmt\" (UniqueName: \"kubernetes.io/projected/1f256e2a-9089-41fe-a06b-a44d5f53ac1c-kube-api-access-9pcmt\") pod \"dnsmasq-dns-c44667757-7bvgc\" (UID: \"1f256e2a-9089-41fe-a06b-a44d5f53ac1c\") " pod="openstack/dnsmasq-dns-c44667757-7bvgc" Mar 09 04:06:43 crc kubenswrapper[4901]: I0309 04:06:43.948180 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efc43cb1-a1e5-4e46-98ed-071fd960a438-config\") pod \"dnsmasq-dns-55c76fd6b7-sg4nn\" (UID: \"efc43cb1-a1e5-4e46-98ed-071fd960a438\") " pod="openstack/dnsmasq-dns-55c76fd6b7-sg4nn" Mar 09 04:06:43 crc kubenswrapper[4901]: I0309 04:06:43.948305 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f256e2a-9089-41fe-a06b-a44d5f53ac1c-config\") pod \"dnsmasq-dns-c44667757-7bvgc\" (UID: \"1f256e2a-9089-41fe-a06b-a44d5f53ac1c\") " pod="openstack/dnsmasq-dns-c44667757-7bvgc" Mar 09 04:06:43 crc kubenswrapper[4901]: I0309 04:06:43.948485 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/efc43cb1-a1e5-4e46-98ed-071fd960a438-dns-svc\") pod \"dnsmasq-dns-55c76fd6b7-sg4nn\" (UID: \"efc43cb1-a1e5-4e46-98ed-071fd960a438\") " pod="openstack/dnsmasq-dns-55c76fd6b7-sg4nn" Mar 09 04:06:44 crc kubenswrapper[4901]: I0309 04:06:44.049872 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xttzt\" (UniqueName: \"kubernetes.io/projected/efc43cb1-a1e5-4e46-98ed-071fd960a438-kube-api-access-xttzt\") pod \"dnsmasq-dns-55c76fd6b7-sg4nn\" (UID: \"efc43cb1-a1e5-4e46-98ed-071fd960a438\") " pod="openstack/dnsmasq-dns-55c76fd6b7-sg4nn" Mar 09 04:06:44 crc kubenswrapper[4901]: I0309 04:06:44.049925 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pcmt\" (UniqueName: \"kubernetes.io/projected/1f256e2a-9089-41fe-a06b-a44d5f53ac1c-kube-api-access-9pcmt\") pod \"dnsmasq-dns-c44667757-7bvgc\" (UID: \"1f256e2a-9089-41fe-a06b-a44d5f53ac1c\") " pod="openstack/dnsmasq-dns-c44667757-7bvgc" Mar 09 04:06:44 crc kubenswrapper[4901]: I0309 04:06:44.049957 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efc43cb1-a1e5-4e46-98ed-071fd960a438-config\") pod \"dnsmasq-dns-55c76fd6b7-sg4nn\" (UID: \"efc43cb1-a1e5-4e46-98ed-071fd960a438\") " pod="openstack/dnsmasq-dns-55c76fd6b7-sg4nn" Mar 09 04:06:44 crc kubenswrapper[4901]: I0309 04:06:44.049980 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f256e2a-9089-41fe-a06b-a44d5f53ac1c-config\") pod \"dnsmasq-dns-c44667757-7bvgc\" (UID: \"1f256e2a-9089-41fe-a06b-a44d5f53ac1c\") " pod="openstack/dnsmasq-dns-c44667757-7bvgc" Mar 09 04:06:44 crc kubenswrapper[4901]: I0309 04:06:44.050028 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/efc43cb1-a1e5-4e46-98ed-071fd960a438-dns-svc\") pod \"dnsmasq-dns-55c76fd6b7-sg4nn\" (UID: \"efc43cb1-a1e5-4e46-98ed-071fd960a438\") " pod="openstack/dnsmasq-dns-55c76fd6b7-sg4nn" Mar 09 04:06:44 crc kubenswrapper[4901]: I0309 04:06:44.050839 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/efc43cb1-a1e5-4e46-98ed-071fd960a438-dns-svc\") pod \"dnsmasq-dns-55c76fd6b7-sg4nn\" (UID: \"efc43cb1-a1e5-4e46-98ed-071fd960a438\") " pod="openstack/dnsmasq-dns-55c76fd6b7-sg4nn" Mar 09 04:06:44 crc kubenswrapper[4901]: I0309 04:06:44.051654 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efc43cb1-a1e5-4e46-98ed-071fd960a438-config\") pod \"dnsmasq-dns-55c76fd6b7-sg4nn\" (UID: \"efc43cb1-a1e5-4e46-98ed-071fd960a438\") " pod="openstack/dnsmasq-dns-55c76fd6b7-sg4nn" Mar 09 04:06:44 crc kubenswrapper[4901]: I0309 04:06:44.052181 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f256e2a-9089-41fe-a06b-a44d5f53ac1c-config\") pod \"dnsmasq-dns-c44667757-7bvgc\" (UID: \"1f256e2a-9089-41fe-a06b-a44d5f53ac1c\") " pod="openstack/dnsmasq-dns-c44667757-7bvgc" Mar 09 04:06:44 crc kubenswrapper[4901]: I0309 04:06:44.070359 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pcmt\" (UniqueName: \"kubernetes.io/projected/1f256e2a-9089-41fe-a06b-a44d5f53ac1c-kube-api-access-9pcmt\") pod \"dnsmasq-dns-c44667757-7bvgc\" (UID: \"1f256e2a-9089-41fe-a06b-a44d5f53ac1c\") " pod="openstack/dnsmasq-dns-c44667757-7bvgc" Mar 09 04:06:44 crc kubenswrapper[4901]: I0309 04:06:44.074134 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xttzt\" (UniqueName: \"kubernetes.io/projected/efc43cb1-a1e5-4e46-98ed-071fd960a438-kube-api-access-xttzt\") pod \"dnsmasq-dns-55c76fd6b7-sg4nn\" (UID: \"efc43cb1-a1e5-4e46-98ed-071fd960a438\") " pod="openstack/dnsmasq-dns-55c76fd6b7-sg4nn" Mar 09 04:06:44 crc kubenswrapper[4901]: I0309 04:06:44.181453 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c44667757-7bvgc" Mar 09 04:06:44 crc kubenswrapper[4901]: I0309 04:06:44.191676 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55c76fd6b7-sg4nn" Mar 09 04:06:44 crc kubenswrapper[4901]: I0309 04:06:44.241017 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55c76fd6b7-sg4nn"] Mar 09 04:06:44 crc kubenswrapper[4901]: I0309 04:06:44.271307 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5fb77f9685-mtds9"] Mar 09 04:06:44 crc kubenswrapper[4901]: I0309 04:06:44.273804 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fb77f9685-mtds9" Mar 09 04:06:44 crc kubenswrapper[4901]: I0309 04:06:44.282888 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fb77f9685-mtds9"] Mar 09 04:06:44 crc kubenswrapper[4901]: I0309 04:06:44.353685 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t48qf\" (UniqueName: \"kubernetes.io/projected/c57ee1f2-a2a5-45dd-9f0b-9eab37ae9b1f-kube-api-access-t48qf\") pod \"dnsmasq-dns-5fb77f9685-mtds9\" (UID: \"c57ee1f2-a2a5-45dd-9f0b-9eab37ae9b1f\") " pod="openstack/dnsmasq-dns-5fb77f9685-mtds9" Mar 09 04:06:44 crc kubenswrapper[4901]: I0309 04:06:44.354004 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c57ee1f2-a2a5-45dd-9f0b-9eab37ae9b1f-config\") pod \"dnsmasq-dns-5fb77f9685-mtds9\" (UID: \"c57ee1f2-a2a5-45dd-9f0b-9eab37ae9b1f\") " pod="openstack/dnsmasq-dns-5fb77f9685-mtds9" Mar 09 04:06:44 crc kubenswrapper[4901]: I0309 04:06:44.354032 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c57ee1f2-a2a5-45dd-9f0b-9eab37ae9b1f-dns-svc\") pod \"dnsmasq-dns-5fb77f9685-mtds9\" (UID: \"c57ee1f2-a2a5-45dd-9f0b-9eab37ae9b1f\") " pod="openstack/dnsmasq-dns-5fb77f9685-mtds9" Mar 09 04:06:44 crc kubenswrapper[4901]: I0309 04:06:44.457330 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t48qf\" (UniqueName: \"kubernetes.io/projected/c57ee1f2-a2a5-45dd-9f0b-9eab37ae9b1f-kube-api-access-t48qf\") pod \"dnsmasq-dns-5fb77f9685-mtds9\" (UID: \"c57ee1f2-a2a5-45dd-9f0b-9eab37ae9b1f\") " pod="openstack/dnsmasq-dns-5fb77f9685-mtds9" Mar 09 04:06:44 crc kubenswrapper[4901]: I0309 04:06:44.457375 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c57ee1f2-a2a5-45dd-9f0b-9eab37ae9b1f-config\") pod \"dnsmasq-dns-5fb77f9685-mtds9\" (UID: \"c57ee1f2-a2a5-45dd-9f0b-9eab37ae9b1f\") " pod="openstack/dnsmasq-dns-5fb77f9685-mtds9" Mar 09 04:06:44 crc kubenswrapper[4901]: I0309 04:06:44.457403 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c57ee1f2-a2a5-45dd-9f0b-9eab37ae9b1f-dns-svc\") pod \"dnsmasq-dns-5fb77f9685-mtds9\" (UID: \"c57ee1f2-a2a5-45dd-9f0b-9eab37ae9b1f\") " pod="openstack/dnsmasq-dns-5fb77f9685-mtds9" Mar 09 04:06:44 crc kubenswrapper[4901]: I0309 04:06:44.458312 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c57ee1f2-a2a5-45dd-9f0b-9eab37ae9b1f-dns-svc\") pod \"dnsmasq-dns-5fb77f9685-mtds9\" (UID: \"c57ee1f2-a2a5-45dd-9f0b-9eab37ae9b1f\") " pod="openstack/dnsmasq-dns-5fb77f9685-mtds9" Mar 09 04:06:44 crc kubenswrapper[4901]: I0309 04:06:44.458800 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c57ee1f2-a2a5-45dd-9f0b-9eab37ae9b1f-config\") pod \"dnsmasq-dns-5fb77f9685-mtds9\" (UID: \"c57ee1f2-a2a5-45dd-9f0b-9eab37ae9b1f\") " pod="openstack/dnsmasq-dns-5fb77f9685-mtds9" Mar 09 04:06:44 crc kubenswrapper[4901]: I0309 04:06:44.479478 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t48qf\" (UniqueName: \"kubernetes.io/projected/c57ee1f2-a2a5-45dd-9f0b-9eab37ae9b1f-kube-api-access-t48qf\") pod \"dnsmasq-dns-5fb77f9685-mtds9\" (UID: \"c57ee1f2-a2a5-45dd-9f0b-9eab37ae9b1f\") " pod="openstack/dnsmasq-dns-5fb77f9685-mtds9" Mar 09 04:06:44 crc kubenswrapper[4901]: I0309 04:06:44.553623 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vsmcd" Mar 09 04:06:44 crc kubenswrapper[4901]: I0309 04:06:44.553679 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vsmcd" Mar 09 04:06:44 crc kubenswrapper[4901]: I0309 04:06:44.556177 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c44667757-7bvgc"] Mar 09 04:06:44 crc kubenswrapper[4901]: I0309 04:06:44.601288 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-ff89b6977-8kf9k"] Mar 09 04:06:44 crc kubenswrapper[4901]: I0309 04:06:44.602797 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ff89b6977-8kf9k" Mar 09 04:06:44 crc kubenswrapper[4901]: I0309 04:06:44.608715 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-ff89b6977-8kf9k"] Mar 09 04:06:44 crc kubenswrapper[4901]: I0309 04:06:44.609383 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fb77f9685-mtds9" Mar 09 04:06:44 crc kubenswrapper[4901]: I0309 04:06:44.611732 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vsmcd" Mar 09 04:06:44 crc kubenswrapper[4901]: I0309 04:06:44.663143 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/274fc5c2-4ab4-47b9-b147-d0a47e22b70e-config\") pod \"dnsmasq-dns-ff89b6977-8kf9k\" (UID: \"274fc5c2-4ab4-47b9-b147-d0a47e22b70e\") " pod="openstack/dnsmasq-dns-ff89b6977-8kf9k" Mar 09 04:06:44 crc kubenswrapper[4901]: I0309 04:06:44.663241 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzp7n\" (UniqueName: \"kubernetes.io/projected/274fc5c2-4ab4-47b9-b147-d0a47e22b70e-kube-api-access-zzp7n\") pod \"dnsmasq-dns-ff89b6977-8kf9k\" (UID: \"274fc5c2-4ab4-47b9-b147-d0a47e22b70e\") " pod="openstack/dnsmasq-dns-ff89b6977-8kf9k" Mar 09 04:06:44 crc kubenswrapper[4901]: I0309 04:06:44.663310 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/274fc5c2-4ab4-47b9-b147-d0a47e22b70e-dns-svc\") pod \"dnsmasq-dns-ff89b6977-8kf9k\" (UID: \"274fc5c2-4ab4-47b9-b147-d0a47e22b70e\") " pod="openstack/dnsmasq-dns-ff89b6977-8kf9k" Mar 09 04:06:44 crc kubenswrapper[4901]: I0309 04:06:44.747271 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c44667757-7bvgc"] Mar 09 04:06:44 crc kubenswrapper[4901]: W0309 04:06:44.752958 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f256e2a_9089_41fe_a06b_a44d5f53ac1c.slice/crio-54d7c54b242398aa60fa17dbd4d0c855cc542053362b4b2b4997557735a3c3ce WatchSource:0}: Error finding container 54d7c54b242398aa60fa17dbd4d0c855cc542053362b4b2b4997557735a3c3ce: Status 404 returned error can't find the container with id 54d7c54b242398aa60fa17dbd4d0c855cc542053362b4b2b4997557735a3c3ce Mar 09 04:06:44 crc kubenswrapper[4901]: I0309 04:06:44.776527 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzp7n\" (UniqueName: \"kubernetes.io/projected/274fc5c2-4ab4-47b9-b147-d0a47e22b70e-kube-api-access-zzp7n\") pod \"dnsmasq-dns-ff89b6977-8kf9k\" (UID: \"274fc5c2-4ab4-47b9-b147-d0a47e22b70e\") " pod="openstack/dnsmasq-dns-ff89b6977-8kf9k" Mar 09 04:06:44 crc kubenswrapper[4901]: I0309 04:06:44.776689 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/274fc5c2-4ab4-47b9-b147-d0a47e22b70e-dns-svc\") pod \"dnsmasq-dns-ff89b6977-8kf9k\" (UID: \"274fc5c2-4ab4-47b9-b147-d0a47e22b70e\") " pod="openstack/dnsmasq-dns-ff89b6977-8kf9k" Mar 09 04:06:44 crc kubenswrapper[4901]: I0309 04:06:44.776811 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/274fc5c2-4ab4-47b9-b147-d0a47e22b70e-config\") pod \"dnsmasq-dns-ff89b6977-8kf9k\" (UID: \"274fc5c2-4ab4-47b9-b147-d0a47e22b70e\") " pod="openstack/dnsmasq-dns-ff89b6977-8kf9k" Mar 09 04:06:44 crc kubenswrapper[4901]: I0309 04:06:44.777600 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/274fc5c2-4ab4-47b9-b147-d0a47e22b70e-config\") pod \"dnsmasq-dns-ff89b6977-8kf9k\" (UID: \"274fc5c2-4ab4-47b9-b147-d0a47e22b70e\") " pod="openstack/dnsmasq-dns-ff89b6977-8kf9k" Mar 09 04:06:44 crc kubenswrapper[4901]: I0309 04:06:44.778437 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/274fc5c2-4ab4-47b9-b147-d0a47e22b70e-dns-svc\") pod \"dnsmasq-dns-ff89b6977-8kf9k\" (UID: \"274fc5c2-4ab4-47b9-b147-d0a47e22b70e\") " pod="openstack/dnsmasq-dns-ff89b6977-8kf9k" Mar 09 04:06:44 crc kubenswrapper[4901]: I0309 04:06:44.814289 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzp7n\" (UniqueName: \"kubernetes.io/projected/274fc5c2-4ab4-47b9-b147-d0a47e22b70e-kube-api-access-zzp7n\") pod \"dnsmasq-dns-ff89b6977-8kf9k\" (UID: \"274fc5c2-4ab4-47b9-b147-d0a47e22b70e\") " pod="openstack/dnsmasq-dns-ff89b6977-8kf9k" Mar 09 04:06:44 crc kubenswrapper[4901]: I0309 04:06:44.827982 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55c76fd6b7-sg4nn"] Mar 09 04:06:44 crc kubenswrapper[4901]: I0309 04:06:44.936502 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ff89b6977-8kf9k" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.045712 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c44667757-7bvgc" event={"ID":"1f256e2a-9089-41fe-a06b-a44d5f53ac1c","Type":"ContainerStarted","Data":"54d7c54b242398aa60fa17dbd4d0c855cc542053362b4b2b4997557735a3c3ce"} Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.050033 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55c76fd6b7-sg4nn" event={"ID":"efc43cb1-a1e5-4e46-98ed-071fd960a438","Type":"ContainerStarted","Data":"66f326bbbfc5f0797d1e5974795313d7ffca44cad786002322f9758403fa07d4"} Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.111660 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vsmcd" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.169036 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vsmcd"] Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.256201 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fb77f9685-mtds9"] Mar 09 04:06:45 crc kubenswrapper[4901]: W0309 04:06:45.281317 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc57ee1f2_a2a5_45dd_9f0b_9eab37ae9b1f.slice/crio-6da675dd11cbf9058b6de1fe5a3f9df945512678c1afc5c67904dadef5c2f6b0 WatchSource:0}: Error finding container 6da675dd11cbf9058b6de1fe5a3f9df945512678c1afc5c67904dadef5c2f6b0: Status 404 returned error can't find the container with id 6da675dd11cbf9058b6de1fe5a3f9df945512678c1afc5c67904dadef5c2f6b0 Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.417250 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.418795 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.423456 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-ff89b6977-8kf9k"] Mar 09 04:06:45 crc kubenswrapper[4901]: W0309 04:06:45.426291 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod274fc5c2_4ab4_47b9_b147_d0a47e22b70e.slice/crio-c3129d9e8558c167184340d0f1380f89af9d0f53d86f11c598847d18dfa32e35 WatchSource:0}: Error finding container c3129d9e8558c167184340d0f1380f89af9d0f53d86f11c598847d18dfa32e35: Status 404 returned error can't find the container with id c3129d9e8558c167184340d0f1380f89af9d0f53d86f11c598847d18dfa32e35 Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.428847 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.429179 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-cptq8" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.429057 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.429099 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.429147 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.429650 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.429834 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.450931 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.489682 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-08e0f60e-9de0-4892-8d18-ad06b926d79b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-08e0f60e-9de0-4892-8d18-ad06b926d79b\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2b9c342-0b0e-486b-bc0c-2abd0319879d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.489732 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a2b9c342-0b0e-486b-bc0c-2abd0319879d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2b9c342-0b0e-486b-bc0c-2abd0319879d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.489750 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a2b9c342-0b0e-486b-bc0c-2abd0319879d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2b9c342-0b0e-486b-bc0c-2abd0319879d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.489772 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a2b9c342-0b0e-486b-bc0c-2abd0319879d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2b9c342-0b0e-486b-bc0c-2abd0319879d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.489794 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a2b9c342-0b0e-486b-bc0c-2abd0319879d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2b9c342-0b0e-486b-bc0c-2abd0319879d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.489815 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a2b9c342-0b0e-486b-bc0c-2abd0319879d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2b9c342-0b0e-486b-bc0c-2abd0319879d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.489839 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a2b9c342-0b0e-486b-bc0c-2abd0319879d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2b9c342-0b0e-486b-bc0c-2abd0319879d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.489860 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a2b9c342-0b0e-486b-bc0c-2abd0319879d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2b9c342-0b0e-486b-bc0c-2abd0319879d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.489882 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwxsw\" (UniqueName: \"kubernetes.io/projected/a2b9c342-0b0e-486b-bc0c-2abd0319879d-kube-api-access-dwxsw\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2b9c342-0b0e-486b-bc0c-2abd0319879d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.489901 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a2b9c342-0b0e-486b-bc0c-2abd0319879d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2b9c342-0b0e-486b-bc0c-2abd0319879d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.489931 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a2b9c342-0b0e-486b-bc0c-2abd0319879d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2b9c342-0b0e-486b-bc0c-2abd0319879d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.590642 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a2b9c342-0b0e-486b-bc0c-2abd0319879d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2b9c342-0b0e-486b-bc0c-2abd0319879d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.590691 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-08e0f60e-9de0-4892-8d18-ad06b926d79b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-08e0f60e-9de0-4892-8d18-ad06b926d79b\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2b9c342-0b0e-486b-bc0c-2abd0319879d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.590725 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a2b9c342-0b0e-486b-bc0c-2abd0319879d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2b9c342-0b0e-486b-bc0c-2abd0319879d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.590742 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a2b9c342-0b0e-486b-bc0c-2abd0319879d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2b9c342-0b0e-486b-bc0c-2abd0319879d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.590764 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a2b9c342-0b0e-486b-bc0c-2abd0319879d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2b9c342-0b0e-486b-bc0c-2abd0319879d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.590783 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a2b9c342-0b0e-486b-bc0c-2abd0319879d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2b9c342-0b0e-486b-bc0c-2abd0319879d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.590802 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a2b9c342-0b0e-486b-bc0c-2abd0319879d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2b9c342-0b0e-486b-bc0c-2abd0319879d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.590827 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a2b9c342-0b0e-486b-bc0c-2abd0319879d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2b9c342-0b0e-486b-bc0c-2abd0319879d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.590848 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a2b9c342-0b0e-486b-bc0c-2abd0319879d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2b9c342-0b0e-486b-bc0c-2abd0319879d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.590869 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwxsw\" (UniqueName: \"kubernetes.io/projected/a2b9c342-0b0e-486b-bc0c-2abd0319879d-kube-api-access-dwxsw\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2b9c342-0b0e-486b-bc0c-2abd0319879d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.590886 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a2b9c342-0b0e-486b-bc0c-2abd0319879d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2b9c342-0b0e-486b-bc0c-2abd0319879d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.591596 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a2b9c342-0b0e-486b-bc0c-2abd0319879d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2b9c342-0b0e-486b-bc0c-2abd0319879d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.592419 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a2b9c342-0b0e-486b-bc0c-2abd0319879d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2b9c342-0b0e-486b-bc0c-2abd0319879d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.592525 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a2b9c342-0b0e-486b-bc0c-2abd0319879d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2b9c342-0b0e-486b-bc0c-2abd0319879d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.592719 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a2b9c342-0b0e-486b-bc0c-2abd0319879d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2b9c342-0b0e-486b-bc0c-2abd0319879d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.592806 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a2b9c342-0b0e-486b-bc0c-2abd0319879d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2b9c342-0b0e-486b-bc0c-2abd0319879d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.593376 4901 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.593416 4901 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-08e0f60e-9de0-4892-8d18-ad06b926d79b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-08e0f60e-9de0-4892-8d18-ad06b926d79b\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2b9c342-0b0e-486b-bc0c-2abd0319879d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6bf618cbd06e1420d56660ed6d2a8f0cd7254ca02606d9ff3d5918d03ec1f102/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.598135 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a2b9c342-0b0e-486b-bc0c-2abd0319879d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2b9c342-0b0e-486b-bc0c-2abd0319879d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.598299 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a2b9c342-0b0e-486b-bc0c-2abd0319879d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2b9c342-0b0e-486b-bc0c-2abd0319879d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.598678 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a2b9c342-0b0e-486b-bc0c-2abd0319879d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2b9c342-0b0e-486b-bc0c-2abd0319879d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.611580 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a2b9c342-0b0e-486b-bc0c-2abd0319879d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2b9c342-0b0e-486b-bc0c-2abd0319879d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.612601 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwxsw\" (UniqueName: \"kubernetes.io/projected/a2b9c342-0b0e-486b-bc0c-2abd0319879d-kube-api-access-dwxsw\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2b9c342-0b0e-486b-bc0c-2abd0319879d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.620774 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-08e0f60e-9de0-4892-8d18-ad06b926d79b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-08e0f60e-9de0-4892-8d18-ad06b926d79b\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2b9c342-0b0e-486b-bc0c-2abd0319879d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.739461 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.779023 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.781552 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.785429 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.785705 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.785835 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.785979 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-w67xx" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.786912 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.787313 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.788329 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.791446 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.895970 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c41f7ab2-8b3d-4fac-84b0-6c884673bce9-config-data\") pod \"rabbitmq-server-0\" (UID: \"c41f7ab2-8b3d-4fac-84b0-6c884673bce9\") " pod="openstack/rabbitmq-server-0" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.896056 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c41f7ab2-8b3d-4fac-84b0-6c884673bce9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c41f7ab2-8b3d-4fac-84b0-6c884673bce9\") " pod="openstack/rabbitmq-server-0" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.896156 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c41f7ab2-8b3d-4fac-84b0-6c884673bce9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c41f7ab2-8b3d-4fac-84b0-6c884673bce9\") " pod="openstack/rabbitmq-server-0" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.896316 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c41f7ab2-8b3d-4fac-84b0-6c884673bce9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c41f7ab2-8b3d-4fac-84b0-6c884673bce9\") " pod="openstack/rabbitmq-server-0" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.896454 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c41f7ab2-8b3d-4fac-84b0-6c884673bce9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c41f7ab2-8b3d-4fac-84b0-6c884673bce9\") " pod="openstack/rabbitmq-server-0" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.896562 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c41f7ab2-8b3d-4fac-84b0-6c884673bce9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c41f7ab2-8b3d-4fac-84b0-6c884673bce9\") " pod="openstack/rabbitmq-server-0" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.896611 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c41f7ab2-8b3d-4fac-84b0-6c884673bce9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c41f7ab2-8b3d-4fac-84b0-6c884673bce9\") " pod="openstack/rabbitmq-server-0" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.896656 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c41f7ab2-8b3d-4fac-84b0-6c884673bce9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c41f7ab2-8b3d-4fac-84b0-6c884673bce9\") " pod="openstack/rabbitmq-server-0" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.896675 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qw9jk\" (UniqueName: \"kubernetes.io/projected/c41f7ab2-8b3d-4fac-84b0-6c884673bce9-kube-api-access-qw9jk\") pod \"rabbitmq-server-0\" (UID: \"c41f7ab2-8b3d-4fac-84b0-6c884673bce9\") " pod="openstack/rabbitmq-server-0" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.896736 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c41f7ab2-8b3d-4fac-84b0-6c884673bce9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c41f7ab2-8b3d-4fac-84b0-6c884673bce9\") " pod="openstack/rabbitmq-server-0" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.896755 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5f37a386-b7fa-4103-b738-f202db5aac24\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5f37a386-b7fa-4103-b738-f202db5aac24\") pod \"rabbitmq-server-0\" (UID: \"c41f7ab2-8b3d-4fac-84b0-6c884673bce9\") " pod="openstack/rabbitmq-server-0" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.999403 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c41f7ab2-8b3d-4fac-84b0-6c884673bce9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c41f7ab2-8b3d-4fac-84b0-6c884673bce9\") " pod="openstack/rabbitmq-server-0" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.999448 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c41f7ab2-8b3d-4fac-84b0-6c884673bce9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c41f7ab2-8b3d-4fac-84b0-6c884673bce9\") " pod="openstack/rabbitmq-server-0" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.999474 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c41f7ab2-8b3d-4fac-84b0-6c884673bce9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c41f7ab2-8b3d-4fac-84b0-6c884673bce9\") " pod="openstack/rabbitmq-server-0" Mar 09 04:06:45 crc kubenswrapper[4901]: I0309 04:06:45.999490 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qw9jk\" (UniqueName: \"kubernetes.io/projected/c41f7ab2-8b3d-4fac-84b0-6c884673bce9-kube-api-access-qw9jk\") pod \"rabbitmq-server-0\" (UID: \"c41f7ab2-8b3d-4fac-84b0-6c884673bce9\") " pod="openstack/rabbitmq-server-0" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:45.999518 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c41f7ab2-8b3d-4fac-84b0-6c884673bce9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c41f7ab2-8b3d-4fac-84b0-6c884673bce9\") " pod="openstack/rabbitmq-server-0" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:45.999545 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5f37a386-b7fa-4103-b738-f202db5aac24\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5f37a386-b7fa-4103-b738-f202db5aac24\") pod \"rabbitmq-server-0\" (UID: \"c41f7ab2-8b3d-4fac-84b0-6c884673bce9\") " pod="openstack/rabbitmq-server-0" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:45.999571 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c41f7ab2-8b3d-4fac-84b0-6c884673bce9-config-data\") pod \"rabbitmq-server-0\" (UID: \"c41f7ab2-8b3d-4fac-84b0-6c884673bce9\") " pod="openstack/rabbitmq-server-0" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:45.999622 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c41f7ab2-8b3d-4fac-84b0-6c884673bce9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c41f7ab2-8b3d-4fac-84b0-6c884673bce9\") " pod="openstack/rabbitmq-server-0" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.000492 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c41f7ab2-8b3d-4fac-84b0-6c884673bce9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c41f7ab2-8b3d-4fac-84b0-6c884673bce9\") " pod="openstack/rabbitmq-server-0" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.001680 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c41f7ab2-8b3d-4fac-84b0-6c884673bce9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c41f7ab2-8b3d-4fac-84b0-6c884673bce9\") " pod="openstack/rabbitmq-server-0" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.001737 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c41f7ab2-8b3d-4fac-84b0-6c884673bce9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c41f7ab2-8b3d-4fac-84b0-6c884673bce9\") " pod="openstack/rabbitmq-server-0" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.001783 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c41f7ab2-8b3d-4fac-84b0-6c884673bce9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c41f7ab2-8b3d-4fac-84b0-6c884673bce9\") " pod="openstack/rabbitmq-server-0" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.002291 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c41f7ab2-8b3d-4fac-84b0-6c884673bce9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c41f7ab2-8b3d-4fac-84b0-6c884673bce9\") " pod="openstack/rabbitmq-server-0" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.002367 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c41f7ab2-8b3d-4fac-84b0-6c884673bce9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c41f7ab2-8b3d-4fac-84b0-6c884673bce9\") " pod="openstack/rabbitmq-server-0" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.002680 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c41f7ab2-8b3d-4fac-84b0-6c884673bce9-config-data\") pod \"rabbitmq-server-0\" (UID: \"c41f7ab2-8b3d-4fac-84b0-6c884673bce9\") " pod="openstack/rabbitmq-server-0" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.003021 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c41f7ab2-8b3d-4fac-84b0-6c884673bce9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c41f7ab2-8b3d-4fac-84b0-6c884673bce9\") " pod="openstack/rabbitmq-server-0" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.003660 4901 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.003699 4901 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5f37a386-b7fa-4103-b738-f202db5aac24\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5f37a386-b7fa-4103-b738-f202db5aac24\") pod \"rabbitmq-server-0\" (UID: \"c41f7ab2-8b3d-4fac-84b0-6c884673bce9\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5412389fc365b63b1ccd7e3b2d0bef28e36381a956388f06746ecca5231e1554/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.006809 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c41f7ab2-8b3d-4fac-84b0-6c884673bce9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c41f7ab2-8b3d-4fac-84b0-6c884673bce9\") " pod="openstack/rabbitmq-server-0" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.007360 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c41f7ab2-8b3d-4fac-84b0-6c884673bce9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c41f7ab2-8b3d-4fac-84b0-6c884673bce9\") " pod="openstack/rabbitmq-server-0" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.008085 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c41f7ab2-8b3d-4fac-84b0-6c884673bce9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c41f7ab2-8b3d-4fac-84b0-6c884673bce9\") " pod="openstack/rabbitmq-server-0" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.008396 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c41f7ab2-8b3d-4fac-84b0-6c884673bce9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c41f7ab2-8b3d-4fac-84b0-6c884673bce9\") " pod="openstack/rabbitmq-server-0" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.020406 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qw9jk\" (UniqueName: \"kubernetes.io/projected/c41f7ab2-8b3d-4fac-84b0-6c884673bce9-kube-api-access-qw9jk\") pod \"rabbitmq-server-0\" (UID: \"c41f7ab2-8b3d-4fac-84b0-6c884673bce9\") " pod="openstack/rabbitmq-server-0" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.035943 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5f37a386-b7fa-4103-b738-f202db5aac24\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5f37a386-b7fa-4103-b738-f202db5aac24\") pod \"rabbitmq-server-0\" (UID: \"c41f7ab2-8b3d-4fac-84b0-6c884673bce9\") " pod="openstack/rabbitmq-server-0" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.061517 4901 generic.go:334] "Generic (PLEG): container finished" podID="274fc5c2-4ab4-47b9-b147-d0a47e22b70e" containerID="5beda19574c2f6440ef021be2906dcd6db84cab953e4edf99f91a06eef67d595" exitCode=0 Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.061565 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ff89b6977-8kf9k" event={"ID":"274fc5c2-4ab4-47b9-b147-d0a47e22b70e","Type":"ContainerDied","Data":"5beda19574c2f6440ef021be2906dcd6db84cab953e4edf99f91a06eef67d595"} Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.061615 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ff89b6977-8kf9k" event={"ID":"274fc5c2-4ab4-47b9-b147-d0a47e22b70e","Type":"ContainerStarted","Data":"c3129d9e8558c167184340d0f1380f89af9d0f53d86f11c598847d18dfa32e35"} Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.063064 4901 generic.go:334] "Generic (PLEG): container finished" podID="1f256e2a-9089-41fe-a06b-a44d5f53ac1c" containerID="0abd78b78fecbdc942dbaa2a4d76bf4e5892e555d35373367091b0065bba5435" exitCode=0 Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.063112 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c44667757-7bvgc" event={"ID":"1f256e2a-9089-41fe-a06b-a44d5f53ac1c","Type":"ContainerDied","Data":"0abd78b78fecbdc942dbaa2a4d76bf4e5892e555d35373367091b0065bba5435"} Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.065716 4901 generic.go:334] "Generic (PLEG): container finished" podID="c57ee1f2-a2a5-45dd-9f0b-9eab37ae9b1f" containerID="6d34726aabc155a18738220e8f98f586e26a50f272162b3a169e255e6c92bed0" exitCode=0 Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.065774 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fb77f9685-mtds9" event={"ID":"c57ee1f2-a2a5-45dd-9f0b-9eab37ae9b1f","Type":"ContainerDied","Data":"6d34726aabc155a18738220e8f98f586e26a50f272162b3a169e255e6c92bed0"} Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.065796 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fb77f9685-mtds9" event={"ID":"c57ee1f2-a2a5-45dd-9f0b-9eab37ae9b1f","Type":"ContainerStarted","Data":"6da675dd11cbf9058b6de1fe5a3f9df945512678c1afc5c67904dadef5c2f6b0"} Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.068841 4901 generic.go:334] "Generic (PLEG): container finished" podID="efc43cb1-a1e5-4e46-98ed-071fd960a438" containerID="1d681f8fddea43684e9e4509a52b1fb7a90e8fb6f8d23b8ac86fc1f2c46983ca" exitCode=0 Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.069314 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55c76fd6b7-sg4nn" event={"ID":"efc43cb1-a1e5-4e46-98ed-071fd960a438","Type":"ContainerDied","Data":"1d681f8fddea43684e9e4509a52b1fb7a90e8fb6f8d23b8ac86fc1f2c46983ca"} Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.106085 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.205811 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 04:06:46 crc kubenswrapper[4901]: W0309 04:06:46.221370 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2b9c342_0b0e_486b_bc0c_2abd0319879d.slice/crio-5bb5463bbbdca605f84adaabebc5a703a00a739549f4156758df9372b2fce0e5 WatchSource:0}: Error finding container 5bb5463bbbdca605f84adaabebc5a703a00a739549f4156758df9372b2fce0e5: Status 404 returned error can't find the container with id 5bb5463bbbdca605f84adaabebc5a703a00a739549f4156758df9372b2fce0e5 Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.317618 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c44667757-7bvgc" Mar 09 04:06:46 crc kubenswrapper[4901]: E0309 04:06:46.407896 4901 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Mar 09 04:06:46 crc kubenswrapper[4901]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/c57ee1f2-a2a5-45dd-9f0b-9eab37ae9b1f/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 09 04:06:46 crc kubenswrapper[4901]: > podSandboxID="6da675dd11cbf9058b6de1fe5a3f9df945512678c1afc5c67904dadef5c2f6b0" Mar 09 04:06:46 crc kubenswrapper[4901]: E0309 04:06:46.408059 4901 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 09 04:06:46 crc kubenswrapper[4901]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nb6hc5h68h68h594h659hdbh679h65ch5f6hdch6h5b9h8fh55hfhf8h57fhc7h56ch687h669h559h678h5dhc7hf7h697h5d6h9ch669h54fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t48qf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5fb77f9685-mtds9_openstack(c57ee1f2-a2a5-45dd-9f0b-9eab37ae9b1f): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/c57ee1f2-a2a5-45dd-9f0b-9eab37ae9b1f/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 09 04:06:46 crc kubenswrapper[4901]: > logger="UnhandledError" Mar 09 04:06:46 crc kubenswrapper[4901]: E0309 04:06:46.410360 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/c57ee1f2-a2a5-45dd-9f0b-9eab37ae9b1f/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-5fb77f9685-mtds9" podUID="c57ee1f2-a2a5-45dd-9f0b-9eab37ae9b1f" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.414015 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f256e2a-9089-41fe-a06b-a44d5f53ac1c-config\") pod \"1f256e2a-9089-41fe-a06b-a44d5f53ac1c\" (UID: \"1f256e2a-9089-41fe-a06b-a44d5f53ac1c\") " Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.414135 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pcmt\" (UniqueName: \"kubernetes.io/projected/1f256e2a-9089-41fe-a06b-a44d5f53ac1c-kube-api-access-9pcmt\") pod \"1f256e2a-9089-41fe-a06b-a44d5f53ac1c\" (UID: \"1f256e2a-9089-41fe-a06b-a44d5f53ac1c\") " Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.418742 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f256e2a-9089-41fe-a06b-a44d5f53ac1c-kube-api-access-9pcmt" (OuterVolumeSpecName: "kube-api-access-9pcmt") pod "1f256e2a-9089-41fe-a06b-a44d5f53ac1c" (UID: "1f256e2a-9089-41fe-a06b-a44d5f53ac1c"). InnerVolumeSpecName "kube-api-access-9pcmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.431839 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f256e2a-9089-41fe-a06b-a44d5f53ac1c-config" (OuterVolumeSpecName: "config") pod "1f256e2a-9089-41fe-a06b-a44d5f53ac1c" (UID: "1f256e2a-9089-41fe-a06b-a44d5f53ac1c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.461044 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55c76fd6b7-sg4nn" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.515892 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/efc43cb1-a1e5-4e46-98ed-071fd960a438-dns-svc\") pod \"efc43cb1-a1e5-4e46-98ed-071fd960a438\" (UID: \"efc43cb1-a1e5-4e46-98ed-071fd960a438\") " Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.515963 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efc43cb1-a1e5-4e46-98ed-071fd960a438-config\") pod \"efc43cb1-a1e5-4e46-98ed-071fd960a438\" (UID: \"efc43cb1-a1e5-4e46-98ed-071fd960a438\") " Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.515997 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xttzt\" (UniqueName: \"kubernetes.io/projected/efc43cb1-a1e5-4e46-98ed-071fd960a438-kube-api-access-xttzt\") pod \"efc43cb1-a1e5-4e46-98ed-071fd960a438\" (UID: \"efc43cb1-a1e5-4e46-98ed-071fd960a438\") " Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.516502 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f256e2a-9089-41fe-a06b-a44d5f53ac1c-config\") on node \"crc\" DevicePath \"\"" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.516520 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pcmt\" (UniqueName: \"kubernetes.io/projected/1f256e2a-9089-41fe-a06b-a44d5f53ac1c-kube-api-access-9pcmt\") on node \"crc\" DevicePath \"\"" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.519782 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efc43cb1-a1e5-4e46-98ed-071fd960a438-kube-api-access-xttzt" (OuterVolumeSpecName: "kube-api-access-xttzt") pod "efc43cb1-a1e5-4e46-98ed-071fd960a438" (UID: "efc43cb1-a1e5-4e46-98ed-071fd960a438"). InnerVolumeSpecName "kube-api-access-xttzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.535816 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efc43cb1-a1e5-4e46-98ed-071fd960a438-config" (OuterVolumeSpecName: "config") pod "efc43cb1-a1e5-4e46-98ed-071fd960a438" (UID: "efc43cb1-a1e5-4e46-98ed-071fd960a438"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.537758 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 09 04:06:46 crc kubenswrapper[4901]: E0309 04:06:46.538116 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efc43cb1-a1e5-4e46-98ed-071fd960a438" containerName="init" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.538130 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="efc43cb1-a1e5-4e46-98ed-071fd960a438" containerName="init" Mar 09 04:06:46 crc kubenswrapper[4901]: E0309 04:06:46.538147 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f256e2a-9089-41fe-a06b-a44d5f53ac1c" containerName="init" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.538154 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f256e2a-9089-41fe-a06b-a44d5f53ac1c" containerName="init" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.538298 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f256e2a-9089-41fe-a06b-a44d5f53ac1c" containerName="init" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.538308 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="efc43cb1-a1e5-4e46-98ed-071fd960a438" containerName="init" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.539110 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.545550 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.545584 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-j794b" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.545638 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.545703 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.548121 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.551882 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efc43cb1-a1e5-4e46-98ed-071fd960a438-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "efc43cb1-a1e5-4e46-98ed-071fd960a438" (UID: "efc43cb1-a1e5-4e46-98ed-071fd960a438"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.555053 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.617564 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-9b1ce996-6c66-4cd5-a3cd-31e0a8389122\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9b1ce996-6c66-4cd5-a3cd-31e0a8389122\") pod \"openstack-galera-0\" (UID: \"c28b6405-4576-4ec2-b596-db1e1a35d148\") " pod="openstack/openstack-galera-0" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.617638 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c28b6405-4576-4ec2-b596-db1e1a35d148-config-data-default\") pod \"openstack-galera-0\" (UID: \"c28b6405-4576-4ec2-b596-db1e1a35d148\") " pod="openstack/openstack-galera-0" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.617667 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c28b6405-4576-4ec2-b596-db1e1a35d148-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"c28b6405-4576-4ec2-b596-db1e1a35d148\") " pod="openstack/openstack-galera-0" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.617705 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c28b6405-4576-4ec2-b596-db1e1a35d148-operator-scripts\") pod \"openstack-galera-0\" (UID: \"c28b6405-4576-4ec2-b596-db1e1a35d148\") " pod="openstack/openstack-galera-0" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.617872 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c28b6405-4576-4ec2-b596-db1e1a35d148-config-data-generated\") pod \"openstack-galera-0\" (UID: \"c28b6405-4576-4ec2-b596-db1e1a35d148\") " pod="openstack/openstack-galera-0" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.618281 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c28b6405-4576-4ec2-b596-db1e1a35d148-kolla-config\") pod \"openstack-galera-0\" (UID: \"c28b6405-4576-4ec2-b596-db1e1a35d148\") " pod="openstack/openstack-galera-0" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.618429 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xknnd\" (UniqueName: \"kubernetes.io/projected/c28b6405-4576-4ec2-b596-db1e1a35d148-kube-api-access-xknnd\") pod \"openstack-galera-0\" (UID: \"c28b6405-4576-4ec2-b596-db1e1a35d148\") " pod="openstack/openstack-galera-0" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.618514 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c28b6405-4576-4ec2-b596-db1e1a35d148-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"c28b6405-4576-4ec2-b596-db1e1a35d148\") " pod="openstack/openstack-galera-0" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.618677 4901 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/efc43cb1-a1e5-4e46-98ed-071fd960a438-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.618700 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efc43cb1-a1e5-4e46-98ed-071fd960a438-config\") on node \"crc\" DevicePath \"\"" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.618717 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xttzt\" (UniqueName: \"kubernetes.io/projected/efc43cb1-a1e5-4e46-98ed-071fd960a438-kube-api-access-xttzt\") on node \"crc\" DevicePath \"\"" Mar 09 04:06:46 crc kubenswrapper[4901]: W0309 04:06:46.688752 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc41f7ab2_8b3d_4fac_84b0_6c884673bce9.slice/crio-842f9dfb9f2044df8dd7480009a8f9a00fe2084f3ee56b3b8c901355cb7d0692 WatchSource:0}: Error finding container 842f9dfb9f2044df8dd7480009a8f9a00fe2084f3ee56b3b8c901355cb7d0692: Status 404 returned error can't find the container with id 842f9dfb9f2044df8dd7480009a8f9a00fe2084f3ee56b3b8c901355cb7d0692 Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.695913 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.719448 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xknnd\" (UniqueName: \"kubernetes.io/projected/c28b6405-4576-4ec2-b596-db1e1a35d148-kube-api-access-xknnd\") pod \"openstack-galera-0\" (UID: \"c28b6405-4576-4ec2-b596-db1e1a35d148\") " pod="openstack/openstack-galera-0" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.719494 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c28b6405-4576-4ec2-b596-db1e1a35d148-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"c28b6405-4576-4ec2-b596-db1e1a35d148\") " pod="openstack/openstack-galera-0" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.719529 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-9b1ce996-6c66-4cd5-a3cd-31e0a8389122\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9b1ce996-6c66-4cd5-a3cd-31e0a8389122\") pod \"openstack-galera-0\" (UID: \"c28b6405-4576-4ec2-b596-db1e1a35d148\") " pod="openstack/openstack-galera-0" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.719559 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c28b6405-4576-4ec2-b596-db1e1a35d148-config-data-default\") pod \"openstack-galera-0\" (UID: \"c28b6405-4576-4ec2-b596-db1e1a35d148\") " pod="openstack/openstack-galera-0" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.719574 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c28b6405-4576-4ec2-b596-db1e1a35d148-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"c28b6405-4576-4ec2-b596-db1e1a35d148\") " pod="openstack/openstack-galera-0" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.719597 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c28b6405-4576-4ec2-b596-db1e1a35d148-operator-scripts\") pod \"openstack-galera-0\" (UID: \"c28b6405-4576-4ec2-b596-db1e1a35d148\") " pod="openstack/openstack-galera-0" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.719619 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c28b6405-4576-4ec2-b596-db1e1a35d148-config-data-generated\") pod \"openstack-galera-0\" (UID: \"c28b6405-4576-4ec2-b596-db1e1a35d148\") " pod="openstack/openstack-galera-0" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.719667 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c28b6405-4576-4ec2-b596-db1e1a35d148-kolla-config\") pod \"openstack-galera-0\" (UID: \"c28b6405-4576-4ec2-b596-db1e1a35d148\") " pod="openstack/openstack-galera-0" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.720211 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c28b6405-4576-4ec2-b596-db1e1a35d148-kolla-config\") pod \"openstack-galera-0\" (UID: \"c28b6405-4576-4ec2-b596-db1e1a35d148\") " pod="openstack/openstack-galera-0" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.722527 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c28b6405-4576-4ec2-b596-db1e1a35d148-config-data-generated\") pod \"openstack-galera-0\" (UID: \"c28b6405-4576-4ec2-b596-db1e1a35d148\") " pod="openstack/openstack-galera-0" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.723833 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c28b6405-4576-4ec2-b596-db1e1a35d148-config-data-default\") pod \"openstack-galera-0\" (UID: \"c28b6405-4576-4ec2-b596-db1e1a35d148\") " pod="openstack/openstack-galera-0" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.724189 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c28b6405-4576-4ec2-b596-db1e1a35d148-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"c28b6405-4576-4ec2-b596-db1e1a35d148\") " pod="openstack/openstack-galera-0" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.725550 4901 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.725632 4901 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-9b1ce996-6c66-4cd5-a3cd-31e0a8389122\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9b1ce996-6c66-4cd5-a3cd-31e0a8389122\") pod \"openstack-galera-0\" (UID: \"c28b6405-4576-4ec2-b596-db1e1a35d148\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3e54ae82878822586cfdf38e022ada89c6b79df7d4eea5e8a01225c133af3060/globalmount\"" pod="openstack/openstack-galera-0" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.728186 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c28b6405-4576-4ec2-b596-db1e1a35d148-operator-scripts\") pod \"openstack-galera-0\" (UID: \"c28b6405-4576-4ec2-b596-db1e1a35d148\") " pod="openstack/openstack-galera-0" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.734668 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c28b6405-4576-4ec2-b596-db1e1a35d148-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"c28b6405-4576-4ec2-b596-db1e1a35d148\") " pod="openstack/openstack-galera-0" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.737602 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xknnd\" (UniqueName: \"kubernetes.io/projected/c28b6405-4576-4ec2-b596-db1e1a35d148-kube-api-access-xknnd\") pod \"openstack-galera-0\" (UID: \"c28b6405-4576-4ec2-b596-db1e1a35d148\") " pod="openstack/openstack-galera-0" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.766353 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-9b1ce996-6c66-4cd5-a3cd-31e0a8389122\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9b1ce996-6c66-4cd5-a3cd-31e0a8389122\") pod \"openstack-galera-0\" (UID: \"c28b6405-4576-4ec2-b596-db1e1a35d148\") " pod="openstack/openstack-galera-0" Mar 09 04:06:46 crc kubenswrapper[4901]: I0309 04:06:46.862752 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 09 04:06:47 crc kubenswrapper[4901]: I0309 04:06:47.090257 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55c76fd6b7-sg4nn" event={"ID":"efc43cb1-a1e5-4e46-98ed-071fd960a438","Type":"ContainerDied","Data":"66f326bbbfc5f0797d1e5974795313d7ffca44cad786002322f9758403fa07d4"} Mar 09 04:06:47 crc kubenswrapper[4901]: I0309 04:06:47.090607 4901 scope.go:117] "RemoveContainer" containerID="1d681f8fddea43684e9e4509a52b1fb7a90e8fb6f8d23b8ac86fc1f2c46983ca" Mar 09 04:06:47 crc kubenswrapper[4901]: I0309 04:06:47.090312 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55c76fd6b7-sg4nn" Mar 09 04:06:47 crc kubenswrapper[4901]: I0309 04:06:47.106956 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ff89b6977-8kf9k" event={"ID":"274fc5c2-4ab4-47b9-b147-d0a47e22b70e","Type":"ContainerStarted","Data":"442a4abd63bbab64729a563c9165f3aeec4099412eea7769e69110fa68ab6817"} Mar 09 04:06:47 crc kubenswrapper[4901]: I0309 04:06:47.107042 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-ff89b6977-8kf9k" Mar 09 04:06:47 crc kubenswrapper[4901]: I0309 04:06:47.111201 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c44667757-7bvgc" event={"ID":"1f256e2a-9089-41fe-a06b-a44d5f53ac1c","Type":"ContainerDied","Data":"54d7c54b242398aa60fa17dbd4d0c855cc542053362b4b2b4997557735a3c3ce"} Mar 09 04:06:47 crc kubenswrapper[4901]: I0309 04:06:47.111300 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c44667757-7bvgc" Mar 09 04:06:47 crc kubenswrapper[4901]: I0309 04:06:47.116363 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c41f7ab2-8b3d-4fac-84b0-6c884673bce9","Type":"ContainerStarted","Data":"842f9dfb9f2044df8dd7480009a8f9a00fe2084f3ee56b3b8c901355cb7d0692"} Mar 09 04:06:47 crc kubenswrapper[4901]: I0309 04:06:47.122347 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vsmcd" podUID="0527d7c3-c662-41df-bb0c-7bc56b6da775" containerName="registry-server" containerID="cri-o://d1de0d291c025dd4ecbc55849aa3a6b747dacd88cfa53cef65f698222cc1a01b" gracePeriod=2 Mar 09 04:06:47 crc kubenswrapper[4901]: I0309 04:06:47.123065 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a2b9c342-0b0e-486b-bc0c-2abd0319879d","Type":"ContainerStarted","Data":"bed013901aa7b00ec6aacd44b15b0376968b3f604f49d89fb5268602515447ae"} Mar 09 04:06:47 crc kubenswrapper[4901]: I0309 04:06:47.123081 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a2b9c342-0b0e-486b-bc0c-2abd0319879d","Type":"ContainerStarted","Data":"5bb5463bbbdca605f84adaabebc5a703a00a739549f4156758df9372b2fce0e5"} Mar 09 04:06:47 crc kubenswrapper[4901]: I0309 04:06:47.148077 4901 scope.go:117] "RemoveContainer" containerID="0abd78b78fecbdc942dbaa2a4d76bf4e5892e555d35373367091b0065bba5435" Mar 09 04:06:47 crc kubenswrapper[4901]: I0309 04:06:47.169157 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-ff89b6977-8kf9k" podStartSLOduration=3.169139064 podStartE2EDuration="3.169139064s" podCreationTimestamp="2026-03-09 04:06:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 04:06:47.133145199 +0000 UTC m=+5131.722808951" watchObservedRunningTime="2026-03-09 04:06:47.169139064 +0000 UTC m=+5131.758802796" Mar 09 04:06:47 crc kubenswrapper[4901]: I0309 04:06:47.238248 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 09 04:06:47 crc kubenswrapper[4901]: I0309 04:06:47.410259 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55c76fd6b7-sg4nn"] Mar 09 04:06:47 crc kubenswrapper[4901]: I0309 04:06:47.416334 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55c76fd6b7-sg4nn"] Mar 09 04:06:47 crc kubenswrapper[4901]: I0309 04:06:47.443853 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c44667757-7bvgc"] Mar 09 04:06:47 crc kubenswrapper[4901]: I0309 04:06:47.450427 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-c44667757-7bvgc"] Mar 09 04:06:47 crc kubenswrapper[4901]: I0309 04:06:47.571423 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vsmcd" Mar 09 04:06:47 crc kubenswrapper[4901]: I0309 04:06:47.640899 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0527d7c3-c662-41df-bb0c-7bc56b6da775-catalog-content\") pod \"0527d7c3-c662-41df-bb0c-7bc56b6da775\" (UID: \"0527d7c3-c662-41df-bb0c-7bc56b6da775\") " Mar 09 04:06:47 crc kubenswrapper[4901]: I0309 04:06:47.641083 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bv59p\" (UniqueName: \"kubernetes.io/projected/0527d7c3-c662-41df-bb0c-7bc56b6da775-kube-api-access-bv59p\") pod \"0527d7c3-c662-41df-bb0c-7bc56b6da775\" (UID: \"0527d7c3-c662-41df-bb0c-7bc56b6da775\") " Mar 09 04:06:47 crc kubenswrapper[4901]: I0309 04:06:47.641170 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0527d7c3-c662-41df-bb0c-7bc56b6da775-utilities\") pod \"0527d7c3-c662-41df-bb0c-7bc56b6da775\" (UID: \"0527d7c3-c662-41df-bb0c-7bc56b6da775\") " Mar 09 04:06:47 crc kubenswrapper[4901]: I0309 04:06:47.642088 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0527d7c3-c662-41df-bb0c-7bc56b6da775-utilities" (OuterVolumeSpecName: "utilities") pod "0527d7c3-c662-41df-bb0c-7bc56b6da775" (UID: "0527d7c3-c662-41df-bb0c-7bc56b6da775"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 04:06:47 crc kubenswrapper[4901]: I0309 04:06:47.706815 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0527d7c3-c662-41df-bb0c-7bc56b6da775-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0527d7c3-c662-41df-bb0c-7bc56b6da775" (UID: "0527d7c3-c662-41df-bb0c-7bc56b6da775"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 04:06:47 crc kubenswrapper[4901]: I0309 04:06:47.711121 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0527d7c3-c662-41df-bb0c-7bc56b6da775-kube-api-access-bv59p" (OuterVolumeSpecName: "kube-api-access-bv59p") pod "0527d7c3-c662-41df-bb0c-7bc56b6da775" (UID: "0527d7c3-c662-41df-bb0c-7bc56b6da775"). InnerVolumeSpecName "kube-api-access-bv59p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:06:47 crc kubenswrapper[4901]: I0309 04:06:47.742730 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bv59p\" (UniqueName: \"kubernetes.io/projected/0527d7c3-c662-41df-bb0c-7bc56b6da775-kube-api-access-bv59p\") on node \"crc\" DevicePath \"\"" Mar 09 04:06:47 crc kubenswrapper[4901]: I0309 04:06:47.742771 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0527d7c3-c662-41df-bb0c-7bc56b6da775-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 04:06:47 crc kubenswrapper[4901]: I0309 04:06:47.742785 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0527d7c3-c662-41df-bb0c-7bc56b6da775-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.087983 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 09 04:06:48 crc kubenswrapper[4901]: E0309 04:06:48.088612 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0527d7c3-c662-41df-bb0c-7bc56b6da775" containerName="registry-server" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.088626 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="0527d7c3-c662-41df-bb0c-7bc56b6da775" containerName="registry-server" Mar 09 04:06:48 crc kubenswrapper[4901]: E0309 04:06:48.088652 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0527d7c3-c662-41df-bb0c-7bc56b6da775" containerName="extract-utilities" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.088658 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="0527d7c3-c662-41df-bb0c-7bc56b6da775" containerName="extract-utilities" Mar 09 04:06:48 crc kubenswrapper[4901]: E0309 04:06:48.088670 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0527d7c3-c662-41df-bb0c-7bc56b6da775" containerName="extract-content" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.088678 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="0527d7c3-c662-41df-bb0c-7bc56b6da775" containerName="extract-content" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.088811 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="0527d7c3-c662-41df-bb0c-7bc56b6da775" containerName="registry-server" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.089919 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.093172 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.094754 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.094878 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.096438 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-mlrqk" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.130426 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f256e2a-9089-41fe-a06b-a44d5f53ac1c" path="/var/lib/kubelet/pods/1f256e2a-9089-41fe-a06b-a44d5f53ac1c/volumes" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.133605 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efc43cb1-a1e5-4e46-98ed-071fd960a438" path="/var/lib/kubelet/pods/efc43cb1-a1e5-4e46-98ed-071fd960a438/volumes" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.142759 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"c28b6405-4576-4ec2-b596-db1e1a35d148","Type":"ContainerStarted","Data":"87673c06e1e227ea8f6679752c17599619befa44f5d858c833ad7a9ba6bf4cd3"} Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.142817 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"c28b6405-4576-4ec2-b596-db1e1a35d148","Type":"ContainerStarted","Data":"effde34395c02305c8fe7ad84bfbc7128df44175ad4549fc6c7d13f48dd248ad"} Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.156745 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c41f7ab2-8b3d-4fac-84b0-6c884673bce9","Type":"ContainerStarted","Data":"f267741241bf2aa8a91630a2f6a0919dc17cb0b223265d4f0b3e2f3e7829eb05"} Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.178450 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fb77f9685-mtds9" event={"ID":"c57ee1f2-a2a5-45dd-9f0b-9eab37ae9b1f","Type":"ContainerStarted","Data":"072203436f2da36cce61b0c8f604b9b630c571786975844ac109782eb15bf3f0"} Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.178664 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5fb77f9685-mtds9" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.185611 4901 generic.go:334] "Generic (PLEG): container finished" podID="0527d7c3-c662-41df-bb0c-7bc56b6da775" containerID="d1de0d291c025dd4ecbc55849aa3a6b747dacd88cfa53cef65f698222cc1a01b" exitCode=0 Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.185728 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vsmcd" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.186256 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vsmcd" event={"ID":"0527d7c3-c662-41df-bb0c-7bc56b6da775","Type":"ContainerDied","Data":"d1de0d291c025dd4ecbc55849aa3a6b747dacd88cfa53cef65f698222cc1a01b"} Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.186280 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vsmcd" event={"ID":"0527d7c3-c662-41df-bb0c-7bc56b6da775","Type":"ContainerDied","Data":"160df8621c49b3578480995fb8761e1074a9eb9c391bca89210e680ff09b9a4e"} Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.186295 4901 scope.go:117] "RemoveContainer" containerID="d1de0d291c025dd4ecbc55849aa3a6b747dacd88cfa53cef65f698222cc1a01b" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.188583 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.229257 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vsmcd"] Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.231400 4901 scope.go:117] "RemoveContainer" containerID="6f75caace87d2e602cdfc152b4ded0991c8093063410d04d17fcf35c0c6c9699" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.238632 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vsmcd"] Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.240363 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5fb77f9685-mtds9" podStartSLOduration=4.240348363 podStartE2EDuration="4.240348363s" podCreationTimestamp="2026-03-09 04:06:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 04:06:48.236466686 +0000 UTC m=+5132.826130418" watchObservedRunningTime="2026-03-09 04:06:48.240348363 +0000 UTC m=+5132.830012095" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.251576 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5de13b0e-52fc-4b56-b8bc-28e67614db67-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"5de13b0e-52fc-4b56-b8bc-28e67614db67\") " pod="openstack/openstack-cell1-galera-0" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.251784 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5de13b0e-52fc-4b56-b8bc-28e67614db67-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"5de13b0e-52fc-4b56-b8bc-28e67614db67\") " pod="openstack/openstack-cell1-galera-0" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.252355 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rf7b\" (UniqueName: \"kubernetes.io/projected/5de13b0e-52fc-4b56-b8bc-28e67614db67-kube-api-access-6rf7b\") pod \"openstack-cell1-galera-0\" (UID: \"5de13b0e-52fc-4b56-b8bc-28e67614db67\") " pod="openstack/openstack-cell1-galera-0" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.252432 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5de13b0e-52fc-4b56-b8bc-28e67614db67-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"5de13b0e-52fc-4b56-b8bc-28e67614db67\") " pod="openstack/openstack-cell1-galera-0" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.252560 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5de13b0e-52fc-4b56-b8bc-28e67614db67-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"5de13b0e-52fc-4b56-b8bc-28e67614db67\") " pod="openstack/openstack-cell1-galera-0" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.252642 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5de13b0e-52fc-4b56-b8bc-28e67614db67-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"5de13b0e-52fc-4b56-b8bc-28e67614db67\") " pod="openstack/openstack-cell1-galera-0" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.253442 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5de13b0e-52fc-4b56-b8bc-28e67614db67-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"5de13b0e-52fc-4b56-b8bc-28e67614db67\") " pod="openstack/openstack-cell1-galera-0" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.253681 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-36e314ba-d20d-4b42-bbf5-2abf3f828370\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-36e314ba-d20d-4b42-bbf5-2abf3f828370\") pod \"openstack-cell1-galera-0\" (UID: \"5de13b0e-52fc-4b56-b8bc-28e67614db67\") " pod="openstack/openstack-cell1-galera-0" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.254730 4901 scope.go:117] "RemoveContainer" containerID="e80e9ff17682a3282e2f9eed155e2c903dbb77f63a8c1d00d8e04ff8f4837ef3" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.276519 4901 scope.go:117] "RemoveContainer" containerID="d1de0d291c025dd4ecbc55849aa3a6b747dacd88cfa53cef65f698222cc1a01b" Mar 09 04:06:48 crc kubenswrapper[4901]: E0309 04:06:48.277105 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1de0d291c025dd4ecbc55849aa3a6b747dacd88cfa53cef65f698222cc1a01b\": container with ID starting with d1de0d291c025dd4ecbc55849aa3a6b747dacd88cfa53cef65f698222cc1a01b not found: ID does not exist" containerID="d1de0d291c025dd4ecbc55849aa3a6b747dacd88cfa53cef65f698222cc1a01b" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.277334 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1de0d291c025dd4ecbc55849aa3a6b747dacd88cfa53cef65f698222cc1a01b"} err="failed to get container status \"d1de0d291c025dd4ecbc55849aa3a6b747dacd88cfa53cef65f698222cc1a01b\": rpc error: code = NotFound desc = could not find container \"d1de0d291c025dd4ecbc55849aa3a6b747dacd88cfa53cef65f698222cc1a01b\": container with ID starting with d1de0d291c025dd4ecbc55849aa3a6b747dacd88cfa53cef65f698222cc1a01b not found: ID does not exist" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.277457 4901 scope.go:117] "RemoveContainer" containerID="6f75caace87d2e602cdfc152b4ded0991c8093063410d04d17fcf35c0c6c9699" Mar 09 04:06:48 crc kubenswrapper[4901]: E0309 04:06:48.280113 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f75caace87d2e602cdfc152b4ded0991c8093063410d04d17fcf35c0c6c9699\": container with ID starting with 6f75caace87d2e602cdfc152b4ded0991c8093063410d04d17fcf35c0c6c9699 not found: ID does not exist" containerID="6f75caace87d2e602cdfc152b4ded0991c8093063410d04d17fcf35c0c6c9699" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.280278 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f75caace87d2e602cdfc152b4ded0991c8093063410d04d17fcf35c0c6c9699"} err="failed to get container status \"6f75caace87d2e602cdfc152b4ded0991c8093063410d04d17fcf35c0c6c9699\": rpc error: code = NotFound desc = could not find container \"6f75caace87d2e602cdfc152b4ded0991c8093063410d04d17fcf35c0c6c9699\": container with ID starting with 6f75caace87d2e602cdfc152b4ded0991c8093063410d04d17fcf35c0c6c9699 not found: ID does not exist" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.280422 4901 scope.go:117] "RemoveContainer" containerID="e80e9ff17682a3282e2f9eed155e2c903dbb77f63a8c1d00d8e04ff8f4837ef3" Mar 09 04:06:48 crc kubenswrapper[4901]: E0309 04:06:48.280841 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e80e9ff17682a3282e2f9eed155e2c903dbb77f63a8c1d00d8e04ff8f4837ef3\": container with ID starting with e80e9ff17682a3282e2f9eed155e2c903dbb77f63a8c1d00d8e04ff8f4837ef3 not found: ID does not exist" containerID="e80e9ff17682a3282e2f9eed155e2c903dbb77f63a8c1d00d8e04ff8f4837ef3" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.280891 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e80e9ff17682a3282e2f9eed155e2c903dbb77f63a8c1d00d8e04ff8f4837ef3"} err="failed to get container status \"e80e9ff17682a3282e2f9eed155e2c903dbb77f63a8c1d00d8e04ff8f4837ef3\": rpc error: code = NotFound desc = could not find container \"e80e9ff17682a3282e2f9eed155e2c903dbb77f63a8c1d00d8e04ff8f4837ef3\": container with ID starting with e80e9ff17682a3282e2f9eed155e2c903dbb77f63a8c1d00d8e04ff8f4837ef3 not found: ID does not exist" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.356023 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5de13b0e-52fc-4b56-b8bc-28e67614db67-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"5de13b0e-52fc-4b56-b8bc-28e67614db67\") " pod="openstack/openstack-cell1-galera-0" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.356486 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rf7b\" (UniqueName: \"kubernetes.io/projected/5de13b0e-52fc-4b56-b8bc-28e67614db67-kube-api-access-6rf7b\") pod \"openstack-cell1-galera-0\" (UID: \"5de13b0e-52fc-4b56-b8bc-28e67614db67\") " pod="openstack/openstack-cell1-galera-0" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.356627 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5de13b0e-52fc-4b56-b8bc-28e67614db67-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"5de13b0e-52fc-4b56-b8bc-28e67614db67\") " pod="openstack/openstack-cell1-galera-0" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.356829 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5de13b0e-52fc-4b56-b8bc-28e67614db67-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"5de13b0e-52fc-4b56-b8bc-28e67614db67\") " pod="openstack/openstack-cell1-galera-0" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.357315 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5de13b0e-52fc-4b56-b8bc-28e67614db67-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"5de13b0e-52fc-4b56-b8bc-28e67614db67\") " pod="openstack/openstack-cell1-galera-0" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.357429 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5de13b0e-52fc-4b56-b8bc-28e67614db67-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"5de13b0e-52fc-4b56-b8bc-28e67614db67\") " pod="openstack/openstack-cell1-galera-0" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.357632 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-36e314ba-d20d-4b42-bbf5-2abf3f828370\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-36e314ba-d20d-4b42-bbf5-2abf3f828370\") pod \"openstack-cell1-galera-0\" (UID: \"5de13b0e-52fc-4b56-b8bc-28e67614db67\") " pod="openstack/openstack-cell1-galera-0" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.357688 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5de13b0e-52fc-4b56-b8bc-28e67614db67-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"5de13b0e-52fc-4b56-b8bc-28e67614db67\") " pod="openstack/openstack-cell1-galera-0" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.357841 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5de13b0e-52fc-4b56-b8bc-28e67614db67-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"5de13b0e-52fc-4b56-b8bc-28e67614db67\") " pod="openstack/openstack-cell1-galera-0" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.358531 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5de13b0e-52fc-4b56-b8bc-28e67614db67-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"5de13b0e-52fc-4b56-b8bc-28e67614db67\") " pod="openstack/openstack-cell1-galera-0" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.358555 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5de13b0e-52fc-4b56-b8bc-28e67614db67-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"5de13b0e-52fc-4b56-b8bc-28e67614db67\") " pod="openstack/openstack-cell1-galera-0" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.360338 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5de13b0e-52fc-4b56-b8bc-28e67614db67-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"5de13b0e-52fc-4b56-b8bc-28e67614db67\") " pod="openstack/openstack-cell1-galera-0" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.360831 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5de13b0e-52fc-4b56-b8bc-28e67614db67-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"5de13b0e-52fc-4b56-b8bc-28e67614db67\") " pod="openstack/openstack-cell1-galera-0" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.366595 4901 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.366625 4901 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-36e314ba-d20d-4b42-bbf5-2abf3f828370\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-36e314ba-d20d-4b42-bbf5-2abf3f828370\") pod \"openstack-cell1-galera-0\" (UID: \"5de13b0e-52fc-4b56-b8bc-28e67614db67\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5b6de12b8f50d358365d93c049429b5654d6d827b8e1cf2e21adeeb5cfd26dcd/globalmount\"" pod="openstack/openstack-cell1-galera-0" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.371238 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rf7b\" (UniqueName: \"kubernetes.io/projected/5de13b0e-52fc-4b56-b8bc-28e67614db67-kube-api-access-6rf7b\") pod \"openstack-cell1-galera-0\" (UID: \"5de13b0e-52fc-4b56-b8bc-28e67614db67\") " pod="openstack/openstack-cell1-galera-0" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.372493 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5de13b0e-52fc-4b56-b8bc-28e67614db67-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"5de13b0e-52fc-4b56-b8bc-28e67614db67\") " pod="openstack/openstack-cell1-galera-0" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.397116 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-36e314ba-d20d-4b42-bbf5-2abf3f828370\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-36e314ba-d20d-4b42-bbf5-2abf3f828370\") pod \"openstack-cell1-galera-0\" (UID: \"5de13b0e-52fc-4b56-b8bc-28e67614db67\") " pod="openstack/openstack-cell1-galera-0" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.465581 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.466661 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.471057 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.471243 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.471293 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-nwgp8" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.474749 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.481820 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.560305 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/929b82d8-ca48-4ccc-b938-8224fae0ac25-memcached-tls-certs\") pod \"memcached-0\" (UID: \"929b82d8-ca48-4ccc-b938-8224fae0ac25\") " pod="openstack/memcached-0" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.560590 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnzmk\" (UniqueName: \"kubernetes.io/projected/929b82d8-ca48-4ccc-b938-8224fae0ac25-kube-api-access-rnzmk\") pod \"memcached-0\" (UID: \"929b82d8-ca48-4ccc-b938-8224fae0ac25\") " pod="openstack/memcached-0" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.560656 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/929b82d8-ca48-4ccc-b938-8224fae0ac25-kolla-config\") pod \"memcached-0\" (UID: \"929b82d8-ca48-4ccc-b938-8224fae0ac25\") " pod="openstack/memcached-0" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.560685 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/929b82d8-ca48-4ccc-b938-8224fae0ac25-config-data\") pod \"memcached-0\" (UID: \"929b82d8-ca48-4ccc-b938-8224fae0ac25\") " pod="openstack/memcached-0" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.560702 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/929b82d8-ca48-4ccc-b938-8224fae0ac25-combined-ca-bundle\") pod \"memcached-0\" (UID: \"929b82d8-ca48-4ccc-b938-8224fae0ac25\") " pod="openstack/memcached-0" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.662856 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/929b82d8-ca48-4ccc-b938-8224fae0ac25-kolla-config\") pod \"memcached-0\" (UID: \"929b82d8-ca48-4ccc-b938-8224fae0ac25\") " pod="openstack/memcached-0" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.662914 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/929b82d8-ca48-4ccc-b938-8224fae0ac25-config-data\") pod \"memcached-0\" (UID: \"929b82d8-ca48-4ccc-b938-8224fae0ac25\") " pod="openstack/memcached-0" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.662931 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/929b82d8-ca48-4ccc-b938-8224fae0ac25-combined-ca-bundle\") pod \"memcached-0\" (UID: \"929b82d8-ca48-4ccc-b938-8224fae0ac25\") " pod="openstack/memcached-0" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.662981 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/929b82d8-ca48-4ccc-b938-8224fae0ac25-memcached-tls-certs\") pod \"memcached-0\" (UID: \"929b82d8-ca48-4ccc-b938-8224fae0ac25\") " pod="openstack/memcached-0" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.663008 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnzmk\" (UniqueName: \"kubernetes.io/projected/929b82d8-ca48-4ccc-b938-8224fae0ac25-kube-api-access-rnzmk\") pod \"memcached-0\" (UID: \"929b82d8-ca48-4ccc-b938-8224fae0ac25\") " pod="openstack/memcached-0" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.663562 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/929b82d8-ca48-4ccc-b938-8224fae0ac25-kolla-config\") pod \"memcached-0\" (UID: \"929b82d8-ca48-4ccc-b938-8224fae0ac25\") " pod="openstack/memcached-0" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.664422 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/929b82d8-ca48-4ccc-b938-8224fae0ac25-config-data\") pod \"memcached-0\" (UID: \"929b82d8-ca48-4ccc-b938-8224fae0ac25\") " pod="openstack/memcached-0" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.667999 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/929b82d8-ca48-4ccc-b938-8224fae0ac25-combined-ca-bundle\") pod \"memcached-0\" (UID: \"929b82d8-ca48-4ccc-b938-8224fae0ac25\") " pod="openstack/memcached-0" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.669471 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/929b82d8-ca48-4ccc-b938-8224fae0ac25-memcached-tls-certs\") pod \"memcached-0\" (UID: \"929b82d8-ca48-4ccc-b938-8224fae0ac25\") " pod="openstack/memcached-0" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.683772 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnzmk\" (UniqueName: \"kubernetes.io/projected/929b82d8-ca48-4ccc-b938-8224fae0ac25-kube-api-access-rnzmk\") pod \"memcached-0\" (UID: \"929b82d8-ca48-4ccc-b938-8224fae0ac25\") " pod="openstack/memcached-0" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.779886 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 09 04:06:48 crc kubenswrapper[4901]: I0309 04:06:48.920165 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 09 04:06:49 crc kubenswrapper[4901]: I0309 04:06:49.183658 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 09 04:06:49 crc kubenswrapper[4901]: W0309 04:06:49.187954 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod929b82d8_ca48_4ccc_b938_8224fae0ac25.slice/crio-35a29b25e0f90cac09bd69bb709c5edd47d44ac5258db256c9c0a24762ecc489 WatchSource:0}: Error finding container 35a29b25e0f90cac09bd69bb709c5edd47d44ac5258db256c9c0a24762ecc489: Status 404 returned error can't find the container with id 35a29b25e0f90cac09bd69bb709c5edd47d44ac5258db256c9c0a24762ecc489 Mar 09 04:06:49 crc kubenswrapper[4901]: I0309 04:06:49.198475 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"5de13b0e-52fc-4b56-b8bc-28e67614db67","Type":"ContainerStarted","Data":"ed623bdfcf6f73cf5dbb8fc814b588dea55fbf0cce5570529d7f686969e73014"} Mar 09 04:06:49 crc kubenswrapper[4901]: I0309 04:06:49.198537 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"5de13b0e-52fc-4b56-b8bc-28e67614db67","Type":"ContainerStarted","Data":"6bd6a36aaac0123df31a43b80e26e16f4a7e6056521fc6554443ad7d574f949f"} Mar 09 04:06:50 crc kubenswrapper[4901]: I0309 04:06:50.125513 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0527d7c3-c662-41df-bb0c-7bc56b6da775" path="/var/lib/kubelet/pods/0527d7c3-c662-41df-bb0c-7bc56b6da775/volumes" Mar 09 04:06:50 crc kubenswrapper[4901]: I0309 04:06:50.216946 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"929b82d8-ca48-4ccc-b938-8224fae0ac25","Type":"ContainerStarted","Data":"5485efa3be6eb81bd56a406468c31cd31c3bc3ca33504478a24de0df3ec1b7e1"} Mar 09 04:06:50 crc kubenswrapper[4901]: I0309 04:06:50.217736 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"929b82d8-ca48-4ccc-b938-8224fae0ac25","Type":"ContainerStarted","Data":"35a29b25e0f90cac09bd69bb709c5edd47d44ac5258db256c9c0a24762ecc489"} Mar 09 04:06:50 crc kubenswrapper[4901]: I0309 04:06:50.217839 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 09 04:06:50 crc kubenswrapper[4901]: I0309 04:06:50.241588 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.241562568 podStartE2EDuration="2.241562568s" podCreationTimestamp="2026-03-09 04:06:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 04:06:50.240923682 +0000 UTC m=+5134.830587444" watchObservedRunningTime="2026-03-09 04:06:50.241562568 +0000 UTC m=+5134.831226330" Mar 09 04:06:52 crc kubenswrapper[4901]: I0309 04:06:52.241273 4901 generic.go:334] "Generic (PLEG): container finished" podID="c28b6405-4576-4ec2-b596-db1e1a35d148" containerID="87673c06e1e227ea8f6679752c17599619befa44f5d858c833ad7a9ba6bf4cd3" exitCode=0 Mar 09 04:06:52 crc kubenswrapper[4901]: I0309 04:06:52.241418 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"c28b6405-4576-4ec2-b596-db1e1a35d148","Type":"ContainerDied","Data":"87673c06e1e227ea8f6679752c17599619befa44f5d858c833ad7a9ba6bf4cd3"} Mar 09 04:06:53 crc kubenswrapper[4901]: I0309 04:06:53.255482 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"c28b6405-4576-4ec2-b596-db1e1a35d148","Type":"ContainerStarted","Data":"efaf9b495b91ae26402c81028274931e95570f0e2664cbb6f414f69ccf2b407b"} Mar 09 04:06:53 crc kubenswrapper[4901]: I0309 04:06:53.282286 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=8.282265822 podStartE2EDuration="8.282265822s" podCreationTimestamp="2026-03-09 04:06:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 04:06:53.277081013 +0000 UTC m=+5137.866744805" watchObservedRunningTime="2026-03-09 04:06:53.282265822 +0000 UTC m=+5137.871929584" Mar 09 04:06:54 crc kubenswrapper[4901]: I0309 04:06:54.266647 4901 generic.go:334] "Generic (PLEG): container finished" podID="5de13b0e-52fc-4b56-b8bc-28e67614db67" containerID="ed623bdfcf6f73cf5dbb8fc814b588dea55fbf0cce5570529d7f686969e73014" exitCode=0 Mar 09 04:06:54 crc kubenswrapper[4901]: I0309 04:06:54.266739 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"5de13b0e-52fc-4b56-b8bc-28e67614db67","Type":"ContainerDied","Data":"ed623bdfcf6f73cf5dbb8fc814b588dea55fbf0cce5570529d7f686969e73014"} Mar 09 04:06:54 crc kubenswrapper[4901]: I0309 04:06:54.610398 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5fb77f9685-mtds9" Mar 09 04:06:54 crc kubenswrapper[4901]: I0309 04:06:54.939449 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-ff89b6977-8kf9k" Mar 09 04:06:54 crc kubenswrapper[4901]: I0309 04:06:54.974166 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fb77f9685-mtds9"] Mar 09 04:06:55 crc kubenswrapper[4901]: I0309 04:06:55.277669 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"5de13b0e-52fc-4b56-b8bc-28e67614db67","Type":"ContainerStarted","Data":"d98f752a7772dd3a631d83c285fc1d0f2ff1ba6fcfee87e18cc034ae88300fb9"} Mar 09 04:06:55 crc kubenswrapper[4901]: I0309 04:06:55.278531 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5fb77f9685-mtds9" podUID="c57ee1f2-a2a5-45dd-9f0b-9eab37ae9b1f" containerName="dnsmasq-dns" containerID="cri-o://072203436f2da36cce61b0c8f604b9b630c571786975844ac109782eb15bf3f0" gracePeriod=10 Mar 09 04:06:55 crc kubenswrapper[4901]: I0309 04:06:55.305908 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=8.305893456 podStartE2EDuration="8.305893456s" podCreationTimestamp="2026-03-09 04:06:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 04:06:55.300540563 +0000 UTC m=+5139.890204295" watchObservedRunningTime="2026-03-09 04:06:55.305893456 +0000 UTC m=+5139.895557188" Mar 09 04:06:55 crc kubenswrapper[4901]: I0309 04:06:55.728761 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fb77f9685-mtds9" Mar 09 04:06:55 crc kubenswrapper[4901]: I0309 04:06:55.791158 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c57ee1f2-a2a5-45dd-9f0b-9eab37ae9b1f-dns-svc\") pod \"c57ee1f2-a2a5-45dd-9f0b-9eab37ae9b1f\" (UID: \"c57ee1f2-a2a5-45dd-9f0b-9eab37ae9b1f\") " Mar 09 04:06:55 crc kubenswrapper[4901]: I0309 04:06:55.791252 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t48qf\" (UniqueName: \"kubernetes.io/projected/c57ee1f2-a2a5-45dd-9f0b-9eab37ae9b1f-kube-api-access-t48qf\") pod \"c57ee1f2-a2a5-45dd-9f0b-9eab37ae9b1f\" (UID: \"c57ee1f2-a2a5-45dd-9f0b-9eab37ae9b1f\") " Mar 09 04:06:55 crc kubenswrapper[4901]: I0309 04:06:55.791402 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c57ee1f2-a2a5-45dd-9f0b-9eab37ae9b1f-config\") pod \"c57ee1f2-a2a5-45dd-9f0b-9eab37ae9b1f\" (UID: \"c57ee1f2-a2a5-45dd-9f0b-9eab37ae9b1f\") " Mar 09 04:06:55 crc kubenswrapper[4901]: I0309 04:06:55.800373 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c57ee1f2-a2a5-45dd-9f0b-9eab37ae9b1f-kube-api-access-t48qf" (OuterVolumeSpecName: "kube-api-access-t48qf") pod "c57ee1f2-a2a5-45dd-9f0b-9eab37ae9b1f" (UID: "c57ee1f2-a2a5-45dd-9f0b-9eab37ae9b1f"). InnerVolumeSpecName "kube-api-access-t48qf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:06:55 crc kubenswrapper[4901]: I0309 04:06:55.845082 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c57ee1f2-a2a5-45dd-9f0b-9eab37ae9b1f-config" (OuterVolumeSpecName: "config") pod "c57ee1f2-a2a5-45dd-9f0b-9eab37ae9b1f" (UID: "c57ee1f2-a2a5-45dd-9f0b-9eab37ae9b1f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 04:06:55 crc kubenswrapper[4901]: I0309 04:06:55.850193 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c57ee1f2-a2a5-45dd-9f0b-9eab37ae9b1f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c57ee1f2-a2a5-45dd-9f0b-9eab37ae9b1f" (UID: "c57ee1f2-a2a5-45dd-9f0b-9eab37ae9b1f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 04:06:55 crc kubenswrapper[4901]: I0309 04:06:55.893032 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c57ee1f2-a2a5-45dd-9f0b-9eab37ae9b1f-config\") on node \"crc\" DevicePath \"\"" Mar 09 04:06:55 crc kubenswrapper[4901]: I0309 04:06:55.893064 4901 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c57ee1f2-a2a5-45dd-9f0b-9eab37ae9b1f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 04:06:55 crc kubenswrapper[4901]: I0309 04:06:55.893075 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t48qf\" (UniqueName: \"kubernetes.io/projected/c57ee1f2-a2a5-45dd-9f0b-9eab37ae9b1f-kube-api-access-t48qf\") on node \"crc\" DevicePath \"\"" Mar 09 04:06:56 crc kubenswrapper[4901]: I0309 04:06:56.290392 4901 generic.go:334] "Generic (PLEG): container finished" podID="c57ee1f2-a2a5-45dd-9f0b-9eab37ae9b1f" containerID="072203436f2da36cce61b0c8f604b9b630c571786975844ac109782eb15bf3f0" exitCode=0 Mar 09 04:06:56 crc kubenswrapper[4901]: I0309 04:06:56.290569 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fb77f9685-mtds9" event={"ID":"c57ee1f2-a2a5-45dd-9f0b-9eab37ae9b1f","Type":"ContainerDied","Data":"072203436f2da36cce61b0c8f604b9b630c571786975844ac109782eb15bf3f0"} Mar 09 04:06:56 crc kubenswrapper[4901]: I0309 04:06:56.290846 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fb77f9685-mtds9" event={"ID":"c57ee1f2-a2a5-45dd-9f0b-9eab37ae9b1f","Type":"ContainerDied","Data":"6da675dd11cbf9058b6de1fe5a3f9df945512678c1afc5c67904dadef5c2f6b0"} Mar 09 04:06:56 crc kubenswrapper[4901]: I0309 04:06:56.290885 4901 scope.go:117] "RemoveContainer" containerID="072203436f2da36cce61b0c8f604b9b630c571786975844ac109782eb15bf3f0" Mar 09 04:06:56 crc kubenswrapper[4901]: I0309 04:06:56.290652 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fb77f9685-mtds9" Mar 09 04:06:56 crc kubenswrapper[4901]: I0309 04:06:56.321413 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fb77f9685-mtds9"] Mar 09 04:06:56 crc kubenswrapper[4901]: I0309 04:06:56.329167 4901 scope.go:117] "RemoveContainer" containerID="6d34726aabc155a18738220e8f98f586e26a50f272162b3a169e255e6c92bed0" Mar 09 04:06:56 crc kubenswrapper[4901]: I0309 04:06:56.329544 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5fb77f9685-mtds9"] Mar 09 04:06:56 crc kubenswrapper[4901]: I0309 04:06:56.357653 4901 scope.go:117] "RemoveContainer" containerID="072203436f2da36cce61b0c8f604b9b630c571786975844ac109782eb15bf3f0" Mar 09 04:06:56 crc kubenswrapper[4901]: E0309 04:06:56.359084 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"072203436f2da36cce61b0c8f604b9b630c571786975844ac109782eb15bf3f0\": container with ID starting with 072203436f2da36cce61b0c8f604b9b630c571786975844ac109782eb15bf3f0 not found: ID does not exist" containerID="072203436f2da36cce61b0c8f604b9b630c571786975844ac109782eb15bf3f0" Mar 09 04:06:56 crc kubenswrapper[4901]: I0309 04:06:56.359167 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"072203436f2da36cce61b0c8f604b9b630c571786975844ac109782eb15bf3f0"} err="failed to get container status \"072203436f2da36cce61b0c8f604b9b630c571786975844ac109782eb15bf3f0\": rpc error: code = NotFound desc = could not find container \"072203436f2da36cce61b0c8f604b9b630c571786975844ac109782eb15bf3f0\": container with ID starting with 072203436f2da36cce61b0c8f604b9b630c571786975844ac109782eb15bf3f0 not found: ID does not exist" Mar 09 04:06:56 crc kubenswrapper[4901]: I0309 04:06:56.359211 4901 scope.go:117] "RemoveContainer" containerID="6d34726aabc155a18738220e8f98f586e26a50f272162b3a169e255e6c92bed0" Mar 09 04:06:56 crc kubenswrapper[4901]: E0309 04:06:56.360466 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d34726aabc155a18738220e8f98f586e26a50f272162b3a169e255e6c92bed0\": container with ID starting with 6d34726aabc155a18738220e8f98f586e26a50f272162b3a169e255e6c92bed0 not found: ID does not exist" containerID="6d34726aabc155a18738220e8f98f586e26a50f272162b3a169e255e6c92bed0" Mar 09 04:06:56 crc kubenswrapper[4901]: I0309 04:06:56.360562 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d34726aabc155a18738220e8f98f586e26a50f272162b3a169e255e6c92bed0"} err="failed to get container status \"6d34726aabc155a18738220e8f98f586e26a50f272162b3a169e255e6c92bed0\": rpc error: code = NotFound desc = could not find container \"6d34726aabc155a18738220e8f98f586e26a50f272162b3a169e255e6c92bed0\": container with ID starting with 6d34726aabc155a18738220e8f98f586e26a50f272162b3a169e255e6c92bed0 not found: ID does not exist" Mar 09 04:06:56 crc kubenswrapper[4901]: I0309 04:06:56.863929 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 09 04:06:56 crc kubenswrapper[4901]: I0309 04:06:56.863977 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 09 04:06:58 crc kubenswrapper[4901]: I0309 04:06:58.120846 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c57ee1f2-a2a5-45dd-9f0b-9eab37ae9b1f" path="/var/lib/kubelet/pods/c57ee1f2-a2a5-45dd-9f0b-9eab37ae9b1f/volumes" Mar 09 04:06:58 crc kubenswrapper[4901]: I0309 04:06:58.475148 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 09 04:06:58 crc kubenswrapper[4901]: I0309 04:06:58.475197 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 09 04:06:58 crc kubenswrapper[4901]: I0309 04:06:58.780896 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 09 04:06:59 crc kubenswrapper[4901]: I0309 04:06:59.317697 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 09 04:06:59 crc kubenswrapper[4901]: I0309 04:06:59.431070 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 09 04:07:00 crc kubenswrapper[4901]: I0309 04:07:00.863492 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 04:07:00 crc kubenswrapper[4901]: I0309 04:07:00.863592 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 04:07:00 crc kubenswrapper[4901]: I0309 04:07:00.974978 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 09 04:07:01 crc kubenswrapper[4901]: I0309 04:07:01.085621 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 09 04:07:05 crc kubenswrapper[4901]: I0309 04:07:05.592774 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-6kxm9"] Mar 09 04:07:05 crc kubenswrapper[4901]: E0309 04:07:05.593343 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c57ee1f2-a2a5-45dd-9f0b-9eab37ae9b1f" containerName="init" Mar 09 04:07:05 crc kubenswrapper[4901]: I0309 04:07:05.593356 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="c57ee1f2-a2a5-45dd-9f0b-9eab37ae9b1f" containerName="init" Mar 09 04:07:05 crc kubenswrapper[4901]: E0309 04:07:05.593378 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c57ee1f2-a2a5-45dd-9f0b-9eab37ae9b1f" containerName="dnsmasq-dns" Mar 09 04:07:05 crc kubenswrapper[4901]: I0309 04:07:05.593384 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="c57ee1f2-a2a5-45dd-9f0b-9eab37ae9b1f" containerName="dnsmasq-dns" Mar 09 04:07:05 crc kubenswrapper[4901]: I0309 04:07:05.593521 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="c57ee1f2-a2a5-45dd-9f0b-9eab37ae9b1f" containerName="dnsmasq-dns" Mar 09 04:07:05 crc kubenswrapper[4901]: I0309 04:07:05.594017 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-6kxm9" Mar 09 04:07:05 crc kubenswrapper[4901]: I0309 04:07:05.596814 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 09 04:07:05 crc kubenswrapper[4901]: I0309 04:07:05.606537 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-6kxm9"] Mar 09 04:07:05 crc kubenswrapper[4901]: I0309 04:07:05.762999 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d13e8184-817e-437b-be8f-d11ecef2b96a-operator-scripts\") pod \"root-account-create-update-6kxm9\" (UID: \"d13e8184-817e-437b-be8f-d11ecef2b96a\") " pod="openstack/root-account-create-update-6kxm9" Mar 09 04:07:05 crc kubenswrapper[4901]: I0309 04:07:05.763078 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8tmq\" (UniqueName: \"kubernetes.io/projected/d13e8184-817e-437b-be8f-d11ecef2b96a-kube-api-access-r8tmq\") pod \"root-account-create-update-6kxm9\" (UID: \"d13e8184-817e-437b-be8f-d11ecef2b96a\") " pod="openstack/root-account-create-update-6kxm9" Mar 09 04:07:05 crc kubenswrapper[4901]: I0309 04:07:05.865361 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d13e8184-817e-437b-be8f-d11ecef2b96a-operator-scripts\") pod \"root-account-create-update-6kxm9\" (UID: \"d13e8184-817e-437b-be8f-d11ecef2b96a\") " pod="openstack/root-account-create-update-6kxm9" Mar 09 04:07:05 crc kubenswrapper[4901]: I0309 04:07:05.865435 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8tmq\" (UniqueName: \"kubernetes.io/projected/d13e8184-817e-437b-be8f-d11ecef2b96a-kube-api-access-r8tmq\") pod \"root-account-create-update-6kxm9\" (UID: \"d13e8184-817e-437b-be8f-d11ecef2b96a\") " pod="openstack/root-account-create-update-6kxm9" Mar 09 04:07:05 crc kubenswrapper[4901]: I0309 04:07:05.866756 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d13e8184-817e-437b-be8f-d11ecef2b96a-operator-scripts\") pod \"root-account-create-update-6kxm9\" (UID: \"d13e8184-817e-437b-be8f-d11ecef2b96a\") " pod="openstack/root-account-create-update-6kxm9" Mar 09 04:07:05 crc kubenswrapper[4901]: I0309 04:07:05.895774 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8tmq\" (UniqueName: \"kubernetes.io/projected/d13e8184-817e-437b-be8f-d11ecef2b96a-kube-api-access-r8tmq\") pod \"root-account-create-update-6kxm9\" (UID: \"d13e8184-817e-437b-be8f-d11ecef2b96a\") " pod="openstack/root-account-create-update-6kxm9" Mar 09 04:07:05 crc kubenswrapper[4901]: I0309 04:07:05.935314 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-6kxm9" Mar 09 04:07:06 crc kubenswrapper[4901]: I0309 04:07:06.443040 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-6kxm9"] Mar 09 04:07:07 crc kubenswrapper[4901]: I0309 04:07:07.395304 4901 generic.go:334] "Generic (PLEG): container finished" podID="d13e8184-817e-437b-be8f-d11ecef2b96a" containerID="c8c9ad9a119d0e3744ac90f19b2e3b0fcd6cb5fd7c18bc92d27dafc5f8babe31" exitCode=0 Mar 09 04:07:07 crc kubenswrapper[4901]: I0309 04:07:07.395359 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-6kxm9" event={"ID":"d13e8184-817e-437b-be8f-d11ecef2b96a","Type":"ContainerDied","Data":"c8c9ad9a119d0e3744ac90f19b2e3b0fcd6cb5fd7c18bc92d27dafc5f8babe31"} Mar 09 04:07:07 crc kubenswrapper[4901]: I0309 04:07:07.395408 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-6kxm9" event={"ID":"d13e8184-817e-437b-be8f-d11ecef2b96a","Type":"ContainerStarted","Data":"c2d1d191222aee95de40d7d26d8dfa77215cc026d0680f8dcbca09bc32eb64bf"} Mar 09 04:07:08 crc kubenswrapper[4901]: I0309 04:07:08.774489 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-6kxm9" Mar 09 04:07:08 crc kubenswrapper[4901]: I0309 04:07:08.916479 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d13e8184-817e-437b-be8f-d11ecef2b96a-operator-scripts\") pod \"d13e8184-817e-437b-be8f-d11ecef2b96a\" (UID: \"d13e8184-817e-437b-be8f-d11ecef2b96a\") " Mar 09 04:07:08 crc kubenswrapper[4901]: I0309 04:07:08.916613 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8tmq\" (UniqueName: \"kubernetes.io/projected/d13e8184-817e-437b-be8f-d11ecef2b96a-kube-api-access-r8tmq\") pod \"d13e8184-817e-437b-be8f-d11ecef2b96a\" (UID: \"d13e8184-817e-437b-be8f-d11ecef2b96a\") " Mar 09 04:07:08 crc kubenswrapper[4901]: I0309 04:07:08.917480 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d13e8184-817e-437b-be8f-d11ecef2b96a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d13e8184-817e-437b-be8f-d11ecef2b96a" (UID: "d13e8184-817e-437b-be8f-d11ecef2b96a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 04:07:08 crc kubenswrapper[4901]: I0309 04:07:08.926267 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d13e8184-817e-437b-be8f-d11ecef2b96a-kube-api-access-r8tmq" (OuterVolumeSpecName: "kube-api-access-r8tmq") pod "d13e8184-817e-437b-be8f-d11ecef2b96a" (UID: "d13e8184-817e-437b-be8f-d11ecef2b96a"). InnerVolumeSpecName "kube-api-access-r8tmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:07:09 crc kubenswrapper[4901]: I0309 04:07:09.017726 4901 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d13e8184-817e-437b-be8f-d11ecef2b96a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 04:07:09 crc kubenswrapper[4901]: I0309 04:07:09.017763 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8tmq\" (UniqueName: \"kubernetes.io/projected/d13e8184-817e-437b-be8f-d11ecef2b96a-kube-api-access-r8tmq\") on node \"crc\" DevicePath \"\"" Mar 09 04:07:09 crc kubenswrapper[4901]: I0309 04:07:09.417441 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-6kxm9" event={"ID":"d13e8184-817e-437b-be8f-d11ecef2b96a","Type":"ContainerDied","Data":"c2d1d191222aee95de40d7d26d8dfa77215cc026d0680f8dcbca09bc32eb64bf"} Mar 09 04:07:09 crc kubenswrapper[4901]: I0309 04:07:09.417527 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2d1d191222aee95de40d7d26d8dfa77215cc026d0680f8dcbca09bc32eb64bf" Mar 09 04:07:09 crc kubenswrapper[4901]: I0309 04:07:09.417612 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-6kxm9" Mar 09 04:07:12 crc kubenswrapper[4901]: I0309 04:07:12.131358 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-6kxm9"] Mar 09 04:07:12 crc kubenswrapper[4901]: I0309 04:07:12.131699 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-6kxm9"] Mar 09 04:07:14 crc kubenswrapper[4901]: I0309 04:07:14.115065 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d13e8184-817e-437b-be8f-d11ecef2b96a" path="/var/lib/kubelet/pods/d13e8184-817e-437b-be8f-d11ecef2b96a/volumes" Mar 09 04:07:17 crc kubenswrapper[4901]: I0309 04:07:17.097241 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-wp4lm"] Mar 09 04:07:17 crc kubenswrapper[4901]: E0309 04:07:17.098021 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d13e8184-817e-437b-be8f-d11ecef2b96a" containerName="mariadb-account-create-update" Mar 09 04:07:17 crc kubenswrapper[4901]: I0309 04:07:17.098044 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="d13e8184-817e-437b-be8f-d11ecef2b96a" containerName="mariadb-account-create-update" Mar 09 04:07:17 crc kubenswrapper[4901]: I0309 04:07:17.098368 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="d13e8184-817e-437b-be8f-d11ecef2b96a" containerName="mariadb-account-create-update" Mar 09 04:07:17 crc kubenswrapper[4901]: I0309 04:07:17.099260 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-wp4lm" Mar 09 04:07:17 crc kubenswrapper[4901]: I0309 04:07:17.104819 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 09 04:07:17 crc kubenswrapper[4901]: I0309 04:07:17.128850 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-wp4lm"] Mar 09 04:07:17 crc kubenswrapper[4901]: I0309 04:07:17.249866 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cgxs\" (UniqueName: \"kubernetes.io/projected/a5c413e1-2b2d-45c5-be20-c8b4fc90c324-kube-api-access-6cgxs\") pod \"root-account-create-update-wp4lm\" (UID: \"a5c413e1-2b2d-45c5-be20-c8b4fc90c324\") " pod="openstack/root-account-create-update-wp4lm" Mar 09 04:07:17 crc kubenswrapper[4901]: I0309 04:07:17.249993 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5c413e1-2b2d-45c5-be20-c8b4fc90c324-operator-scripts\") pod \"root-account-create-update-wp4lm\" (UID: \"a5c413e1-2b2d-45c5-be20-c8b4fc90c324\") " pod="openstack/root-account-create-update-wp4lm" Mar 09 04:07:17 crc kubenswrapper[4901]: I0309 04:07:17.351972 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cgxs\" (UniqueName: \"kubernetes.io/projected/a5c413e1-2b2d-45c5-be20-c8b4fc90c324-kube-api-access-6cgxs\") pod \"root-account-create-update-wp4lm\" (UID: \"a5c413e1-2b2d-45c5-be20-c8b4fc90c324\") " pod="openstack/root-account-create-update-wp4lm" Mar 09 04:07:17 crc kubenswrapper[4901]: I0309 04:07:17.352086 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5c413e1-2b2d-45c5-be20-c8b4fc90c324-operator-scripts\") pod \"root-account-create-update-wp4lm\" (UID: \"a5c413e1-2b2d-45c5-be20-c8b4fc90c324\") " pod="openstack/root-account-create-update-wp4lm" Mar 09 04:07:17 crc kubenswrapper[4901]: I0309 04:07:17.353008 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5c413e1-2b2d-45c5-be20-c8b4fc90c324-operator-scripts\") pod \"root-account-create-update-wp4lm\" (UID: \"a5c413e1-2b2d-45c5-be20-c8b4fc90c324\") " pod="openstack/root-account-create-update-wp4lm" Mar 09 04:07:17 crc kubenswrapper[4901]: I0309 04:07:17.379215 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cgxs\" (UniqueName: \"kubernetes.io/projected/a5c413e1-2b2d-45c5-be20-c8b4fc90c324-kube-api-access-6cgxs\") pod \"root-account-create-update-wp4lm\" (UID: \"a5c413e1-2b2d-45c5-be20-c8b4fc90c324\") " pod="openstack/root-account-create-update-wp4lm" Mar 09 04:07:17 crc kubenswrapper[4901]: I0309 04:07:17.424905 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-wp4lm" Mar 09 04:07:17 crc kubenswrapper[4901]: I0309 04:07:17.911430 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-wp4lm"] Mar 09 04:07:17 crc kubenswrapper[4901]: W0309 04:07:17.916548 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5c413e1_2b2d_45c5_be20_c8b4fc90c324.slice/crio-d6e0af5a12ca7ffc002532eafc013c0c1fb3100fc94df80c5bf4e481a8a2b968 WatchSource:0}: Error finding container d6e0af5a12ca7ffc002532eafc013c0c1fb3100fc94df80c5bf4e481a8a2b968: Status 404 returned error can't find the container with id d6e0af5a12ca7ffc002532eafc013c0c1fb3100fc94df80c5bf4e481a8a2b968 Mar 09 04:07:18 crc kubenswrapper[4901]: I0309 04:07:18.513420 4901 generic.go:334] "Generic (PLEG): container finished" podID="a5c413e1-2b2d-45c5-be20-c8b4fc90c324" containerID="a0abcad488c073232285e86fb3e59aaec44a6017e059d27e01e0859726eb2882" exitCode=0 Mar 09 04:07:18 crc kubenswrapper[4901]: I0309 04:07:18.513511 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-wp4lm" event={"ID":"a5c413e1-2b2d-45c5-be20-c8b4fc90c324","Type":"ContainerDied","Data":"a0abcad488c073232285e86fb3e59aaec44a6017e059d27e01e0859726eb2882"} Mar 09 04:07:18 crc kubenswrapper[4901]: I0309 04:07:18.513704 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-wp4lm" event={"ID":"a5c413e1-2b2d-45c5-be20-c8b4fc90c324","Type":"ContainerStarted","Data":"d6e0af5a12ca7ffc002532eafc013c0c1fb3100fc94df80c5bf4e481a8a2b968"} Mar 09 04:07:19 crc kubenswrapper[4901]: I0309 04:07:19.524859 4901 generic.go:334] "Generic (PLEG): container finished" podID="a2b9c342-0b0e-486b-bc0c-2abd0319879d" containerID="bed013901aa7b00ec6aacd44b15b0376968b3f604f49d89fb5268602515447ae" exitCode=0 Mar 09 04:07:19 crc kubenswrapper[4901]: I0309 04:07:19.524977 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a2b9c342-0b0e-486b-bc0c-2abd0319879d","Type":"ContainerDied","Data":"bed013901aa7b00ec6aacd44b15b0376968b3f604f49d89fb5268602515447ae"} Mar 09 04:07:19 crc kubenswrapper[4901]: I0309 04:07:19.877601 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-wp4lm" Mar 09 04:07:20 crc kubenswrapper[4901]: I0309 04:07:20.004297 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5c413e1-2b2d-45c5-be20-c8b4fc90c324-operator-scripts\") pod \"a5c413e1-2b2d-45c5-be20-c8b4fc90c324\" (UID: \"a5c413e1-2b2d-45c5-be20-c8b4fc90c324\") " Mar 09 04:07:20 crc kubenswrapper[4901]: I0309 04:07:20.004384 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cgxs\" (UniqueName: \"kubernetes.io/projected/a5c413e1-2b2d-45c5-be20-c8b4fc90c324-kube-api-access-6cgxs\") pod \"a5c413e1-2b2d-45c5-be20-c8b4fc90c324\" (UID: \"a5c413e1-2b2d-45c5-be20-c8b4fc90c324\") " Mar 09 04:07:20 crc kubenswrapper[4901]: I0309 04:07:20.005551 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5c413e1-2b2d-45c5-be20-c8b4fc90c324-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a5c413e1-2b2d-45c5-be20-c8b4fc90c324" (UID: "a5c413e1-2b2d-45c5-be20-c8b4fc90c324"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 04:07:20 crc kubenswrapper[4901]: I0309 04:07:20.009033 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5c413e1-2b2d-45c5-be20-c8b4fc90c324-kube-api-access-6cgxs" (OuterVolumeSpecName: "kube-api-access-6cgxs") pod "a5c413e1-2b2d-45c5-be20-c8b4fc90c324" (UID: "a5c413e1-2b2d-45c5-be20-c8b4fc90c324"). InnerVolumeSpecName "kube-api-access-6cgxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:07:20 crc kubenswrapper[4901]: I0309 04:07:20.106697 4901 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5c413e1-2b2d-45c5-be20-c8b4fc90c324-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 04:07:20 crc kubenswrapper[4901]: I0309 04:07:20.107353 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6cgxs\" (UniqueName: \"kubernetes.io/projected/a5c413e1-2b2d-45c5-be20-c8b4fc90c324-kube-api-access-6cgxs\") on node \"crc\" DevicePath \"\"" Mar 09 04:07:20 crc kubenswrapper[4901]: I0309 04:07:20.534512 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-wp4lm" event={"ID":"a5c413e1-2b2d-45c5-be20-c8b4fc90c324","Type":"ContainerDied","Data":"d6e0af5a12ca7ffc002532eafc013c0c1fb3100fc94df80c5bf4e481a8a2b968"} Mar 09 04:07:20 crc kubenswrapper[4901]: I0309 04:07:20.534544 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-wp4lm" Mar 09 04:07:20 crc kubenswrapper[4901]: I0309 04:07:20.534561 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6e0af5a12ca7ffc002532eafc013c0c1fb3100fc94df80c5bf4e481a8a2b968" Mar 09 04:07:20 crc kubenswrapper[4901]: I0309 04:07:20.542171 4901 generic.go:334] "Generic (PLEG): container finished" podID="c41f7ab2-8b3d-4fac-84b0-6c884673bce9" containerID="f267741241bf2aa8a91630a2f6a0919dc17cb0b223265d4f0b3e2f3e7829eb05" exitCode=0 Mar 09 04:07:20 crc kubenswrapper[4901]: I0309 04:07:20.542296 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c41f7ab2-8b3d-4fac-84b0-6c884673bce9","Type":"ContainerDied","Data":"f267741241bf2aa8a91630a2f6a0919dc17cb0b223265d4f0b3e2f3e7829eb05"} Mar 09 04:07:20 crc kubenswrapper[4901]: I0309 04:07:20.547518 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a2b9c342-0b0e-486b-bc0c-2abd0319879d","Type":"ContainerStarted","Data":"0c2c654c9b514af9f2cc94826269d05d6dbacf1d35a9f0232ac0f7fec01d5657"} Mar 09 04:07:20 crc kubenswrapper[4901]: I0309 04:07:20.548431 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:07:20 crc kubenswrapper[4901]: I0309 04:07:20.613528 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.613502316 podStartE2EDuration="36.613502316s" podCreationTimestamp="2026-03-09 04:06:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 04:07:20.604175694 +0000 UTC m=+5165.193839476" watchObservedRunningTime="2026-03-09 04:07:20.613502316 +0000 UTC m=+5165.203166088" Mar 09 04:07:21 crc kubenswrapper[4901]: I0309 04:07:21.558664 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c41f7ab2-8b3d-4fac-84b0-6c884673bce9","Type":"ContainerStarted","Data":"f6b5f830d520afa2fc702a7eebf9cb8b5ac57253caebdf9d48393cb1fd469fc1"} Mar 09 04:07:21 crc kubenswrapper[4901]: I0309 04:07:21.559204 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 09 04:07:21 crc kubenswrapper[4901]: I0309 04:07:21.585620 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.5855865 podStartE2EDuration="37.5855865s" podCreationTimestamp="2026-03-09 04:06:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 04:07:21.582623466 +0000 UTC m=+5166.172287278" watchObservedRunningTime="2026-03-09 04:07:21.5855865 +0000 UTC m=+5166.175250232" Mar 09 04:07:30 crc kubenswrapper[4901]: I0309 04:07:30.863495 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 04:07:30 crc kubenswrapper[4901]: I0309 04:07:30.864137 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 04:07:30 crc kubenswrapper[4901]: I0309 04:07:30.864201 4901 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5c998" Mar 09 04:07:30 crc kubenswrapper[4901]: I0309 04:07:30.865124 4901 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ee11181bba501a52807c1c6a38036b291b104f5a882250fea8dbae3a89c2ab93"} pod="openshift-machine-config-operator/machine-config-daemon-5c998" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 04:07:30 crc kubenswrapper[4901]: I0309 04:07:30.865525 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" containerID="cri-o://ee11181bba501a52807c1c6a38036b291b104f5a882250fea8dbae3a89c2ab93" gracePeriod=600 Mar 09 04:07:31 crc kubenswrapper[4901]: I0309 04:07:31.661735 4901 generic.go:334] "Generic (PLEG): container finished" podID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerID="ee11181bba501a52807c1c6a38036b291b104f5a882250fea8dbae3a89c2ab93" exitCode=0 Mar 09 04:07:31 crc kubenswrapper[4901]: I0309 04:07:31.661839 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" event={"ID":"65e722e8-52c4-4bb6-9927-f378b2f7296a","Type":"ContainerDied","Data":"ee11181bba501a52807c1c6a38036b291b104f5a882250fea8dbae3a89c2ab93"} Mar 09 04:07:31 crc kubenswrapper[4901]: I0309 04:07:31.662360 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" event={"ID":"65e722e8-52c4-4bb6-9927-f378b2f7296a","Type":"ContainerStarted","Data":"d217497ad345fae7da1bdd0eb24a9199e10562d0bb7d6f5c45a95f7e79585459"} Mar 09 04:07:31 crc kubenswrapper[4901]: I0309 04:07:31.662393 4901 scope.go:117] "RemoveContainer" containerID="f2eaaa94ef813f27ab89958e9931b25930a897defaee3221bd3f9fa043f32e39" Mar 09 04:07:35 crc kubenswrapper[4901]: I0309 04:07:35.743713 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:07:36 crc kubenswrapper[4901]: I0309 04:07:36.116079 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 09 04:07:45 crc kubenswrapper[4901]: I0309 04:07:45.063660 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-66d5bf7c87-w8wd7"] Mar 09 04:07:45 crc kubenswrapper[4901]: E0309 04:07:45.065375 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5c413e1-2b2d-45c5-be20-c8b4fc90c324" containerName="mariadb-account-create-update" Mar 09 04:07:45 crc kubenswrapper[4901]: I0309 04:07:45.065472 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5c413e1-2b2d-45c5-be20-c8b4fc90c324" containerName="mariadb-account-create-update" Mar 09 04:07:45 crc kubenswrapper[4901]: I0309 04:07:45.065700 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5c413e1-2b2d-45c5-be20-c8b4fc90c324" containerName="mariadb-account-create-update" Mar 09 04:07:45 crc kubenswrapper[4901]: I0309 04:07:45.066605 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66d5bf7c87-w8wd7" Mar 09 04:07:45 crc kubenswrapper[4901]: I0309 04:07:45.084019 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66d5bf7c87-w8wd7"] Mar 09 04:07:45 crc kubenswrapper[4901]: I0309 04:07:45.131490 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40189f68-54f4-4d38-ab8a-5a470afd7461-config\") pod \"dnsmasq-dns-66d5bf7c87-w8wd7\" (UID: \"40189f68-54f4-4d38-ab8a-5a470afd7461\") " pod="openstack/dnsmasq-dns-66d5bf7c87-w8wd7" Mar 09 04:07:45 crc kubenswrapper[4901]: I0309 04:07:45.131605 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/40189f68-54f4-4d38-ab8a-5a470afd7461-dns-svc\") pod \"dnsmasq-dns-66d5bf7c87-w8wd7\" (UID: \"40189f68-54f4-4d38-ab8a-5a470afd7461\") " pod="openstack/dnsmasq-dns-66d5bf7c87-w8wd7" Mar 09 04:07:45 crc kubenswrapper[4901]: I0309 04:07:45.131636 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7562\" (UniqueName: \"kubernetes.io/projected/40189f68-54f4-4d38-ab8a-5a470afd7461-kube-api-access-s7562\") pod \"dnsmasq-dns-66d5bf7c87-w8wd7\" (UID: \"40189f68-54f4-4d38-ab8a-5a470afd7461\") " pod="openstack/dnsmasq-dns-66d5bf7c87-w8wd7" Mar 09 04:07:45 crc kubenswrapper[4901]: I0309 04:07:45.233452 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40189f68-54f4-4d38-ab8a-5a470afd7461-config\") pod \"dnsmasq-dns-66d5bf7c87-w8wd7\" (UID: \"40189f68-54f4-4d38-ab8a-5a470afd7461\") " pod="openstack/dnsmasq-dns-66d5bf7c87-w8wd7" Mar 09 04:07:45 crc kubenswrapper[4901]: I0309 04:07:45.233578 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/40189f68-54f4-4d38-ab8a-5a470afd7461-dns-svc\") pod \"dnsmasq-dns-66d5bf7c87-w8wd7\" (UID: \"40189f68-54f4-4d38-ab8a-5a470afd7461\") " pod="openstack/dnsmasq-dns-66d5bf7c87-w8wd7" Mar 09 04:07:45 crc kubenswrapper[4901]: I0309 04:07:45.233608 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7562\" (UniqueName: \"kubernetes.io/projected/40189f68-54f4-4d38-ab8a-5a470afd7461-kube-api-access-s7562\") pod \"dnsmasq-dns-66d5bf7c87-w8wd7\" (UID: \"40189f68-54f4-4d38-ab8a-5a470afd7461\") " pod="openstack/dnsmasq-dns-66d5bf7c87-w8wd7" Mar 09 04:07:45 crc kubenswrapper[4901]: I0309 04:07:45.234372 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40189f68-54f4-4d38-ab8a-5a470afd7461-config\") pod \"dnsmasq-dns-66d5bf7c87-w8wd7\" (UID: \"40189f68-54f4-4d38-ab8a-5a470afd7461\") " pod="openstack/dnsmasq-dns-66d5bf7c87-w8wd7" Mar 09 04:07:45 crc kubenswrapper[4901]: I0309 04:07:45.234752 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/40189f68-54f4-4d38-ab8a-5a470afd7461-dns-svc\") pod \"dnsmasq-dns-66d5bf7c87-w8wd7\" (UID: \"40189f68-54f4-4d38-ab8a-5a470afd7461\") " pod="openstack/dnsmasq-dns-66d5bf7c87-w8wd7" Mar 09 04:07:45 crc kubenswrapper[4901]: I0309 04:07:45.262607 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7562\" (UniqueName: \"kubernetes.io/projected/40189f68-54f4-4d38-ab8a-5a470afd7461-kube-api-access-s7562\") pod \"dnsmasq-dns-66d5bf7c87-w8wd7\" (UID: \"40189f68-54f4-4d38-ab8a-5a470afd7461\") " pod="openstack/dnsmasq-dns-66d5bf7c87-w8wd7" Mar 09 04:07:45 crc kubenswrapper[4901]: I0309 04:07:45.384859 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66d5bf7c87-w8wd7" Mar 09 04:07:45 crc kubenswrapper[4901]: I0309 04:07:45.766262 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 04:07:45 crc kubenswrapper[4901]: I0309 04:07:45.821443 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66d5bf7c87-w8wd7"] Mar 09 04:07:46 crc kubenswrapper[4901]: I0309 04:07:46.594512 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 04:07:46 crc kubenswrapper[4901]: I0309 04:07:46.817813 4901 generic.go:334] "Generic (PLEG): container finished" podID="40189f68-54f4-4d38-ab8a-5a470afd7461" containerID="8d1447f7850ff411a2fa3e4fb4674d47183a1fdf3f0b10f8efb1817d085aa4a4" exitCode=0 Mar 09 04:07:46 crc kubenswrapper[4901]: I0309 04:07:46.817854 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66d5bf7c87-w8wd7" event={"ID":"40189f68-54f4-4d38-ab8a-5a470afd7461","Type":"ContainerDied","Data":"8d1447f7850ff411a2fa3e4fb4674d47183a1fdf3f0b10f8efb1817d085aa4a4"} Mar 09 04:07:46 crc kubenswrapper[4901]: I0309 04:07:46.817879 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66d5bf7c87-w8wd7" event={"ID":"40189f68-54f4-4d38-ab8a-5a470afd7461","Type":"ContainerStarted","Data":"8a711aff529047836d40f77bd67f5d8f4099a0c4bde98d0143da8916356e0d12"} Mar 09 04:07:47 crc kubenswrapper[4901]: I0309 04:07:47.828132 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66d5bf7c87-w8wd7" event={"ID":"40189f68-54f4-4d38-ab8a-5a470afd7461","Type":"ContainerStarted","Data":"87123861efab25958328587e10c056c20fcb622dc1423128ed57f036eab73417"} Mar 09 04:07:47 crc kubenswrapper[4901]: I0309 04:07:47.829511 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-66d5bf7c87-w8wd7" Mar 09 04:07:47 crc kubenswrapper[4901]: I0309 04:07:47.855015 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-66d5bf7c87-w8wd7" podStartSLOduration=2.854998016 podStartE2EDuration="2.854998016s" podCreationTimestamp="2026-03-09 04:07:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 04:07:47.854140355 +0000 UTC m=+5192.443804097" watchObservedRunningTime="2026-03-09 04:07:47.854998016 +0000 UTC m=+5192.444661748" Mar 09 04:07:50 crc kubenswrapper[4901]: I0309 04:07:50.239774 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="c41f7ab2-8b3d-4fac-84b0-6c884673bce9" containerName="rabbitmq" containerID="cri-o://f6b5f830d520afa2fc702a7eebf9cb8b5ac57253caebdf9d48393cb1fd469fc1" gracePeriod=604796 Mar 09 04:07:51 crc kubenswrapper[4901]: I0309 04:07:51.102077 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="a2b9c342-0b0e-486b-bc0c-2abd0319879d" containerName="rabbitmq" containerID="cri-o://0c2c654c9b514af9f2cc94826269d05d6dbacf1d35a9f0232ac0f7fec01d5657" gracePeriod=604796 Mar 09 04:07:55 crc kubenswrapper[4901]: I0309 04:07:55.387601 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-66d5bf7c87-w8wd7" Mar 09 04:07:55 crc kubenswrapper[4901]: I0309 04:07:55.478991 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-ff89b6977-8kf9k"] Mar 09 04:07:55 crc kubenswrapper[4901]: I0309 04:07:55.479865 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-ff89b6977-8kf9k" podUID="274fc5c2-4ab4-47b9-b147-d0a47e22b70e" containerName="dnsmasq-dns" containerID="cri-o://442a4abd63bbab64729a563c9165f3aeec4099412eea7769e69110fa68ab6817" gracePeriod=10 Mar 09 04:07:55 crc kubenswrapper[4901]: I0309 04:07:55.740358 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="a2b9c342-0b0e-486b-bc0c-2abd0319879d" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.1.22:5671: connect: connection refused" Mar 09 04:07:55 crc kubenswrapper[4901]: I0309 04:07:55.905330 4901 generic.go:334] "Generic (PLEG): container finished" podID="274fc5c2-4ab4-47b9-b147-d0a47e22b70e" containerID="442a4abd63bbab64729a563c9165f3aeec4099412eea7769e69110fa68ab6817" exitCode=0 Mar 09 04:07:55 crc kubenswrapper[4901]: I0309 04:07:55.905567 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ff89b6977-8kf9k" event={"ID":"274fc5c2-4ab4-47b9-b147-d0a47e22b70e","Type":"ContainerDied","Data":"442a4abd63bbab64729a563c9165f3aeec4099412eea7769e69110fa68ab6817"} Mar 09 04:07:55 crc kubenswrapper[4901]: I0309 04:07:55.905691 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ff89b6977-8kf9k" event={"ID":"274fc5c2-4ab4-47b9-b147-d0a47e22b70e","Type":"ContainerDied","Data":"c3129d9e8558c167184340d0f1380f89af9d0f53d86f11c598847d18dfa32e35"} Mar 09 04:07:55 crc kubenswrapper[4901]: I0309 04:07:55.905705 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3129d9e8558c167184340d0f1380f89af9d0f53d86f11c598847d18dfa32e35" Mar 09 04:07:55 crc kubenswrapper[4901]: I0309 04:07:55.930478 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ff89b6977-8kf9k" Mar 09 04:07:56 crc kubenswrapper[4901]: I0309 04:07:56.003632 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzp7n\" (UniqueName: \"kubernetes.io/projected/274fc5c2-4ab4-47b9-b147-d0a47e22b70e-kube-api-access-zzp7n\") pod \"274fc5c2-4ab4-47b9-b147-d0a47e22b70e\" (UID: \"274fc5c2-4ab4-47b9-b147-d0a47e22b70e\") " Mar 09 04:07:56 crc kubenswrapper[4901]: I0309 04:07:56.003688 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/274fc5c2-4ab4-47b9-b147-d0a47e22b70e-config\") pod \"274fc5c2-4ab4-47b9-b147-d0a47e22b70e\" (UID: \"274fc5c2-4ab4-47b9-b147-d0a47e22b70e\") " Mar 09 04:07:56 crc kubenswrapper[4901]: I0309 04:07:56.003828 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/274fc5c2-4ab4-47b9-b147-d0a47e22b70e-dns-svc\") pod \"274fc5c2-4ab4-47b9-b147-d0a47e22b70e\" (UID: \"274fc5c2-4ab4-47b9-b147-d0a47e22b70e\") " Mar 09 04:07:56 crc kubenswrapper[4901]: I0309 04:07:56.015629 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/274fc5c2-4ab4-47b9-b147-d0a47e22b70e-kube-api-access-zzp7n" (OuterVolumeSpecName: "kube-api-access-zzp7n") pod "274fc5c2-4ab4-47b9-b147-d0a47e22b70e" (UID: "274fc5c2-4ab4-47b9-b147-d0a47e22b70e"). InnerVolumeSpecName "kube-api-access-zzp7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:07:56 crc kubenswrapper[4901]: I0309 04:07:56.036840 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/274fc5c2-4ab4-47b9-b147-d0a47e22b70e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "274fc5c2-4ab4-47b9-b147-d0a47e22b70e" (UID: "274fc5c2-4ab4-47b9-b147-d0a47e22b70e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 04:07:56 crc kubenswrapper[4901]: I0309 04:07:56.038907 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/274fc5c2-4ab4-47b9-b147-d0a47e22b70e-config" (OuterVolumeSpecName: "config") pod "274fc5c2-4ab4-47b9-b147-d0a47e22b70e" (UID: "274fc5c2-4ab4-47b9-b147-d0a47e22b70e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 04:07:56 crc kubenswrapper[4901]: I0309 04:07:56.105902 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzp7n\" (UniqueName: \"kubernetes.io/projected/274fc5c2-4ab4-47b9-b147-d0a47e22b70e-kube-api-access-zzp7n\") on node \"crc\" DevicePath \"\"" Mar 09 04:07:56 crc kubenswrapper[4901]: I0309 04:07:56.105939 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/274fc5c2-4ab4-47b9-b147-d0a47e22b70e-config\") on node \"crc\" DevicePath \"\"" Mar 09 04:07:56 crc kubenswrapper[4901]: I0309 04:07:56.105949 4901 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/274fc5c2-4ab4-47b9-b147-d0a47e22b70e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 04:07:56 crc kubenswrapper[4901]: I0309 04:07:56.109751 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="c41f7ab2-8b3d-4fac-84b0-6c884673bce9" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.1.23:5671: connect: connection refused" Mar 09 04:07:56 crc kubenswrapper[4901]: I0309 04:07:56.797924 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 09 04:07:56 crc kubenswrapper[4901]: I0309 04:07:56.918692 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c41f7ab2-8b3d-4fac-84b0-6c884673bce9-rabbitmq-tls\") pod \"c41f7ab2-8b3d-4fac-84b0-6c884673bce9\" (UID: \"c41f7ab2-8b3d-4fac-84b0-6c884673bce9\") " Mar 09 04:07:56 crc kubenswrapper[4901]: I0309 04:07:56.918783 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c41f7ab2-8b3d-4fac-84b0-6c884673bce9-plugins-conf\") pod \"c41f7ab2-8b3d-4fac-84b0-6c884673bce9\" (UID: \"c41f7ab2-8b3d-4fac-84b0-6c884673bce9\") " Mar 09 04:07:56 crc kubenswrapper[4901]: I0309 04:07:56.918825 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qw9jk\" (UniqueName: \"kubernetes.io/projected/c41f7ab2-8b3d-4fac-84b0-6c884673bce9-kube-api-access-qw9jk\") pod \"c41f7ab2-8b3d-4fac-84b0-6c884673bce9\" (UID: \"c41f7ab2-8b3d-4fac-84b0-6c884673bce9\") " Mar 09 04:07:56 crc kubenswrapper[4901]: I0309 04:07:56.918860 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c41f7ab2-8b3d-4fac-84b0-6c884673bce9-rabbitmq-plugins\") pod \"c41f7ab2-8b3d-4fac-84b0-6c884673bce9\" (UID: \"c41f7ab2-8b3d-4fac-84b0-6c884673bce9\") " Mar 09 04:07:56 crc kubenswrapper[4901]: I0309 04:07:56.918981 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5f37a386-b7fa-4103-b738-f202db5aac24\") pod \"c41f7ab2-8b3d-4fac-84b0-6c884673bce9\" (UID: \"c41f7ab2-8b3d-4fac-84b0-6c884673bce9\") " Mar 09 04:07:56 crc kubenswrapper[4901]: I0309 04:07:56.919020 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c41f7ab2-8b3d-4fac-84b0-6c884673bce9-rabbitmq-confd\") pod \"c41f7ab2-8b3d-4fac-84b0-6c884673bce9\" (UID: \"c41f7ab2-8b3d-4fac-84b0-6c884673bce9\") " Mar 09 04:07:56 crc kubenswrapper[4901]: I0309 04:07:56.919051 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c41f7ab2-8b3d-4fac-84b0-6c884673bce9-erlang-cookie-secret\") pod \"c41f7ab2-8b3d-4fac-84b0-6c884673bce9\" (UID: \"c41f7ab2-8b3d-4fac-84b0-6c884673bce9\") " Mar 09 04:07:56 crc kubenswrapper[4901]: I0309 04:07:56.919092 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c41f7ab2-8b3d-4fac-84b0-6c884673bce9-pod-info\") pod \"c41f7ab2-8b3d-4fac-84b0-6c884673bce9\" (UID: \"c41f7ab2-8b3d-4fac-84b0-6c884673bce9\") " Mar 09 04:07:56 crc kubenswrapper[4901]: I0309 04:07:56.919157 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c41f7ab2-8b3d-4fac-84b0-6c884673bce9-server-conf\") pod \"c41f7ab2-8b3d-4fac-84b0-6c884673bce9\" (UID: \"c41f7ab2-8b3d-4fac-84b0-6c884673bce9\") " Mar 09 04:07:56 crc kubenswrapper[4901]: I0309 04:07:56.919201 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c41f7ab2-8b3d-4fac-84b0-6c884673bce9-config-data\") pod \"c41f7ab2-8b3d-4fac-84b0-6c884673bce9\" (UID: \"c41f7ab2-8b3d-4fac-84b0-6c884673bce9\") " Mar 09 04:07:56 crc kubenswrapper[4901]: I0309 04:07:56.919242 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c41f7ab2-8b3d-4fac-84b0-6c884673bce9-rabbitmq-erlang-cookie\") pod \"c41f7ab2-8b3d-4fac-84b0-6c884673bce9\" (UID: \"c41f7ab2-8b3d-4fac-84b0-6c884673bce9\") " Mar 09 04:07:56 crc kubenswrapper[4901]: I0309 04:07:56.919410 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c41f7ab2-8b3d-4fac-84b0-6c884673bce9-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "c41f7ab2-8b3d-4fac-84b0-6c884673bce9" (UID: "c41f7ab2-8b3d-4fac-84b0-6c884673bce9"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 04:07:56 crc kubenswrapper[4901]: I0309 04:07:56.919635 4901 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c41f7ab2-8b3d-4fac-84b0-6c884673bce9-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 09 04:07:56 crc kubenswrapper[4901]: I0309 04:07:56.919723 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c41f7ab2-8b3d-4fac-84b0-6c884673bce9-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "c41f7ab2-8b3d-4fac-84b0-6c884673bce9" (UID: "c41f7ab2-8b3d-4fac-84b0-6c884673bce9"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 04:07:56 crc kubenswrapper[4901]: I0309 04:07:56.919885 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c41f7ab2-8b3d-4fac-84b0-6c884673bce9-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "c41f7ab2-8b3d-4fac-84b0-6c884673bce9" (UID: "c41f7ab2-8b3d-4fac-84b0-6c884673bce9"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 04:07:56 crc kubenswrapper[4901]: I0309 04:07:56.927003 4901 generic.go:334] "Generic (PLEG): container finished" podID="c41f7ab2-8b3d-4fac-84b0-6c884673bce9" containerID="f6b5f830d520afa2fc702a7eebf9cb8b5ac57253caebdf9d48393cb1fd469fc1" exitCode=0 Mar 09 04:07:56 crc kubenswrapper[4901]: I0309 04:07:56.927085 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ff89b6977-8kf9k" Mar 09 04:07:56 crc kubenswrapper[4901]: I0309 04:07:56.927456 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 09 04:07:56 crc kubenswrapper[4901]: I0309 04:07:56.927731 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c41f7ab2-8b3d-4fac-84b0-6c884673bce9","Type":"ContainerDied","Data":"f6b5f830d520afa2fc702a7eebf9cb8b5ac57253caebdf9d48393cb1fd469fc1"} Mar 09 04:07:56 crc kubenswrapper[4901]: I0309 04:07:56.927758 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c41f7ab2-8b3d-4fac-84b0-6c884673bce9","Type":"ContainerDied","Data":"842f9dfb9f2044df8dd7480009a8f9a00fe2084f3ee56b3b8c901355cb7d0692"} Mar 09 04:07:56 crc kubenswrapper[4901]: I0309 04:07:56.927773 4901 scope.go:117] "RemoveContainer" containerID="f6b5f830d520afa2fc702a7eebf9cb8b5ac57253caebdf9d48393cb1fd469fc1" Mar 09 04:07:56 crc kubenswrapper[4901]: I0309 04:07:56.932487 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c41f7ab2-8b3d-4fac-84b0-6c884673bce9-kube-api-access-qw9jk" (OuterVolumeSpecName: "kube-api-access-qw9jk") pod "c41f7ab2-8b3d-4fac-84b0-6c884673bce9" (UID: "c41f7ab2-8b3d-4fac-84b0-6c884673bce9"). InnerVolumeSpecName "kube-api-access-qw9jk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:07:56 crc kubenswrapper[4901]: I0309 04:07:56.942409 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c41f7ab2-8b3d-4fac-84b0-6c884673bce9-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "c41f7ab2-8b3d-4fac-84b0-6c884673bce9" (UID: "c41f7ab2-8b3d-4fac-84b0-6c884673bce9"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 04:07:56 crc kubenswrapper[4901]: I0309 04:07:56.952743 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/c41f7ab2-8b3d-4fac-84b0-6c884673bce9-pod-info" (OuterVolumeSpecName: "pod-info") pod "c41f7ab2-8b3d-4fac-84b0-6c884673bce9" (UID: "c41f7ab2-8b3d-4fac-84b0-6c884673bce9"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 09 04:07:56 crc kubenswrapper[4901]: I0309 04:07:56.955404 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c41f7ab2-8b3d-4fac-84b0-6c884673bce9-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "c41f7ab2-8b3d-4fac-84b0-6c884673bce9" (UID: "c41f7ab2-8b3d-4fac-84b0-6c884673bce9"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:07:56 crc kubenswrapper[4901]: I0309 04:07:56.972990 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5f37a386-b7fa-4103-b738-f202db5aac24" (OuterVolumeSpecName: "persistence") pod "c41f7ab2-8b3d-4fac-84b0-6c884673bce9" (UID: "c41f7ab2-8b3d-4fac-84b0-6c884673bce9"). InnerVolumeSpecName "pvc-5f37a386-b7fa-4103-b738-f202db5aac24". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 09 04:07:56 crc kubenswrapper[4901]: I0309 04:07:56.984972 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c41f7ab2-8b3d-4fac-84b0-6c884673bce9-config-data" (OuterVolumeSpecName: "config-data") pod "c41f7ab2-8b3d-4fac-84b0-6c884673bce9" (UID: "c41f7ab2-8b3d-4fac-84b0-6c884673bce9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.012180 4901 scope.go:117] "RemoveContainer" containerID="f267741241bf2aa8a91630a2f6a0919dc17cb0b223265d4f0b3e2f3e7829eb05" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.019108 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-ff89b6977-8kf9k"] Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.021155 4901 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c41f7ab2-8b3d-4fac-84b0-6c884673bce9-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.021185 4901 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-5f37a386-b7fa-4103-b738-f202db5aac24\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5f37a386-b7fa-4103-b738-f202db5aac24\") on node \"crc\" " Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.021195 4901 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c41f7ab2-8b3d-4fac-84b0-6c884673bce9-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.021206 4901 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c41f7ab2-8b3d-4fac-84b0-6c884673bce9-pod-info\") on node \"crc\" DevicePath \"\"" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.021216 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c41f7ab2-8b3d-4fac-84b0-6c884673bce9-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.021238 4901 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c41f7ab2-8b3d-4fac-84b0-6c884673bce9-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.021246 4901 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c41f7ab2-8b3d-4fac-84b0-6c884673bce9-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.021254 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qw9jk\" (UniqueName: \"kubernetes.io/projected/c41f7ab2-8b3d-4fac-84b0-6c884673bce9-kube-api-access-qw9jk\") on node \"crc\" DevicePath \"\"" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.023750 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-ff89b6977-8kf9k"] Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.047856 4901 scope.go:117] "RemoveContainer" containerID="f6b5f830d520afa2fc702a7eebf9cb8b5ac57253caebdf9d48393cb1fd469fc1" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.048290 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c41f7ab2-8b3d-4fac-84b0-6c884673bce9-server-conf" (OuterVolumeSpecName: "server-conf") pod "c41f7ab2-8b3d-4fac-84b0-6c884673bce9" (UID: "c41f7ab2-8b3d-4fac-84b0-6c884673bce9"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 04:07:57 crc kubenswrapper[4901]: E0309 04:07:57.048356 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6b5f830d520afa2fc702a7eebf9cb8b5ac57253caebdf9d48393cb1fd469fc1\": container with ID starting with f6b5f830d520afa2fc702a7eebf9cb8b5ac57253caebdf9d48393cb1fd469fc1 not found: ID does not exist" containerID="f6b5f830d520afa2fc702a7eebf9cb8b5ac57253caebdf9d48393cb1fd469fc1" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.048397 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6b5f830d520afa2fc702a7eebf9cb8b5ac57253caebdf9d48393cb1fd469fc1"} err="failed to get container status \"f6b5f830d520afa2fc702a7eebf9cb8b5ac57253caebdf9d48393cb1fd469fc1\": rpc error: code = NotFound desc = could not find container \"f6b5f830d520afa2fc702a7eebf9cb8b5ac57253caebdf9d48393cb1fd469fc1\": container with ID starting with f6b5f830d520afa2fc702a7eebf9cb8b5ac57253caebdf9d48393cb1fd469fc1 not found: ID does not exist" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.048430 4901 scope.go:117] "RemoveContainer" containerID="f267741241bf2aa8a91630a2f6a0919dc17cb0b223265d4f0b3e2f3e7829eb05" Mar 09 04:07:57 crc kubenswrapper[4901]: E0309 04:07:57.049209 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f267741241bf2aa8a91630a2f6a0919dc17cb0b223265d4f0b3e2f3e7829eb05\": container with ID starting with f267741241bf2aa8a91630a2f6a0919dc17cb0b223265d4f0b3e2f3e7829eb05 not found: ID does not exist" containerID="f267741241bf2aa8a91630a2f6a0919dc17cb0b223265d4f0b3e2f3e7829eb05" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.049396 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f267741241bf2aa8a91630a2f6a0919dc17cb0b223265d4f0b3e2f3e7829eb05"} err="failed to get container status \"f267741241bf2aa8a91630a2f6a0919dc17cb0b223265d4f0b3e2f3e7829eb05\": rpc error: code = NotFound desc = could not find container \"f267741241bf2aa8a91630a2f6a0919dc17cb0b223265d4f0b3e2f3e7829eb05\": container with ID starting with f267741241bf2aa8a91630a2f6a0919dc17cb0b223265d4f0b3e2f3e7829eb05 not found: ID does not exist" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.050159 4901 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.050313 4901 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-5f37a386-b7fa-4103-b738-f202db5aac24" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5f37a386-b7fa-4103-b738-f202db5aac24") on node "crc" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.082828 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c41f7ab2-8b3d-4fac-84b0-6c884673bce9-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "c41f7ab2-8b3d-4fac-84b0-6c884673bce9" (UID: "c41f7ab2-8b3d-4fac-84b0-6c884673bce9"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.122425 4901 reconciler_common.go:293] "Volume detached for volume \"pvc-5f37a386-b7fa-4103-b738-f202db5aac24\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5f37a386-b7fa-4103-b738-f202db5aac24\") on node \"crc\" DevicePath \"\"" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.122463 4901 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c41f7ab2-8b3d-4fac-84b0-6c884673bce9-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.122474 4901 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c41f7ab2-8b3d-4fac-84b0-6c884673bce9-server-conf\") on node \"crc\" DevicePath \"\"" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.316180 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.328137 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.351313 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 04:07:57 crc kubenswrapper[4901]: E0309 04:07:57.351661 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="274fc5c2-4ab4-47b9-b147-d0a47e22b70e" containerName="dnsmasq-dns" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.351680 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="274fc5c2-4ab4-47b9-b147-d0a47e22b70e" containerName="dnsmasq-dns" Mar 09 04:07:57 crc kubenswrapper[4901]: E0309 04:07:57.351691 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c41f7ab2-8b3d-4fac-84b0-6c884673bce9" containerName="setup-container" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.351702 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="c41f7ab2-8b3d-4fac-84b0-6c884673bce9" containerName="setup-container" Mar 09 04:07:57 crc kubenswrapper[4901]: E0309 04:07:57.351718 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="274fc5c2-4ab4-47b9-b147-d0a47e22b70e" containerName="init" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.351726 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="274fc5c2-4ab4-47b9-b147-d0a47e22b70e" containerName="init" Mar 09 04:07:57 crc kubenswrapper[4901]: E0309 04:07:57.351746 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c41f7ab2-8b3d-4fac-84b0-6c884673bce9" containerName="rabbitmq" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.351754 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="c41f7ab2-8b3d-4fac-84b0-6c884673bce9" containerName="rabbitmq" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.351928 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="274fc5c2-4ab4-47b9-b147-d0a47e22b70e" containerName="dnsmasq-dns" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.351948 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="c41f7ab2-8b3d-4fac-84b0-6c884673bce9" containerName="rabbitmq" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.352838 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.355590 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.355708 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.355933 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.355954 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.356105 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.356315 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-w67xx" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.356600 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.362247 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.429045 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8b05fa0b-5691-466e-9256-812e4809adb2-server-conf\") pod \"rabbitmq-server-0\" (UID: \"8b05fa0b-5691-466e-9256-812e4809adb2\") " pod="openstack/rabbitmq-server-0" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.429112 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8b05fa0b-5691-466e-9256-812e4809adb2-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"8b05fa0b-5691-466e-9256-812e4809adb2\") " pod="openstack/rabbitmq-server-0" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.429137 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8b05fa0b-5691-466e-9256-812e4809adb2-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"8b05fa0b-5691-466e-9256-812e4809adb2\") " pod="openstack/rabbitmq-server-0" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.429327 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8b05fa0b-5691-466e-9256-812e4809adb2-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"8b05fa0b-5691-466e-9256-812e4809adb2\") " pod="openstack/rabbitmq-server-0" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.429418 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8b05fa0b-5691-466e-9256-812e4809adb2-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"8b05fa0b-5691-466e-9256-812e4809adb2\") " pod="openstack/rabbitmq-server-0" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.429446 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5f37a386-b7fa-4103-b738-f202db5aac24\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5f37a386-b7fa-4103-b738-f202db5aac24\") pod \"rabbitmq-server-0\" (UID: \"8b05fa0b-5691-466e-9256-812e4809adb2\") " pod="openstack/rabbitmq-server-0" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.429470 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8b05fa0b-5691-466e-9256-812e4809adb2-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"8b05fa0b-5691-466e-9256-812e4809adb2\") " pod="openstack/rabbitmq-server-0" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.429502 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzb62\" (UniqueName: \"kubernetes.io/projected/8b05fa0b-5691-466e-9256-812e4809adb2-kube-api-access-zzb62\") pod \"rabbitmq-server-0\" (UID: \"8b05fa0b-5691-466e-9256-812e4809adb2\") " pod="openstack/rabbitmq-server-0" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.429550 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8b05fa0b-5691-466e-9256-812e4809adb2-config-data\") pod \"rabbitmq-server-0\" (UID: \"8b05fa0b-5691-466e-9256-812e4809adb2\") " pod="openstack/rabbitmq-server-0" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.429569 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8b05fa0b-5691-466e-9256-812e4809adb2-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"8b05fa0b-5691-466e-9256-812e4809adb2\") " pod="openstack/rabbitmq-server-0" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.429624 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8b05fa0b-5691-466e-9256-812e4809adb2-pod-info\") pod \"rabbitmq-server-0\" (UID: \"8b05fa0b-5691-466e-9256-812e4809adb2\") " pod="openstack/rabbitmq-server-0" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.531490 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8b05fa0b-5691-466e-9256-812e4809adb2-server-conf\") pod \"rabbitmq-server-0\" (UID: \"8b05fa0b-5691-466e-9256-812e4809adb2\") " pod="openstack/rabbitmq-server-0" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.531591 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8b05fa0b-5691-466e-9256-812e4809adb2-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"8b05fa0b-5691-466e-9256-812e4809adb2\") " pod="openstack/rabbitmq-server-0" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.531643 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8b05fa0b-5691-466e-9256-812e4809adb2-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"8b05fa0b-5691-466e-9256-812e4809adb2\") " pod="openstack/rabbitmq-server-0" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.531716 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8b05fa0b-5691-466e-9256-812e4809adb2-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"8b05fa0b-5691-466e-9256-812e4809adb2\") " pod="openstack/rabbitmq-server-0" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.531790 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8b05fa0b-5691-466e-9256-812e4809adb2-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"8b05fa0b-5691-466e-9256-812e4809adb2\") " pod="openstack/rabbitmq-server-0" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.531812 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8b05fa0b-5691-466e-9256-812e4809adb2-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"8b05fa0b-5691-466e-9256-812e4809adb2\") " pod="openstack/rabbitmq-server-0" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.531836 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5f37a386-b7fa-4103-b738-f202db5aac24\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5f37a386-b7fa-4103-b738-f202db5aac24\") pod \"rabbitmq-server-0\" (UID: \"8b05fa0b-5691-466e-9256-812e4809adb2\") " pod="openstack/rabbitmq-server-0" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.531897 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzb62\" (UniqueName: \"kubernetes.io/projected/8b05fa0b-5691-466e-9256-812e4809adb2-kube-api-access-zzb62\") pod \"rabbitmq-server-0\" (UID: \"8b05fa0b-5691-466e-9256-812e4809adb2\") " pod="openstack/rabbitmq-server-0" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.531953 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8b05fa0b-5691-466e-9256-812e4809adb2-config-data\") pod \"rabbitmq-server-0\" (UID: \"8b05fa0b-5691-466e-9256-812e4809adb2\") " pod="openstack/rabbitmq-server-0" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.531975 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8b05fa0b-5691-466e-9256-812e4809adb2-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"8b05fa0b-5691-466e-9256-812e4809adb2\") " pod="openstack/rabbitmq-server-0" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.532035 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8b05fa0b-5691-466e-9256-812e4809adb2-pod-info\") pod \"rabbitmq-server-0\" (UID: \"8b05fa0b-5691-466e-9256-812e4809adb2\") " pod="openstack/rabbitmq-server-0" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.532203 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8b05fa0b-5691-466e-9256-812e4809adb2-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"8b05fa0b-5691-466e-9256-812e4809adb2\") " pod="openstack/rabbitmq-server-0" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.533017 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8b05fa0b-5691-466e-9256-812e4809adb2-server-conf\") pod \"rabbitmq-server-0\" (UID: \"8b05fa0b-5691-466e-9256-812e4809adb2\") " pod="openstack/rabbitmq-server-0" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.533125 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8b05fa0b-5691-466e-9256-812e4809adb2-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"8b05fa0b-5691-466e-9256-812e4809adb2\") " pod="openstack/rabbitmq-server-0" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.533393 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8b05fa0b-5691-466e-9256-812e4809adb2-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"8b05fa0b-5691-466e-9256-812e4809adb2\") " pod="openstack/rabbitmq-server-0" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.533485 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8b05fa0b-5691-466e-9256-812e4809adb2-config-data\") pod \"rabbitmq-server-0\" (UID: \"8b05fa0b-5691-466e-9256-812e4809adb2\") " pod="openstack/rabbitmq-server-0" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.535711 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8b05fa0b-5691-466e-9256-812e4809adb2-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"8b05fa0b-5691-466e-9256-812e4809adb2\") " pod="openstack/rabbitmq-server-0" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.536209 4901 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.536324 4901 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5f37a386-b7fa-4103-b738-f202db5aac24\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5f37a386-b7fa-4103-b738-f202db5aac24\") pod \"rabbitmq-server-0\" (UID: \"8b05fa0b-5691-466e-9256-812e4809adb2\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5412389fc365b63b1ccd7e3b2d0bef28e36381a956388f06746ecca5231e1554/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.536947 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8b05fa0b-5691-466e-9256-812e4809adb2-pod-info\") pod \"rabbitmq-server-0\" (UID: \"8b05fa0b-5691-466e-9256-812e4809adb2\") " pod="openstack/rabbitmq-server-0" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.537263 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8b05fa0b-5691-466e-9256-812e4809adb2-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"8b05fa0b-5691-466e-9256-812e4809adb2\") " pod="openstack/rabbitmq-server-0" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.538136 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8b05fa0b-5691-466e-9256-812e4809adb2-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"8b05fa0b-5691-466e-9256-812e4809adb2\") " pod="openstack/rabbitmq-server-0" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.550869 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzb62\" (UniqueName: \"kubernetes.io/projected/8b05fa0b-5691-466e-9256-812e4809adb2-kube-api-access-zzb62\") pod \"rabbitmq-server-0\" (UID: \"8b05fa0b-5691-466e-9256-812e4809adb2\") " pod="openstack/rabbitmq-server-0" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.562880 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5f37a386-b7fa-4103-b738-f202db5aac24\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5f37a386-b7fa-4103-b738-f202db5aac24\") pod \"rabbitmq-server-0\" (UID: \"8b05fa0b-5691-466e-9256-812e4809adb2\") " pod="openstack/rabbitmq-server-0" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.681861 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.698528 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.736769 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a2b9c342-0b0e-486b-bc0c-2abd0319879d-server-conf\") pod \"a2b9c342-0b0e-486b-bc0c-2abd0319879d\" (UID: \"a2b9c342-0b0e-486b-bc0c-2abd0319879d\") " Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.736828 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a2b9c342-0b0e-486b-bc0c-2abd0319879d-erlang-cookie-secret\") pod \"a2b9c342-0b0e-486b-bc0c-2abd0319879d\" (UID: \"a2b9c342-0b0e-486b-bc0c-2abd0319879d\") " Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.736868 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a2b9c342-0b0e-486b-bc0c-2abd0319879d-rabbitmq-confd\") pod \"a2b9c342-0b0e-486b-bc0c-2abd0319879d\" (UID: \"a2b9c342-0b0e-486b-bc0c-2abd0319879d\") " Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.736919 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a2b9c342-0b0e-486b-bc0c-2abd0319879d-rabbitmq-erlang-cookie\") pod \"a2b9c342-0b0e-486b-bc0c-2abd0319879d\" (UID: \"a2b9c342-0b0e-486b-bc0c-2abd0319879d\") " Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.736951 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a2b9c342-0b0e-486b-bc0c-2abd0319879d-rabbitmq-tls\") pod \"a2b9c342-0b0e-486b-bc0c-2abd0319879d\" (UID: \"a2b9c342-0b0e-486b-bc0c-2abd0319879d\") " Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.736984 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a2b9c342-0b0e-486b-bc0c-2abd0319879d-config-data\") pod \"a2b9c342-0b0e-486b-bc0c-2abd0319879d\" (UID: \"a2b9c342-0b0e-486b-bc0c-2abd0319879d\") " Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.737014 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a2b9c342-0b0e-486b-bc0c-2abd0319879d-rabbitmq-plugins\") pod \"a2b9c342-0b0e-486b-bc0c-2abd0319879d\" (UID: \"a2b9c342-0b0e-486b-bc0c-2abd0319879d\") " Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.737065 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwxsw\" (UniqueName: \"kubernetes.io/projected/a2b9c342-0b0e-486b-bc0c-2abd0319879d-kube-api-access-dwxsw\") pod \"a2b9c342-0b0e-486b-bc0c-2abd0319879d\" (UID: \"a2b9c342-0b0e-486b-bc0c-2abd0319879d\") " Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.737099 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a2b9c342-0b0e-486b-bc0c-2abd0319879d-plugins-conf\") pod \"a2b9c342-0b0e-486b-bc0c-2abd0319879d\" (UID: \"a2b9c342-0b0e-486b-bc0c-2abd0319879d\") " Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.737310 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-08e0f60e-9de0-4892-8d18-ad06b926d79b\") pod \"a2b9c342-0b0e-486b-bc0c-2abd0319879d\" (UID: \"a2b9c342-0b0e-486b-bc0c-2abd0319879d\") " Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.737371 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a2b9c342-0b0e-486b-bc0c-2abd0319879d-pod-info\") pod \"a2b9c342-0b0e-486b-bc0c-2abd0319879d\" (UID: \"a2b9c342-0b0e-486b-bc0c-2abd0319879d\") " Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.752692 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2b9c342-0b0e-486b-bc0c-2abd0319879d-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "a2b9c342-0b0e-486b-bc0c-2abd0319879d" (UID: "a2b9c342-0b0e-486b-bc0c-2abd0319879d"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.752881 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2b9c342-0b0e-486b-bc0c-2abd0319879d-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "a2b9c342-0b0e-486b-bc0c-2abd0319879d" (UID: "a2b9c342-0b0e-486b-bc0c-2abd0319879d"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.753317 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2b9c342-0b0e-486b-bc0c-2abd0319879d-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "a2b9c342-0b0e-486b-bc0c-2abd0319879d" (UID: "a2b9c342-0b0e-486b-bc0c-2abd0319879d"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.753554 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2b9c342-0b0e-486b-bc0c-2abd0319879d-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "a2b9c342-0b0e-486b-bc0c-2abd0319879d" (UID: "a2b9c342-0b0e-486b-bc0c-2abd0319879d"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.759491 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2b9c342-0b0e-486b-bc0c-2abd0319879d-kube-api-access-dwxsw" (OuterVolumeSpecName: "kube-api-access-dwxsw") pod "a2b9c342-0b0e-486b-bc0c-2abd0319879d" (UID: "a2b9c342-0b0e-486b-bc0c-2abd0319879d"). InnerVolumeSpecName "kube-api-access-dwxsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.758802 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/a2b9c342-0b0e-486b-bc0c-2abd0319879d-pod-info" (OuterVolumeSpecName: "pod-info") pod "a2b9c342-0b0e-486b-bc0c-2abd0319879d" (UID: "a2b9c342-0b0e-486b-bc0c-2abd0319879d"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.762557 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2b9c342-0b0e-486b-bc0c-2abd0319879d-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "a2b9c342-0b0e-486b-bc0c-2abd0319879d" (UID: "a2b9c342-0b0e-486b-bc0c-2abd0319879d"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.772677 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-08e0f60e-9de0-4892-8d18-ad06b926d79b" (OuterVolumeSpecName: "persistence") pod "a2b9c342-0b0e-486b-bc0c-2abd0319879d" (UID: "a2b9c342-0b0e-486b-bc0c-2abd0319879d"). InnerVolumeSpecName "pvc-08e0f60e-9de0-4892-8d18-ad06b926d79b". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.793925 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2b9c342-0b0e-486b-bc0c-2abd0319879d-config-data" (OuterVolumeSpecName: "config-data") pod "a2b9c342-0b0e-486b-bc0c-2abd0319879d" (UID: "a2b9c342-0b0e-486b-bc0c-2abd0319879d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.807767 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2b9c342-0b0e-486b-bc0c-2abd0319879d-server-conf" (OuterVolumeSpecName: "server-conf") pod "a2b9c342-0b0e-486b-bc0c-2abd0319879d" (UID: "a2b9c342-0b0e-486b-bc0c-2abd0319879d"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.834274 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2b9c342-0b0e-486b-bc0c-2abd0319879d-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "a2b9c342-0b0e-486b-bc0c-2abd0319879d" (UID: "a2b9c342-0b0e-486b-bc0c-2abd0319879d"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.838772 4901 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a2b9c342-0b0e-486b-bc0c-2abd0319879d-pod-info\") on node \"crc\" DevicePath \"\"" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.838810 4901 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a2b9c342-0b0e-486b-bc0c-2abd0319879d-server-conf\") on node \"crc\" DevicePath \"\"" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.838823 4901 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a2b9c342-0b0e-486b-bc0c-2abd0319879d-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.838836 4901 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a2b9c342-0b0e-486b-bc0c-2abd0319879d-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.838851 4901 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a2b9c342-0b0e-486b-bc0c-2abd0319879d-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.838862 4901 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a2b9c342-0b0e-486b-bc0c-2abd0319879d-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.838876 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a2b9c342-0b0e-486b-bc0c-2abd0319879d-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.838887 4901 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a2b9c342-0b0e-486b-bc0c-2abd0319879d-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.838900 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwxsw\" (UniqueName: \"kubernetes.io/projected/a2b9c342-0b0e-486b-bc0c-2abd0319879d-kube-api-access-dwxsw\") on node \"crc\" DevicePath \"\"" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.838911 4901 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a2b9c342-0b0e-486b-bc0c-2abd0319879d-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.838955 4901 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-08e0f60e-9de0-4892-8d18-ad06b926d79b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-08e0f60e-9de0-4892-8d18-ad06b926d79b\") on node \"crc\" " Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.868360 4901 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.868510 4901 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-08e0f60e-9de0-4892-8d18-ad06b926d79b" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-08e0f60e-9de0-4892-8d18-ad06b926d79b") on node "crc" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.938338 4901 generic.go:334] "Generic (PLEG): container finished" podID="a2b9c342-0b0e-486b-bc0c-2abd0319879d" containerID="0c2c654c9b514af9f2cc94826269d05d6dbacf1d35a9f0232ac0f7fec01d5657" exitCode=0 Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.938375 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.938372 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a2b9c342-0b0e-486b-bc0c-2abd0319879d","Type":"ContainerDied","Data":"0c2c654c9b514af9f2cc94826269d05d6dbacf1d35a9f0232ac0f7fec01d5657"} Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.938407 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a2b9c342-0b0e-486b-bc0c-2abd0319879d","Type":"ContainerDied","Data":"5bb5463bbbdca605f84adaabebc5a703a00a739549f4156758df9372b2fce0e5"} Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.938422 4901 scope.go:117] "RemoveContainer" containerID="0c2c654c9b514af9f2cc94826269d05d6dbacf1d35a9f0232ac0f7fec01d5657" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.939798 4901 reconciler_common.go:293] "Volume detached for volume \"pvc-08e0f60e-9de0-4892-8d18-ad06b926d79b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-08e0f60e-9de0-4892-8d18-ad06b926d79b\") on node \"crc\" DevicePath \"\"" Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.978517 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 04:07:57 crc kubenswrapper[4901]: I0309 04:07:57.993941 4901 scope.go:117] "RemoveContainer" containerID="bed013901aa7b00ec6aacd44b15b0376968b3f604f49d89fb5268602515447ae" Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.023762 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.026745 4901 scope.go:117] "RemoveContainer" containerID="0c2c654c9b514af9f2cc94826269d05d6dbacf1d35a9f0232ac0f7fec01d5657" Mar 09 04:07:58 crc kubenswrapper[4901]: E0309 04:07:58.027711 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c2c654c9b514af9f2cc94826269d05d6dbacf1d35a9f0232ac0f7fec01d5657\": container with ID starting with 0c2c654c9b514af9f2cc94826269d05d6dbacf1d35a9f0232ac0f7fec01d5657 not found: ID does not exist" containerID="0c2c654c9b514af9f2cc94826269d05d6dbacf1d35a9f0232ac0f7fec01d5657" Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.027754 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c2c654c9b514af9f2cc94826269d05d6dbacf1d35a9f0232ac0f7fec01d5657"} err="failed to get container status \"0c2c654c9b514af9f2cc94826269d05d6dbacf1d35a9f0232ac0f7fec01d5657\": rpc error: code = NotFound desc = could not find container \"0c2c654c9b514af9f2cc94826269d05d6dbacf1d35a9f0232ac0f7fec01d5657\": container with ID starting with 0c2c654c9b514af9f2cc94826269d05d6dbacf1d35a9f0232ac0f7fec01d5657 not found: ID does not exist" Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.027782 4901 scope.go:117] "RemoveContainer" containerID="bed013901aa7b00ec6aacd44b15b0376968b3f604f49d89fb5268602515447ae" Mar 09 04:07:58 crc kubenswrapper[4901]: E0309 04:07:58.028074 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bed013901aa7b00ec6aacd44b15b0376968b3f604f49d89fb5268602515447ae\": container with ID starting with bed013901aa7b00ec6aacd44b15b0376968b3f604f49d89fb5268602515447ae not found: ID does not exist" containerID="bed013901aa7b00ec6aacd44b15b0376968b3f604f49d89fb5268602515447ae" Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.028108 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bed013901aa7b00ec6aacd44b15b0376968b3f604f49d89fb5268602515447ae"} err="failed to get container status \"bed013901aa7b00ec6aacd44b15b0376968b3f604f49d89fb5268602515447ae\": rpc error: code = NotFound desc = could not find container \"bed013901aa7b00ec6aacd44b15b0376968b3f604f49d89fb5268602515447ae\": container with ID starting with bed013901aa7b00ec6aacd44b15b0376968b3f604f49d89fb5268602515447ae not found: ID does not exist" Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.038254 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 04:07:58 crc kubenswrapper[4901]: E0309 04:07:58.038603 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2b9c342-0b0e-486b-bc0c-2abd0319879d" containerName="setup-container" Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.038616 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2b9c342-0b0e-486b-bc0c-2abd0319879d" containerName="setup-container" Mar 09 04:07:58 crc kubenswrapper[4901]: E0309 04:07:58.038629 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2b9c342-0b0e-486b-bc0c-2abd0319879d" containerName="rabbitmq" Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.038635 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2b9c342-0b0e-486b-bc0c-2abd0319879d" containerName="rabbitmq" Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.038772 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2b9c342-0b0e-486b-bc0c-2abd0319879d" containerName="rabbitmq" Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.039653 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.042206 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.042500 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.042574 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.042505 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-cptq8" Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.042623 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.042748 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.044517 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.062145 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.118502 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="274fc5c2-4ab4-47b9-b147-d0a47e22b70e" path="/var/lib/kubelet/pods/274fc5c2-4ab4-47b9-b147-d0a47e22b70e/volumes" Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.124729 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2b9c342-0b0e-486b-bc0c-2abd0319879d" path="/var/lib/kubelet/pods/a2b9c342-0b0e-486b-bc0c-2abd0319879d/volumes" Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.125450 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c41f7ab2-8b3d-4fac-84b0-6c884673bce9" path="/var/lib/kubelet/pods/c41f7ab2-8b3d-4fac-84b0-6c884673bce9/volumes" Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.143890 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/50d7049e-19ad-4936-950c-2bcede63c496-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"50d7049e-19ad-4936-950c-2bcede63c496\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.144205 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/50d7049e-19ad-4936-950c-2bcede63c496-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"50d7049e-19ad-4936-950c-2bcede63c496\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.144377 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/50d7049e-19ad-4936-950c-2bcede63c496-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"50d7049e-19ad-4936-950c-2bcede63c496\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.144513 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/50d7049e-19ad-4936-950c-2bcede63c496-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"50d7049e-19ad-4936-950c-2bcede63c496\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.144632 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/50d7049e-19ad-4936-950c-2bcede63c496-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"50d7049e-19ad-4936-950c-2bcede63c496\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.145705 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lz7p\" (UniqueName: \"kubernetes.io/projected/50d7049e-19ad-4936-950c-2bcede63c496-kube-api-access-4lz7p\") pod \"rabbitmq-cell1-server-0\" (UID: \"50d7049e-19ad-4936-950c-2bcede63c496\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.146400 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/50d7049e-19ad-4936-950c-2bcede63c496-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"50d7049e-19ad-4936-950c-2bcede63c496\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.146587 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/50d7049e-19ad-4936-950c-2bcede63c496-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"50d7049e-19ad-4936-950c-2bcede63c496\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.146767 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/50d7049e-19ad-4936-950c-2bcede63c496-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"50d7049e-19ad-4936-950c-2bcede63c496\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.148147 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/50d7049e-19ad-4936-950c-2bcede63c496-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"50d7049e-19ad-4936-950c-2bcede63c496\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.148305 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-08e0f60e-9de0-4892-8d18-ad06b926d79b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-08e0f60e-9de0-4892-8d18-ad06b926d79b\") pod \"rabbitmq-cell1-server-0\" (UID: \"50d7049e-19ad-4936-950c-2bcede63c496\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.249689 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/50d7049e-19ad-4936-950c-2bcede63c496-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"50d7049e-19ad-4936-950c-2bcede63c496\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.250099 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/50d7049e-19ad-4936-950c-2bcede63c496-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"50d7049e-19ad-4936-950c-2bcede63c496\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.250153 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-08e0f60e-9de0-4892-8d18-ad06b926d79b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-08e0f60e-9de0-4892-8d18-ad06b926d79b\") pod \"rabbitmq-cell1-server-0\" (UID: \"50d7049e-19ad-4936-950c-2bcede63c496\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.250209 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/50d7049e-19ad-4936-950c-2bcede63c496-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"50d7049e-19ad-4936-950c-2bcede63c496\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.250266 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/50d7049e-19ad-4936-950c-2bcede63c496-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"50d7049e-19ad-4936-950c-2bcede63c496\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.250310 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/50d7049e-19ad-4936-950c-2bcede63c496-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"50d7049e-19ad-4936-950c-2bcede63c496\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.250333 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/50d7049e-19ad-4936-950c-2bcede63c496-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"50d7049e-19ad-4936-950c-2bcede63c496\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.250968 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/50d7049e-19ad-4936-950c-2bcede63c496-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"50d7049e-19ad-4936-950c-2bcede63c496\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.251300 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/50d7049e-19ad-4936-950c-2bcede63c496-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"50d7049e-19ad-4936-950c-2bcede63c496\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.251667 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lz7p\" (UniqueName: \"kubernetes.io/projected/50d7049e-19ad-4936-950c-2bcede63c496-kube-api-access-4lz7p\") pod \"rabbitmq-cell1-server-0\" (UID: \"50d7049e-19ad-4936-950c-2bcede63c496\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.251732 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/50d7049e-19ad-4936-950c-2bcede63c496-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"50d7049e-19ad-4936-950c-2bcede63c496\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.251777 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/50d7049e-19ad-4936-950c-2bcede63c496-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"50d7049e-19ad-4936-950c-2bcede63c496\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.251774 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/50d7049e-19ad-4936-950c-2bcede63c496-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"50d7049e-19ad-4936-950c-2bcede63c496\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.252305 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/50d7049e-19ad-4936-950c-2bcede63c496-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"50d7049e-19ad-4936-950c-2bcede63c496\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.252398 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/50d7049e-19ad-4936-950c-2bcede63c496-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"50d7049e-19ad-4936-950c-2bcede63c496\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.252490 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/50d7049e-19ad-4936-950c-2bcede63c496-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"50d7049e-19ad-4936-950c-2bcede63c496\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.256159 4901 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.256204 4901 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-08e0f60e-9de0-4892-8d18-ad06b926d79b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-08e0f60e-9de0-4892-8d18-ad06b926d79b\") pod \"rabbitmq-cell1-server-0\" (UID: \"50d7049e-19ad-4936-950c-2bcede63c496\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6bf618cbd06e1420d56660ed6d2a8f0cd7254ca02606d9ff3d5918d03ec1f102/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.256492 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/50d7049e-19ad-4936-950c-2bcede63c496-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"50d7049e-19ad-4936-950c-2bcede63c496\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.256949 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/50d7049e-19ad-4936-950c-2bcede63c496-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"50d7049e-19ad-4936-950c-2bcede63c496\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.258589 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/50d7049e-19ad-4936-950c-2bcede63c496-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"50d7049e-19ad-4936-950c-2bcede63c496\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.260606 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/50d7049e-19ad-4936-950c-2bcede63c496-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"50d7049e-19ad-4936-950c-2bcede63c496\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.280970 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lz7p\" (UniqueName: \"kubernetes.io/projected/50d7049e-19ad-4936-950c-2bcede63c496-kube-api-access-4lz7p\") pod \"rabbitmq-cell1-server-0\" (UID: \"50d7049e-19ad-4936-950c-2bcede63c496\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.291496 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 04:07:58 crc kubenswrapper[4901]: W0309 04:07:58.307413 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b05fa0b_5691_466e_9256_812e4809adb2.slice/crio-6a9ca0b44fa00360e6ea2be949ff138fa640c43b7a8cc75736eef40f7d7ba333 WatchSource:0}: Error finding container 6a9ca0b44fa00360e6ea2be949ff138fa640c43b7a8cc75736eef40f7d7ba333: Status 404 returned error can't find the container with id 6a9ca0b44fa00360e6ea2be949ff138fa640c43b7a8cc75736eef40f7d7ba333 Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.307639 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-08e0f60e-9de0-4892-8d18-ad06b926d79b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-08e0f60e-9de0-4892-8d18-ad06b926d79b\") pod \"rabbitmq-cell1-server-0\" (UID: \"50d7049e-19ad-4936-950c-2bcede63c496\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.358419 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.835922 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.951851 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8b05fa0b-5691-466e-9256-812e4809adb2","Type":"ContainerStarted","Data":"6a9ca0b44fa00360e6ea2be949ff138fa640c43b7a8cc75736eef40f7d7ba333"} Mar 09 04:07:58 crc kubenswrapper[4901]: I0309 04:07:58.954674 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"50d7049e-19ad-4936-950c-2bcede63c496","Type":"ContainerStarted","Data":"40bacd28f082aa0806539527ee6eeb999f2416ad6ac8b022cc01df80c1f33e8c"} Mar 09 04:07:59 crc kubenswrapper[4901]: I0309 04:07:59.968368 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8b05fa0b-5691-466e-9256-812e4809adb2","Type":"ContainerStarted","Data":"8db9998aa0e87805bfd534d9ae27a90801c90d41e5d1d7eaa500832f0610e581"} Mar 09 04:08:00 crc kubenswrapper[4901]: I0309 04:08:00.174923 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550488-j68gd"] Mar 09 04:08:00 crc kubenswrapper[4901]: I0309 04:08:00.176346 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550488-j68gd" Mar 09 04:08:00 crc kubenswrapper[4901]: I0309 04:08:00.179996 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 04:08:00 crc kubenswrapper[4901]: I0309 04:08:00.180127 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 04:08:00 crc kubenswrapper[4901]: I0309 04:08:00.180455 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lq6vs" Mar 09 04:08:00 crc kubenswrapper[4901]: I0309 04:08:00.194117 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550488-j68gd"] Mar 09 04:08:00 crc kubenswrapper[4901]: I0309 04:08:00.284466 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwhzd\" (UniqueName: \"kubernetes.io/projected/b6d8792f-b941-44d3-a0ab-446c2b010bd0-kube-api-access-dwhzd\") pod \"auto-csr-approver-29550488-j68gd\" (UID: \"b6d8792f-b941-44d3-a0ab-446c2b010bd0\") " pod="openshift-infra/auto-csr-approver-29550488-j68gd" Mar 09 04:08:00 crc kubenswrapper[4901]: I0309 04:08:00.386207 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwhzd\" (UniqueName: \"kubernetes.io/projected/b6d8792f-b941-44d3-a0ab-446c2b010bd0-kube-api-access-dwhzd\") pod \"auto-csr-approver-29550488-j68gd\" (UID: \"b6d8792f-b941-44d3-a0ab-446c2b010bd0\") " pod="openshift-infra/auto-csr-approver-29550488-j68gd" Mar 09 04:08:00 crc kubenswrapper[4901]: I0309 04:08:00.405113 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwhzd\" (UniqueName: \"kubernetes.io/projected/b6d8792f-b941-44d3-a0ab-446c2b010bd0-kube-api-access-dwhzd\") pod \"auto-csr-approver-29550488-j68gd\" (UID: \"b6d8792f-b941-44d3-a0ab-446c2b010bd0\") " pod="openshift-infra/auto-csr-approver-29550488-j68gd" Mar 09 04:08:00 crc kubenswrapper[4901]: I0309 04:08:00.510438 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550488-j68gd" Mar 09 04:08:00 crc kubenswrapper[4901]: I0309 04:08:00.983000 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"50d7049e-19ad-4936-950c-2bcede63c496","Type":"ContainerStarted","Data":"cc87b3b9480b52237e3edc8db4bfac738465401d124f0c59495fc289405bc3e2"} Mar 09 04:08:00 crc kubenswrapper[4901]: I0309 04:08:00.985027 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550488-j68gd"] Mar 09 04:08:01 crc kubenswrapper[4901]: W0309 04:08:00.999625 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6d8792f_b941_44d3_a0ab_446c2b010bd0.slice/crio-8ce943db6543fc0e4322c4040aa1b73b94e4c69028157fa781a4c5acbae149d6 WatchSource:0}: Error finding container 8ce943db6543fc0e4322c4040aa1b73b94e4c69028157fa781a4c5acbae149d6: Status 404 returned error can't find the container with id 8ce943db6543fc0e4322c4040aa1b73b94e4c69028157fa781a4c5acbae149d6 Mar 09 04:08:01 crc kubenswrapper[4901]: I0309 04:08:01.995733 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550488-j68gd" event={"ID":"b6d8792f-b941-44d3-a0ab-446c2b010bd0","Type":"ContainerStarted","Data":"8ce943db6543fc0e4322c4040aa1b73b94e4c69028157fa781a4c5acbae149d6"} Mar 09 04:08:03 crc kubenswrapper[4901]: I0309 04:08:03.008833 4901 generic.go:334] "Generic (PLEG): container finished" podID="b6d8792f-b941-44d3-a0ab-446c2b010bd0" containerID="07aca33045e52a42df8f9db2b5908b1ba6fb2b86444f95f48083050b438a88d7" exitCode=0 Mar 09 04:08:03 crc kubenswrapper[4901]: I0309 04:08:03.008953 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550488-j68gd" event={"ID":"b6d8792f-b941-44d3-a0ab-446c2b010bd0","Type":"ContainerDied","Data":"07aca33045e52a42df8f9db2b5908b1ba6fb2b86444f95f48083050b438a88d7"} Mar 09 04:08:04 crc kubenswrapper[4901]: I0309 04:08:04.425260 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550488-j68gd" Mar 09 04:08:04 crc kubenswrapper[4901]: I0309 04:08:04.553408 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwhzd\" (UniqueName: \"kubernetes.io/projected/b6d8792f-b941-44d3-a0ab-446c2b010bd0-kube-api-access-dwhzd\") pod \"b6d8792f-b941-44d3-a0ab-446c2b010bd0\" (UID: \"b6d8792f-b941-44d3-a0ab-446c2b010bd0\") " Mar 09 04:08:04 crc kubenswrapper[4901]: I0309 04:08:04.561674 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6d8792f-b941-44d3-a0ab-446c2b010bd0-kube-api-access-dwhzd" (OuterVolumeSpecName: "kube-api-access-dwhzd") pod "b6d8792f-b941-44d3-a0ab-446c2b010bd0" (UID: "b6d8792f-b941-44d3-a0ab-446c2b010bd0"). InnerVolumeSpecName "kube-api-access-dwhzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:08:04 crc kubenswrapper[4901]: I0309 04:08:04.656188 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwhzd\" (UniqueName: \"kubernetes.io/projected/b6d8792f-b941-44d3-a0ab-446c2b010bd0-kube-api-access-dwhzd\") on node \"crc\" DevicePath \"\"" Mar 09 04:08:05 crc kubenswrapper[4901]: I0309 04:08:05.039931 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550488-j68gd" event={"ID":"b6d8792f-b941-44d3-a0ab-446c2b010bd0","Type":"ContainerDied","Data":"8ce943db6543fc0e4322c4040aa1b73b94e4c69028157fa781a4c5acbae149d6"} Mar 09 04:08:05 crc kubenswrapper[4901]: I0309 04:08:05.039977 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ce943db6543fc0e4322c4040aa1b73b94e4c69028157fa781a4c5acbae149d6" Mar 09 04:08:05 crc kubenswrapper[4901]: I0309 04:08:05.040037 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550488-j68gd" Mar 09 04:08:05 crc kubenswrapper[4901]: I0309 04:08:05.521043 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550482-cvb95"] Mar 09 04:08:05 crc kubenswrapper[4901]: I0309 04:08:05.527702 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550482-cvb95"] Mar 09 04:08:06 crc kubenswrapper[4901]: I0309 04:08:06.136581 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1923e3c5-bc27-4914-9789-a5f731fc2725" path="/var/lib/kubelet/pods/1923e3c5-bc27-4914-9789-a5f731fc2725/volumes" Mar 09 04:08:29 crc kubenswrapper[4901]: I0309 04:08:29.358584 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-j6l64"] Mar 09 04:08:29 crc kubenswrapper[4901]: E0309 04:08:29.360914 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6d8792f-b941-44d3-a0ab-446c2b010bd0" containerName="oc" Mar 09 04:08:29 crc kubenswrapper[4901]: I0309 04:08:29.360943 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6d8792f-b941-44d3-a0ab-446c2b010bd0" containerName="oc" Mar 09 04:08:29 crc kubenswrapper[4901]: I0309 04:08:29.361331 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6d8792f-b941-44d3-a0ab-446c2b010bd0" containerName="oc" Mar 09 04:08:29 crc kubenswrapper[4901]: I0309 04:08:29.363185 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j6l64" Mar 09 04:08:29 crc kubenswrapper[4901]: I0309 04:08:29.373410 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j6l64"] Mar 09 04:08:29 crc kubenswrapper[4901]: I0309 04:08:29.486917 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/720fafbd-95a6-4409-980e-74fd865ef9e9-utilities\") pod \"redhat-operators-j6l64\" (UID: \"720fafbd-95a6-4409-980e-74fd865ef9e9\") " pod="openshift-marketplace/redhat-operators-j6l64" Mar 09 04:08:29 crc kubenswrapper[4901]: I0309 04:08:29.487052 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4cbk\" (UniqueName: \"kubernetes.io/projected/720fafbd-95a6-4409-980e-74fd865ef9e9-kube-api-access-n4cbk\") pod \"redhat-operators-j6l64\" (UID: \"720fafbd-95a6-4409-980e-74fd865ef9e9\") " pod="openshift-marketplace/redhat-operators-j6l64" Mar 09 04:08:29 crc kubenswrapper[4901]: I0309 04:08:29.487118 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/720fafbd-95a6-4409-980e-74fd865ef9e9-catalog-content\") pod \"redhat-operators-j6l64\" (UID: \"720fafbd-95a6-4409-980e-74fd865ef9e9\") " pod="openshift-marketplace/redhat-operators-j6l64" Mar 09 04:08:29 crc kubenswrapper[4901]: I0309 04:08:29.589058 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/720fafbd-95a6-4409-980e-74fd865ef9e9-utilities\") pod \"redhat-operators-j6l64\" (UID: \"720fafbd-95a6-4409-980e-74fd865ef9e9\") " pod="openshift-marketplace/redhat-operators-j6l64" Mar 09 04:08:29 crc kubenswrapper[4901]: I0309 04:08:29.589167 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4cbk\" (UniqueName: \"kubernetes.io/projected/720fafbd-95a6-4409-980e-74fd865ef9e9-kube-api-access-n4cbk\") pod \"redhat-operators-j6l64\" (UID: \"720fafbd-95a6-4409-980e-74fd865ef9e9\") " pod="openshift-marketplace/redhat-operators-j6l64" Mar 09 04:08:29 crc kubenswrapper[4901]: I0309 04:08:29.589239 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/720fafbd-95a6-4409-980e-74fd865ef9e9-catalog-content\") pod \"redhat-operators-j6l64\" (UID: \"720fafbd-95a6-4409-980e-74fd865ef9e9\") " pod="openshift-marketplace/redhat-operators-j6l64" Mar 09 04:08:29 crc kubenswrapper[4901]: I0309 04:08:29.590077 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/720fafbd-95a6-4409-980e-74fd865ef9e9-utilities\") pod \"redhat-operators-j6l64\" (UID: \"720fafbd-95a6-4409-980e-74fd865ef9e9\") " pod="openshift-marketplace/redhat-operators-j6l64" Mar 09 04:08:29 crc kubenswrapper[4901]: I0309 04:08:29.590097 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/720fafbd-95a6-4409-980e-74fd865ef9e9-catalog-content\") pod \"redhat-operators-j6l64\" (UID: \"720fafbd-95a6-4409-980e-74fd865ef9e9\") " pod="openshift-marketplace/redhat-operators-j6l64" Mar 09 04:08:29 crc kubenswrapper[4901]: I0309 04:08:29.608955 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4cbk\" (UniqueName: \"kubernetes.io/projected/720fafbd-95a6-4409-980e-74fd865ef9e9-kube-api-access-n4cbk\") pod \"redhat-operators-j6l64\" (UID: \"720fafbd-95a6-4409-980e-74fd865ef9e9\") " pod="openshift-marketplace/redhat-operators-j6l64" Mar 09 04:08:29 crc kubenswrapper[4901]: I0309 04:08:29.691481 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j6l64" Mar 09 04:08:30 crc kubenswrapper[4901]: I0309 04:08:30.165577 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j6l64"] Mar 09 04:08:30 crc kubenswrapper[4901]: I0309 04:08:30.274428 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j6l64" event={"ID":"720fafbd-95a6-4409-980e-74fd865ef9e9","Type":"ContainerStarted","Data":"e147761a3684dcd48a14dcbc73dcb84994093fe9fae615f71b9e68886023bff1"} Mar 09 04:08:31 crc kubenswrapper[4901]: I0309 04:08:31.284839 4901 generic.go:334] "Generic (PLEG): container finished" podID="720fafbd-95a6-4409-980e-74fd865ef9e9" containerID="7ecd12eb8b1e590d7ee993759298b6185507571822220a9b96d0a35c25afba51" exitCode=0 Mar 09 04:08:31 crc kubenswrapper[4901]: I0309 04:08:31.284897 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j6l64" event={"ID":"720fafbd-95a6-4409-980e-74fd865ef9e9","Type":"ContainerDied","Data":"7ecd12eb8b1e590d7ee993759298b6185507571822220a9b96d0a35c25afba51"} Mar 09 04:08:32 crc kubenswrapper[4901]: I0309 04:08:32.296888 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j6l64" event={"ID":"720fafbd-95a6-4409-980e-74fd865ef9e9","Type":"ContainerStarted","Data":"14871fcb84dbdd6c0c42739159219bcdd7955897744e5bc40a7eb2e6ea8087bb"} Mar 09 04:08:33 crc kubenswrapper[4901]: I0309 04:08:33.308621 4901 generic.go:334] "Generic (PLEG): container finished" podID="720fafbd-95a6-4409-980e-74fd865ef9e9" containerID="14871fcb84dbdd6c0c42739159219bcdd7955897744e5bc40a7eb2e6ea8087bb" exitCode=0 Mar 09 04:08:33 crc kubenswrapper[4901]: I0309 04:08:33.308768 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j6l64" event={"ID":"720fafbd-95a6-4409-980e-74fd865ef9e9","Type":"ContainerDied","Data":"14871fcb84dbdd6c0c42739159219bcdd7955897744e5bc40a7eb2e6ea8087bb"} Mar 09 04:08:33 crc kubenswrapper[4901]: I0309 04:08:33.312103 4901 generic.go:334] "Generic (PLEG): container finished" podID="8b05fa0b-5691-466e-9256-812e4809adb2" containerID="8db9998aa0e87805bfd534d9ae27a90801c90d41e5d1d7eaa500832f0610e581" exitCode=0 Mar 09 04:08:33 crc kubenswrapper[4901]: I0309 04:08:33.312489 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8b05fa0b-5691-466e-9256-812e4809adb2","Type":"ContainerDied","Data":"8db9998aa0e87805bfd534d9ae27a90801c90d41e5d1d7eaa500832f0610e581"} Mar 09 04:08:34 crc kubenswrapper[4901]: I0309 04:08:34.323566 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j6l64" event={"ID":"720fafbd-95a6-4409-980e-74fd865ef9e9","Type":"ContainerStarted","Data":"d427c218959224d421e2a2e045ea3e0ee7ad78d1d53dea34639ffef7ca95ec2f"} Mar 09 04:08:34 crc kubenswrapper[4901]: I0309 04:08:34.326941 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8b05fa0b-5691-466e-9256-812e4809adb2","Type":"ContainerStarted","Data":"3abd200c0ff4a234c24362c42d35f996642cce307e397cb614c8565acacab3ff"} Mar 09 04:08:34 crc kubenswrapper[4901]: I0309 04:08:34.327694 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 09 04:08:34 crc kubenswrapper[4901]: I0309 04:08:34.329239 4901 generic.go:334] "Generic (PLEG): container finished" podID="50d7049e-19ad-4936-950c-2bcede63c496" containerID="cc87b3b9480b52237e3edc8db4bfac738465401d124f0c59495fc289405bc3e2" exitCode=0 Mar 09 04:08:34 crc kubenswrapper[4901]: I0309 04:08:34.329259 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"50d7049e-19ad-4936-950c-2bcede63c496","Type":"ContainerDied","Data":"cc87b3b9480b52237e3edc8db4bfac738465401d124f0c59495fc289405bc3e2"} Mar 09 04:08:34 crc kubenswrapper[4901]: I0309 04:08:34.360644 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-j6l64" podStartSLOduration=2.920995654 podStartE2EDuration="5.360622381s" podCreationTimestamp="2026-03-09 04:08:29 +0000 UTC" firstStartedPulling="2026-03-09 04:08:31.287199542 +0000 UTC m=+5235.876863274" lastFinishedPulling="2026-03-09 04:08:33.726826259 +0000 UTC m=+5238.316490001" observedRunningTime="2026-03-09 04:08:34.349691329 +0000 UTC m=+5238.939355071" watchObservedRunningTime="2026-03-09 04:08:34.360622381 +0000 UTC m=+5238.950286123" Mar 09 04:08:34 crc kubenswrapper[4901]: I0309 04:08:34.398134 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.398115213 podStartE2EDuration="37.398115213s" podCreationTimestamp="2026-03-09 04:07:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 04:08:34.393855467 +0000 UTC m=+5238.983519209" watchObservedRunningTime="2026-03-09 04:08:34.398115213 +0000 UTC m=+5238.987778945" Mar 09 04:08:35 crc kubenswrapper[4901]: I0309 04:08:35.340156 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"50d7049e-19ad-4936-950c-2bcede63c496","Type":"ContainerStarted","Data":"e5373472aff266f87ccd412f553b610d7b82b58a774bbbe88fad9ad7b584f79e"} Mar 09 04:08:35 crc kubenswrapper[4901]: I0309 04:08:35.341418 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:08:35 crc kubenswrapper[4901]: I0309 04:08:35.374650 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.374630727 podStartE2EDuration="38.374630727s" podCreationTimestamp="2026-03-09 04:07:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 04:08:35.366989527 +0000 UTC m=+5239.956653259" watchObservedRunningTime="2026-03-09 04:08:35.374630727 +0000 UTC m=+5239.964294469" Mar 09 04:08:39 crc kubenswrapper[4901]: I0309 04:08:39.692668 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-j6l64" Mar 09 04:08:39 crc kubenswrapper[4901]: I0309 04:08:39.694216 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-j6l64" Mar 09 04:08:40 crc kubenswrapper[4901]: I0309 04:08:40.765821 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-j6l64" podUID="720fafbd-95a6-4409-980e-74fd865ef9e9" containerName="registry-server" probeResult="failure" output=< Mar 09 04:08:40 crc kubenswrapper[4901]: timeout: failed to connect service ":50051" within 1s Mar 09 04:08:40 crc kubenswrapper[4901]: > Mar 09 04:08:41 crc kubenswrapper[4901]: I0309 04:08:41.933091 4901 scope.go:117] "RemoveContainer" containerID="36b1b227bfe6f1a73a7a1cb4120a1829a8c4b32d60c17b57e40d1201f9ca2bd4" Mar 09 04:08:47 crc kubenswrapper[4901]: I0309 04:08:47.702520 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 09 04:08:48 crc kubenswrapper[4901]: I0309 04:08:48.362549 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 09 04:08:49 crc kubenswrapper[4901]: I0309 04:08:49.773831 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-j6l64" Mar 09 04:08:49 crc kubenswrapper[4901]: I0309 04:08:49.818594 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-j6l64" Mar 09 04:08:50 crc kubenswrapper[4901]: I0309 04:08:50.028383 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j6l64"] Mar 09 04:08:51 crc kubenswrapper[4901]: I0309 04:08:51.471701 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-j6l64" podUID="720fafbd-95a6-4409-980e-74fd865ef9e9" containerName="registry-server" containerID="cri-o://d427c218959224d421e2a2e045ea3e0ee7ad78d1d53dea34639ffef7ca95ec2f" gracePeriod=2 Mar 09 04:08:51 crc kubenswrapper[4901]: I0309 04:08:51.939925 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j6l64" Mar 09 04:08:52 crc kubenswrapper[4901]: I0309 04:08:52.050102 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4cbk\" (UniqueName: \"kubernetes.io/projected/720fafbd-95a6-4409-980e-74fd865ef9e9-kube-api-access-n4cbk\") pod \"720fafbd-95a6-4409-980e-74fd865ef9e9\" (UID: \"720fafbd-95a6-4409-980e-74fd865ef9e9\") " Mar 09 04:08:52 crc kubenswrapper[4901]: I0309 04:08:52.050303 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/720fafbd-95a6-4409-980e-74fd865ef9e9-utilities\") pod \"720fafbd-95a6-4409-980e-74fd865ef9e9\" (UID: \"720fafbd-95a6-4409-980e-74fd865ef9e9\") " Mar 09 04:08:52 crc kubenswrapper[4901]: I0309 04:08:52.050620 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/720fafbd-95a6-4409-980e-74fd865ef9e9-catalog-content\") pod \"720fafbd-95a6-4409-980e-74fd865ef9e9\" (UID: \"720fafbd-95a6-4409-980e-74fd865ef9e9\") " Mar 09 04:08:52 crc kubenswrapper[4901]: I0309 04:08:52.052816 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/720fafbd-95a6-4409-980e-74fd865ef9e9-utilities" (OuterVolumeSpecName: "utilities") pod "720fafbd-95a6-4409-980e-74fd865ef9e9" (UID: "720fafbd-95a6-4409-980e-74fd865ef9e9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 04:08:52 crc kubenswrapper[4901]: I0309 04:08:52.066032 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/720fafbd-95a6-4409-980e-74fd865ef9e9-kube-api-access-n4cbk" (OuterVolumeSpecName: "kube-api-access-n4cbk") pod "720fafbd-95a6-4409-980e-74fd865ef9e9" (UID: "720fafbd-95a6-4409-980e-74fd865ef9e9"). InnerVolumeSpecName "kube-api-access-n4cbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:08:52 crc kubenswrapper[4901]: I0309 04:08:52.156978 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/720fafbd-95a6-4409-980e-74fd865ef9e9-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 04:08:52 crc kubenswrapper[4901]: I0309 04:08:52.157029 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4cbk\" (UniqueName: \"kubernetes.io/projected/720fafbd-95a6-4409-980e-74fd865ef9e9-kube-api-access-n4cbk\") on node \"crc\" DevicePath \"\"" Mar 09 04:08:52 crc kubenswrapper[4901]: I0309 04:08:52.209289 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/720fafbd-95a6-4409-980e-74fd865ef9e9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "720fafbd-95a6-4409-980e-74fd865ef9e9" (UID: "720fafbd-95a6-4409-980e-74fd865ef9e9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 04:08:52 crc kubenswrapper[4901]: I0309 04:08:52.259284 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/720fafbd-95a6-4409-980e-74fd865ef9e9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 04:08:52 crc kubenswrapper[4901]: I0309 04:08:52.402842 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Mar 09 04:08:52 crc kubenswrapper[4901]: E0309 04:08:52.403816 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="720fafbd-95a6-4409-980e-74fd865ef9e9" containerName="extract-utilities" Mar 09 04:08:52 crc kubenswrapper[4901]: I0309 04:08:52.403850 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="720fafbd-95a6-4409-980e-74fd865ef9e9" containerName="extract-utilities" Mar 09 04:08:52 crc kubenswrapper[4901]: E0309 04:08:52.403944 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="720fafbd-95a6-4409-980e-74fd865ef9e9" containerName="registry-server" Mar 09 04:08:52 crc kubenswrapper[4901]: I0309 04:08:52.403961 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="720fafbd-95a6-4409-980e-74fd865ef9e9" containerName="registry-server" Mar 09 04:08:52 crc kubenswrapper[4901]: E0309 04:08:52.404042 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="720fafbd-95a6-4409-980e-74fd865ef9e9" containerName="extract-content" Mar 09 04:08:52 crc kubenswrapper[4901]: I0309 04:08:52.404058 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="720fafbd-95a6-4409-980e-74fd865ef9e9" containerName="extract-content" Mar 09 04:08:52 crc kubenswrapper[4901]: I0309 04:08:52.404559 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="720fafbd-95a6-4409-980e-74fd865ef9e9" containerName="registry-server" Mar 09 04:08:52 crc kubenswrapper[4901]: I0309 04:08:52.406101 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 09 04:08:52 crc kubenswrapper[4901]: I0309 04:08:52.411119 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-58bdh" Mar 09 04:08:52 crc kubenswrapper[4901]: I0309 04:08:52.413002 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 09 04:08:52 crc kubenswrapper[4901]: I0309 04:08:52.485821 4901 generic.go:334] "Generic (PLEG): container finished" podID="720fafbd-95a6-4409-980e-74fd865ef9e9" containerID="d427c218959224d421e2a2e045ea3e0ee7ad78d1d53dea34639ffef7ca95ec2f" exitCode=0 Mar 09 04:08:52 crc kubenswrapper[4901]: I0309 04:08:52.485864 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j6l64" event={"ID":"720fafbd-95a6-4409-980e-74fd865ef9e9","Type":"ContainerDied","Data":"d427c218959224d421e2a2e045ea3e0ee7ad78d1d53dea34639ffef7ca95ec2f"} Mar 09 04:08:52 crc kubenswrapper[4901]: I0309 04:08:52.485889 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j6l64" event={"ID":"720fafbd-95a6-4409-980e-74fd865ef9e9","Type":"ContainerDied","Data":"e147761a3684dcd48a14dcbc73dcb84994093fe9fae615f71b9e68886023bff1"} Mar 09 04:08:52 crc kubenswrapper[4901]: I0309 04:08:52.485910 4901 scope.go:117] "RemoveContainer" containerID="d427c218959224d421e2a2e045ea3e0ee7ad78d1d53dea34639ffef7ca95ec2f" Mar 09 04:08:52 crc kubenswrapper[4901]: I0309 04:08:52.485926 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j6l64" Mar 09 04:08:52 crc kubenswrapper[4901]: I0309 04:08:52.515935 4901 scope.go:117] "RemoveContainer" containerID="14871fcb84dbdd6c0c42739159219bcdd7955897744e5bc40a7eb2e6ea8087bb" Mar 09 04:08:52 crc kubenswrapper[4901]: I0309 04:08:52.526516 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j6l64"] Mar 09 04:08:52 crc kubenswrapper[4901]: I0309 04:08:52.535141 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-j6l64"] Mar 09 04:08:52 crc kubenswrapper[4901]: I0309 04:08:52.544033 4901 scope.go:117] "RemoveContainer" containerID="7ecd12eb8b1e590d7ee993759298b6185507571822220a9b96d0a35c25afba51" Mar 09 04:08:52 crc kubenswrapper[4901]: I0309 04:08:52.563934 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5lm5\" (UniqueName: \"kubernetes.io/projected/bc89061d-5a62-4c39-bae7-b8de5221d783-kube-api-access-c5lm5\") pod \"mariadb-client\" (UID: \"bc89061d-5a62-4c39-bae7-b8de5221d783\") " pod="openstack/mariadb-client" Mar 09 04:08:52 crc kubenswrapper[4901]: I0309 04:08:52.571706 4901 scope.go:117] "RemoveContainer" containerID="d427c218959224d421e2a2e045ea3e0ee7ad78d1d53dea34639ffef7ca95ec2f" Mar 09 04:08:52 crc kubenswrapper[4901]: E0309 04:08:52.572155 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d427c218959224d421e2a2e045ea3e0ee7ad78d1d53dea34639ffef7ca95ec2f\": container with ID starting with d427c218959224d421e2a2e045ea3e0ee7ad78d1d53dea34639ffef7ca95ec2f not found: ID does not exist" containerID="d427c218959224d421e2a2e045ea3e0ee7ad78d1d53dea34639ffef7ca95ec2f" Mar 09 04:08:52 crc kubenswrapper[4901]: I0309 04:08:52.572191 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d427c218959224d421e2a2e045ea3e0ee7ad78d1d53dea34639ffef7ca95ec2f"} err="failed to get container status \"d427c218959224d421e2a2e045ea3e0ee7ad78d1d53dea34639ffef7ca95ec2f\": rpc error: code = NotFound desc = could not find container \"d427c218959224d421e2a2e045ea3e0ee7ad78d1d53dea34639ffef7ca95ec2f\": container with ID starting with d427c218959224d421e2a2e045ea3e0ee7ad78d1d53dea34639ffef7ca95ec2f not found: ID does not exist" Mar 09 04:08:52 crc kubenswrapper[4901]: I0309 04:08:52.572233 4901 scope.go:117] "RemoveContainer" containerID="14871fcb84dbdd6c0c42739159219bcdd7955897744e5bc40a7eb2e6ea8087bb" Mar 09 04:08:52 crc kubenswrapper[4901]: E0309 04:08:52.572802 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14871fcb84dbdd6c0c42739159219bcdd7955897744e5bc40a7eb2e6ea8087bb\": container with ID starting with 14871fcb84dbdd6c0c42739159219bcdd7955897744e5bc40a7eb2e6ea8087bb not found: ID does not exist" containerID="14871fcb84dbdd6c0c42739159219bcdd7955897744e5bc40a7eb2e6ea8087bb" Mar 09 04:08:52 crc kubenswrapper[4901]: I0309 04:08:52.572824 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14871fcb84dbdd6c0c42739159219bcdd7955897744e5bc40a7eb2e6ea8087bb"} err="failed to get container status \"14871fcb84dbdd6c0c42739159219bcdd7955897744e5bc40a7eb2e6ea8087bb\": rpc error: code = NotFound desc = could not find container \"14871fcb84dbdd6c0c42739159219bcdd7955897744e5bc40a7eb2e6ea8087bb\": container with ID starting with 14871fcb84dbdd6c0c42739159219bcdd7955897744e5bc40a7eb2e6ea8087bb not found: ID does not exist" Mar 09 04:08:52 crc kubenswrapper[4901]: I0309 04:08:52.572838 4901 scope.go:117] "RemoveContainer" containerID="7ecd12eb8b1e590d7ee993759298b6185507571822220a9b96d0a35c25afba51" Mar 09 04:08:52 crc kubenswrapper[4901]: E0309 04:08:52.573168 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ecd12eb8b1e590d7ee993759298b6185507571822220a9b96d0a35c25afba51\": container with ID starting with 7ecd12eb8b1e590d7ee993759298b6185507571822220a9b96d0a35c25afba51 not found: ID does not exist" containerID="7ecd12eb8b1e590d7ee993759298b6185507571822220a9b96d0a35c25afba51" Mar 09 04:08:52 crc kubenswrapper[4901]: I0309 04:08:52.573274 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ecd12eb8b1e590d7ee993759298b6185507571822220a9b96d0a35c25afba51"} err="failed to get container status \"7ecd12eb8b1e590d7ee993759298b6185507571822220a9b96d0a35c25afba51\": rpc error: code = NotFound desc = could not find container \"7ecd12eb8b1e590d7ee993759298b6185507571822220a9b96d0a35c25afba51\": container with ID starting with 7ecd12eb8b1e590d7ee993759298b6185507571822220a9b96d0a35c25afba51 not found: ID does not exist" Mar 09 04:08:52 crc kubenswrapper[4901]: I0309 04:08:52.666272 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5lm5\" (UniqueName: \"kubernetes.io/projected/bc89061d-5a62-4c39-bae7-b8de5221d783-kube-api-access-c5lm5\") pod \"mariadb-client\" (UID: \"bc89061d-5a62-4c39-bae7-b8de5221d783\") " pod="openstack/mariadb-client" Mar 09 04:08:52 crc kubenswrapper[4901]: I0309 04:08:52.688041 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5lm5\" (UniqueName: \"kubernetes.io/projected/bc89061d-5a62-4c39-bae7-b8de5221d783-kube-api-access-c5lm5\") pod \"mariadb-client\" (UID: \"bc89061d-5a62-4c39-bae7-b8de5221d783\") " pod="openstack/mariadb-client" Mar 09 04:08:52 crc kubenswrapper[4901]: I0309 04:08:52.732256 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 09 04:08:53 crc kubenswrapper[4901]: I0309 04:08:53.087858 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 09 04:08:53 crc kubenswrapper[4901]: W0309 04:08:53.088448 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc89061d_5a62_4c39_bae7_b8de5221d783.slice/crio-a8f14d5eed1093fca61a10baebae1e1f18035a7130bb6beb13b42938c94b2cca WatchSource:0}: Error finding container a8f14d5eed1093fca61a10baebae1e1f18035a7130bb6beb13b42938c94b2cca: Status 404 returned error can't find the container with id a8f14d5eed1093fca61a10baebae1e1f18035a7130bb6beb13b42938c94b2cca Mar 09 04:08:53 crc kubenswrapper[4901]: I0309 04:08:53.496817 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"bc89061d-5a62-4c39-bae7-b8de5221d783","Type":"ContainerStarted","Data":"a8f14d5eed1093fca61a10baebae1e1f18035a7130bb6beb13b42938c94b2cca"} Mar 09 04:08:54 crc kubenswrapper[4901]: I0309 04:08:54.114022 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="720fafbd-95a6-4409-980e-74fd865ef9e9" path="/var/lib/kubelet/pods/720fafbd-95a6-4409-980e-74fd865ef9e9/volumes" Mar 09 04:08:54 crc kubenswrapper[4901]: I0309 04:08:54.507329 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"bc89061d-5a62-4c39-bae7-b8de5221d783","Type":"ContainerStarted","Data":"97be7fc87c6d0b5b1626fe5c85423c84c6b1248acf80b6888bc3860d20084330"} Mar 09 04:08:54 crc kubenswrapper[4901]: I0309 04:08:54.528828 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client" podStartSLOduration=2.025665154 podStartE2EDuration="2.528799166s" podCreationTimestamp="2026-03-09 04:08:52 +0000 UTC" firstStartedPulling="2026-03-09 04:08:53.090911349 +0000 UTC m=+5257.680575081" lastFinishedPulling="2026-03-09 04:08:53.59404532 +0000 UTC m=+5258.183709093" observedRunningTime="2026-03-09 04:08:54.526687003 +0000 UTC m=+5259.116350785" watchObservedRunningTime="2026-03-09 04:08:54.528799166 +0000 UTC m=+5259.118462938" Mar 09 04:09:09 crc kubenswrapper[4901]: I0309 04:09:09.617491 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 09 04:09:09 crc kubenswrapper[4901]: I0309 04:09:09.618352 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-client" podUID="bc89061d-5a62-4c39-bae7-b8de5221d783" containerName="mariadb-client" containerID="cri-o://97be7fc87c6d0b5b1626fe5c85423c84c6b1248acf80b6888bc3860d20084330" gracePeriod=30 Mar 09 04:09:10 crc kubenswrapper[4901]: I0309 04:09:10.193341 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 09 04:09:10 crc kubenswrapper[4901]: I0309 04:09:10.295214 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5lm5\" (UniqueName: \"kubernetes.io/projected/bc89061d-5a62-4c39-bae7-b8de5221d783-kube-api-access-c5lm5\") pod \"bc89061d-5a62-4c39-bae7-b8de5221d783\" (UID: \"bc89061d-5a62-4c39-bae7-b8de5221d783\") " Mar 09 04:09:10 crc kubenswrapper[4901]: I0309 04:09:10.304709 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc89061d-5a62-4c39-bae7-b8de5221d783-kube-api-access-c5lm5" (OuterVolumeSpecName: "kube-api-access-c5lm5") pod "bc89061d-5a62-4c39-bae7-b8de5221d783" (UID: "bc89061d-5a62-4c39-bae7-b8de5221d783"). InnerVolumeSpecName "kube-api-access-c5lm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:09:10 crc kubenswrapper[4901]: I0309 04:09:10.398462 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5lm5\" (UniqueName: \"kubernetes.io/projected/bc89061d-5a62-4c39-bae7-b8de5221d783-kube-api-access-c5lm5\") on node \"crc\" DevicePath \"\"" Mar 09 04:09:10 crc kubenswrapper[4901]: I0309 04:09:10.685613 4901 generic.go:334] "Generic (PLEG): container finished" podID="bc89061d-5a62-4c39-bae7-b8de5221d783" containerID="97be7fc87c6d0b5b1626fe5c85423c84c6b1248acf80b6888bc3860d20084330" exitCode=143 Mar 09 04:09:10 crc kubenswrapper[4901]: I0309 04:09:10.685654 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 09 04:09:10 crc kubenswrapper[4901]: I0309 04:09:10.685693 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"bc89061d-5a62-4c39-bae7-b8de5221d783","Type":"ContainerDied","Data":"97be7fc87c6d0b5b1626fe5c85423c84c6b1248acf80b6888bc3860d20084330"} Mar 09 04:09:10 crc kubenswrapper[4901]: I0309 04:09:10.685799 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"bc89061d-5a62-4c39-bae7-b8de5221d783","Type":"ContainerDied","Data":"a8f14d5eed1093fca61a10baebae1e1f18035a7130bb6beb13b42938c94b2cca"} Mar 09 04:09:10 crc kubenswrapper[4901]: I0309 04:09:10.685844 4901 scope.go:117] "RemoveContainer" containerID="97be7fc87c6d0b5b1626fe5c85423c84c6b1248acf80b6888bc3860d20084330" Mar 09 04:09:10 crc kubenswrapper[4901]: I0309 04:09:10.722398 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 09 04:09:10 crc kubenswrapper[4901]: I0309 04:09:10.722678 4901 scope.go:117] "RemoveContainer" containerID="97be7fc87c6d0b5b1626fe5c85423c84c6b1248acf80b6888bc3860d20084330" Mar 09 04:09:10 crc kubenswrapper[4901]: E0309 04:09:10.723144 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97be7fc87c6d0b5b1626fe5c85423c84c6b1248acf80b6888bc3860d20084330\": container with ID starting with 97be7fc87c6d0b5b1626fe5c85423c84c6b1248acf80b6888bc3860d20084330 not found: ID does not exist" containerID="97be7fc87c6d0b5b1626fe5c85423c84c6b1248acf80b6888bc3860d20084330" Mar 09 04:09:10 crc kubenswrapper[4901]: I0309 04:09:10.723190 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97be7fc87c6d0b5b1626fe5c85423c84c6b1248acf80b6888bc3860d20084330"} err="failed to get container status \"97be7fc87c6d0b5b1626fe5c85423c84c6b1248acf80b6888bc3860d20084330\": rpc error: code = NotFound desc = could not find container \"97be7fc87c6d0b5b1626fe5c85423c84c6b1248acf80b6888bc3860d20084330\": container with ID starting with 97be7fc87c6d0b5b1626fe5c85423c84c6b1248acf80b6888bc3860d20084330 not found: ID does not exist" Mar 09 04:09:10 crc kubenswrapper[4901]: I0309 04:09:10.729099 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Mar 09 04:09:12 crc kubenswrapper[4901]: I0309 04:09:12.128566 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc89061d-5a62-4c39-bae7-b8de5221d783" path="/var/lib/kubelet/pods/bc89061d-5a62-4c39-bae7-b8de5221d783/volumes" Mar 09 04:10:00 crc kubenswrapper[4901]: I0309 04:10:00.167754 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550490-5xxvw"] Mar 09 04:10:00 crc kubenswrapper[4901]: E0309 04:10:00.168667 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc89061d-5a62-4c39-bae7-b8de5221d783" containerName="mariadb-client" Mar 09 04:10:00 crc kubenswrapper[4901]: I0309 04:10:00.168683 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc89061d-5a62-4c39-bae7-b8de5221d783" containerName="mariadb-client" Mar 09 04:10:00 crc kubenswrapper[4901]: I0309 04:10:00.168902 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc89061d-5a62-4c39-bae7-b8de5221d783" containerName="mariadb-client" Mar 09 04:10:00 crc kubenswrapper[4901]: I0309 04:10:00.169536 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550490-5xxvw" Mar 09 04:10:00 crc kubenswrapper[4901]: I0309 04:10:00.176198 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 04:10:00 crc kubenswrapper[4901]: I0309 04:10:00.176351 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 04:10:00 crc kubenswrapper[4901]: I0309 04:10:00.176556 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lq6vs" Mar 09 04:10:00 crc kubenswrapper[4901]: I0309 04:10:00.177428 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550490-5xxvw"] Mar 09 04:10:00 crc kubenswrapper[4901]: I0309 04:10:00.225247 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmjmq\" (UniqueName: \"kubernetes.io/projected/9cde0fc6-6b28-4065-b35e-3a9dda22574a-kube-api-access-kmjmq\") pod \"auto-csr-approver-29550490-5xxvw\" (UID: \"9cde0fc6-6b28-4065-b35e-3a9dda22574a\") " pod="openshift-infra/auto-csr-approver-29550490-5xxvw" Mar 09 04:10:00 crc kubenswrapper[4901]: I0309 04:10:00.326751 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmjmq\" (UniqueName: \"kubernetes.io/projected/9cde0fc6-6b28-4065-b35e-3a9dda22574a-kube-api-access-kmjmq\") pod \"auto-csr-approver-29550490-5xxvw\" (UID: \"9cde0fc6-6b28-4065-b35e-3a9dda22574a\") " pod="openshift-infra/auto-csr-approver-29550490-5xxvw" Mar 09 04:10:00 crc kubenswrapper[4901]: I0309 04:10:00.353531 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmjmq\" (UniqueName: \"kubernetes.io/projected/9cde0fc6-6b28-4065-b35e-3a9dda22574a-kube-api-access-kmjmq\") pod \"auto-csr-approver-29550490-5xxvw\" (UID: \"9cde0fc6-6b28-4065-b35e-3a9dda22574a\") " pod="openshift-infra/auto-csr-approver-29550490-5xxvw" Mar 09 04:10:00 crc kubenswrapper[4901]: I0309 04:10:00.517669 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550490-5xxvw" Mar 09 04:10:00 crc kubenswrapper[4901]: I0309 04:10:00.863199 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 04:10:00 crc kubenswrapper[4901]: I0309 04:10:00.863604 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 04:10:01 crc kubenswrapper[4901]: I0309 04:10:01.023636 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550490-5xxvw"] Mar 09 04:10:01 crc kubenswrapper[4901]: W0309 04:10:01.032211 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cde0fc6_6b28_4065_b35e_3a9dda22574a.slice/crio-9ba98fdf17ec7b311d63cecf14e7a519beebe2aadc849b4d627036f7ba6b6729 WatchSource:0}: Error finding container 9ba98fdf17ec7b311d63cecf14e7a519beebe2aadc849b4d627036f7ba6b6729: Status 404 returned error can't find the container with id 9ba98fdf17ec7b311d63cecf14e7a519beebe2aadc849b4d627036f7ba6b6729 Mar 09 04:10:01 crc kubenswrapper[4901]: I0309 04:10:01.205339 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550490-5xxvw" event={"ID":"9cde0fc6-6b28-4065-b35e-3a9dda22574a","Type":"ContainerStarted","Data":"9ba98fdf17ec7b311d63cecf14e7a519beebe2aadc849b4d627036f7ba6b6729"} Mar 09 04:10:03 crc kubenswrapper[4901]: I0309 04:10:03.227821 4901 generic.go:334] "Generic (PLEG): container finished" podID="9cde0fc6-6b28-4065-b35e-3a9dda22574a" containerID="4bf87b2f9423cd63169f60194fb521e0ae02ffb275c0cc3ee1ee878233a5ce4f" exitCode=0 Mar 09 04:10:03 crc kubenswrapper[4901]: I0309 04:10:03.228458 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550490-5xxvw" event={"ID":"9cde0fc6-6b28-4065-b35e-3a9dda22574a","Type":"ContainerDied","Data":"4bf87b2f9423cd63169f60194fb521e0ae02ffb275c0cc3ee1ee878233a5ce4f"} Mar 09 04:10:04 crc kubenswrapper[4901]: I0309 04:10:04.572338 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550490-5xxvw" Mar 09 04:10:04 crc kubenswrapper[4901]: I0309 04:10:04.695000 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmjmq\" (UniqueName: \"kubernetes.io/projected/9cde0fc6-6b28-4065-b35e-3a9dda22574a-kube-api-access-kmjmq\") pod \"9cde0fc6-6b28-4065-b35e-3a9dda22574a\" (UID: \"9cde0fc6-6b28-4065-b35e-3a9dda22574a\") " Mar 09 04:10:04 crc kubenswrapper[4901]: I0309 04:10:04.704500 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cde0fc6-6b28-4065-b35e-3a9dda22574a-kube-api-access-kmjmq" (OuterVolumeSpecName: "kube-api-access-kmjmq") pod "9cde0fc6-6b28-4065-b35e-3a9dda22574a" (UID: "9cde0fc6-6b28-4065-b35e-3a9dda22574a"). InnerVolumeSpecName "kube-api-access-kmjmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:10:04 crc kubenswrapper[4901]: I0309 04:10:04.796894 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmjmq\" (UniqueName: \"kubernetes.io/projected/9cde0fc6-6b28-4065-b35e-3a9dda22574a-kube-api-access-kmjmq\") on node \"crc\" DevicePath \"\"" Mar 09 04:10:05 crc kubenswrapper[4901]: I0309 04:10:05.253594 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550490-5xxvw" event={"ID":"9cde0fc6-6b28-4065-b35e-3a9dda22574a","Type":"ContainerDied","Data":"9ba98fdf17ec7b311d63cecf14e7a519beebe2aadc849b4d627036f7ba6b6729"} Mar 09 04:10:05 crc kubenswrapper[4901]: I0309 04:10:05.254206 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ba98fdf17ec7b311d63cecf14e7a519beebe2aadc849b4d627036f7ba6b6729" Mar 09 04:10:05 crc kubenswrapper[4901]: I0309 04:10:05.253652 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550490-5xxvw" Mar 09 04:10:05 crc kubenswrapper[4901]: I0309 04:10:05.677898 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550484-thrv5"] Mar 09 04:10:05 crc kubenswrapper[4901]: I0309 04:10:05.686460 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550484-thrv5"] Mar 09 04:10:06 crc kubenswrapper[4901]: I0309 04:10:06.122313 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="045dd9d2-24bb-4198-ad1f-1dfdfe81afb5" path="/var/lib/kubelet/pods/045dd9d2-24bb-4198-ad1f-1dfdfe81afb5/volumes" Mar 09 04:10:30 crc kubenswrapper[4901]: I0309 04:10:30.863336 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 04:10:30 crc kubenswrapper[4901]: I0309 04:10:30.864160 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 04:10:42 crc kubenswrapper[4901]: I0309 04:10:42.141793 4901 scope.go:117] "RemoveContainer" containerID="2e66386be2b1f3b38ff34fb6f84d874f024b7ba9a4e63a7ca9df1d17e4294b4d" Mar 09 04:10:42 crc kubenswrapper[4901]: I0309 04:10:42.180469 4901 scope.go:117] "RemoveContainer" containerID="8620404e3532a82760258afe7ee9479528ae51c4ce90cbcc60db4c8ce16cd758" Mar 09 04:11:00 crc kubenswrapper[4901]: I0309 04:11:00.862863 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 04:11:00 crc kubenswrapper[4901]: I0309 04:11:00.863668 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 04:11:00 crc kubenswrapper[4901]: I0309 04:11:00.863741 4901 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5c998" Mar 09 04:11:00 crc kubenswrapper[4901]: I0309 04:11:00.864781 4901 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d217497ad345fae7da1bdd0eb24a9199e10562d0bb7d6f5c45a95f7e79585459"} pod="openshift-machine-config-operator/machine-config-daemon-5c998" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 04:11:00 crc kubenswrapper[4901]: I0309 04:11:00.864887 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" containerID="cri-o://d217497ad345fae7da1bdd0eb24a9199e10562d0bb7d6f5c45a95f7e79585459" gracePeriod=600 Mar 09 04:11:01 crc kubenswrapper[4901]: E0309 04:11:01.009137 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:11:01 crc kubenswrapper[4901]: I0309 04:11:01.850840 4901 generic.go:334] "Generic (PLEG): container finished" podID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerID="d217497ad345fae7da1bdd0eb24a9199e10562d0bb7d6f5c45a95f7e79585459" exitCode=0 Mar 09 04:11:01 crc kubenswrapper[4901]: I0309 04:11:01.850914 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" event={"ID":"65e722e8-52c4-4bb6-9927-f378b2f7296a","Type":"ContainerDied","Data":"d217497ad345fae7da1bdd0eb24a9199e10562d0bb7d6f5c45a95f7e79585459"} Mar 09 04:11:01 crc kubenswrapper[4901]: I0309 04:11:01.851205 4901 scope.go:117] "RemoveContainer" containerID="ee11181bba501a52807c1c6a38036b291b104f5a882250fea8dbae3a89c2ab93" Mar 09 04:11:01 crc kubenswrapper[4901]: I0309 04:11:01.851813 4901 scope.go:117] "RemoveContainer" containerID="d217497ad345fae7da1bdd0eb24a9199e10562d0bb7d6f5c45a95f7e79585459" Mar 09 04:11:01 crc kubenswrapper[4901]: E0309 04:11:01.852085 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:11:15 crc kubenswrapper[4901]: I0309 04:11:15.106884 4901 scope.go:117] "RemoveContainer" containerID="d217497ad345fae7da1bdd0eb24a9199e10562d0bb7d6f5c45a95f7e79585459" Mar 09 04:11:15 crc kubenswrapper[4901]: E0309 04:11:15.107574 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:11:29 crc kubenswrapper[4901]: I0309 04:11:29.107648 4901 scope.go:117] "RemoveContainer" containerID="d217497ad345fae7da1bdd0eb24a9199e10562d0bb7d6f5c45a95f7e79585459" Mar 09 04:11:29 crc kubenswrapper[4901]: E0309 04:11:29.108731 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:11:44 crc kubenswrapper[4901]: I0309 04:11:44.107542 4901 scope.go:117] "RemoveContainer" containerID="d217497ad345fae7da1bdd0eb24a9199e10562d0bb7d6f5c45a95f7e79585459" Mar 09 04:11:44 crc kubenswrapper[4901]: E0309 04:11:44.108717 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:11:57 crc kubenswrapper[4901]: I0309 04:11:57.106792 4901 scope.go:117] "RemoveContainer" containerID="d217497ad345fae7da1bdd0eb24a9199e10562d0bb7d6f5c45a95f7e79585459" Mar 09 04:11:57 crc kubenswrapper[4901]: E0309 04:11:57.107526 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:12:00 crc kubenswrapper[4901]: I0309 04:12:00.166881 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550492-bxlcq"] Mar 09 04:12:00 crc kubenswrapper[4901]: E0309 04:12:00.167803 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cde0fc6-6b28-4065-b35e-3a9dda22574a" containerName="oc" Mar 09 04:12:00 crc kubenswrapper[4901]: I0309 04:12:00.167823 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cde0fc6-6b28-4065-b35e-3a9dda22574a" containerName="oc" Mar 09 04:12:00 crc kubenswrapper[4901]: I0309 04:12:00.168030 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cde0fc6-6b28-4065-b35e-3a9dda22574a" containerName="oc" Mar 09 04:12:00 crc kubenswrapper[4901]: I0309 04:12:00.168676 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550492-bxlcq" Mar 09 04:12:00 crc kubenswrapper[4901]: I0309 04:12:00.171625 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 04:12:00 crc kubenswrapper[4901]: I0309 04:12:00.171683 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lq6vs" Mar 09 04:12:00 crc kubenswrapper[4901]: I0309 04:12:00.173279 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 04:12:00 crc kubenswrapper[4901]: I0309 04:12:00.178862 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550492-bxlcq"] Mar 09 04:12:00 crc kubenswrapper[4901]: I0309 04:12:00.262371 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stj4p\" (UniqueName: \"kubernetes.io/projected/4a165250-7b91-481f-80f9-1561a790b7c9-kube-api-access-stj4p\") pod \"auto-csr-approver-29550492-bxlcq\" (UID: \"4a165250-7b91-481f-80f9-1561a790b7c9\") " pod="openshift-infra/auto-csr-approver-29550492-bxlcq" Mar 09 04:12:00 crc kubenswrapper[4901]: I0309 04:12:00.364473 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stj4p\" (UniqueName: \"kubernetes.io/projected/4a165250-7b91-481f-80f9-1561a790b7c9-kube-api-access-stj4p\") pod \"auto-csr-approver-29550492-bxlcq\" (UID: \"4a165250-7b91-481f-80f9-1561a790b7c9\") " pod="openshift-infra/auto-csr-approver-29550492-bxlcq" Mar 09 04:12:00 crc kubenswrapper[4901]: I0309 04:12:00.395000 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stj4p\" (UniqueName: \"kubernetes.io/projected/4a165250-7b91-481f-80f9-1561a790b7c9-kube-api-access-stj4p\") pod \"auto-csr-approver-29550492-bxlcq\" (UID: \"4a165250-7b91-481f-80f9-1561a790b7c9\") " pod="openshift-infra/auto-csr-approver-29550492-bxlcq" Mar 09 04:12:00 crc kubenswrapper[4901]: I0309 04:12:00.502298 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550492-bxlcq" Mar 09 04:12:01 crc kubenswrapper[4901]: I0309 04:12:01.026841 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550492-bxlcq"] Mar 09 04:12:01 crc kubenswrapper[4901]: I0309 04:12:01.044461 4901 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 04:12:01 crc kubenswrapper[4901]: I0309 04:12:01.426004 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550492-bxlcq" event={"ID":"4a165250-7b91-481f-80f9-1561a790b7c9","Type":"ContainerStarted","Data":"ed0999a486ead3822087c1c4bb7047219b12367fdb94573c9f3aa220bd56b103"} Mar 09 04:12:02 crc kubenswrapper[4901]: I0309 04:12:02.435994 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550492-bxlcq" event={"ID":"4a165250-7b91-481f-80f9-1561a790b7c9","Type":"ContainerStarted","Data":"7fb4cff40f2d552f73408e05cbd3fd49b9c40585ea6a0fa507d15e7ef5bc3a16"} Mar 09 04:12:02 crc kubenswrapper[4901]: I0309 04:12:02.450193 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550492-bxlcq" podStartSLOduration=1.5788059859999999 podStartE2EDuration="2.450166065s" podCreationTimestamp="2026-03-09 04:12:00 +0000 UTC" firstStartedPulling="2026-03-09 04:12:01.043996546 +0000 UTC m=+5445.633660318" lastFinishedPulling="2026-03-09 04:12:01.915356625 +0000 UTC m=+5446.505020397" observedRunningTime="2026-03-09 04:12:02.448879823 +0000 UTC m=+5447.038543565" watchObservedRunningTime="2026-03-09 04:12:02.450166065 +0000 UTC m=+5447.039829807" Mar 09 04:12:03 crc kubenswrapper[4901]: I0309 04:12:03.461282 4901 generic.go:334] "Generic (PLEG): container finished" podID="4a165250-7b91-481f-80f9-1561a790b7c9" containerID="7fb4cff40f2d552f73408e05cbd3fd49b9c40585ea6a0fa507d15e7ef5bc3a16" exitCode=0 Mar 09 04:12:03 crc kubenswrapper[4901]: I0309 04:12:03.461331 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550492-bxlcq" event={"ID":"4a165250-7b91-481f-80f9-1561a790b7c9","Type":"ContainerDied","Data":"7fb4cff40f2d552f73408e05cbd3fd49b9c40585ea6a0fa507d15e7ef5bc3a16"} Mar 09 04:12:04 crc kubenswrapper[4901]: I0309 04:12:04.881024 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550492-bxlcq" Mar 09 04:12:04 crc kubenswrapper[4901]: I0309 04:12:04.941638 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-stj4p\" (UniqueName: \"kubernetes.io/projected/4a165250-7b91-481f-80f9-1561a790b7c9-kube-api-access-stj4p\") pod \"4a165250-7b91-481f-80f9-1561a790b7c9\" (UID: \"4a165250-7b91-481f-80f9-1561a790b7c9\") " Mar 09 04:12:04 crc kubenswrapper[4901]: I0309 04:12:04.957202 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a165250-7b91-481f-80f9-1561a790b7c9-kube-api-access-stj4p" (OuterVolumeSpecName: "kube-api-access-stj4p") pod "4a165250-7b91-481f-80f9-1561a790b7c9" (UID: "4a165250-7b91-481f-80f9-1561a790b7c9"). InnerVolumeSpecName "kube-api-access-stj4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:12:05 crc kubenswrapper[4901]: I0309 04:12:05.043778 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-stj4p\" (UniqueName: \"kubernetes.io/projected/4a165250-7b91-481f-80f9-1561a790b7c9-kube-api-access-stj4p\") on node \"crc\" DevicePath \"\"" Mar 09 04:12:05 crc kubenswrapper[4901]: I0309 04:12:05.494825 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550492-bxlcq" event={"ID":"4a165250-7b91-481f-80f9-1561a790b7c9","Type":"ContainerDied","Data":"ed0999a486ead3822087c1c4bb7047219b12367fdb94573c9f3aa220bd56b103"} Mar 09 04:12:05 crc kubenswrapper[4901]: I0309 04:12:05.495198 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550492-bxlcq" Mar 09 04:12:05 crc kubenswrapper[4901]: I0309 04:12:05.495205 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed0999a486ead3822087c1c4bb7047219b12367fdb94573c9f3aa220bd56b103" Mar 09 04:12:05 crc kubenswrapper[4901]: I0309 04:12:05.566067 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550486-sp2hq"] Mar 09 04:12:05 crc kubenswrapper[4901]: I0309 04:12:05.572642 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550486-sp2hq"] Mar 09 04:12:06 crc kubenswrapper[4901]: I0309 04:12:06.131782 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f69fd42-fdc0-4311-bc09-3a1307f04e40" path="/var/lib/kubelet/pods/1f69fd42-fdc0-4311-bc09-3a1307f04e40/volumes" Mar 09 04:12:08 crc kubenswrapper[4901]: I0309 04:12:08.356457 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-m54xk"] Mar 09 04:12:08 crc kubenswrapper[4901]: E0309 04:12:08.357113 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a165250-7b91-481f-80f9-1561a790b7c9" containerName="oc" Mar 09 04:12:08 crc kubenswrapper[4901]: I0309 04:12:08.357133 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a165250-7b91-481f-80f9-1561a790b7c9" containerName="oc" Mar 09 04:12:08 crc kubenswrapper[4901]: I0309 04:12:08.357493 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a165250-7b91-481f-80f9-1561a790b7c9" containerName="oc" Mar 09 04:12:08 crc kubenswrapper[4901]: I0309 04:12:08.364145 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m54xk" Mar 09 04:12:08 crc kubenswrapper[4901]: I0309 04:12:08.375969 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m54xk"] Mar 09 04:12:08 crc kubenswrapper[4901]: I0309 04:12:08.499787 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5sgr\" (UniqueName: \"kubernetes.io/projected/304b98fc-4d85-4041-ac04-aac01afd2005-kube-api-access-d5sgr\") pod \"certified-operators-m54xk\" (UID: \"304b98fc-4d85-4041-ac04-aac01afd2005\") " pod="openshift-marketplace/certified-operators-m54xk" Mar 09 04:12:08 crc kubenswrapper[4901]: I0309 04:12:08.499867 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/304b98fc-4d85-4041-ac04-aac01afd2005-utilities\") pod \"certified-operators-m54xk\" (UID: \"304b98fc-4d85-4041-ac04-aac01afd2005\") " pod="openshift-marketplace/certified-operators-m54xk" Mar 09 04:12:08 crc kubenswrapper[4901]: I0309 04:12:08.499942 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/304b98fc-4d85-4041-ac04-aac01afd2005-catalog-content\") pod \"certified-operators-m54xk\" (UID: \"304b98fc-4d85-4041-ac04-aac01afd2005\") " pod="openshift-marketplace/certified-operators-m54xk" Mar 09 04:12:08 crc kubenswrapper[4901]: I0309 04:12:08.601789 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/304b98fc-4d85-4041-ac04-aac01afd2005-catalog-content\") pod \"certified-operators-m54xk\" (UID: \"304b98fc-4d85-4041-ac04-aac01afd2005\") " pod="openshift-marketplace/certified-operators-m54xk" Mar 09 04:12:08 crc kubenswrapper[4901]: I0309 04:12:08.602170 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5sgr\" (UniqueName: \"kubernetes.io/projected/304b98fc-4d85-4041-ac04-aac01afd2005-kube-api-access-d5sgr\") pod \"certified-operators-m54xk\" (UID: \"304b98fc-4d85-4041-ac04-aac01afd2005\") " pod="openshift-marketplace/certified-operators-m54xk" Mar 09 04:12:08 crc kubenswrapper[4901]: I0309 04:12:08.602204 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/304b98fc-4d85-4041-ac04-aac01afd2005-utilities\") pod \"certified-operators-m54xk\" (UID: \"304b98fc-4d85-4041-ac04-aac01afd2005\") " pod="openshift-marketplace/certified-operators-m54xk" Mar 09 04:12:08 crc kubenswrapper[4901]: I0309 04:12:08.602619 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/304b98fc-4d85-4041-ac04-aac01afd2005-catalog-content\") pod \"certified-operators-m54xk\" (UID: \"304b98fc-4d85-4041-ac04-aac01afd2005\") " pod="openshift-marketplace/certified-operators-m54xk" Mar 09 04:12:08 crc kubenswrapper[4901]: I0309 04:12:08.602711 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/304b98fc-4d85-4041-ac04-aac01afd2005-utilities\") pod \"certified-operators-m54xk\" (UID: \"304b98fc-4d85-4041-ac04-aac01afd2005\") " pod="openshift-marketplace/certified-operators-m54xk" Mar 09 04:12:08 crc kubenswrapper[4901]: I0309 04:12:08.628642 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5sgr\" (UniqueName: \"kubernetes.io/projected/304b98fc-4d85-4041-ac04-aac01afd2005-kube-api-access-d5sgr\") pod \"certified-operators-m54xk\" (UID: \"304b98fc-4d85-4041-ac04-aac01afd2005\") " pod="openshift-marketplace/certified-operators-m54xk" Mar 09 04:12:08 crc kubenswrapper[4901]: I0309 04:12:08.709564 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m54xk" Mar 09 04:12:09 crc kubenswrapper[4901]: I0309 04:12:09.199180 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m54xk"] Mar 09 04:12:09 crc kubenswrapper[4901]: I0309 04:12:09.530206 4901 generic.go:334] "Generic (PLEG): container finished" podID="304b98fc-4d85-4041-ac04-aac01afd2005" containerID="73e813de6b788b72b9ffb273e921e9c49f6ee7e44718227ec1ac0a32ccb98149" exitCode=0 Mar 09 04:12:09 crc kubenswrapper[4901]: I0309 04:12:09.530255 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m54xk" event={"ID":"304b98fc-4d85-4041-ac04-aac01afd2005","Type":"ContainerDied","Data":"73e813de6b788b72b9ffb273e921e9c49f6ee7e44718227ec1ac0a32ccb98149"} Mar 09 04:12:09 crc kubenswrapper[4901]: I0309 04:12:09.530278 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m54xk" event={"ID":"304b98fc-4d85-4041-ac04-aac01afd2005","Type":"ContainerStarted","Data":"c113928dc542e71e007ffc69ba88f03246d75413a873dfe4b9848ed32c56aa02"} Mar 09 04:12:10 crc kubenswrapper[4901]: I0309 04:12:10.541441 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m54xk" event={"ID":"304b98fc-4d85-4041-ac04-aac01afd2005","Type":"ContainerStarted","Data":"84daf292fee9ebd1f2903a3657a66558c36bc0be0aea6bb368da4d6cae7bf335"} Mar 09 04:12:11 crc kubenswrapper[4901]: I0309 04:12:11.557601 4901 generic.go:334] "Generic (PLEG): container finished" podID="304b98fc-4d85-4041-ac04-aac01afd2005" containerID="84daf292fee9ebd1f2903a3657a66558c36bc0be0aea6bb368da4d6cae7bf335" exitCode=0 Mar 09 04:12:11 crc kubenswrapper[4901]: I0309 04:12:11.557728 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m54xk" event={"ID":"304b98fc-4d85-4041-ac04-aac01afd2005","Type":"ContainerDied","Data":"84daf292fee9ebd1f2903a3657a66558c36bc0be0aea6bb368da4d6cae7bf335"} Mar 09 04:12:12 crc kubenswrapper[4901]: I0309 04:12:12.107092 4901 scope.go:117] "RemoveContainer" containerID="d217497ad345fae7da1bdd0eb24a9199e10562d0bb7d6f5c45a95f7e79585459" Mar 09 04:12:12 crc kubenswrapper[4901]: E0309 04:12:12.107798 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:12:12 crc kubenswrapper[4901]: I0309 04:12:12.569272 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m54xk" event={"ID":"304b98fc-4d85-4041-ac04-aac01afd2005","Type":"ContainerStarted","Data":"ebc4e77be23990933b7911b64eb9514206e45e9457217a765d7d1d979124ecc7"} Mar 09 04:12:18 crc kubenswrapper[4901]: I0309 04:12:18.710031 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-m54xk" Mar 09 04:12:18 crc kubenswrapper[4901]: I0309 04:12:18.710623 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-m54xk" Mar 09 04:12:18 crc kubenswrapper[4901]: I0309 04:12:18.776121 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-m54xk" Mar 09 04:12:18 crc kubenswrapper[4901]: I0309 04:12:18.801301 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-m54xk" podStartSLOduration=8.349366146 podStartE2EDuration="10.801281249s" podCreationTimestamp="2026-03-09 04:12:08 +0000 UTC" firstStartedPulling="2026-03-09 04:12:09.531534551 +0000 UTC m=+5454.121198283" lastFinishedPulling="2026-03-09 04:12:11.983449654 +0000 UTC m=+5456.573113386" observedRunningTime="2026-03-09 04:12:12.587616019 +0000 UTC m=+5457.177279751" watchObservedRunningTime="2026-03-09 04:12:18.801281249 +0000 UTC m=+5463.390944991" Mar 09 04:12:19 crc kubenswrapper[4901]: I0309 04:12:19.707555 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-m54xk" Mar 09 04:12:19 crc kubenswrapper[4901]: I0309 04:12:19.785924 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-m54xk"] Mar 09 04:12:21 crc kubenswrapper[4901]: I0309 04:12:21.660070 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-m54xk" podUID="304b98fc-4d85-4041-ac04-aac01afd2005" containerName="registry-server" containerID="cri-o://ebc4e77be23990933b7911b64eb9514206e45e9457217a765d7d1d979124ecc7" gracePeriod=2 Mar 09 04:12:22 crc kubenswrapper[4901]: I0309 04:12:22.200547 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m54xk" Mar 09 04:12:22 crc kubenswrapper[4901]: I0309 04:12:22.343059 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5sgr\" (UniqueName: \"kubernetes.io/projected/304b98fc-4d85-4041-ac04-aac01afd2005-kube-api-access-d5sgr\") pod \"304b98fc-4d85-4041-ac04-aac01afd2005\" (UID: \"304b98fc-4d85-4041-ac04-aac01afd2005\") " Mar 09 04:12:22 crc kubenswrapper[4901]: I0309 04:12:22.343268 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/304b98fc-4d85-4041-ac04-aac01afd2005-catalog-content\") pod \"304b98fc-4d85-4041-ac04-aac01afd2005\" (UID: \"304b98fc-4d85-4041-ac04-aac01afd2005\") " Mar 09 04:12:22 crc kubenswrapper[4901]: I0309 04:12:22.343321 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/304b98fc-4d85-4041-ac04-aac01afd2005-utilities\") pod \"304b98fc-4d85-4041-ac04-aac01afd2005\" (UID: \"304b98fc-4d85-4041-ac04-aac01afd2005\") " Mar 09 04:12:22 crc kubenswrapper[4901]: I0309 04:12:22.345641 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/304b98fc-4d85-4041-ac04-aac01afd2005-utilities" (OuterVolumeSpecName: "utilities") pod "304b98fc-4d85-4041-ac04-aac01afd2005" (UID: "304b98fc-4d85-4041-ac04-aac01afd2005"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 04:12:22 crc kubenswrapper[4901]: I0309 04:12:22.359677 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/304b98fc-4d85-4041-ac04-aac01afd2005-kube-api-access-d5sgr" (OuterVolumeSpecName: "kube-api-access-d5sgr") pod "304b98fc-4d85-4041-ac04-aac01afd2005" (UID: "304b98fc-4d85-4041-ac04-aac01afd2005"). InnerVolumeSpecName "kube-api-access-d5sgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:12:22 crc kubenswrapper[4901]: I0309 04:12:22.407396 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/304b98fc-4d85-4041-ac04-aac01afd2005-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "304b98fc-4d85-4041-ac04-aac01afd2005" (UID: "304b98fc-4d85-4041-ac04-aac01afd2005"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 04:12:22 crc kubenswrapper[4901]: I0309 04:12:22.444963 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/304b98fc-4d85-4041-ac04-aac01afd2005-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 04:12:22 crc kubenswrapper[4901]: I0309 04:12:22.444997 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/304b98fc-4d85-4041-ac04-aac01afd2005-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 04:12:22 crc kubenswrapper[4901]: I0309 04:12:22.445007 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5sgr\" (UniqueName: \"kubernetes.io/projected/304b98fc-4d85-4041-ac04-aac01afd2005-kube-api-access-d5sgr\") on node \"crc\" DevicePath \"\"" Mar 09 04:12:22 crc kubenswrapper[4901]: I0309 04:12:22.674655 4901 generic.go:334] "Generic (PLEG): container finished" podID="304b98fc-4d85-4041-ac04-aac01afd2005" containerID="ebc4e77be23990933b7911b64eb9514206e45e9457217a765d7d1d979124ecc7" exitCode=0 Mar 09 04:12:22 crc kubenswrapper[4901]: I0309 04:12:22.674710 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m54xk" event={"ID":"304b98fc-4d85-4041-ac04-aac01afd2005","Type":"ContainerDied","Data":"ebc4e77be23990933b7911b64eb9514206e45e9457217a765d7d1d979124ecc7"} Mar 09 04:12:22 crc kubenswrapper[4901]: I0309 04:12:22.674746 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m54xk" event={"ID":"304b98fc-4d85-4041-ac04-aac01afd2005","Type":"ContainerDied","Data":"c113928dc542e71e007ffc69ba88f03246d75413a873dfe4b9848ed32c56aa02"} Mar 09 04:12:22 crc kubenswrapper[4901]: I0309 04:12:22.674759 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m54xk" Mar 09 04:12:22 crc kubenswrapper[4901]: I0309 04:12:22.674772 4901 scope.go:117] "RemoveContainer" containerID="ebc4e77be23990933b7911b64eb9514206e45e9457217a765d7d1d979124ecc7" Mar 09 04:12:22 crc kubenswrapper[4901]: I0309 04:12:22.727025 4901 scope.go:117] "RemoveContainer" containerID="84daf292fee9ebd1f2903a3657a66558c36bc0be0aea6bb368da4d6cae7bf335" Mar 09 04:12:22 crc kubenswrapper[4901]: I0309 04:12:22.748510 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-m54xk"] Mar 09 04:12:22 crc kubenswrapper[4901]: I0309 04:12:22.765497 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-m54xk"] Mar 09 04:12:22 crc kubenswrapper[4901]: I0309 04:12:22.777596 4901 scope.go:117] "RemoveContainer" containerID="73e813de6b788b72b9ffb273e921e9c49f6ee7e44718227ec1ac0a32ccb98149" Mar 09 04:12:22 crc kubenswrapper[4901]: I0309 04:12:22.812488 4901 scope.go:117] "RemoveContainer" containerID="ebc4e77be23990933b7911b64eb9514206e45e9457217a765d7d1d979124ecc7" Mar 09 04:12:22 crc kubenswrapper[4901]: E0309 04:12:22.813066 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebc4e77be23990933b7911b64eb9514206e45e9457217a765d7d1d979124ecc7\": container with ID starting with ebc4e77be23990933b7911b64eb9514206e45e9457217a765d7d1d979124ecc7 not found: ID does not exist" containerID="ebc4e77be23990933b7911b64eb9514206e45e9457217a765d7d1d979124ecc7" Mar 09 04:12:22 crc kubenswrapper[4901]: I0309 04:12:22.813382 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebc4e77be23990933b7911b64eb9514206e45e9457217a765d7d1d979124ecc7"} err="failed to get container status \"ebc4e77be23990933b7911b64eb9514206e45e9457217a765d7d1d979124ecc7\": rpc error: code = NotFound desc = could not find container \"ebc4e77be23990933b7911b64eb9514206e45e9457217a765d7d1d979124ecc7\": container with ID starting with ebc4e77be23990933b7911b64eb9514206e45e9457217a765d7d1d979124ecc7 not found: ID does not exist" Mar 09 04:12:22 crc kubenswrapper[4901]: I0309 04:12:22.813587 4901 scope.go:117] "RemoveContainer" containerID="84daf292fee9ebd1f2903a3657a66558c36bc0be0aea6bb368da4d6cae7bf335" Mar 09 04:12:22 crc kubenswrapper[4901]: E0309 04:12:22.814214 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84daf292fee9ebd1f2903a3657a66558c36bc0be0aea6bb368da4d6cae7bf335\": container with ID starting with 84daf292fee9ebd1f2903a3657a66558c36bc0be0aea6bb368da4d6cae7bf335 not found: ID does not exist" containerID="84daf292fee9ebd1f2903a3657a66558c36bc0be0aea6bb368da4d6cae7bf335" Mar 09 04:12:22 crc kubenswrapper[4901]: I0309 04:12:22.814272 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84daf292fee9ebd1f2903a3657a66558c36bc0be0aea6bb368da4d6cae7bf335"} err="failed to get container status \"84daf292fee9ebd1f2903a3657a66558c36bc0be0aea6bb368da4d6cae7bf335\": rpc error: code = NotFound desc = could not find container \"84daf292fee9ebd1f2903a3657a66558c36bc0be0aea6bb368da4d6cae7bf335\": container with ID starting with 84daf292fee9ebd1f2903a3657a66558c36bc0be0aea6bb368da4d6cae7bf335 not found: ID does not exist" Mar 09 04:12:22 crc kubenswrapper[4901]: I0309 04:12:22.814301 4901 scope.go:117] "RemoveContainer" containerID="73e813de6b788b72b9ffb273e921e9c49f6ee7e44718227ec1ac0a32ccb98149" Mar 09 04:12:22 crc kubenswrapper[4901]: E0309 04:12:22.814835 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73e813de6b788b72b9ffb273e921e9c49f6ee7e44718227ec1ac0a32ccb98149\": container with ID starting with 73e813de6b788b72b9ffb273e921e9c49f6ee7e44718227ec1ac0a32ccb98149 not found: ID does not exist" containerID="73e813de6b788b72b9ffb273e921e9c49f6ee7e44718227ec1ac0a32ccb98149" Mar 09 04:12:22 crc kubenswrapper[4901]: I0309 04:12:22.815070 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73e813de6b788b72b9ffb273e921e9c49f6ee7e44718227ec1ac0a32ccb98149"} err="failed to get container status \"73e813de6b788b72b9ffb273e921e9c49f6ee7e44718227ec1ac0a32ccb98149\": rpc error: code = NotFound desc = could not find container \"73e813de6b788b72b9ffb273e921e9c49f6ee7e44718227ec1ac0a32ccb98149\": container with ID starting with 73e813de6b788b72b9ffb273e921e9c49f6ee7e44718227ec1ac0a32ccb98149 not found: ID does not exist" Mar 09 04:12:24 crc kubenswrapper[4901]: I0309 04:12:24.124607 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="304b98fc-4d85-4041-ac04-aac01afd2005" path="/var/lib/kubelet/pods/304b98fc-4d85-4041-ac04-aac01afd2005/volumes" Mar 09 04:12:27 crc kubenswrapper[4901]: I0309 04:12:27.106993 4901 scope.go:117] "RemoveContainer" containerID="d217497ad345fae7da1bdd0eb24a9199e10562d0bb7d6f5c45a95f7e79585459" Mar 09 04:12:27 crc kubenswrapper[4901]: E0309 04:12:27.107930 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:12:39 crc kubenswrapper[4901]: I0309 04:12:39.106761 4901 scope.go:117] "RemoveContainer" containerID="d217497ad345fae7da1bdd0eb24a9199e10562d0bb7d6f5c45a95f7e79585459" Mar 09 04:12:39 crc kubenswrapper[4901]: E0309 04:12:39.107862 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:12:41 crc kubenswrapper[4901]: I0309 04:12:41.731849 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Mar 09 04:12:41 crc kubenswrapper[4901]: E0309 04:12:41.733262 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="304b98fc-4d85-4041-ac04-aac01afd2005" containerName="extract-utilities" Mar 09 04:12:41 crc kubenswrapper[4901]: I0309 04:12:41.733297 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="304b98fc-4d85-4041-ac04-aac01afd2005" containerName="extract-utilities" Mar 09 04:12:41 crc kubenswrapper[4901]: E0309 04:12:41.733337 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="304b98fc-4d85-4041-ac04-aac01afd2005" containerName="extract-content" Mar 09 04:12:41 crc kubenswrapper[4901]: I0309 04:12:41.733359 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="304b98fc-4d85-4041-ac04-aac01afd2005" containerName="extract-content" Mar 09 04:12:41 crc kubenswrapper[4901]: E0309 04:12:41.733394 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="304b98fc-4d85-4041-ac04-aac01afd2005" containerName="registry-server" Mar 09 04:12:41 crc kubenswrapper[4901]: I0309 04:12:41.733410 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="304b98fc-4d85-4041-ac04-aac01afd2005" containerName="registry-server" Mar 09 04:12:41 crc kubenswrapper[4901]: I0309 04:12:41.733786 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="304b98fc-4d85-4041-ac04-aac01afd2005" containerName="registry-server" Mar 09 04:12:41 crc kubenswrapper[4901]: I0309 04:12:41.735001 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Mar 09 04:12:41 crc kubenswrapper[4901]: I0309 04:12:41.737929 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-58bdh" Mar 09 04:12:41 crc kubenswrapper[4901]: I0309 04:12:41.744390 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Mar 09 04:12:41 crc kubenswrapper[4901]: I0309 04:12:41.776101 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cllch\" (UniqueName: \"kubernetes.io/projected/213b167a-15b0-4174-955f-f10bfbab4262-kube-api-access-cllch\") pod \"mariadb-copy-data\" (UID: \"213b167a-15b0-4174-955f-f10bfbab4262\") " pod="openstack/mariadb-copy-data" Mar 09 04:12:41 crc kubenswrapper[4901]: I0309 04:12:41.776207 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-45c524d2-7253-4771-9b28-bbbe4cb9ecee\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-45c524d2-7253-4771-9b28-bbbe4cb9ecee\") pod \"mariadb-copy-data\" (UID: \"213b167a-15b0-4174-955f-f10bfbab4262\") " pod="openstack/mariadb-copy-data" Mar 09 04:12:41 crc kubenswrapper[4901]: I0309 04:12:41.878480 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cllch\" (UniqueName: \"kubernetes.io/projected/213b167a-15b0-4174-955f-f10bfbab4262-kube-api-access-cllch\") pod \"mariadb-copy-data\" (UID: \"213b167a-15b0-4174-955f-f10bfbab4262\") " pod="openstack/mariadb-copy-data" Mar 09 04:12:41 crc kubenswrapper[4901]: I0309 04:12:41.878562 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-45c524d2-7253-4771-9b28-bbbe4cb9ecee\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-45c524d2-7253-4771-9b28-bbbe4cb9ecee\") pod \"mariadb-copy-data\" (UID: \"213b167a-15b0-4174-955f-f10bfbab4262\") " pod="openstack/mariadb-copy-data" Mar 09 04:12:41 crc kubenswrapper[4901]: I0309 04:12:41.881856 4901 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 04:12:41 crc kubenswrapper[4901]: I0309 04:12:41.881903 4901 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-45c524d2-7253-4771-9b28-bbbe4cb9ecee\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-45c524d2-7253-4771-9b28-bbbe4cb9ecee\") pod \"mariadb-copy-data\" (UID: \"213b167a-15b0-4174-955f-f10bfbab4262\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c0332d732fd14d3fa2ae832c70933710a0380f64b6435f0feb8d23b5ddfbeeeb/globalmount\"" pod="openstack/mariadb-copy-data" Mar 09 04:12:41 crc kubenswrapper[4901]: I0309 04:12:41.916690 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cllch\" (UniqueName: \"kubernetes.io/projected/213b167a-15b0-4174-955f-f10bfbab4262-kube-api-access-cllch\") pod \"mariadb-copy-data\" (UID: \"213b167a-15b0-4174-955f-f10bfbab4262\") " pod="openstack/mariadb-copy-data" Mar 09 04:12:41 crc kubenswrapper[4901]: I0309 04:12:41.937414 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-45c524d2-7253-4771-9b28-bbbe4cb9ecee\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-45c524d2-7253-4771-9b28-bbbe4cb9ecee\") pod \"mariadb-copy-data\" (UID: \"213b167a-15b0-4174-955f-f10bfbab4262\") " pod="openstack/mariadb-copy-data" Mar 09 04:12:42 crc kubenswrapper[4901]: I0309 04:12:42.107399 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Mar 09 04:12:42 crc kubenswrapper[4901]: I0309 04:12:42.326202 4901 scope.go:117] "RemoveContainer" containerID="0a578304b4f8dd429dc0322a1a5701a989c64ed6aed0f8a27d5886dfe44f993d" Mar 09 04:12:42 crc kubenswrapper[4901]: I0309 04:12:42.741523 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Mar 09 04:12:42 crc kubenswrapper[4901]: W0309 04:12:42.746613 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod213b167a_15b0_4174_955f_f10bfbab4262.slice/crio-a9f938e66cd5c56c28cd569dc71376d5bd8778e1d3b732c4b06596d5e2705644 WatchSource:0}: Error finding container a9f938e66cd5c56c28cd569dc71376d5bd8778e1d3b732c4b06596d5e2705644: Status 404 returned error can't find the container with id a9f938e66cd5c56c28cd569dc71376d5bd8778e1d3b732c4b06596d5e2705644 Mar 09 04:12:42 crc kubenswrapper[4901]: I0309 04:12:42.886526 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"213b167a-15b0-4174-955f-f10bfbab4262","Type":"ContainerStarted","Data":"a9f938e66cd5c56c28cd569dc71376d5bd8778e1d3b732c4b06596d5e2705644"} Mar 09 04:12:43 crc kubenswrapper[4901]: I0309 04:12:43.899642 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"213b167a-15b0-4174-955f-f10bfbab4262","Type":"ContainerStarted","Data":"92c84eecbccd2e7a97d13dda2a8fb7540a173ddf0ef6072e41c0e29e5bb21a97"} Mar 09 04:12:43 crc kubenswrapper[4901]: I0309 04:12:43.932313 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=3.9322919020000002 podStartE2EDuration="3.932291902s" podCreationTimestamp="2026-03-09 04:12:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 04:12:43.920785636 +0000 UTC m=+5488.510449378" watchObservedRunningTime="2026-03-09 04:12:43.932291902 +0000 UTC m=+5488.521955644" Mar 09 04:12:46 crc kubenswrapper[4901]: I0309 04:12:46.824879 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Mar 09 04:12:46 crc kubenswrapper[4901]: I0309 04:12:46.828022 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 09 04:12:46 crc kubenswrapper[4901]: I0309 04:12:46.837781 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 09 04:12:46 crc kubenswrapper[4901]: I0309 04:12:46.963994 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sbst\" (UniqueName: \"kubernetes.io/projected/de44b263-b764-42b8-bf47-ac99e05078dc-kube-api-access-9sbst\") pod \"mariadb-client\" (UID: \"de44b263-b764-42b8-bf47-ac99e05078dc\") " pod="openstack/mariadb-client" Mar 09 04:12:47 crc kubenswrapper[4901]: I0309 04:12:47.065322 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sbst\" (UniqueName: \"kubernetes.io/projected/de44b263-b764-42b8-bf47-ac99e05078dc-kube-api-access-9sbst\") pod \"mariadb-client\" (UID: \"de44b263-b764-42b8-bf47-ac99e05078dc\") " pod="openstack/mariadb-client" Mar 09 04:12:47 crc kubenswrapper[4901]: I0309 04:12:47.087468 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sbst\" (UniqueName: \"kubernetes.io/projected/de44b263-b764-42b8-bf47-ac99e05078dc-kube-api-access-9sbst\") pod \"mariadb-client\" (UID: \"de44b263-b764-42b8-bf47-ac99e05078dc\") " pod="openstack/mariadb-client" Mar 09 04:12:47 crc kubenswrapper[4901]: I0309 04:12:47.158564 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 09 04:12:47 crc kubenswrapper[4901]: I0309 04:12:47.450190 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 09 04:12:47 crc kubenswrapper[4901]: I0309 04:12:47.945394 4901 generic.go:334] "Generic (PLEG): container finished" podID="de44b263-b764-42b8-bf47-ac99e05078dc" containerID="b64c9fa4a78cb15087c846ae2b8aab8a8f0aad175208ab7db5a7c8eb04d05ea6" exitCode=0 Mar 09 04:12:47 crc kubenswrapper[4901]: I0309 04:12:47.945451 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"de44b263-b764-42b8-bf47-ac99e05078dc","Type":"ContainerDied","Data":"b64c9fa4a78cb15087c846ae2b8aab8a8f0aad175208ab7db5a7c8eb04d05ea6"} Mar 09 04:12:47 crc kubenswrapper[4901]: I0309 04:12:47.945685 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"de44b263-b764-42b8-bf47-ac99e05078dc","Type":"ContainerStarted","Data":"7d95b87e6219c2f204d33522150e0895c5e1775fc0e213fccd9d9e7bd7773dfe"} Mar 09 04:12:49 crc kubenswrapper[4901]: I0309 04:12:49.284617 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 09 04:12:49 crc kubenswrapper[4901]: I0309 04:12:49.310816 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_de44b263-b764-42b8-bf47-ac99e05078dc/mariadb-client/0.log" Mar 09 04:12:49 crc kubenswrapper[4901]: I0309 04:12:49.338858 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 09 04:12:49 crc kubenswrapper[4901]: I0309 04:12:49.346380 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Mar 09 04:12:49 crc kubenswrapper[4901]: I0309 04:12:49.408754 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9sbst\" (UniqueName: \"kubernetes.io/projected/de44b263-b764-42b8-bf47-ac99e05078dc-kube-api-access-9sbst\") pod \"de44b263-b764-42b8-bf47-ac99e05078dc\" (UID: \"de44b263-b764-42b8-bf47-ac99e05078dc\") " Mar 09 04:12:49 crc kubenswrapper[4901]: I0309 04:12:49.415239 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de44b263-b764-42b8-bf47-ac99e05078dc-kube-api-access-9sbst" (OuterVolumeSpecName: "kube-api-access-9sbst") pod "de44b263-b764-42b8-bf47-ac99e05078dc" (UID: "de44b263-b764-42b8-bf47-ac99e05078dc"). InnerVolumeSpecName "kube-api-access-9sbst". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:12:49 crc kubenswrapper[4901]: I0309 04:12:49.503916 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Mar 09 04:12:49 crc kubenswrapper[4901]: E0309 04:12:49.504418 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de44b263-b764-42b8-bf47-ac99e05078dc" containerName="mariadb-client" Mar 09 04:12:49 crc kubenswrapper[4901]: I0309 04:12:49.504444 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="de44b263-b764-42b8-bf47-ac99e05078dc" containerName="mariadb-client" Mar 09 04:12:49 crc kubenswrapper[4901]: I0309 04:12:49.504646 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="de44b263-b764-42b8-bf47-ac99e05078dc" containerName="mariadb-client" Mar 09 04:12:49 crc kubenswrapper[4901]: I0309 04:12:49.505282 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 09 04:12:49 crc kubenswrapper[4901]: I0309 04:12:49.510105 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9sbst\" (UniqueName: \"kubernetes.io/projected/de44b263-b764-42b8-bf47-ac99e05078dc-kube-api-access-9sbst\") on node \"crc\" DevicePath \"\"" Mar 09 04:12:49 crc kubenswrapper[4901]: I0309 04:12:49.523386 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 09 04:12:49 crc kubenswrapper[4901]: I0309 04:12:49.611744 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q66c4\" (UniqueName: \"kubernetes.io/projected/99efe466-31c0-4c94-a7e9-d83b6e52af0e-kube-api-access-q66c4\") pod \"mariadb-client\" (UID: \"99efe466-31c0-4c94-a7e9-d83b6e52af0e\") " pod="openstack/mariadb-client" Mar 09 04:12:49 crc kubenswrapper[4901]: I0309 04:12:49.714487 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q66c4\" (UniqueName: \"kubernetes.io/projected/99efe466-31c0-4c94-a7e9-d83b6e52af0e-kube-api-access-q66c4\") pod \"mariadb-client\" (UID: \"99efe466-31c0-4c94-a7e9-d83b6e52af0e\") " pod="openstack/mariadb-client" Mar 09 04:12:49 crc kubenswrapper[4901]: I0309 04:12:49.747559 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q66c4\" (UniqueName: \"kubernetes.io/projected/99efe466-31c0-4c94-a7e9-d83b6e52af0e-kube-api-access-q66c4\") pod \"mariadb-client\" (UID: \"99efe466-31c0-4c94-a7e9-d83b6e52af0e\") " pod="openstack/mariadb-client" Mar 09 04:12:49 crc kubenswrapper[4901]: I0309 04:12:49.836925 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 09 04:12:49 crc kubenswrapper[4901]: I0309 04:12:49.970528 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d95b87e6219c2f204d33522150e0895c5e1775fc0e213fccd9d9e7bd7773dfe" Mar 09 04:12:49 crc kubenswrapper[4901]: I0309 04:12:49.970608 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 09 04:12:49 crc kubenswrapper[4901]: I0309 04:12:49.991774 4901 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/mariadb-client" oldPodUID="de44b263-b764-42b8-bf47-ac99e05078dc" podUID="99efe466-31c0-4c94-a7e9-d83b6e52af0e" Mar 09 04:12:50 crc kubenswrapper[4901]: I0309 04:12:50.108602 4901 scope.go:117] "RemoveContainer" containerID="d217497ad345fae7da1bdd0eb24a9199e10562d0bb7d6f5c45a95f7e79585459" Mar 09 04:12:50 crc kubenswrapper[4901]: E0309 04:12:50.108832 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:12:50 crc kubenswrapper[4901]: W0309 04:12:50.123364 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod99efe466_31c0_4c94_a7e9_d83b6e52af0e.slice/crio-dbb440779f05bbfc9ee73115c6b4842b1813a4a295c685ee2ca1fc38c961bffa WatchSource:0}: Error finding container dbb440779f05bbfc9ee73115c6b4842b1813a4a295c685ee2ca1fc38c961bffa: Status 404 returned error can't find the container with id dbb440779f05bbfc9ee73115c6b4842b1813a4a295c685ee2ca1fc38c961bffa Mar 09 04:12:50 crc kubenswrapper[4901]: I0309 04:12:50.126322 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de44b263-b764-42b8-bf47-ac99e05078dc" path="/var/lib/kubelet/pods/de44b263-b764-42b8-bf47-ac99e05078dc/volumes" Mar 09 04:12:50 crc kubenswrapper[4901]: I0309 04:12:50.130381 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 09 04:12:51 crc kubenswrapper[4901]: I0309 04:12:51.009625 4901 generic.go:334] "Generic (PLEG): container finished" podID="99efe466-31c0-4c94-a7e9-d83b6e52af0e" containerID="076be717e745166e865c397bca8d8d5d5e33035f0e3bd8455106f955f417df1a" exitCode=0 Mar 09 04:12:51 crc kubenswrapper[4901]: I0309 04:12:51.009750 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"99efe466-31c0-4c94-a7e9-d83b6e52af0e","Type":"ContainerDied","Data":"076be717e745166e865c397bca8d8d5d5e33035f0e3bd8455106f955f417df1a"} Mar 09 04:12:51 crc kubenswrapper[4901]: I0309 04:12:51.010109 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"99efe466-31c0-4c94-a7e9-d83b6e52af0e","Type":"ContainerStarted","Data":"dbb440779f05bbfc9ee73115c6b4842b1813a4a295c685ee2ca1fc38c961bffa"} Mar 09 04:12:52 crc kubenswrapper[4901]: I0309 04:12:52.361358 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 09 04:12:52 crc kubenswrapper[4901]: I0309 04:12:52.381039 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_99efe466-31c0-4c94-a7e9-d83b6e52af0e/mariadb-client/0.log" Mar 09 04:12:52 crc kubenswrapper[4901]: I0309 04:12:52.410454 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 09 04:12:52 crc kubenswrapper[4901]: I0309 04:12:52.420103 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Mar 09 04:12:52 crc kubenswrapper[4901]: I0309 04:12:52.482029 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q66c4\" (UniqueName: \"kubernetes.io/projected/99efe466-31c0-4c94-a7e9-d83b6e52af0e-kube-api-access-q66c4\") pod \"99efe466-31c0-4c94-a7e9-d83b6e52af0e\" (UID: \"99efe466-31c0-4c94-a7e9-d83b6e52af0e\") " Mar 09 04:12:52 crc kubenswrapper[4901]: I0309 04:12:52.491661 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99efe466-31c0-4c94-a7e9-d83b6e52af0e-kube-api-access-q66c4" (OuterVolumeSpecName: "kube-api-access-q66c4") pod "99efe466-31c0-4c94-a7e9-d83b6e52af0e" (UID: "99efe466-31c0-4c94-a7e9-d83b6e52af0e"). InnerVolumeSpecName "kube-api-access-q66c4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:12:52 crc kubenswrapper[4901]: I0309 04:12:52.584364 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q66c4\" (UniqueName: \"kubernetes.io/projected/99efe466-31c0-4c94-a7e9-d83b6e52af0e-kube-api-access-q66c4\") on node \"crc\" DevicePath \"\"" Mar 09 04:12:53 crc kubenswrapper[4901]: I0309 04:12:53.032684 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbb440779f05bbfc9ee73115c6b4842b1813a4a295c685ee2ca1fc38c961bffa" Mar 09 04:12:53 crc kubenswrapper[4901]: I0309 04:12:53.032808 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 09 04:12:54 crc kubenswrapper[4901]: I0309 04:12:54.120361 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99efe466-31c0-4c94-a7e9-d83b6e52af0e" path="/var/lib/kubelet/pods/99efe466-31c0-4c94-a7e9-d83b6e52af0e/volumes" Mar 09 04:13:05 crc kubenswrapper[4901]: I0309 04:13:05.106140 4901 scope.go:117] "RemoveContainer" containerID="d217497ad345fae7da1bdd0eb24a9199e10562d0bb7d6f5c45a95f7e79585459" Mar 09 04:13:05 crc kubenswrapper[4901]: E0309 04:13:05.107125 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:13:18 crc kubenswrapper[4901]: I0309 04:13:18.106687 4901 scope.go:117] "RemoveContainer" containerID="d217497ad345fae7da1bdd0eb24a9199e10562d0bb7d6f5c45a95f7e79585459" Mar 09 04:13:18 crc kubenswrapper[4901]: E0309 04:13:18.107382 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.407746 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 09 04:13:27 crc kubenswrapper[4901]: E0309 04:13:27.413517 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99efe466-31c0-4c94-a7e9-d83b6e52af0e" containerName="mariadb-client" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.413674 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="99efe466-31c0-4c94-a7e9-d83b6e52af0e" containerName="mariadb-client" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.419704 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="99efe466-31c0-4c94-a7e9-d83b6e52af0e" containerName="mariadb-client" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.446646 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.452276 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.452335 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.453132 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.453824 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.453977 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-vsfpx" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.472660 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.474567 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.492849 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.495414 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.505111 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.517077 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.526141 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.576876 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25f843bc-b581-4ff6-a7e3-fa153dbc5fff-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"25f843bc-b581-4ff6-a7e3-fa153dbc5fff\") " pod="openstack/ovsdbserver-nb-2" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.576928 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a4b857e-ca99-46cb-b4dc-ce16c46649ff-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"3a4b857e-ca99-46cb-b4dc-ce16c46649ff\") " pod="openstack/ovsdbserver-nb-1" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.576950 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3a4b857e-ca99-46cb-b4dc-ce16c46649ff-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"3a4b857e-ca99-46cb-b4dc-ce16c46649ff\") " pod="openstack/ovsdbserver-nb-1" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.576980 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/13d33b0e-cd98-42a1-8cff-58f6592d8818-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"13d33b0e-cd98-42a1-8cff-58f6592d8818\") " pod="openstack/ovsdbserver-nb-0" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.577001 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a4b857e-ca99-46cb-b4dc-ce16c46649ff-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"3a4b857e-ca99-46cb-b4dc-ce16c46649ff\") " pod="openstack/ovsdbserver-nb-1" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.577040 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25f843bc-b581-4ff6-a7e3-fa153dbc5fff-config\") pod \"ovsdbserver-nb-2\" (UID: \"25f843bc-b581-4ff6-a7e3-fa153dbc5fff\") " pod="openstack/ovsdbserver-nb-2" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.577065 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/25f843bc-b581-4ff6-a7e3-fa153dbc5fff-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"25f843bc-b581-4ff6-a7e3-fa153dbc5fff\") " pod="openstack/ovsdbserver-nb-2" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.577078 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a4b857e-ca99-46cb-b4dc-ce16c46649ff-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"3a4b857e-ca99-46cb-b4dc-ce16c46649ff\") " pod="openstack/ovsdbserver-nb-1" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.577159 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13d33b0e-cd98-42a1-8cff-58f6592d8818-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"13d33b0e-cd98-42a1-8cff-58f6592d8818\") " pod="openstack/ovsdbserver-nb-0" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.577202 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/13d33b0e-cd98-42a1-8cff-58f6592d8818-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"13d33b0e-cd98-42a1-8cff-58f6592d8818\") " pod="openstack/ovsdbserver-nb-0" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.577264 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6h6l\" (UniqueName: \"kubernetes.io/projected/13d33b0e-cd98-42a1-8cff-58f6592d8818-kube-api-access-c6h6l\") pod \"ovsdbserver-nb-0\" (UID: \"13d33b0e-cd98-42a1-8cff-58f6592d8818\") " pod="openstack/ovsdbserver-nb-0" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.577283 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7f25436a-3cc1-4649-91a1-c9a36c1b01e7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7f25436a-3cc1-4649-91a1-c9a36c1b01e7\") pod \"ovsdbserver-nb-0\" (UID: \"13d33b0e-cd98-42a1-8cff-58f6592d8818\") " pod="openstack/ovsdbserver-nb-0" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.577308 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a4b857e-ca99-46cb-b4dc-ce16c46649ff-config\") pod \"ovsdbserver-nb-1\" (UID: \"3a4b857e-ca99-46cb-b4dc-ce16c46649ff\") " pod="openstack/ovsdbserver-nb-1" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.577331 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/25f843bc-b581-4ff6-a7e3-fa153dbc5fff-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"25f843bc-b581-4ff6-a7e3-fa153dbc5fff\") " pod="openstack/ovsdbserver-nb-2" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.577354 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/25f843bc-b581-4ff6-a7e3-fa153dbc5fff-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"25f843bc-b581-4ff6-a7e3-fa153dbc5fff\") " pod="openstack/ovsdbserver-nb-2" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.577384 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13d33b0e-cd98-42a1-8cff-58f6592d8818-config\") pod \"ovsdbserver-nb-0\" (UID: \"13d33b0e-cd98-42a1-8cff-58f6592d8818\") " pod="openstack/ovsdbserver-nb-0" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.577400 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-bdcfa546-29c5-4bbf-a954-f91feff645a7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdcfa546-29c5-4bbf-a954-f91feff645a7\") pod \"ovsdbserver-nb-2\" (UID: \"25f843bc-b581-4ff6-a7e3-fa153dbc5fff\") " pod="openstack/ovsdbserver-nb-2" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.577416 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-26681396-04a3-4be0-9ef3-83d3a4425bab\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-26681396-04a3-4be0-9ef3-83d3a4425bab\") pod \"ovsdbserver-nb-1\" (UID: \"3a4b857e-ca99-46cb-b4dc-ce16c46649ff\") " pod="openstack/ovsdbserver-nb-1" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.577432 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5dhn\" (UniqueName: \"kubernetes.io/projected/3a4b857e-ca99-46cb-b4dc-ce16c46649ff-kube-api-access-s5dhn\") pod \"ovsdbserver-nb-1\" (UID: \"3a4b857e-ca99-46cb-b4dc-ce16c46649ff\") " pod="openstack/ovsdbserver-nb-1" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.577488 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/25f843bc-b581-4ff6-a7e3-fa153dbc5fff-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"25f843bc-b581-4ff6-a7e3-fa153dbc5fff\") " pod="openstack/ovsdbserver-nb-2" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.577509 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/13d33b0e-cd98-42a1-8cff-58f6592d8818-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"13d33b0e-cd98-42a1-8cff-58f6592d8818\") " pod="openstack/ovsdbserver-nb-0" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.577539 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s62nv\" (UniqueName: \"kubernetes.io/projected/25f843bc-b581-4ff6-a7e3-fa153dbc5fff-kube-api-access-s62nv\") pod \"ovsdbserver-nb-2\" (UID: \"25f843bc-b581-4ff6-a7e3-fa153dbc5fff\") " pod="openstack/ovsdbserver-nb-2" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.577566 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/13d33b0e-cd98-42a1-8cff-58f6592d8818-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"13d33b0e-cd98-42a1-8cff-58f6592d8818\") " pod="openstack/ovsdbserver-nb-0" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.577588 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a4b857e-ca99-46cb-b4dc-ce16c46649ff-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"3a4b857e-ca99-46cb-b4dc-ce16c46649ff\") " pod="openstack/ovsdbserver-nb-1" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.679395 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13d33b0e-cd98-42a1-8cff-58f6592d8818-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"13d33b0e-cd98-42a1-8cff-58f6592d8818\") " pod="openstack/ovsdbserver-nb-0" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.679440 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/13d33b0e-cd98-42a1-8cff-58f6592d8818-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"13d33b0e-cd98-42a1-8cff-58f6592d8818\") " pod="openstack/ovsdbserver-nb-0" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.679470 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6h6l\" (UniqueName: \"kubernetes.io/projected/13d33b0e-cd98-42a1-8cff-58f6592d8818-kube-api-access-c6h6l\") pod \"ovsdbserver-nb-0\" (UID: \"13d33b0e-cd98-42a1-8cff-58f6592d8818\") " pod="openstack/ovsdbserver-nb-0" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.679490 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7f25436a-3cc1-4649-91a1-c9a36c1b01e7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7f25436a-3cc1-4649-91a1-c9a36c1b01e7\") pod \"ovsdbserver-nb-0\" (UID: \"13d33b0e-cd98-42a1-8cff-58f6592d8818\") " pod="openstack/ovsdbserver-nb-0" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.679513 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a4b857e-ca99-46cb-b4dc-ce16c46649ff-config\") pod \"ovsdbserver-nb-1\" (UID: \"3a4b857e-ca99-46cb-b4dc-ce16c46649ff\") " pod="openstack/ovsdbserver-nb-1" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.679530 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/25f843bc-b581-4ff6-a7e3-fa153dbc5fff-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"25f843bc-b581-4ff6-a7e3-fa153dbc5fff\") " pod="openstack/ovsdbserver-nb-2" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.679551 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/25f843bc-b581-4ff6-a7e3-fa153dbc5fff-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"25f843bc-b581-4ff6-a7e3-fa153dbc5fff\") " pod="openstack/ovsdbserver-nb-2" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.679572 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-26681396-04a3-4be0-9ef3-83d3a4425bab\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-26681396-04a3-4be0-9ef3-83d3a4425bab\") pod \"ovsdbserver-nb-1\" (UID: \"3a4b857e-ca99-46cb-b4dc-ce16c46649ff\") " pod="openstack/ovsdbserver-nb-1" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.679588 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13d33b0e-cd98-42a1-8cff-58f6592d8818-config\") pod \"ovsdbserver-nb-0\" (UID: \"13d33b0e-cd98-42a1-8cff-58f6592d8818\") " pod="openstack/ovsdbserver-nb-0" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.679603 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-bdcfa546-29c5-4bbf-a954-f91feff645a7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdcfa546-29c5-4bbf-a954-f91feff645a7\") pod \"ovsdbserver-nb-2\" (UID: \"25f843bc-b581-4ff6-a7e3-fa153dbc5fff\") " pod="openstack/ovsdbserver-nb-2" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.679624 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5dhn\" (UniqueName: \"kubernetes.io/projected/3a4b857e-ca99-46cb-b4dc-ce16c46649ff-kube-api-access-s5dhn\") pod \"ovsdbserver-nb-1\" (UID: \"3a4b857e-ca99-46cb-b4dc-ce16c46649ff\") " pod="openstack/ovsdbserver-nb-1" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.679653 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/25f843bc-b581-4ff6-a7e3-fa153dbc5fff-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"25f843bc-b581-4ff6-a7e3-fa153dbc5fff\") " pod="openstack/ovsdbserver-nb-2" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.679669 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/13d33b0e-cd98-42a1-8cff-58f6592d8818-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"13d33b0e-cd98-42a1-8cff-58f6592d8818\") " pod="openstack/ovsdbserver-nb-0" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.679687 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s62nv\" (UniqueName: \"kubernetes.io/projected/25f843bc-b581-4ff6-a7e3-fa153dbc5fff-kube-api-access-s62nv\") pod \"ovsdbserver-nb-2\" (UID: \"25f843bc-b581-4ff6-a7e3-fa153dbc5fff\") " pod="openstack/ovsdbserver-nb-2" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.679706 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/13d33b0e-cd98-42a1-8cff-58f6592d8818-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"13d33b0e-cd98-42a1-8cff-58f6592d8818\") " pod="openstack/ovsdbserver-nb-0" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.679722 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a4b857e-ca99-46cb-b4dc-ce16c46649ff-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"3a4b857e-ca99-46cb-b4dc-ce16c46649ff\") " pod="openstack/ovsdbserver-nb-1" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.679743 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25f843bc-b581-4ff6-a7e3-fa153dbc5fff-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"25f843bc-b581-4ff6-a7e3-fa153dbc5fff\") " pod="openstack/ovsdbserver-nb-2" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.679760 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a4b857e-ca99-46cb-b4dc-ce16c46649ff-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"3a4b857e-ca99-46cb-b4dc-ce16c46649ff\") " pod="openstack/ovsdbserver-nb-1" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.679774 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3a4b857e-ca99-46cb-b4dc-ce16c46649ff-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"3a4b857e-ca99-46cb-b4dc-ce16c46649ff\") " pod="openstack/ovsdbserver-nb-1" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.679797 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/13d33b0e-cd98-42a1-8cff-58f6592d8818-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"13d33b0e-cd98-42a1-8cff-58f6592d8818\") " pod="openstack/ovsdbserver-nb-0" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.679817 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a4b857e-ca99-46cb-b4dc-ce16c46649ff-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"3a4b857e-ca99-46cb-b4dc-ce16c46649ff\") " pod="openstack/ovsdbserver-nb-1" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.679831 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25f843bc-b581-4ff6-a7e3-fa153dbc5fff-config\") pod \"ovsdbserver-nb-2\" (UID: \"25f843bc-b581-4ff6-a7e3-fa153dbc5fff\") " pod="openstack/ovsdbserver-nb-2" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.679854 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/25f843bc-b581-4ff6-a7e3-fa153dbc5fff-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"25f843bc-b581-4ff6-a7e3-fa153dbc5fff\") " pod="openstack/ovsdbserver-nb-2" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.679868 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a4b857e-ca99-46cb-b4dc-ce16c46649ff-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"3a4b857e-ca99-46cb-b4dc-ce16c46649ff\") " pod="openstack/ovsdbserver-nb-1" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.681622 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a4b857e-ca99-46cb-b4dc-ce16c46649ff-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"3a4b857e-ca99-46cb-b4dc-ce16c46649ff\") " pod="openstack/ovsdbserver-nb-1" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.682447 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/13d33b0e-cd98-42a1-8cff-58f6592d8818-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"13d33b0e-cd98-42a1-8cff-58f6592d8818\") " pod="openstack/ovsdbserver-nb-0" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.682634 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3a4b857e-ca99-46cb-b4dc-ce16c46649ff-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"3a4b857e-ca99-46cb-b4dc-ce16c46649ff\") " pod="openstack/ovsdbserver-nb-1" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.682882 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25f843bc-b581-4ff6-a7e3-fa153dbc5fff-config\") pod \"ovsdbserver-nb-2\" (UID: \"25f843bc-b581-4ff6-a7e3-fa153dbc5fff\") " pod="openstack/ovsdbserver-nb-2" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.683180 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/25f843bc-b581-4ff6-a7e3-fa153dbc5fff-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"25f843bc-b581-4ff6-a7e3-fa153dbc5fff\") " pod="openstack/ovsdbserver-nb-2" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.683331 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/13d33b0e-cd98-42a1-8cff-58f6592d8818-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"13d33b0e-cd98-42a1-8cff-58f6592d8818\") " pod="openstack/ovsdbserver-nb-0" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.683431 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13d33b0e-cd98-42a1-8cff-58f6592d8818-config\") pod \"ovsdbserver-nb-0\" (UID: \"13d33b0e-cd98-42a1-8cff-58f6592d8818\") " pod="openstack/ovsdbserver-nb-0" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.683569 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/25f843bc-b581-4ff6-a7e3-fa153dbc5fff-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"25f843bc-b581-4ff6-a7e3-fa153dbc5fff\") " pod="openstack/ovsdbserver-nb-2" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.684963 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a4b857e-ca99-46cb-b4dc-ce16c46649ff-config\") pod \"ovsdbserver-nb-1\" (UID: \"3a4b857e-ca99-46cb-b4dc-ce16c46649ff\") " pod="openstack/ovsdbserver-nb-1" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.686176 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/25f843bc-b581-4ff6-a7e3-fa153dbc5fff-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"25f843bc-b581-4ff6-a7e3-fa153dbc5fff\") " pod="openstack/ovsdbserver-nb-2" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.687816 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a4b857e-ca99-46cb-b4dc-ce16c46649ff-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"3a4b857e-ca99-46cb-b4dc-ce16c46649ff\") " pod="openstack/ovsdbserver-nb-1" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.687949 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/13d33b0e-cd98-42a1-8cff-58f6592d8818-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"13d33b0e-cd98-42a1-8cff-58f6592d8818\") " pod="openstack/ovsdbserver-nb-0" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.688013 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a4b857e-ca99-46cb-b4dc-ce16c46649ff-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"3a4b857e-ca99-46cb-b4dc-ce16c46649ff\") " pod="openstack/ovsdbserver-nb-1" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.688149 4901 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.688174 4901 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-bdcfa546-29c5-4bbf-a954-f91feff645a7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdcfa546-29c5-4bbf-a954-f91feff645a7\") pod \"ovsdbserver-nb-2\" (UID: \"25f843bc-b581-4ff6-a7e3-fa153dbc5fff\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/92a97a2c21afc503b8edc8a403bc963bdc0f85de92b9ae1d7065d9d22ee885cd/globalmount\"" pod="openstack/ovsdbserver-nb-2" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.688283 4901 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.688322 4901 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-26681396-04a3-4be0-9ef3-83d3a4425bab\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-26681396-04a3-4be0-9ef3-83d3a4425bab\") pod \"ovsdbserver-nb-1\" (UID: \"3a4b857e-ca99-46cb-b4dc-ce16c46649ff\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/8a1f2bb05d974ffde8fc26ca9cf98545f86d0fc825281190c1ac51fa6478dc95/globalmount\"" pod="openstack/ovsdbserver-nb-1" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.688388 4901 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.688410 4901 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7f25436a-3cc1-4649-91a1-c9a36c1b01e7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7f25436a-3cc1-4649-91a1-c9a36c1b01e7\") pod \"ovsdbserver-nb-0\" (UID: \"13d33b0e-cd98-42a1-8cff-58f6592d8818\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/4777b68248bdd6e6e558c2b746903cf429bc5730baeb56b45c87d57767e4c186/globalmount\"" pod="openstack/ovsdbserver-nb-0" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.688895 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/13d33b0e-cd98-42a1-8cff-58f6592d8818-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"13d33b0e-cd98-42a1-8cff-58f6592d8818\") " pod="openstack/ovsdbserver-nb-0" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.689270 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a4b857e-ca99-46cb-b4dc-ce16c46649ff-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"3a4b857e-ca99-46cb-b4dc-ce16c46649ff\") " pod="openstack/ovsdbserver-nb-1" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.690929 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/25f843bc-b581-4ff6-a7e3-fa153dbc5fff-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"25f843bc-b581-4ff6-a7e3-fa153dbc5fff\") " pod="openstack/ovsdbserver-nb-2" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.691927 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25f843bc-b581-4ff6-a7e3-fa153dbc5fff-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"25f843bc-b581-4ff6-a7e3-fa153dbc5fff\") " pod="openstack/ovsdbserver-nb-2" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.702557 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s62nv\" (UniqueName: \"kubernetes.io/projected/25f843bc-b581-4ff6-a7e3-fa153dbc5fff-kube-api-access-s62nv\") pod \"ovsdbserver-nb-2\" (UID: \"25f843bc-b581-4ff6-a7e3-fa153dbc5fff\") " pod="openstack/ovsdbserver-nb-2" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.702561 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5dhn\" (UniqueName: \"kubernetes.io/projected/3a4b857e-ca99-46cb-b4dc-ce16c46649ff-kube-api-access-s5dhn\") pod \"ovsdbserver-nb-1\" (UID: \"3a4b857e-ca99-46cb-b4dc-ce16c46649ff\") " pod="openstack/ovsdbserver-nb-1" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.704600 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6h6l\" (UniqueName: \"kubernetes.io/projected/13d33b0e-cd98-42a1-8cff-58f6592d8818-kube-api-access-c6h6l\") pod \"ovsdbserver-nb-0\" (UID: \"13d33b0e-cd98-42a1-8cff-58f6592d8818\") " pod="openstack/ovsdbserver-nb-0" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.709269 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13d33b0e-cd98-42a1-8cff-58f6592d8818-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"13d33b0e-cd98-42a1-8cff-58f6592d8818\") " pod="openstack/ovsdbserver-nb-0" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.724750 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-26681396-04a3-4be0-9ef3-83d3a4425bab\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-26681396-04a3-4be0-9ef3-83d3a4425bab\") pod \"ovsdbserver-nb-1\" (UID: \"3a4b857e-ca99-46cb-b4dc-ce16c46649ff\") " pod="openstack/ovsdbserver-nb-1" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.725723 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-bdcfa546-29c5-4bbf-a954-f91feff645a7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdcfa546-29c5-4bbf-a954-f91feff645a7\") pod \"ovsdbserver-nb-2\" (UID: \"25f843bc-b581-4ff6-a7e3-fa153dbc5fff\") " pod="openstack/ovsdbserver-nb-2" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.726060 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7f25436a-3cc1-4649-91a1-c9a36c1b01e7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7f25436a-3cc1-4649-91a1-c9a36c1b01e7\") pod \"ovsdbserver-nb-0\" (UID: \"13d33b0e-cd98-42a1-8cff-58f6592d8818\") " pod="openstack/ovsdbserver-nb-0" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.769705 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.790751 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Mar 09 04:13:27 crc kubenswrapper[4901]: I0309 04:13:27.821511 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.666281 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.675560 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.687745 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-clvn7" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.688154 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.690362 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.694665 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.694730 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.704341 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.706577 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.712596 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.723043 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.724125 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.765686 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.801323 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/73820d01-78b9-4eb5-bbdd-1192bba6335b-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"73820d01-78b9-4eb5-bbdd-1192bba6335b\") " pod="openstack/ovsdbserver-sb-1" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.801374 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5b95\" (UniqueName: \"kubernetes.io/projected/423e84f6-36ec-4649-97e4-faf3da93684e-kube-api-access-f5b95\") pod \"ovsdbserver-sb-0\" (UID: \"423e84f6-36ec-4649-97e4-faf3da93684e\") " pod="openstack/ovsdbserver-sb-0" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.801411 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73820d01-78b9-4eb5-bbdd-1192bba6335b-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"73820d01-78b9-4eb5-bbdd-1192bba6335b\") " pod="openstack/ovsdbserver-sb-1" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.801448 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-cbba7ada-506f-49cb-8856-1c4f7b96e1d9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cbba7ada-506f-49cb-8856-1c4f7b96e1d9\") pod \"ovsdbserver-sb-0\" (UID: \"423e84f6-36ec-4649-97e4-faf3da93684e\") " pod="openstack/ovsdbserver-sb-0" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.801474 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73820d01-78b9-4eb5-bbdd-1192bba6335b-config\") pod \"ovsdbserver-sb-1\" (UID: \"73820d01-78b9-4eb5-bbdd-1192bba6335b\") " pod="openstack/ovsdbserver-sb-1" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.801498 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/73820d01-78b9-4eb5-bbdd-1192bba6335b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"73820d01-78b9-4eb5-bbdd-1192bba6335b\") " pod="openstack/ovsdbserver-sb-1" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.801524 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-01ee9cee-89ed-41e9-b860-ce77ee662936\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-01ee9cee-89ed-41e9-b860-ce77ee662936\") pod \"ovsdbserver-sb-2\" (UID: \"a75a85d8-38a8-4799-8a0a-ca67151aa49a\") " pod="openstack/ovsdbserver-sb-2" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.801549 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a75a85d8-38a8-4799-8a0a-ca67151aa49a-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"a75a85d8-38a8-4799-8a0a-ca67151aa49a\") " pod="openstack/ovsdbserver-sb-2" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.801569 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/423e84f6-36ec-4649-97e4-faf3da93684e-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"423e84f6-36ec-4649-97e4-faf3da93684e\") " pod="openstack/ovsdbserver-sb-0" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.801588 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/423e84f6-36ec-4649-97e4-faf3da93684e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"423e84f6-36ec-4649-97e4-faf3da93684e\") " pod="openstack/ovsdbserver-sb-0" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.801608 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a75a85d8-38a8-4799-8a0a-ca67151aa49a-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"a75a85d8-38a8-4799-8a0a-ca67151aa49a\") " pod="openstack/ovsdbserver-sb-2" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.801637 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/73820d01-78b9-4eb5-bbdd-1192bba6335b-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"73820d01-78b9-4eb5-bbdd-1192bba6335b\") " pod="openstack/ovsdbserver-sb-1" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.801676 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/423e84f6-36ec-4649-97e4-faf3da93684e-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"423e84f6-36ec-4649-97e4-faf3da93684e\") " pod="openstack/ovsdbserver-sb-0" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.801696 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e4b0079a-1042-4e55-b2f8-91517341df8c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e4b0079a-1042-4e55-b2f8-91517341df8c\") pod \"ovsdbserver-sb-1\" (UID: \"73820d01-78b9-4eb5-bbdd-1192bba6335b\") " pod="openstack/ovsdbserver-sb-1" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.801723 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a75a85d8-38a8-4799-8a0a-ca67151aa49a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"a75a85d8-38a8-4799-8a0a-ca67151aa49a\") " pod="openstack/ovsdbserver-sb-2" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.801747 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/423e84f6-36ec-4649-97e4-faf3da93684e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"423e84f6-36ec-4649-97e4-faf3da93684e\") " pod="openstack/ovsdbserver-sb-0" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.801767 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zfzl\" (UniqueName: \"kubernetes.io/projected/73820d01-78b9-4eb5-bbdd-1192bba6335b-kube-api-access-9zfzl\") pod \"ovsdbserver-sb-1\" (UID: \"73820d01-78b9-4eb5-bbdd-1192bba6335b\") " pod="openstack/ovsdbserver-sb-1" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.801788 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a75a85d8-38a8-4799-8a0a-ca67151aa49a-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"a75a85d8-38a8-4799-8a0a-ca67151aa49a\") " pod="openstack/ovsdbserver-sb-2" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.801820 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a75a85d8-38a8-4799-8a0a-ca67151aa49a-config\") pod \"ovsdbserver-sb-2\" (UID: \"a75a85d8-38a8-4799-8a0a-ca67151aa49a\") " pod="openstack/ovsdbserver-sb-2" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.801840 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwvvn\" (UniqueName: \"kubernetes.io/projected/a75a85d8-38a8-4799-8a0a-ca67151aa49a-kube-api-access-pwvvn\") pod \"ovsdbserver-sb-2\" (UID: \"a75a85d8-38a8-4799-8a0a-ca67151aa49a\") " pod="openstack/ovsdbserver-sb-2" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.801885 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/73820d01-78b9-4eb5-bbdd-1192bba6335b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"73820d01-78b9-4eb5-bbdd-1192bba6335b\") " pod="openstack/ovsdbserver-sb-1" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.801915 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/423e84f6-36ec-4649-97e4-faf3da93684e-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"423e84f6-36ec-4649-97e4-faf3da93684e\") " pod="openstack/ovsdbserver-sb-0" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.801936 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/423e84f6-36ec-4649-97e4-faf3da93684e-config\") pod \"ovsdbserver-sb-0\" (UID: \"423e84f6-36ec-4649-97e4-faf3da93684e\") " pod="openstack/ovsdbserver-sb-0" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.801962 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a75a85d8-38a8-4799-8a0a-ca67151aa49a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"a75a85d8-38a8-4799-8a0a-ca67151aa49a\") " pod="openstack/ovsdbserver-sb-2" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.876841 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.904176 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/423e84f6-36ec-4649-97e4-faf3da93684e-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"423e84f6-36ec-4649-97e4-faf3da93684e\") " pod="openstack/ovsdbserver-sb-0" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.904282 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e4b0079a-1042-4e55-b2f8-91517341df8c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e4b0079a-1042-4e55-b2f8-91517341df8c\") pod \"ovsdbserver-sb-1\" (UID: \"73820d01-78b9-4eb5-bbdd-1192bba6335b\") " pod="openstack/ovsdbserver-sb-1" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.904394 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a75a85d8-38a8-4799-8a0a-ca67151aa49a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"a75a85d8-38a8-4799-8a0a-ca67151aa49a\") " pod="openstack/ovsdbserver-sb-2" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.906875 4901 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.906911 4901 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e4b0079a-1042-4e55-b2f8-91517341df8c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e4b0079a-1042-4e55-b2f8-91517341df8c\") pod \"ovsdbserver-sb-1\" (UID: \"73820d01-78b9-4eb5-bbdd-1192bba6335b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/394bebed97ad5368ad1117dacaba8b2b244a2c91344041328bbf78048f6d6d21/globalmount\"" pod="openstack/ovsdbserver-sb-1" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.906921 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/423e84f6-36ec-4649-97e4-faf3da93684e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"423e84f6-36ec-4649-97e4-faf3da93684e\") " pod="openstack/ovsdbserver-sb-0" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.906958 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zfzl\" (UniqueName: \"kubernetes.io/projected/73820d01-78b9-4eb5-bbdd-1192bba6335b-kube-api-access-9zfzl\") pod \"ovsdbserver-sb-1\" (UID: \"73820d01-78b9-4eb5-bbdd-1192bba6335b\") " pod="openstack/ovsdbserver-sb-1" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.906992 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a75a85d8-38a8-4799-8a0a-ca67151aa49a-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"a75a85d8-38a8-4799-8a0a-ca67151aa49a\") " pod="openstack/ovsdbserver-sb-2" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.907040 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a75a85d8-38a8-4799-8a0a-ca67151aa49a-config\") pod \"ovsdbserver-sb-2\" (UID: \"a75a85d8-38a8-4799-8a0a-ca67151aa49a\") " pod="openstack/ovsdbserver-sb-2" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.907128 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwvvn\" (UniqueName: \"kubernetes.io/projected/a75a85d8-38a8-4799-8a0a-ca67151aa49a-kube-api-access-pwvvn\") pod \"ovsdbserver-sb-2\" (UID: \"a75a85d8-38a8-4799-8a0a-ca67151aa49a\") " pod="openstack/ovsdbserver-sb-2" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.907130 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/423e84f6-36ec-4649-97e4-faf3da93684e-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"423e84f6-36ec-4649-97e4-faf3da93684e\") " pod="openstack/ovsdbserver-sb-0" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.907636 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/73820d01-78b9-4eb5-bbdd-1192bba6335b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"73820d01-78b9-4eb5-bbdd-1192bba6335b\") " pod="openstack/ovsdbserver-sb-1" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.907704 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/423e84f6-36ec-4649-97e4-faf3da93684e-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"423e84f6-36ec-4649-97e4-faf3da93684e\") " pod="openstack/ovsdbserver-sb-0" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.907754 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/423e84f6-36ec-4649-97e4-faf3da93684e-config\") pod \"ovsdbserver-sb-0\" (UID: \"423e84f6-36ec-4649-97e4-faf3da93684e\") " pod="openstack/ovsdbserver-sb-0" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.907795 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a75a85d8-38a8-4799-8a0a-ca67151aa49a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"a75a85d8-38a8-4799-8a0a-ca67151aa49a\") " pod="openstack/ovsdbserver-sb-2" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.907848 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/73820d01-78b9-4eb5-bbdd-1192bba6335b-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"73820d01-78b9-4eb5-bbdd-1192bba6335b\") " pod="openstack/ovsdbserver-sb-1" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.907877 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5b95\" (UniqueName: \"kubernetes.io/projected/423e84f6-36ec-4649-97e4-faf3da93684e-kube-api-access-f5b95\") pod \"ovsdbserver-sb-0\" (UID: \"423e84f6-36ec-4649-97e4-faf3da93684e\") " pod="openstack/ovsdbserver-sb-0" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.908205 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a75a85d8-38a8-4799-8a0a-ca67151aa49a-config\") pod \"ovsdbserver-sb-2\" (UID: \"a75a85d8-38a8-4799-8a0a-ca67151aa49a\") " pod="openstack/ovsdbserver-sb-2" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.908657 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/423e84f6-36ec-4649-97e4-faf3da93684e-config\") pod \"ovsdbserver-sb-0\" (UID: \"423e84f6-36ec-4649-97e4-faf3da93684e\") " pod="openstack/ovsdbserver-sb-0" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.907925 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73820d01-78b9-4eb5-bbdd-1192bba6335b-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"73820d01-78b9-4eb5-bbdd-1192bba6335b\") " pod="openstack/ovsdbserver-sb-1" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.909043 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-cbba7ada-506f-49cb-8856-1c4f7b96e1d9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cbba7ada-506f-49cb-8856-1c4f7b96e1d9\") pod \"ovsdbserver-sb-0\" (UID: \"423e84f6-36ec-4649-97e4-faf3da93684e\") " pod="openstack/ovsdbserver-sb-0" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.909068 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73820d01-78b9-4eb5-bbdd-1192bba6335b-config\") pod \"ovsdbserver-sb-1\" (UID: \"73820d01-78b9-4eb5-bbdd-1192bba6335b\") " pod="openstack/ovsdbserver-sb-1" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.909108 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/73820d01-78b9-4eb5-bbdd-1192bba6335b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"73820d01-78b9-4eb5-bbdd-1192bba6335b\") " pod="openstack/ovsdbserver-sb-1" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.909249 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/423e84f6-36ec-4649-97e4-faf3da93684e-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"423e84f6-36ec-4649-97e4-faf3da93684e\") " pod="openstack/ovsdbserver-sb-0" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.909212 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-01ee9cee-89ed-41e9-b860-ce77ee662936\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-01ee9cee-89ed-41e9-b860-ce77ee662936\") pod \"ovsdbserver-sb-2\" (UID: \"a75a85d8-38a8-4799-8a0a-ca67151aa49a\") " pod="openstack/ovsdbserver-sb-2" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.909317 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a75a85d8-38a8-4799-8a0a-ca67151aa49a-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"a75a85d8-38a8-4799-8a0a-ca67151aa49a\") " pod="openstack/ovsdbserver-sb-2" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.909342 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/423e84f6-36ec-4649-97e4-faf3da93684e-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"423e84f6-36ec-4649-97e4-faf3da93684e\") " pod="openstack/ovsdbserver-sb-0" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.909424 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/423e84f6-36ec-4649-97e4-faf3da93684e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"423e84f6-36ec-4649-97e4-faf3da93684e\") " pod="openstack/ovsdbserver-sb-0" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.909456 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a75a85d8-38a8-4799-8a0a-ca67151aa49a-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"a75a85d8-38a8-4799-8a0a-ca67151aa49a\") " pod="openstack/ovsdbserver-sb-2" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.909502 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/73820d01-78b9-4eb5-bbdd-1192bba6335b-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"73820d01-78b9-4eb5-bbdd-1192bba6335b\") " pod="openstack/ovsdbserver-sb-1" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.909530 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a75a85d8-38a8-4799-8a0a-ca67151aa49a-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"a75a85d8-38a8-4799-8a0a-ca67151aa49a\") " pod="openstack/ovsdbserver-sb-2" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.910336 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/73820d01-78b9-4eb5-bbdd-1192bba6335b-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"73820d01-78b9-4eb5-bbdd-1192bba6335b\") " pod="openstack/ovsdbserver-sb-1" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.910780 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a75a85d8-38a8-4799-8a0a-ca67151aa49a-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"a75a85d8-38a8-4799-8a0a-ca67151aa49a\") " pod="openstack/ovsdbserver-sb-2" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.910797 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/423e84f6-36ec-4649-97e4-faf3da93684e-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"423e84f6-36ec-4649-97e4-faf3da93684e\") " pod="openstack/ovsdbserver-sb-0" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.911437 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/73820d01-78b9-4eb5-bbdd-1192bba6335b-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"73820d01-78b9-4eb5-bbdd-1192bba6335b\") " pod="openstack/ovsdbserver-sb-1" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.911652 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73820d01-78b9-4eb5-bbdd-1192bba6335b-config\") pod \"ovsdbserver-sb-1\" (UID: \"73820d01-78b9-4eb5-bbdd-1192bba6335b\") " pod="openstack/ovsdbserver-sb-1" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.912741 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a75a85d8-38a8-4799-8a0a-ca67151aa49a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"a75a85d8-38a8-4799-8a0a-ca67151aa49a\") " pod="openstack/ovsdbserver-sb-2" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.912960 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/73820d01-78b9-4eb5-bbdd-1192bba6335b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"73820d01-78b9-4eb5-bbdd-1192bba6335b\") " pod="openstack/ovsdbserver-sb-1" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.914427 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a75a85d8-38a8-4799-8a0a-ca67151aa49a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"a75a85d8-38a8-4799-8a0a-ca67151aa49a\") " pod="openstack/ovsdbserver-sb-2" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.915848 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73820d01-78b9-4eb5-bbdd-1192bba6335b-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"73820d01-78b9-4eb5-bbdd-1192bba6335b\") " pod="openstack/ovsdbserver-sb-1" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.920941 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/423e84f6-36ec-4649-97e4-faf3da93684e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"423e84f6-36ec-4649-97e4-faf3da93684e\") " pod="openstack/ovsdbserver-sb-0" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.921551 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/423e84f6-36ec-4649-97e4-faf3da93684e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"423e84f6-36ec-4649-97e4-faf3da93684e\") " pod="openstack/ovsdbserver-sb-0" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.922018 4901 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.922046 4901 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-cbba7ada-506f-49cb-8856-1c4f7b96e1d9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cbba7ada-506f-49cb-8856-1c4f7b96e1d9\") pod \"ovsdbserver-sb-0\" (UID: \"423e84f6-36ec-4649-97e4-faf3da93684e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/63b2606a9e32a17bf04a271b64ac00651712852dd5d09be8421a48d9ce0df9df/globalmount\"" pod="openstack/ovsdbserver-sb-0" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.922731 4901 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.922760 4901 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-01ee9cee-89ed-41e9-b860-ce77ee662936\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-01ee9cee-89ed-41e9-b860-ce77ee662936\") pod \"ovsdbserver-sb-2\" (UID: \"a75a85d8-38a8-4799-8a0a-ca67151aa49a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/80c70aef378deb90aa43f18590df8e27e005f64569d48c7da6dee1f4a56d2c36/globalmount\"" pod="openstack/ovsdbserver-sb-2" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.923774 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/73820d01-78b9-4eb5-bbdd-1192bba6335b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"73820d01-78b9-4eb5-bbdd-1192bba6335b\") " pod="openstack/ovsdbserver-sb-1" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.924845 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a75a85d8-38a8-4799-8a0a-ca67151aa49a-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"a75a85d8-38a8-4799-8a0a-ca67151aa49a\") " pod="openstack/ovsdbserver-sb-2" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.925057 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5b95\" (UniqueName: \"kubernetes.io/projected/423e84f6-36ec-4649-97e4-faf3da93684e-kube-api-access-f5b95\") pod \"ovsdbserver-sb-0\" (UID: \"423e84f6-36ec-4649-97e4-faf3da93684e\") " pod="openstack/ovsdbserver-sb-0" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.926896 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zfzl\" (UniqueName: \"kubernetes.io/projected/73820d01-78b9-4eb5-bbdd-1192bba6335b-kube-api-access-9zfzl\") pod \"ovsdbserver-sb-1\" (UID: \"73820d01-78b9-4eb5-bbdd-1192bba6335b\") " pod="openstack/ovsdbserver-sb-1" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.927299 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwvvn\" (UniqueName: \"kubernetes.io/projected/a75a85d8-38a8-4799-8a0a-ca67151aa49a-kube-api-access-pwvvn\") pod \"ovsdbserver-sb-2\" (UID: \"a75a85d8-38a8-4799-8a0a-ca67151aa49a\") " pod="openstack/ovsdbserver-sb-2" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.961643 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-01ee9cee-89ed-41e9-b860-ce77ee662936\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-01ee9cee-89ed-41e9-b860-ce77ee662936\") pod \"ovsdbserver-sb-2\" (UID: \"a75a85d8-38a8-4799-8a0a-ca67151aa49a\") " pod="openstack/ovsdbserver-sb-2" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.964263 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e4b0079a-1042-4e55-b2f8-91517341df8c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e4b0079a-1042-4e55-b2f8-91517341df8c\") pod \"ovsdbserver-sb-1\" (UID: \"73820d01-78b9-4eb5-bbdd-1192bba6335b\") " pod="openstack/ovsdbserver-sb-1" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.991028 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-cbba7ada-506f-49cb-8856-1c4f7b96e1d9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cbba7ada-506f-49cb-8856-1c4f7b96e1d9\") pod \"ovsdbserver-sb-0\" (UID: \"423e84f6-36ec-4649-97e4-faf3da93684e\") " pod="openstack/ovsdbserver-sb-0" Mar 09 04:13:28 crc kubenswrapper[4901]: I0309 04:13:28.997037 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Mar 09 04:13:29 crc kubenswrapper[4901]: I0309 04:13:29.070873 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 09 04:13:29 crc kubenswrapper[4901]: I0309 04:13:29.081106 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Mar 09 04:13:29 crc kubenswrapper[4901]: I0309 04:13:29.098601 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Mar 09 04:13:29 crc kubenswrapper[4901]: I0309 04:13:29.362253 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"25f843bc-b581-4ff6-a7e3-fa153dbc5fff","Type":"ContainerStarted","Data":"41ba689f1f854956551fa6348e8be82e5a2b71026f89d8bad279389ed171c1c1"} Mar 09 04:13:29 crc kubenswrapper[4901]: I0309 04:13:29.362291 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"25f843bc-b581-4ff6-a7e3-fa153dbc5fff","Type":"ContainerStarted","Data":"94814e01d86534afc671380496d44c1322a9f930bf043db567785bd26380e545"} Mar 09 04:13:29 crc kubenswrapper[4901]: I0309 04:13:29.362300 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"25f843bc-b581-4ff6-a7e3-fa153dbc5fff","Type":"ContainerStarted","Data":"9bb61888c6db7df3827cecfea763bc3ddb0b54c072d5431770d5c4a14231b80a"} Mar 09 04:13:29 crc kubenswrapper[4901]: I0309 04:13:29.365117 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"3a4b857e-ca99-46cb-b4dc-ce16c46649ff","Type":"ContainerStarted","Data":"335469b938c1236d8f9cbf06b5751ac4e00d72e4edd2c21303ddb184dd02b0a7"} Mar 09 04:13:29 crc kubenswrapper[4901]: I0309 04:13:29.365138 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"3a4b857e-ca99-46cb-b4dc-ce16c46649ff","Type":"ContainerStarted","Data":"fd6472e1dd96456ee6b9f503c22df58e9cbd6675a61be7a31c3c0a8ac1e72dd6"} Mar 09 04:13:29 crc kubenswrapper[4901]: I0309 04:13:29.365148 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"3a4b857e-ca99-46cb-b4dc-ce16c46649ff","Type":"ContainerStarted","Data":"b6f34cd3f713a24d3fe41646c2d12d29b3be8d9f14ad6f615c2dbdc084e37312"} Mar 09 04:13:29 crc kubenswrapper[4901]: I0309 04:13:29.390892 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=3.39087674 podStartE2EDuration="3.39087674s" podCreationTimestamp="2026-03-09 04:13:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 04:13:29.386992854 +0000 UTC m=+5533.976656616" watchObservedRunningTime="2026-03-09 04:13:29.39087674 +0000 UTC m=+5533.980540462" Mar 09 04:13:29 crc kubenswrapper[4901]: I0309 04:13:29.408962 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=3.408942799 podStartE2EDuration="3.408942799s" podCreationTimestamp="2026-03-09 04:13:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 04:13:29.407232137 +0000 UTC m=+5533.996895869" watchObservedRunningTime="2026-03-09 04:13:29.408942799 +0000 UTC m=+5533.998606541" Mar 09 04:13:29 crc kubenswrapper[4901]: I0309 04:13:29.606234 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 09 04:13:29 crc kubenswrapper[4901]: W0309 04:13:29.615125 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod423e84f6_36ec_4649_97e4_faf3da93684e.slice/crio-8b764483729d1bff8744999cc09d70dc53614c8909ca5f9f84c5c030e271f23f WatchSource:0}: Error finding container 8b764483729d1bff8744999cc09d70dc53614c8909ca5f9f84c5c030e271f23f: Status 404 returned error can't find the container with id 8b764483729d1bff8744999cc09d70dc53614c8909ca5f9f84c5c030e271f23f Mar 09 04:13:29 crc kubenswrapper[4901]: I0309 04:13:29.714011 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Mar 09 04:13:29 crc kubenswrapper[4901]: W0309 04:13:29.732265 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73820d01_78b9_4eb5_bbdd_1192bba6335b.slice/crio-d6ee676775a68f11cfaa0c4d0b408de9e3683ea85521f94d2c223dfcd8911137 WatchSource:0}: Error finding container d6ee676775a68f11cfaa0c4d0b408de9e3683ea85521f94d2c223dfcd8911137: Status 404 returned error can't find the container with id d6ee676775a68f11cfaa0c4d0b408de9e3683ea85521f94d2c223dfcd8911137 Mar 09 04:13:29 crc kubenswrapper[4901]: I0309 04:13:29.820916 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 09 04:13:29 crc kubenswrapper[4901]: W0309 04:13:29.833303 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13d33b0e_cd98_42a1_8cff_58f6592d8818.slice/crio-a54d3e8875215604ef1be605d3dcaf641ea6e03dc75da00c7bb6f9959092fee0 WatchSource:0}: Error finding container a54d3e8875215604ef1be605d3dcaf641ea6e03dc75da00c7bb6f9959092fee0: Status 404 returned error can't find the container with id a54d3e8875215604ef1be605d3dcaf641ea6e03dc75da00c7bb6f9959092fee0 Mar 09 04:13:30 crc kubenswrapper[4901]: I0309 04:13:30.375168 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"13d33b0e-cd98-42a1-8cff-58f6592d8818","Type":"ContainerStarted","Data":"08040cf884325c64118c37d25888682ee8d1143088119f143234ba66a21b878f"} Mar 09 04:13:30 crc kubenswrapper[4901]: I0309 04:13:30.375451 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"13d33b0e-cd98-42a1-8cff-58f6592d8818","Type":"ContainerStarted","Data":"cc0480fe87eec4d79c048baa8c0456ad8f53fb749cc6f670b4d11a582df7cad5"} Mar 09 04:13:30 crc kubenswrapper[4901]: I0309 04:13:30.375462 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"13d33b0e-cd98-42a1-8cff-58f6592d8818","Type":"ContainerStarted","Data":"a54d3e8875215604ef1be605d3dcaf641ea6e03dc75da00c7bb6f9959092fee0"} Mar 09 04:13:30 crc kubenswrapper[4901]: I0309 04:13:30.377681 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"73820d01-78b9-4eb5-bbdd-1192bba6335b","Type":"ContainerStarted","Data":"64eecee5194db7172c6c1e99f0a3ce69805450cea391dba5b65fe63e3d9448bb"} Mar 09 04:13:30 crc kubenswrapper[4901]: I0309 04:13:30.377730 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"73820d01-78b9-4eb5-bbdd-1192bba6335b","Type":"ContainerStarted","Data":"f6a5129438e0a4e297a745caae2abe08c0c152edd235be3e622605871cadb56b"} Mar 09 04:13:30 crc kubenswrapper[4901]: I0309 04:13:30.377740 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"73820d01-78b9-4eb5-bbdd-1192bba6335b","Type":"ContainerStarted","Data":"d6ee676775a68f11cfaa0c4d0b408de9e3683ea85521f94d2c223dfcd8911137"} Mar 09 04:13:30 crc kubenswrapper[4901]: I0309 04:13:30.380204 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"423e84f6-36ec-4649-97e4-faf3da93684e","Type":"ContainerStarted","Data":"935b72493c72e15a658c9bc64d23fb294709dcccf7420c972df1001ef8c0d108"} Mar 09 04:13:30 crc kubenswrapper[4901]: I0309 04:13:30.380306 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"423e84f6-36ec-4649-97e4-faf3da93684e","Type":"ContainerStarted","Data":"eea4306ef98a7fef36e717efdc3ae373798339b3151d43fb891a974ba7ac4ae1"} Mar 09 04:13:30 crc kubenswrapper[4901]: I0309 04:13:30.380320 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"423e84f6-36ec-4649-97e4-faf3da93684e","Type":"ContainerStarted","Data":"8b764483729d1bff8744999cc09d70dc53614c8909ca5f9f84c5c030e271f23f"} Mar 09 04:13:30 crc kubenswrapper[4901]: I0309 04:13:30.404909 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=4.404867496 podStartE2EDuration="4.404867496s" podCreationTimestamp="2026-03-09 04:13:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 04:13:30.399100173 +0000 UTC m=+5534.988763945" watchObservedRunningTime="2026-03-09 04:13:30.404867496 +0000 UTC m=+5534.994531238" Mar 09 04:13:30 crc kubenswrapper[4901]: I0309 04:13:30.435910 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=3.435882096 podStartE2EDuration="3.435882096s" podCreationTimestamp="2026-03-09 04:13:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 04:13:30.428095283 +0000 UTC m=+5535.017759055" watchObservedRunningTime="2026-03-09 04:13:30.435882096 +0000 UTC m=+5535.025545868" Mar 09 04:13:30 crc kubenswrapper[4901]: I0309 04:13:30.467288 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=3.467263587 podStartE2EDuration="3.467263587s" podCreationTimestamp="2026-03-09 04:13:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 04:13:30.462614391 +0000 UTC m=+5535.052278123" watchObservedRunningTime="2026-03-09 04:13:30.467263587 +0000 UTC m=+5535.056927349" Mar 09 04:13:30 crc kubenswrapper[4901]: I0309 04:13:30.508029 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Mar 09 04:13:30 crc kubenswrapper[4901]: W0309 04:13:30.513712 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda75a85d8_38a8_4799_8a0a_ca67151aa49a.slice/crio-baea646f186ceaa5baf1e6332a896b1bb86c4b21b740e3007a0e88b88de43204 WatchSource:0}: Error finding container baea646f186ceaa5baf1e6332a896b1bb86c4b21b740e3007a0e88b88de43204: Status 404 returned error can't find the container with id baea646f186ceaa5baf1e6332a896b1bb86c4b21b740e3007a0e88b88de43204 Mar 09 04:13:30 crc kubenswrapper[4901]: I0309 04:13:30.770678 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 09 04:13:30 crc kubenswrapper[4901]: I0309 04:13:30.791168 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Mar 09 04:13:30 crc kubenswrapper[4901]: I0309 04:13:30.822428 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Mar 09 04:13:31 crc kubenswrapper[4901]: I0309 04:13:31.107390 4901 scope.go:117] "RemoveContainer" containerID="d217497ad345fae7da1bdd0eb24a9199e10562d0bb7d6f5c45a95f7e79585459" Mar 09 04:13:31 crc kubenswrapper[4901]: E0309 04:13:31.107807 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:13:31 crc kubenswrapper[4901]: I0309 04:13:31.395632 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"a75a85d8-38a8-4799-8a0a-ca67151aa49a","Type":"ContainerStarted","Data":"6c4c0201ef8e2e6db35383432463edce84715880ff4c47d93304d9a2aa69eff7"} Mar 09 04:13:31 crc kubenswrapper[4901]: I0309 04:13:31.395722 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"a75a85d8-38a8-4799-8a0a-ca67151aa49a","Type":"ContainerStarted","Data":"5a9a1eef9709bb4f2c199b98cc16e66b14cf5c546a257990d787d0a9120cf31f"} Mar 09 04:13:31 crc kubenswrapper[4901]: I0309 04:13:31.395750 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"a75a85d8-38a8-4799-8a0a-ca67151aa49a","Type":"ContainerStarted","Data":"baea646f186ceaa5baf1e6332a896b1bb86c4b21b740e3007a0e88b88de43204"} Mar 09 04:13:31 crc kubenswrapper[4901]: I0309 04:13:31.429513 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=4.429489615 podStartE2EDuration="4.429489615s" podCreationTimestamp="2026-03-09 04:13:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 04:13:31.419177129 +0000 UTC m=+5536.008840901" watchObservedRunningTime="2026-03-09 04:13:31.429489615 +0000 UTC m=+5536.019153387" Mar 09 04:13:32 crc kubenswrapper[4901]: I0309 04:13:32.071859 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 09 04:13:32 crc kubenswrapper[4901]: I0309 04:13:32.082801 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Mar 09 04:13:32 crc kubenswrapper[4901]: I0309 04:13:32.099921 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Mar 09 04:13:32 crc kubenswrapper[4901]: I0309 04:13:32.144070 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 09 04:13:32 crc kubenswrapper[4901]: I0309 04:13:32.179385 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Mar 09 04:13:32 crc kubenswrapper[4901]: I0309 04:13:32.408049 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Mar 09 04:13:32 crc kubenswrapper[4901]: I0309 04:13:32.409250 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 09 04:13:32 crc kubenswrapper[4901]: I0309 04:13:32.770760 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 09 04:13:32 crc kubenswrapper[4901]: I0309 04:13:32.791199 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Mar 09 04:13:32 crc kubenswrapper[4901]: I0309 04:13:32.822687 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-2" Mar 09 04:13:33 crc kubenswrapper[4901]: I0309 04:13:33.826912 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 09 04:13:33 crc kubenswrapper[4901]: I0309 04:13:33.878541 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Mar 09 04:13:33 crc kubenswrapper[4901]: I0309 04:13:33.906930 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Mar 09 04:13:34 crc kubenswrapper[4901]: I0309 04:13:34.082509 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Mar 09 04:13:34 crc kubenswrapper[4901]: I0309 04:13:34.144831 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 09 04:13:34 crc kubenswrapper[4901]: I0309 04:13:34.145909 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Mar 09 04:13:34 crc kubenswrapper[4901]: I0309 04:13:34.475192 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56c7d464d5-znwrj"] Mar 09 04:13:34 crc kubenswrapper[4901]: I0309 04:13:34.476493 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56c7d464d5-znwrj" Mar 09 04:13:34 crc kubenswrapper[4901]: I0309 04:13:34.478398 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 09 04:13:34 crc kubenswrapper[4901]: I0309 04:13:34.490183 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 09 04:13:34 crc kubenswrapper[4901]: I0309 04:13:34.498482 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56c7d464d5-znwrj"] Mar 09 04:13:34 crc kubenswrapper[4901]: I0309 04:13:34.525079 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2525a85d-d6c1-48c5-a646-6c4c634186f7-config\") pod \"dnsmasq-dns-56c7d464d5-znwrj\" (UID: \"2525a85d-d6c1-48c5-a646-6c4c634186f7\") " pod="openstack/dnsmasq-dns-56c7d464d5-znwrj" Mar 09 04:13:34 crc kubenswrapper[4901]: I0309 04:13:34.525392 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2525a85d-d6c1-48c5-a646-6c4c634186f7-dns-svc\") pod \"dnsmasq-dns-56c7d464d5-znwrj\" (UID: \"2525a85d-d6c1-48c5-a646-6c4c634186f7\") " pod="openstack/dnsmasq-dns-56c7d464d5-znwrj" Mar 09 04:13:34 crc kubenswrapper[4901]: I0309 04:13:34.525438 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2525a85d-d6c1-48c5-a646-6c4c634186f7-ovsdbserver-sb\") pod \"dnsmasq-dns-56c7d464d5-znwrj\" (UID: \"2525a85d-d6c1-48c5-a646-6c4c634186f7\") " pod="openstack/dnsmasq-dns-56c7d464d5-znwrj" Mar 09 04:13:34 crc kubenswrapper[4901]: I0309 04:13:34.525639 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8bbc\" (UniqueName: \"kubernetes.io/projected/2525a85d-d6c1-48c5-a646-6c4c634186f7-kube-api-access-j8bbc\") pod \"dnsmasq-dns-56c7d464d5-znwrj\" (UID: \"2525a85d-d6c1-48c5-a646-6c4c634186f7\") " pod="openstack/dnsmasq-dns-56c7d464d5-znwrj" Mar 09 04:13:34 crc kubenswrapper[4901]: I0309 04:13:34.627612 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8bbc\" (UniqueName: \"kubernetes.io/projected/2525a85d-d6c1-48c5-a646-6c4c634186f7-kube-api-access-j8bbc\") pod \"dnsmasq-dns-56c7d464d5-znwrj\" (UID: \"2525a85d-d6c1-48c5-a646-6c4c634186f7\") " pod="openstack/dnsmasq-dns-56c7d464d5-znwrj" Mar 09 04:13:34 crc kubenswrapper[4901]: I0309 04:13:34.627878 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2525a85d-d6c1-48c5-a646-6c4c634186f7-config\") pod \"dnsmasq-dns-56c7d464d5-znwrj\" (UID: \"2525a85d-d6c1-48c5-a646-6c4c634186f7\") " pod="openstack/dnsmasq-dns-56c7d464d5-znwrj" Mar 09 04:13:34 crc kubenswrapper[4901]: I0309 04:13:34.627986 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2525a85d-d6c1-48c5-a646-6c4c634186f7-dns-svc\") pod \"dnsmasq-dns-56c7d464d5-znwrj\" (UID: \"2525a85d-d6c1-48c5-a646-6c4c634186f7\") " pod="openstack/dnsmasq-dns-56c7d464d5-znwrj" Mar 09 04:13:34 crc kubenswrapper[4901]: I0309 04:13:34.628025 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2525a85d-d6c1-48c5-a646-6c4c634186f7-ovsdbserver-sb\") pod \"dnsmasq-dns-56c7d464d5-znwrj\" (UID: \"2525a85d-d6c1-48c5-a646-6c4c634186f7\") " pod="openstack/dnsmasq-dns-56c7d464d5-znwrj" Mar 09 04:13:34 crc kubenswrapper[4901]: I0309 04:13:34.631496 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2525a85d-d6c1-48c5-a646-6c4c634186f7-dns-svc\") pod \"dnsmasq-dns-56c7d464d5-znwrj\" (UID: \"2525a85d-d6c1-48c5-a646-6c4c634186f7\") " pod="openstack/dnsmasq-dns-56c7d464d5-znwrj" Mar 09 04:13:34 crc kubenswrapper[4901]: I0309 04:13:34.631553 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2525a85d-d6c1-48c5-a646-6c4c634186f7-ovsdbserver-sb\") pod \"dnsmasq-dns-56c7d464d5-znwrj\" (UID: \"2525a85d-d6c1-48c5-a646-6c4c634186f7\") " pod="openstack/dnsmasq-dns-56c7d464d5-znwrj" Mar 09 04:13:34 crc kubenswrapper[4901]: I0309 04:13:34.631576 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2525a85d-d6c1-48c5-a646-6c4c634186f7-config\") pod \"dnsmasq-dns-56c7d464d5-znwrj\" (UID: \"2525a85d-d6c1-48c5-a646-6c4c634186f7\") " pod="openstack/dnsmasq-dns-56c7d464d5-znwrj" Mar 09 04:13:34 crc kubenswrapper[4901]: I0309 04:13:34.650805 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8bbc\" (UniqueName: \"kubernetes.io/projected/2525a85d-d6c1-48c5-a646-6c4c634186f7-kube-api-access-j8bbc\") pod \"dnsmasq-dns-56c7d464d5-znwrj\" (UID: \"2525a85d-d6c1-48c5-a646-6c4c634186f7\") " pod="openstack/dnsmasq-dns-56c7d464d5-znwrj" Mar 09 04:13:34 crc kubenswrapper[4901]: I0309 04:13:34.800549 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56c7d464d5-znwrj" Mar 09 04:13:34 crc kubenswrapper[4901]: I0309 04:13:34.882435 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56c7d464d5-znwrj"] Mar 09 04:13:34 crc kubenswrapper[4901]: I0309 04:13:34.925984 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-857547b655-qxjlx"] Mar 09 04:13:34 crc kubenswrapper[4901]: I0309 04:13:34.927169 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-857547b655-qxjlx" Mar 09 04:13:34 crc kubenswrapper[4901]: I0309 04:13:34.930612 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 09 04:13:34 crc kubenswrapper[4901]: I0309 04:13:34.946438 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-857547b655-qxjlx"] Mar 09 04:13:35 crc kubenswrapper[4901]: I0309 04:13:35.036104 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cfe2f592-c7ea-4838-9cc2-281cbf406cae-dns-svc\") pod \"dnsmasq-dns-857547b655-qxjlx\" (UID: \"cfe2f592-c7ea-4838-9cc2-281cbf406cae\") " pod="openstack/dnsmasq-dns-857547b655-qxjlx" Mar 09 04:13:35 crc kubenswrapper[4901]: I0309 04:13:35.036156 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cfe2f592-c7ea-4838-9cc2-281cbf406cae-ovsdbserver-sb\") pod \"dnsmasq-dns-857547b655-qxjlx\" (UID: \"cfe2f592-c7ea-4838-9cc2-281cbf406cae\") " pod="openstack/dnsmasq-dns-857547b655-qxjlx" Mar 09 04:13:35 crc kubenswrapper[4901]: I0309 04:13:35.036194 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cfe2f592-c7ea-4838-9cc2-281cbf406cae-ovsdbserver-nb\") pod \"dnsmasq-dns-857547b655-qxjlx\" (UID: \"cfe2f592-c7ea-4838-9cc2-281cbf406cae\") " pod="openstack/dnsmasq-dns-857547b655-qxjlx" Mar 09 04:13:35 crc kubenswrapper[4901]: I0309 04:13:35.036236 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtzdg\" (UniqueName: \"kubernetes.io/projected/cfe2f592-c7ea-4838-9cc2-281cbf406cae-kube-api-access-xtzdg\") pod \"dnsmasq-dns-857547b655-qxjlx\" (UID: \"cfe2f592-c7ea-4838-9cc2-281cbf406cae\") " pod="openstack/dnsmasq-dns-857547b655-qxjlx" Mar 09 04:13:35 crc kubenswrapper[4901]: I0309 04:13:35.036310 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfe2f592-c7ea-4838-9cc2-281cbf406cae-config\") pod \"dnsmasq-dns-857547b655-qxjlx\" (UID: \"cfe2f592-c7ea-4838-9cc2-281cbf406cae\") " pod="openstack/dnsmasq-dns-857547b655-qxjlx" Mar 09 04:13:35 crc kubenswrapper[4901]: I0309 04:13:35.125833 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Mar 09 04:13:35 crc kubenswrapper[4901]: I0309 04:13:35.137345 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cfe2f592-c7ea-4838-9cc2-281cbf406cae-dns-svc\") pod \"dnsmasq-dns-857547b655-qxjlx\" (UID: \"cfe2f592-c7ea-4838-9cc2-281cbf406cae\") " pod="openstack/dnsmasq-dns-857547b655-qxjlx" Mar 09 04:13:35 crc kubenswrapper[4901]: I0309 04:13:35.137397 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cfe2f592-c7ea-4838-9cc2-281cbf406cae-ovsdbserver-sb\") pod \"dnsmasq-dns-857547b655-qxjlx\" (UID: \"cfe2f592-c7ea-4838-9cc2-281cbf406cae\") " pod="openstack/dnsmasq-dns-857547b655-qxjlx" Mar 09 04:13:35 crc kubenswrapper[4901]: I0309 04:13:35.137426 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cfe2f592-c7ea-4838-9cc2-281cbf406cae-ovsdbserver-nb\") pod \"dnsmasq-dns-857547b655-qxjlx\" (UID: \"cfe2f592-c7ea-4838-9cc2-281cbf406cae\") " pod="openstack/dnsmasq-dns-857547b655-qxjlx" Mar 09 04:13:35 crc kubenswrapper[4901]: I0309 04:13:35.137450 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtzdg\" (UniqueName: \"kubernetes.io/projected/cfe2f592-c7ea-4838-9cc2-281cbf406cae-kube-api-access-xtzdg\") pod \"dnsmasq-dns-857547b655-qxjlx\" (UID: \"cfe2f592-c7ea-4838-9cc2-281cbf406cae\") " pod="openstack/dnsmasq-dns-857547b655-qxjlx" Mar 09 04:13:35 crc kubenswrapper[4901]: I0309 04:13:35.137492 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfe2f592-c7ea-4838-9cc2-281cbf406cae-config\") pod \"dnsmasq-dns-857547b655-qxjlx\" (UID: \"cfe2f592-c7ea-4838-9cc2-281cbf406cae\") " pod="openstack/dnsmasq-dns-857547b655-qxjlx" Mar 09 04:13:35 crc kubenswrapper[4901]: I0309 04:13:35.139482 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cfe2f592-c7ea-4838-9cc2-281cbf406cae-ovsdbserver-sb\") pod \"dnsmasq-dns-857547b655-qxjlx\" (UID: \"cfe2f592-c7ea-4838-9cc2-281cbf406cae\") " pod="openstack/dnsmasq-dns-857547b655-qxjlx" Mar 09 04:13:35 crc kubenswrapper[4901]: I0309 04:13:35.140022 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cfe2f592-c7ea-4838-9cc2-281cbf406cae-dns-svc\") pod \"dnsmasq-dns-857547b655-qxjlx\" (UID: \"cfe2f592-c7ea-4838-9cc2-281cbf406cae\") " pod="openstack/dnsmasq-dns-857547b655-qxjlx" Mar 09 04:13:35 crc kubenswrapper[4901]: I0309 04:13:35.143417 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cfe2f592-c7ea-4838-9cc2-281cbf406cae-ovsdbserver-nb\") pod \"dnsmasq-dns-857547b655-qxjlx\" (UID: \"cfe2f592-c7ea-4838-9cc2-281cbf406cae\") " pod="openstack/dnsmasq-dns-857547b655-qxjlx" Mar 09 04:13:35 crc kubenswrapper[4901]: I0309 04:13:35.146573 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfe2f592-c7ea-4838-9cc2-281cbf406cae-config\") pod \"dnsmasq-dns-857547b655-qxjlx\" (UID: \"cfe2f592-c7ea-4838-9cc2-281cbf406cae\") " pod="openstack/dnsmasq-dns-857547b655-qxjlx" Mar 09 04:13:35 crc kubenswrapper[4901]: I0309 04:13:35.172475 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtzdg\" (UniqueName: \"kubernetes.io/projected/cfe2f592-c7ea-4838-9cc2-281cbf406cae-kube-api-access-xtzdg\") pod \"dnsmasq-dns-857547b655-qxjlx\" (UID: \"cfe2f592-c7ea-4838-9cc2-281cbf406cae\") " pod="openstack/dnsmasq-dns-857547b655-qxjlx" Mar 09 04:13:35 crc kubenswrapper[4901]: I0309 04:13:35.249702 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-857547b655-qxjlx" Mar 09 04:13:35 crc kubenswrapper[4901]: I0309 04:13:35.346426 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56c7d464d5-znwrj"] Mar 09 04:13:35 crc kubenswrapper[4901]: I0309 04:13:35.457769 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56c7d464d5-znwrj" event={"ID":"2525a85d-d6c1-48c5-a646-6c4c634186f7","Type":"ContainerStarted","Data":"facfa10a23fea3b74dbe711ef610e44759c38ed3f437566ebde8ce5d977d1ae1"} Mar 09 04:13:35 crc kubenswrapper[4901]: I0309 04:13:35.498470 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Mar 09 04:13:35 crc kubenswrapper[4901]: I0309 04:13:35.674031 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-857547b655-qxjlx"] Mar 09 04:13:36 crc kubenswrapper[4901]: I0309 04:13:36.468210 4901 generic.go:334] "Generic (PLEG): container finished" podID="2525a85d-d6c1-48c5-a646-6c4c634186f7" containerID="52cb468e87c2582b6465ac4e011dfa09c2f39351b9f5a0b33f881793b8eea266" exitCode=0 Mar 09 04:13:36 crc kubenswrapper[4901]: I0309 04:13:36.468365 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56c7d464d5-znwrj" event={"ID":"2525a85d-d6c1-48c5-a646-6c4c634186f7","Type":"ContainerDied","Data":"52cb468e87c2582b6465ac4e011dfa09c2f39351b9f5a0b33f881793b8eea266"} Mar 09 04:13:36 crc kubenswrapper[4901]: I0309 04:13:36.470383 4901 generic.go:334] "Generic (PLEG): container finished" podID="cfe2f592-c7ea-4838-9cc2-281cbf406cae" containerID="f83ae20b441e7daf6bbc7859e4aa12f8cb598e680dcc6c1787c2f12e38f815fe" exitCode=0 Mar 09 04:13:36 crc kubenswrapper[4901]: I0309 04:13:36.470439 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-857547b655-qxjlx" event={"ID":"cfe2f592-c7ea-4838-9cc2-281cbf406cae","Type":"ContainerDied","Data":"f83ae20b441e7daf6bbc7859e4aa12f8cb598e680dcc6c1787c2f12e38f815fe"} Mar 09 04:13:36 crc kubenswrapper[4901]: I0309 04:13:36.470475 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-857547b655-qxjlx" event={"ID":"cfe2f592-c7ea-4838-9cc2-281cbf406cae","Type":"ContainerStarted","Data":"ff602be06a69b748c455e7cb437cf1a307874983fdcd93f7f2c9fba5913141eb"} Mar 09 04:13:36 crc kubenswrapper[4901]: I0309 04:13:36.886878 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56c7d464d5-znwrj" Mar 09 04:13:36 crc kubenswrapper[4901]: I0309 04:13:36.982469 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2525a85d-d6c1-48c5-a646-6c4c634186f7-dns-svc\") pod \"2525a85d-d6c1-48c5-a646-6c4c634186f7\" (UID: \"2525a85d-d6c1-48c5-a646-6c4c634186f7\") " Mar 09 04:13:36 crc kubenswrapper[4901]: I0309 04:13:36.982551 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2525a85d-d6c1-48c5-a646-6c4c634186f7-ovsdbserver-sb\") pod \"2525a85d-d6c1-48c5-a646-6c4c634186f7\" (UID: \"2525a85d-d6c1-48c5-a646-6c4c634186f7\") " Mar 09 04:13:36 crc kubenswrapper[4901]: I0309 04:13:36.982673 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2525a85d-d6c1-48c5-a646-6c4c634186f7-config\") pod \"2525a85d-d6c1-48c5-a646-6c4c634186f7\" (UID: \"2525a85d-d6c1-48c5-a646-6c4c634186f7\") " Mar 09 04:13:36 crc kubenswrapper[4901]: I0309 04:13:36.982715 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8bbc\" (UniqueName: \"kubernetes.io/projected/2525a85d-d6c1-48c5-a646-6c4c634186f7-kube-api-access-j8bbc\") pod \"2525a85d-d6c1-48c5-a646-6c4c634186f7\" (UID: \"2525a85d-d6c1-48c5-a646-6c4c634186f7\") " Mar 09 04:13:36 crc kubenswrapper[4901]: I0309 04:13:36.988288 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2525a85d-d6c1-48c5-a646-6c4c634186f7-kube-api-access-j8bbc" (OuterVolumeSpecName: "kube-api-access-j8bbc") pod "2525a85d-d6c1-48c5-a646-6c4c634186f7" (UID: "2525a85d-d6c1-48c5-a646-6c4c634186f7"). InnerVolumeSpecName "kube-api-access-j8bbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:13:37 crc kubenswrapper[4901]: I0309 04:13:37.004827 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2525a85d-d6c1-48c5-a646-6c4c634186f7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2525a85d-d6c1-48c5-a646-6c4c634186f7" (UID: "2525a85d-d6c1-48c5-a646-6c4c634186f7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 04:13:37 crc kubenswrapper[4901]: I0309 04:13:37.008143 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2525a85d-d6c1-48c5-a646-6c4c634186f7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2525a85d-d6c1-48c5-a646-6c4c634186f7" (UID: "2525a85d-d6c1-48c5-a646-6c4c634186f7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 04:13:37 crc kubenswrapper[4901]: I0309 04:13:37.014520 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2525a85d-d6c1-48c5-a646-6c4c634186f7-config" (OuterVolumeSpecName: "config") pod "2525a85d-d6c1-48c5-a646-6c4c634186f7" (UID: "2525a85d-d6c1-48c5-a646-6c4c634186f7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 04:13:37 crc kubenswrapper[4901]: I0309 04:13:37.084110 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8bbc\" (UniqueName: \"kubernetes.io/projected/2525a85d-d6c1-48c5-a646-6c4c634186f7-kube-api-access-j8bbc\") on node \"crc\" DevicePath \"\"" Mar 09 04:13:37 crc kubenswrapper[4901]: I0309 04:13:37.084148 4901 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2525a85d-d6c1-48c5-a646-6c4c634186f7-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 04:13:37 crc kubenswrapper[4901]: I0309 04:13:37.084162 4901 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2525a85d-d6c1-48c5-a646-6c4c634186f7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 04:13:37 crc kubenswrapper[4901]: I0309 04:13:37.084171 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2525a85d-d6c1-48c5-a646-6c4c634186f7-config\") on node \"crc\" DevicePath \"\"" Mar 09 04:13:37 crc kubenswrapper[4901]: I0309 04:13:37.486256 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-857547b655-qxjlx" event={"ID":"cfe2f592-c7ea-4838-9cc2-281cbf406cae","Type":"ContainerStarted","Data":"29032917a4112fa87357b616c68ce60f4c8b347ab5097f703120948db6bff4e3"} Mar 09 04:13:37 crc kubenswrapper[4901]: I0309 04:13:37.487352 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-857547b655-qxjlx" Mar 09 04:13:37 crc kubenswrapper[4901]: I0309 04:13:37.489809 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56c7d464d5-znwrj" event={"ID":"2525a85d-d6c1-48c5-a646-6c4c634186f7","Type":"ContainerDied","Data":"facfa10a23fea3b74dbe711ef610e44759c38ed3f437566ebde8ce5d977d1ae1"} Mar 09 04:13:37 crc kubenswrapper[4901]: I0309 04:13:37.489874 4901 scope.go:117] "RemoveContainer" containerID="52cb468e87c2582b6465ac4e011dfa09c2f39351b9f5a0b33f881793b8eea266" Mar 09 04:13:37 crc kubenswrapper[4901]: I0309 04:13:37.489877 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56c7d464d5-znwrj" Mar 09 04:13:37 crc kubenswrapper[4901]: I0309 04:13:37.538303 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-857547b655-qxjlx" podStartSLOduration=3.538259887 podStartE2EDuration="3.538259887s" podCreationTimestamp="2026-03-09 04:13:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 04:13:37.520344061 +0000 UTC m=+5542.110007813" watchObservedRunningTime="2026-03-09 04:13:37.538259887 +0000 UTC m=+5542.127923639" Mar 09 04:13:37 crc kubenswrapper[4901]: I0309 04:13:37.579126 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56c7d464d5-znwrj"] Mar 09 04:13:37 crc kubenswrapper[4901]: I0309 04:13:37.585667 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56c7d464d5-znwrj"] Mar 09 04:13:37 crc kubenswrapper[4901]: E0309 04:13:37.741890 4901 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2525a85d_d6c1_48c5_a646_6c4c634186f7.slice/crio-facfa10a23fea3b74dbe711ef610e44759c38ed3f437566ebde8ce5d977d1ae1\": RecentStats: unable to find data in memory cache]" Mar 09 04:13:37 crc kubenswrapper[4901]: I0309 04:13:37.831270 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Mar 09 04:13:37 crc kubenswrapper[4901]: I0309 04:13:37.867862 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Mar 09 04:13:38 crc kubenswrapper[4901]: I0309 04:13:38.117298 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2525a85d-d6c1-48c5-a646-6c4c634186f7" path="/var/lib/kubelet/pods/2525a85d-d6c1-48c5-a646-6c4c634186f7/volumes" Mar 09 04:13:40 crc kubenswrapper[4901]: I0309 04:13:40.792036 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Mar 09 04:13:40 crc kubenswrapper[4901]: E0309 04:13:40.793098 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2525a85d-d6c1-48c5-a646-6c4c634186f7" containerName="init" Mar 09 04:13:40 crc kubenswrapper[4901]: I0309 04:13:40.793129 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="2525a85d-d6c1-48c5-a646-6c4c634186f7" containerName="init" Mar 09 04:13:40 crc kubenswrapper[4901]: I0309 04:13:40.793575 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="2525a85d-d6c1-48c5-a646-6c4c634186f7" containerName="init" Mar 09 04:13:40 crc kubenswrapper[4901]: I0309 04:13:40.794803 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Mar 09 04:13:40 crc kubenswrapper[4901]: I0309 04:13:40.799326 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovn-data-cert" Mar 09 04:13:40 crc kubenswrapper[4901]: I0309 04:13:40.815868 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Mar 09 04:13:40 crc kubenswrapper[4901]: I0309 04:13:40.856872 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcdvk\" (UniqueName: \"kubernetes.io/projected/0955f311-3f27-47c0-85e5-e9a5f8136516-kube-api-access-xcdvk\") pod \"ovn-copy-data\" (UID: \"0955f311-3f27-47c0-85e5-e9a5f8136516\") " pod="openstack/ovn-copy-data" Mar 09 04:13:40 crc kubenswrapper[4901]: I0309 04:13:40.857036 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a7ed60b5-fe46-48d6-8e28-943635f51019\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a7ed60b5-fe46-48d6-8e28-943635f51019\") pod \"ovn-copy-data\" (UID: \"0955f311-3f27-47c0-85e5-e9a5f8136516\") " pod="openstack/ovn-copy-data" Mar 09 04:13:40 crc kubenswrapper[4901]: I0309 04:13:40.857068 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/0955f311-3f27-47c0-85e5-e9a5f8136516-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"0955f311-3f27-47c0-85e5-e9a5f8136516\") " pod="openstack/ovn-copy-data" Mar 09 04:13:40 crc kubenswrapper[4901]: I0309 04:13:40.959510 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcdvk\" (UniqueName: \"kubernetes.io/projected/0955f311-3f27-47c0-85e5-e9a5f8136516-kube-api-access-xcdvk\") pod \"ovn-copy-data\" (UID: \"0955f311-3f27-47c0-85e5-e9a5f8136516\") " pod="openstack/ovn-copy-data" Mar 09 04:13:40 crc kubenswrapper[4901]: I0309 04:13:40.959671 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a7ed60b5-fe46-48d6-8e28-943635f51019\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a7ed60b5-fe46-48d6-8e28-943635f51019\") pod \"ovn-copy-data\" (UID: \"0955f311-3f27-47c0-85e5-e9a5f8136516\") " pod="openstack/ovn-copy-data" Mar 09 04:13:40 crc kubenswrapper[4901]: I0309 04:13:40.959719 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/0955f311-3f27-47c0-85e5-e9a5f8136516-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"0955f311-3f27-47c0-85e5-e9a5f8136516\") " pod="openstack/ovn-copy-data" Mar 09 04:13:40 crc kubenswrapper[4901]: I0309 04:13:40.976476 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/0955f311-3f27-47c0-85e5-e9a5f8136516-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"0955f311-3f27-47c0-85e5-e9a5f8136516\") " pod="openstack/ovn-copy-data" Mar 09 04:13:40 crc kubenswrapper[4901]: I0309 04:13:40.993202 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcdvk\" (UniqueName: \"kubernetes.io/projected/0955f311-3f27-47c0-85e5-e9a5f8136516-kube-api-access-xcdvk\") pod \"ovn-copy-data\" (UID: \"0955f311-3f27-47c0-85e5-e9a5f8136516\") " pod="openstack/ovn-copy-data" Mar 09 04:13:40 crc kubenswrapper[4901]: I0309 04:13:40.995334 4901 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 04:13:40 crc kubenswrapper[4901]: I0309 04:13:40.995368 4901 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a7ed60b5-fe46-48d6-8e28-943635f51019\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a7ed60b5-fe46-48d6-8e28-943635f51019\") pod \"ovn-copy-data\" (UID: \"0955f311-3f27-47c0-85e5-e9a5f8136516\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3b8c039c1e03078c6fd85a3bf151d120f2e77cbbbf64000e4ad48a63401f2d4d/globalmount\"" pod="openstack/ovn-copy-data" Mar 09 04:13:41 crc kubenswrapper[4901]: I0309 04:13:41.105201 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a7ed60b5-fe46-48d6-8e28-943635f51019\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a7ed60b5-fe46-48d6-8e28-943635f51019\") pod \"ovn-copy-data\" (UID: \"0955f311-3f27-47c0-85e5-e9a5f8136516\") " pod="openstack/ovn-copy-data" Mar 09 04:13:41 crc kubenswrapper[4901]: I0309 04:13:41.126328 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Mar 09 04:13:41 crc kubenswrapper[4901]: I0309 04:13:41.707164 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Mar 09 04:13:41 crc kubenswrapper[4901]: W0309 04:13:41.708626 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0955f311_3f27_47c0_85e5_e9a5f8136516.slice/crio-3bfc8ee903dde8b5b3fe6066e54e22b62e8a9f4476ac83c91a3a073186c5598d WatchSource:0}: Error finding container 3bfc8ee903dde8b5b3fe6066e54e22b62e8a9f4476ac83c91a3a073186c5598d: Status 404 returned error can't find the container with id 3bfc8ee903dde8b5b3fe6066e54e22b62e8a9f4476ac83c91a3a073186c5598d Mar 09 04:13:42 crc kubenswrapper[4901]: I0309 04:13:42.106680 4901 scope.go:117] "RemoveContainer" containerID="d217497ad345fae7da1bdd0eb24a9199e10562d0bb7d6f5c45a95f7e79585459" Mar 09 04:13:42 crc kubenswrapper[4901]: E0309 04:13:42.108714 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:13:42 crc kubenswrapper[4901]: I0309 04:13:42.422612 4901 scope.go:117] "RemoveContainer" containerID="5beda19574c2f6440ef021be2906dcd6db84cab953e4edf99f91a06eef67d595" Mar 09 04:13:42 crc kubenswrapper[4901]: I0309 04:13:42.445000 4901 scope.go:117] "RemoveContainer" containerID="c8c9ad9a119d0e3744ac90f19b2e3b0fcd6cb5fd7c18bc92d27dafc5f8babe31" Mar 09 04:13:42 crc kubenswrapper[4901]: I0309 04:13:42.501604 4901 scope.go:117] "RemoveContainer" containerID="442a4abd63bbab64729a563c9165f3aeec4099412eea7769e69110fa68ab6817" Mar 09 04:13:42 crc kubenswrapper[4901]: I0309 04:13:42.548293 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"0955f311-3f27-47c0-85e5-e9a5f8136516","Type":"ContainerStarted","Data":"17b48f74acfd03e402eaa479713a1d4e69d17f88048115ee2d5052fdd94fc12a"} Mar 09 04:13:42 crc kubenswrapper[4901]: I0309 04:13:42.548358 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"0955f311-3f27-47c0-85e5-e9a5f8136516","Type":"ContainerStarted","Data":"3bfc8ee903dde8b5b3fe6066e54e22b62e8a9f4476ac83c91a3a073186c5598d"} Mar 09 04:13:42 crc kubenswrapper[4901]: I0309 04:13:42.573404 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=3.077025024 podStartE2EDuration="3.573387458s" podCreationTimestamp="2026-03-09 04:13:39 +0000 UTC" firstStartedPulling="2026-03-09 04:13:41.711674489 +0000 UTC m=+5546.301338231" lastFinishedPulling="2026-03-09 04:13:42.208036923 +0000 UTC m=+5546.797700665" observedRunningTime="2026-03-09 04:13:42.569185054 +0000 UTC m=+5547.158848876" watchObservedRunningTime="2026-03-09 04:13:42.573387458 +0000 UTC m=+5547.163051200" Mar 09 04:13:45 crc kubenswrapper[4901]: I0309 04:13:45.252559 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-857547b655-qxjlx" Mar 09 04:13:45 crc kubenswrapper[4901]: I0309 04:13:45.360818 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66d5bf7c87-w8wd7"] Mar 09 04:13:45 crc kubenswrapper[4901]: I0309 04:13:45.361098 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-66d5bf7c87-w8wd7" podUID="40189f68-54f4-4d38-ab8a-5a470afd7461" containerName="dnsmasq-dns" containerID="cri-o://87123861efab25958328587e10c056c20fcb622dc1423128ed57f036eab73417" gracePeriod=10 Mar 09 04:13:45 crc kubenswrapper[4901]: I0309 04:13:45.386415 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-66d5bf7c87-w8wd7" podUID="40189f68-54f4-4d38-ab8a-5a470afd7461" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.29:5353: connect: connection refused" Mar 09 04:13:45 crc kubenswrapper[4901]: I0309 04:13:45.597095 4901 generic.go:334] "Generic (PLEG): container finished" podID="40189f68-54f4-4d38-ab8a-5a470afd7461" containerID="87123861efab25958328587e10c056c20fcb622dc1423128ed57f036eab73417" exitCode=0 Mar 09 04:13:45 crc kubenswrapper[4901]: I0309 04:13:45.597288 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66d5bf7c87-w8wd7" event={"ID":"40189f68-54f4-4d38-ab8a-5a470afd7461","Type":"ContainerDied","Data":"87123861efab25958328587e10c056c20fcb622dc1423128ed57f036eab73417"} Mar 09 04:13:45 crc kubenswrapper[4901]: I0309 04:13:45.836208 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66d5bf7c87-w8wd7" Mar 09 04:13:45 crc kubenswrapper[4901]: I0309 04:13:45.855862 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7562\" (UniqueName: \"kubernetes.io/projected/40189f68-54f4-4d38-ab8a-5a470afd7461-kube-api-access-s7562\") pod \"40189f68-54f4-4d38-ab8a-5a470afd7461\" (UID: \"40189f68-54f4-4d38-ab8a-5a470afd7461\") " Mar 09 04:13:45 crc kubenswrapper[4901]: I0309 04:13:45.855907 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40189f68-54f4-4d38-ab8a-5a470afd7461-config\") pod \"40189f68-54f4-4d38-ab8a-5a470afd7461\" (UID: \"40189f68-54f4-4d38-ab8a-5a470afd7461\") " Mar 09 04:13:45 crc kubenswrapper[4901]: I0309 04:13:45.855948 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/40189f68-54f4-4d38-ab8a-5a470afd7461-dns-svc\") pod \"40189f68-54f4-4d38-ab8a-5a470afd7461\" (UID: \"40189f68-54f4-4d38-ab8a-5a470afd7461\") " Mar 09 04:13:45 crc kubenswrapper[4901]: I0309 04:13:45.870460 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40189f68-54f4-4d38-ab8a-5a470afd7461-kube-api-access-s7562" (OuterVolumeSpecName: "kube-api-access-s7562") pod "40189f68-54f4-4d38-ab8a-5a470afd7461" (UID: "40189f68-54f4-4d38-ab8a-5a470afd7461"). InnerVolumeSpecName "kube-api-access-s7562". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:13:45 crc kubenswrapper[4901]: I0309 04:13:45.902288 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40189f68-54f4-4d38-ab8a-5a470afd7461-config" (OuterVolumeSpecName: "config") pod "40189f68-54f4-4d38-ab8a-5a470afd7461" (UID: "40189f68-54f4-4d38-ab8a-5a470afd7461"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 04:13:45 crc kubenswrapper[4901]: I0309 04:13:45.912953 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40189f68-54f4-4d38-ab8a-5a470afd7461-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "40189f68-54f4-4d38-ab8a-5a470afd7461" (UID: "40189f68-54f4-4d38-ab8a-5a470afd7461"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 04:13:45 crc kubenswrapper[4901]: I0309 04:13:45.957401 4901 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/40189f68-54f4-4d38-ab8a-5a470afd7461-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 04:13:45 crc kubenswrapper[4901]: I0309 04:13:45.957431 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7562\" (UniqueName: \"kubernetes.io/projected/40189f68-54f4-4d38-ab8a-5a470afd7461-kube-api-access-s7562\") on node \"crc\" DevicePath \"\"" Mar 09 04:13:45 crc kubenswrapper[4901]: I0309 04:13:45.957441 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40189f68-54f4-4d38-ab8a-5a470afd7461-config\") on node \"crc\" DevicePath \"\"" Mar 09 04:13:46 crc kubenswrapper[4901]: I0309 04:13:46.614365 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66d5bf7c87-w8wd7" event={"ID":"40189f68-54f4-4d38-ab8a-5a470afd7461","Type":"ContainerDied","Data":"8a711aff529047836d40f77bd67f5d8f4099a0c4bde98d0143da8916356e0d12"} Mar 09 04:13:46 crc kubenswrapper[4901]: I0309 04:13:46.614429 4901 scope.go:117] "RemoveContainer" containerID="87123861efab25958328587e10c056c20fcb622dc1423128ed57f036eab73417" Mar 09 04:13:46 crc kubenswrapper[4901]: I0309 04:13:46.614430 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66d5bf7c87-w8wd7" Mar 09 04:13:46 crc kubenswrapper[4901]: I0309 04:13:46.650513 4901 scope.go:117] "RemoveContainer" containerID="8d1447f7850ff411a2fa3e4fb4674d47183a1fdf3f0b10f8efb1817d085aa4a4" Mar 09 04:13:46 crc kubenswrapper[4901]: I0309 04:13:46.653419 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66d5bf7c87-w8wd7"] Mar 09 04:13:46 crc kubenswrapper[4901]: I0309 04:13:46.667114 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-66d5bf7c87-w8wd7"] Mar 09 04:13:47 crc kubenswrapper[4901]: I0309 04:13:47.885885 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 09 04:13:47 crc kubenswrapper[4901]: E0309 04:13:47.886588 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40189f68-54f4-4d38-ab8a-5a470afd7461" containerName="init" Mar 09 04:13:47 crc kubenswrapper[4901]: I0309 04:13:47.886611 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="40189f68-54f4-4d38-ab8a-5a470afd7461" containerName="init" Mar 09 04:13:47 crc kubenswrapper[4901]: E0309 04:13:47.886664 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40189f68-54f4-4d38-ab8a-5a470afd7461" containerName="dnsmasq-dns" Mar 09 04:13:47 crc kubenswrapper[4901]: I0309 04:13:47.886674 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="40189f68-54f4-4d38-ab8a-5a470afd7461" containerName="dnsmasq-dns" Mar 09 04:13:47 crc kubenswrapper[4901]: I0309 04:13:47.886864 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="40189f68-54f4-4d38-ab8a-5a470afd7461" containerName="dnsmasq-dns" Mar 09 04:13:47 crc kubenswrapper[4901]: I0309 04:13:47.889322 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 09 04:13:47 crc kubenswrapper[4901]: I0309 04:13:47.889424 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 09 04:13:47 crc kubenswrapper[4901]: I0309 04:13:47.892159 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-k8jc4" Mar 09 04:13:47 crc kubenswrapper[4901]: I0309 04:13:47.892393 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 09 04:13:47 crc kubenswrapper[4901]: I0309 04:13:47.892522 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 09 04:13:47 crc kubenswrapper[4901]: I0309 04:13:47.892689 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 09 04:13:47 crc kubenswrapper[4901]: I0309 04:13:47.893280 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c5890487-b519-475d-9856-6449948d8f14-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c5890487-b519-475d-9856-6449948d8f14\") " pod="openstack/ovn-northd-0" Mar 09 04:13:47 crc kubenswrapper[4901]: I0309 04:13:47.893323 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5890487-b519-475d-9856-6449948d8f14-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"c5890487-b519-475d-9856-6449948d8f14\") " pod="openstack/ovn-northd-0" Mar 09 04:13:47 crc kubenswrapper[4901]: I0309 04:13:47.893365 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5890487-b519-475d-9856-6449948d8f14-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c5890487-b519-475d-9856-6449948d8f14\") " pod="openstack/ovn-northd-0" Mar 09 04:13:47 crc kubenswrapper[4901]: I0309 04:13:47.893430 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5890487-b519-475d-9856-6449948d8f14-config\") pod \"ovn-northd-0\" (UID: \"c5890487-b519-475d-9856-6449948d8f14\") " pod="openstack/ovn-northd-0" Mar 09 04:13:47 crc kubenswrapper[4901]: I0309 04:13:47.893496 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snf7b\" (UniqueName: \"kubernetes.io/projected/c5890487-b519-475d-9856-6449948d8f14-kube-api-access-snf7b\") pod \"ovn-northd-0\" (UID: \"c5890487-b519-475d-9856-6449948d8f14\") " pod="openstack/ovn-northd-0" Mar 09 04:13:47 crc kubenswrapper[4901]: I0309 04:13:47.893525 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c5890487-b519-475d-9856-6449948d8f14-scripts\") pod \"ovn-northd-0\" (UID: \"c5890487-b519-475d-9856-6449948d8f14\") " pod="openstack/ovn-northd-0" Mar 09 04:13:47 crc kubenswrapper[4901]: I0309 04:13:47.893553 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5890487-b519-475d-9856-6449948d8f14-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"c5890487-b519-475d-9856-6449948d8f14\") " pod="openstack/ovn-northd-0" Mar 09 04:13:47 crc kubenswrapper[4901]: I0309 04:13:47.995403 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5890487-b519-475d-9856-6449948d8f14-config\") pod \"ovn-northd-0\" (UID: \"c5890487-b519-475d-9856-6449948d8f14\") " pod="openstack/ovn-northd-0" Mar 09 04:13:47 crc kubenswrapper[4901]: I0309 04:13:47.995489 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snf7b\" (UniqueName: \"kubernetes.io/projected/c5890487-b519-475d-9856-6449948d8f14-kube-api-access-snf7b\") pod \"ovn-northd-0\" (UID: \"c5890487-b519-475d-9856-6449948d8f14\") " pod="openstack/ovn-northd-0" Mar 09 04:13:47 crc kubenswrapper[4901]: I0309 04:13:47.995514 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c5890487-b519-475d-9856-6449948d8f14-scripts\") pod \"ovn-northd-0\" (UID: \"c5890487-b519-475d-9856-6449948d8f14\") " pod="openstack/ovn-northd-0" Mar 09 04:13:47 crc kubenswrapper[4901]: I0309 04:13:47.995542 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5890487-b519-475d-9856-6449948d8f14-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"c5890487-b519-475d-9856-6449948d8f14\") " pod="openstack/ovn-northd-0" Mar 09 04:13:47 crc kubenswrapper[4901]: I0309 04:13:47.995576 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c5890487-b519-475d-9856-6449948d8f14-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c5890487-b519-475d-9856-6449948d8f14\") " pod="openstack/ovn-northd-0" Mar 09 04:13:47 crc kubenswrapper[4901]: I0309 04:13:47.995595 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5890487-b519-475d-9856-6449948d8f14-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"c5890487-b519-475d-9856-6449948d8f14\") " pod="openstack/ovn-northd-0" Mar 09 04:13:47 crc kubenswrapper[4901]: I0309 04:13:47.995628 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5890487-b519-475d-9856-6449948d8f14-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c5890487-b519-475d-9856-6449948d8f14\") " pod="openstack/ovn-northd-0" Mar 09 04:13:47 crc kubenswrapper[4901]: I0309 04:13:47.996598 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5890487-b519-475d-9856-6449948d8f14-config\") pod \"ovn-northd-0\" (UID: \"c5890487-b519-475d-9856-6449948d8f14\") " pod="openstack/ovn-northd-0" Mar 09 04:13:47 crc kubenswrapper[4901]: I0309 04:13:47.996914 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c5890487-b519-475d-9856-6449948d8f14-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c5890487-b519-475d-9856-6449948d8f14\") " pod="openstack/ovn-northd-0" Mar 09 04:13:47 crc kubenswrapper[4901]: I0309 04:13:47.997437 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c5890487-b519-475d-9856-6449948d8f14-scripts\") pod \"ovn-northd-0\" (UID: \"c5890487-b519-475d-9856-6449948d8f14\") " pod="openstack/ovn-northd-0" Mar 09 04:13:48 crc kubenswrapper[4901]: I0309 04:13:48.001758 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5890487-b519-475d-9856-6449948d8f14-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"c5890487-b519-475d-9856-6449948d8f14\") " pod="openstack/ovn-northd-0" Mar 09 04:13:48 crc kubenswrapper[4901]: I0309 04:13:48.003999 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5890487-b519-475d-9856-6449948d8f14-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"c5890487-b519-475d-9856-6449948d8f14\") " pod="openstack/ovn-northd-0" Mar 09 04:13:48 crc kubenswrapper[4901]: I0309 04:13:48.004291 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5890487-b519-475d-9856-6449948d8f14-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c5890487-b519-475d-9856-6449948d8f14\") " pod="openstack/ovn-northd-0" Mar 09 04:13:48 crc kubenswrapper[4901]: I0309 04:13:48.017047 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snf7b\" (UniqueName: \"kubernetes.io/projected/c5890487-b519-475d-9856-6449948d8f14-kube-api-access-snf7b\") pod \"ovn-northd-0\" (UID: \"c5890487-b519-475d-9856-6449948d8f14\") " pod="openstack/ovn-northd-0" Mar 09 04:13:48 crc kubenswrapper[4901]: I0309 04:13:48.118436 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40189f68-54f4-4d38-ab8a-5a470afd7461" path="/var/lib/kubelet/pods/40189f68-54f4-4d38-ab8a-5a470afd7461/volumes" Mar 09 04:13:48 crc kubenswrapper[4901]: I0309 04:13:48.211331 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 09 04:13:48 crc kubenswrapper[4901]: I0309 04:13:48.664323 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 09 04:13:49 crc kubenswrapper[4901]: I0309 04:13:49.646924 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c5890487-b519-475d-9856-6449948d8f14","Type":"ContainerStarted","Data":"10c5d0bea11a372d283861a002202ec4c62f9b65f5b6eb2da9f2626d913cde7b"} Mar 09 04:13:49 crc kubenswrapper[4901]: I0309 04:13:49.647435 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c5890487-b519-475d-9856-6449948d8f14","Type":"ContainerStarted","Data":"a98f404f29b4c6993151161f0e9ef0b2e4f8771b3a6e97a84150ab03b567d466"} Mar 09 04:13:49 crc kubenswrapper[4901]: I0309 04:13:49.647450 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c5890487-b519-475d-9856-6449948d8f14","Type":"ContainerStarted","Data":"85e449d45b6c46f9f93fae31534c074ef33376aa1d7ff55cfd43be857474f29a"} Mar 09 04:13:49 crc kubenswrapper[4901]: I0309 04:13:49.647575 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 09 04:13:49 crc kubenswrapper[4901]: I0309 04:13:49.692460 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.692422591 podStartE2EDuration="2.692422591s" podCreationTimestamp="2026-03-09 04:13:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 04:13:49.683982251 +0000 UTC m=+5554.273646043" watchObservedRunningTime="2026-03-09 04:13:49.692422591 +0000 UTC m=+5554.282086393" Mar 09 04:13:53 crc kubenswrapper[4901]: I0309 04:13:53.085306 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-s7nhh"] Mar 09 04:13:53 crc kubenswrapper[4901]: I0309 04:13:53.086733 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-s7nhh" Mar 09 04:13:53 crc kubenswrapper[4901]: I0309 04:13:53.098382 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-s7nhh"] Mar 09 04:13:53 crc kubenswrapper[4901]: I0309 04:13:53.113633 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-03b9-account-create-update-jbqws"] Mar 09 04:13:53 crc kubenswrapper[4901]: I0309 04:13:53.114574 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-03b9-account-create-update-jbqws" Mar 09 04:13:53 crc kubenswrapper[4901]: I0309 04:13:53.116510 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 09 04:13:53 crc kubenswrapper[4901]: I0309 04:13:53.124638 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-03b9-account-create-update-jbqws"] Mar 09 04:13:53 crc kubenswrapper[4901]: I0309 04:13:53.194198 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btftm\" (UniqueName: \"kubernetes.io/projected/09b6b1e3-ab73-49e7-aecd-fcd62e8bbf8c-kube-api-access-btftm\") pod \"keystone-db-create-s7nhh\" (UID: \"09b6b1e3-ab73-49e7-aecd-fcd62e8bbf8c\") " pod="openstack/keystone-db-create-s7nhh" Mar 09 04:13:53 crc kubenswrapper[4901]: I0309 04:13:53.194391 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09b6b1e3-ab73-49e7-aecd-fcd62e8bbf8c-operator-scripts\") pod \"keystone-db-create-s7nhh\" (UID: \"09b6b1e3-ab73-49e7-aecd-fcd62e8bbf8c\") " pod="openstack/keystone-db-create-s7nhh" Mar 09 04:13:53 crc kubenswrapper[4901]: I0309 04:13:53.296209 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09b6b1e3-ab73-49e7-aecd-fcd62e8bbf8c-operator-scripts\") pod \"keystone-db-create-s7nhh\" (UID: \"09b6b1e3-ab73-49e7-aecd-fcd62e8bbf8c\") " pod="openstack/keystone-db-create-s7nhh" Mar 09 04:13:53 crc kubenswrapper[4901]: I0309 04:13:53.296287 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btftm\" (UniqueName: \"kubernetes.io/projected/09b6b1e3-ab73-49e7-aecd-fcd62e8bbf8c-kube-api-access-btftm\") pod \"keystone-db-create-s7nhh\" (UID: \"09b6b1e3-ab73-49e7-aecd-fcd62e8bbf8c\") " pod="openstack/keystone-db-create-s7nhh" Mar 09 04:13:53 crc kubenswrapper[4901]: I0309 04:13:53.296395 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6445c1e-79ac-4e92-b7ab-a697417e44f8-operator-scripts\") pod \"keystone-03b9-account-create-update-jbqws\" (UID: \"b6445c1e-79ac-4e92-b7ab-a697417e44f8\") " pod="openstack/keystone-03b9-account-create-update-jbqws" Mar 09 04:13:53 crc kubenswrapper[4901]: I0309 04:13:53.296888 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09b6b1e3-ab73-49e7-aecd-fcd62e8bbf8c-operator-scripts\") pod \"keystone-db-create-s7nhh\" (UID: \"09b6b1e3-ab73-49e7-aecd-fcd62e8bbf8c\") " pod="openstack/keystone-db-create-s7nhh" Mar 09 04:13:53 crc kubenswrapper[4901]: I0309 04:13:53.296931 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xb5vn\" (UniqueName: \"kubernetes.io/projected/b6445c1e-79ac-4e92-b7ab-a697417e44f8-kube-api-access-xb5vn\") pod \"keystone-03b9-account-create-update-jbqws\" (UID: \"b6445c1e-79ac-4e92-b7ab-a697417e44f8\") " pod="openstack/keystone-03b9-account-create-update-jbqws" Mar 09 04:13:53 crc kubenswrapper[4901]: I0309 04:13:53.323787 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btftm\" (UniqueName: \"kubernetes.io/projected/09b6b1e3-ab73-49e7-aecd-fcd62e8bbf8c-kube-api-access-btftm\") pod \"keystone-db-create-s7nhh\" (UID: \"09b6b1e3-ab73-49e7-aecd-fcd62e8bbf8c\") " pod="openstack/keystone-db-create-s7nhh" Mar 09 04:13:53 crc kubenswrapper[4901]: I0309 04:13:53.398667 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6445c1e-79ac-4e92-b7ab-a697417e44f8-operator-scripts\") pod \"keystone-03b9-account-create-update-jbqws\" (UID: \"b6445c1e-79ac-4e92-b7ab-a697417e44f8\") " pod="openstack/keystone-03b9-account-create-update-jbqws" Mar 09 04:13:53 crc kubenswrapper[4901]: I0309 04:13:53.398726 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xb5vn\" (UniqueName: \"kubernetes.io/projected/b6445c1e-79ac-4e92-b7ab-a697417e44f8-kube-api-access-xb5vn\") pod \"keystone-03b9-account-create-update-jbqws\" (UID: \"b6445c1e-79ac-4e92-b7ab-a697417e44f8\") " pod="openstack/keystone-03b9-account-create-update-jbqws" Mar 09 04:13:53 crc kubenswrapper[4901]: I0309 04:13:53.399475 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6445c1e-79ac-4e92-b7ab-a697417e44f8-operator-scripts\") pod \"keystone-03b9-account-create-update-jbqws\" (UID: \"b6445c1e-79ac-4e92-b7ab-a697417e44f8\") " pod="openstack/keystone-03b9-account-create-update-jbqws" Mar 09 04:13:53 crc kubenswrapper[4901]: I0309 04:13:53.427789 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xb5vn\" (UniqueName: \"kubernetes.io/projected/b6445c1e-79ac-4e92-b7ab-a697417e44f8-kube-api-access-xb5vn\") pod \"keystone-03b9-account-create-update-jbqws\" (UID: \"b6445c1e-79ac-4e92-b7ab-a697417e44f8\") " pod="openstack/keystone-03b9-account-create-update-jbqws" Mar 09 04:13:53 crc kubenswrapper[4901]: I0309 04:13:53.429496 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-s7nhh" Mar 09 04:13:53 crc kubenswrapper[4901]: I0309 04:13:53.447190 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-03b9-account-create-update-jbqws" Mar 09 04:13:53 crc kubenswrapper[4901]: I0309 04:13:53.781612 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-03b9-account-create-update-jbqws"] Mar 09 04:13:53 crc kubenswrapper[4901]: W0309 04:13:53.782316 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6445c1e_79ac_4e92_b7ab_a697417e44f8.slice/crio-a2cd1baa8e98a180658d3a9e45e3bbd2b03fd8de53f41455aacd3a0262664246 WatchSource:0}: Error finding container a2cd1baa8e98a180658d3a9e45e3bbd2b03fd8de53f41455aacd3a0262664246: Status 404 returned error can't find the container with id a2cd1baa8e98a180658d3a9e45e3bbd2b03fd8de53f41455aacd3a0262664246 Mar 09 04:13:53 crc kubenswrapper[4901]: I0309 04:13:53.873847 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-s7nhh"] Mar 09 04:13:53 crc kubenswrapper[4901]: W0309 04:13:53.886061 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09b6b1e3_ab73_49e7_aecd_fcd62e8bbf8c.slice/crio-ae6926d6732142f2685834b25c5d663a1cc1efc2e0c97889e16f5f18ea8607d9 WatchSource:0}: Error finding container ae6926d6732142f2685834b25c5d663a1cc1efc2e0c97889e16f5f18ea8607d9: Status 404 returned error can't find the container with id ae6926d6732142f2685834b25c5d663a1cc1efc2e0c97889e16f5f18ea8607d9 Mar 09 04:13:54 crc kubenswrapper[4901]: I0309 04:13:54.718710 4901 generic.go:334] "Generic (PLEG): container finished" podID="b6445c1e-79ac-4e92-b7ab-a697417e44f8" containerID="376408b9a1611bfecca129502d5376008f1f799e32d35e8669afb17e4df0cbd3" exitCode=0 Mar 09 04:13:54 crc kubenswrapper[4901]: I0309 04:13:54.718960 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-03b9-account-create-update-jbqws" event={"ID":"b6445c1e-79ac-4e92-b7ab-a697417e44f8","Type":"ContainerDied","Data":"376408b9a1611bfecca129502d5376008f1f799e32d35e8669afb17e4df0cbd3"} Mar 09 04:13:54 crc kubenswrapper[4901]: I0309 04:13:54.719354 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-03b9-account-create-update-jbqws" event={"ID":"b6445c1e-79ac-4e92-b7ab-a697417e44f8","Type":"ContainerStarted","Data":"a2cd1baa8e98a180658d3a9e45e3bbd2b03fd8de53f41455aacd3a0262664246"} Mar 09 04:13:54 crc kubenswrapper[4901]: I0309 04:13:54.725200 4901 generic.go:334] "Generic (PLEG): container finished" podID="09b6b1e3-ab73-49e7-aecd-fcd62e8bbf8c" containerID="de454455d0ff97b922aa28c731915769c3c32e2566a19f004e38a89f59e7c628" exitCode=0 Mar 09 04:13:54 crc kubenswrapper[4901]: I0309 04:13:54.725589 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-s7nhh" event={"ID":"09b6b1e3-ab73-49e7-aecd-fcd62e8bbf8c","Type":"ContainerDied","Data":"de454455d0ff97b922aa28c731915769c3c32e2566a19f004e38a89f59e7c628"} Mar 09 04:13:54 crc kubenswrapper[4901]: I0309 04:13:54.725638 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-s7nhh" event={"ID":"09b6b1e3-ab73-49e7-aecd-fcd62e8bbf8c","Type":"ContainerStarted","Data":"ae6926d6732142f2685834b25c5d663a1cc1efc2e0c97889e16f5f18ea8607d9"} Mar 09 04:13:56 crc kubenswrapper[4901]: I0309 04:13:56.265149 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-s7nhh" Mar 09 04:13:56 crc kubenswrapper[4901]: I0309 04:13:56.273455 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-03b9-account-create-update-jbqws" Mar 09 04:13:56 crc kubenswrapper[4901]: I0309 04:13:56.455836 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btftm\" (UniqueName: \"kubernetes.io/projected/09b6b1e3-ab73-49e7-aecd-fcd62e8bbf8c-kube-api-access-btftm\") pod \"09b6b1e3-ab73-49e7-aecd-fcd62e8bbf8c\" (UID: \"09b6b1e3-ab73-49e7-aecd-fcd62e8bbf8c\") " Mar 09 04:13:56 crc kubenswrapper[4901]: I0309 04:13:56.456016 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xb5vn\" (UniqueName: \"kubernetes.io/projected/b6445c1e-79ac-4e92-b7ab-a697417e44f8-kube-api-access-xb5vn\") pod \"b6445c1e-79ac-4e92-b7ab-a697417e44f8\" (UID: \"b6445c1e-79ac-4e92-b7ab-a697417e44f8\") " Mar 09 04:13:56 crc kubenswrapper[4901]: I0309 04:13:56.456076 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6445c1e-79ac-4e92-b7ab-a697417e44f8-operator-scripts\") pod \"b6445c1e-79ac-4e92-b7ab-a697417e44f8\" (UID: \"b6445c1e-79ac-4e92-b7ab-a697417e44f8\") " Mar 09 04:13:56 crc kubenswrapper[4901]: I0309 04:13:56.456135 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09b6b1e3-ab73-49e7-aecd-fcd62e8bbf8c-operator-scripts\") pod \"09b6b1e3-ab73-49e7-aecd-fcd62e8bbf8c\" (UID: \"09b6b1e3-ab73-49e7-aecd-fcd62e8bbf8c\") " Mar 09 04:13:56 crc kubenswrapper[4901]: I0309 04:13:56.456931 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6445c1e-79ac-4e92-b7ab-a697417e44f8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b6445c1e-79ac-4e92-b7ab-a697417e44f8" (UID: "b6445c1e-79ac-4e92-b7ab-a697417e44f8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 04:13:56 crc kubenswrapper[4901]: I0309 04:13:56.457087 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09b6b1e3-ab73-49e7-aecd-fcd62e8bbf8c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "09b6b1e3-ab73-49e7-aecd-fcd62e8bbf8c" (UID: "09b6b1e3-ab73-49e7-aecd-fcd62e8bbf8c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 04:13:56 crc kubenswrapper[4901]: I0309 04:13:56.464060 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09b6b1e3-ab73-49e7-aecd-fcd62e8bbf8c-kube-api-access-btftm" (OuterVolumeSpecName: "kube-api-access-btftm") pod "09b6b1e3-ab73-49e7-aecd-fcd62e8bbf8c" (UID: "09b6b1e3-ab73-49e7-aecd-fcd62e8bbf8c"). InnerVolumeSpecName "kube-api-access-btftm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:13:56 crc kubenswrapper[4901]: I0309 04:13:56.464311 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6445c1e-79ac-4e92-b7ab-a697417e44f8-kube-api-access-xb5vn" (OuterVolumeSpecName: "kube-api-access-xb5vn") pod "b6445c1e-79ac-4e92-b7ab-a697417e44f8" (UID: "b6445c1e-79ac-4e92-b7ab-a697417e44f8"). InnerVolumeSpecName "kube-api-access-xb5vn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:13:56 crc kubenswrapper[4901]: I0309 04:13:56.557950 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xb5vn\" (UniqueName: \"kubernetes.io/projected/b6445c1e-79ac-4e92-b7ab-a697417e44f8-kube-api-access-xb5vn\") on node \"crc\" DevicePath \"\"" Mar 09 04:13:56 crc kubenswrapper[4901]: I0309 04:13:56.557988 4901 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6445c1e-79ac-4e92-b7ab-a697417e44f8-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 04:13:56 crc kubenswrapper[4901]: I0309 04:13:56.557999 4901 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09b6b1e3-ab73-49e7-aecd-fcd62e8bbf8c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 04:13:56 crc kubenswrapper[4901]: I0309 04:13:56.558009 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btftm\" (UniqueName: \"kubernetes.io/projected/09b6b1e3-ab73-49e7-aecd-fcd62e8bbf8c-kube-api-access-btftm\") on node \"crc\" DevicePath \"\"" Mar 09 04:13:56 crc kubenswrapper[4901]: I0309 04:13:56.751968 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-03b9-account-create-update-jbqws" event={"ID":"b6445c1e-79ac-4e92-b7ab-a697417e44f8","Type":"ContainerDied","Data":"a2cd1baa8e98a180658d3a9e45e3bbd2b03fd8de53f41455aacd3a0262664246"} Mar 09 04:13:56 crc kubenswrapper[4901]: I0309 04:13:56.752031 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2cd1baa8e98a180658d3a9e45e3bbd2b03fd8de53f41455aacd3a0262664246" Mar 09 04:13:56 crc kubenswrapper[4901]: I0309 04:13:56.752051 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-03b9-account-create-update-jbqws" Mar 09 04:13:56 crc kubenswrapper[4901]: I0309 04:13:56.753988 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-s7nhh" event={"ID":"09b6b1e3-ab73-49e7-aecd-fcd62e8bbf8c","Type":"ContainerDied","Data":"ae6926d6732142f2685834b25c5d663a1cc1efc2e0c97889e16f5f18ea8607d9"} Mar 09 04:13:56 crc kubenswrapper[4901]: I0309 04:13:56.754025 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae6926d6732142f2685834b25c5d663a1cc1efc2e0c97889e16f5f18ea8607d9" Mar 09 04:13:56 crc kubenswrapper[4901]: I0309 04:13:56.754117 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-s7nhh" Mar 09 04:13:57 crc kubenswrapper[4901]: I0309 04:13:57.106653 4901 scope.go:117] "RemoveContainer" containerID="d217497ad345fae7da1bdd0eb24a9199e10562d0bb7d6f5c45a95f7e79585459" Mar 09 04:13:57 crc kubenswrapper[4901]: E0309 04:13:57.106965 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:13:58 crc kubenswrapper[4901]: I0309 04:13:58.304326 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 09 04:13:58 crc kubenswrapper[4901]: I0309 04:13:58.642788 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-stg5m"] Mar 09 04:13:58 crc kubenswrapper[4901]: E0309 04:13:58.643100 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09b6b1e3-ab73-49e7-aecd-fcd62e8bbf8c" containerName="mariadb-database-create" Mar 09 04:13:58 crc kubenswrapper[4901]: I0309 04:13:58.643113 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="09b6b1e3-ab73-49e7-aecd-fcd62e8bbf8c" containerName="mariadb-database-create" Mar 09 04:13:58 crc kubenswrapper[4901]: E0309 04:13:58.643128 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6445c1e-79ac-4e92-b7ab-a697417e44f8" containerName="mariadb-account-create-update" Mar 09 04:13:58 crc kubenswrapper[4901]: I0309 04:13:58.643135 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6445c1e-79ac-4e92-b7ab-a697417e44f8" containerName="mariadb-account-create-update" Mar 09 04:13:58 crc kubenswrapper[4901]: I0309 04:13:58.643310 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="09b6b1e3-ab73-49e7-aecd-fcd62e8bbf8c" containerName="mariadb-database-create" Mar 09 04:13:58 crc kubenswrapper[4901]: I0309 04:13:58.643337 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6445c1e-79ac-4e92-b7ab-a697417e44f8" containerName="mariadb-account-create-update" Mar 09 04:13:58 crc kubenswrapper[4901]: I0309 04:13:58.643830 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-stg5m" Mar 09 04:13:58 crc kubenswrapper[4901]: I0309 04:13:58.645734 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-59p27" Mar 09 04:13:58 crc kubenswrapper[4901]: I0309 04:13:58.646193 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 09 04:13:58 crc kubenswrapper[4901]: I0309 04:13:58.646346 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 09 04:13:58 crc kubenswrapper[4901]: I0309 04:13:58.648088 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 09 04:13:58 crc kubenswrapper[4901]: I0309 04:13:58.663412 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-stg5m"] Mar 09 04:13:58 crc kubenswrapper[4901]: I0309 04:13:58.804790 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7njp\" (UniqueName: \"kubernetes.io/projected/c9980b6c-281f-4ee2-82c5-ae0be5525a75-kube-api-access-k7njp\") pod \"keystone-db-sync-stg5m\" (UID: \"c9980b6c-281f-4ee2-82c5-ae0be5525a75\") " pod="openstack/keystone-db-sync-stg5m" Mar 09 04:13:58 crc kubenswrapper[4901]: I0309 04:13:58.804834 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9980b6c-281f-4ee2-82c5-ae0be5525a75-config-data\") pod \"keystone-db-sync-stg5m\" (UID: \"c9980b6c-281f-4ee2-82c5-ae0be5525a75\") " pod="openstack/keystone-db-sync-stg5m" Mar 09 04:13:58 crc kubenswrapper[4901]: I0309 04:13:58.804915 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9980b6c-281f-4ee2-82c5-ae0be5525a75-combined-ca-bundle\") pod \"keystone-db-sync-stg5m\" (UID: \"c9980b6c-281f-4ee2-82c5-ae0be5525a75\") " pod="openstack/keystone-db-sync-stg5m" Mar 09 04:13:58 crc kubenswrapper[4901]: I0309 04:13:58.906867 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9980b6c-281f-4ee2-82c5-ae0be5525a75-combined-ca-bundle\") pod \"keystone-db-sync-stg5m\" (UID: \"c9980b6c-281f-4ee2-82c5-ae0be5525a75\") " pod="openstack/keystone-db-sync-stg5m" Mar 09 04:13:58 crc kubenswrapper[4901]: I0309 04:13:58.907032 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7njp\" (UniqueName: \"kubernetes.io/projected/c9980b6c-281f-4ee2-82c5-ae0be5525a75-kube-api-access-k7njp\") pod \"keystone-db-sync-stg5m\" (UID: \"c9980b6c-281f-4ee2-82c5-ae0be5525a75\") " pod="openstack/keystone-db-sync-stg5m" Mar 09 04:13:58 crc kubenswrapper[4901]: I0309 04:13:58.907079 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9980b6c-281f-4ee2-82c5-ae0be5525a75-config-data\") pod \"keystone-db-sync-stg5m\" (UID: \"c9980b6c-281f-4ee2-82c5-ae0be5525a75\") " pod="openstack/keystone-db-sync-stg5m" Mar 09 04:13:58 crc kubenswrapper[4901]: I0309 04:13:58.917185 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9980b6c-281f-4ee2-82c5-ae0be5525a75-config-data\") pod \"keystone-db-sync-stg5m\" (UID: \"c9980b6c-281f-4ee2-82c5-ae0be5525a75\") " pod="openstack/keystone-db-sync-stg5m" Mar 09 04:13:58 crc kubenswrapper[4901]: I0309 04:13:58.924571 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9980b6c-281f-4ee2-82c5-ae0be5525a75-combined-ca-bundle\") pod \"keystone-db-sync-stg5m\" (UID: \"c9980b6c-281f-4ee2-82c5-ae0be5525a75\") " pod="openstack/keystone-db-sync-stg5m" Mar 09 04:13:58 crc kubenswrapper[4901]: I0309 04:13:58.934816 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7njp\" (UniqueName: \"kubernetes.io/projected/c9980b6c-281f-4ee2-82c5-ae0be5525a75-kube-api-access-k7njp\") pod \"keystone-db-sync-stg5m\" (UID: \"c9980b6c-281f-4ee2-82c5-ae0be5525a75\") " pod="openstack/keystone-db-sync-stg5m" Mar 09 04:13:58 crc kubenswrapper[4901]: I0309 04:13:58.969285 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-stg5m" Mar 09 04:13:59 crc kubenswrapper[4901]: I0309 04:13:59.498484 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-stg5m"] Mar 09 04:13:59 crc kubenswrapper[4901]: I0309 04:13:59.775992 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-stg5m" event={"ID":"c9980b6c-281f-4ee2-82c5-ae0be5525a75","Type":"ContainerStarted","Data":"df28497cdaa411177171933b7a13f589d2a5cf20b3b66e05ac9eedf98db29289"} Mar 09 04:13:59 crc kubenswrapper[4901]: I0309 04:13:59.776034 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-stg5m" event={"ID":"c9980b6c-281f-4ee2-82c5-ae0be5525a75","Type":"ContainerStarted","Data":"f8d6e02951c8144b9cdf199e4b36470ed77bcf7a4affcc5803b802af1eccb116"} Mar 09 04:13:59 crc kubenswrapper[4901]: I0309 04:13:59.799632 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-stg5m" podStartSLOduration=1.799616699 podStartE2EDuration="1.799616699s" podCreationTimestamp="2026-03-09 04:13:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 04:13:59.794232307 +0000 UTC m=+5564.383896039" watchObservedRunningTime="2026-03-09 04:13:59.799616699 +0000 UTC m=+5564.389280431" Mar 09 04:14:00 crc kubenswrapper[4901]: I0309 04:14:00.152070 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550494-hz2hc"] Mar 09 04:14:00 crc kubenswrapper[4901]: I0309 04:14:00.153781 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550494-hz2hc" Mar 09 04:14:00 crc kubenswrapper[4901]: I0309 04:14:00.157114 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 04:14:00 crc kubenswrapper[4901]: I0309 04:14:00.157434 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lq6vs" Mar 09 04:14:00 crc kubenswrapper[4901]: I0309 04:14:00.157788 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 04:14:00 crc kubenswrapper[4901]: I0309 04:14:00.164890 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550494-hz2hc"] Mar 09 04:14:00 crc kubenswrapper[4901]: I0309 04:14:00.332288 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r4wz\" (UniqueName: \"kubernetes.io/projected/a875c415-225e-40d6-8e3f-c0a390112936-kube-api-access-6r4wz\") pod \"auto-csr-approver-29550494-hz2hc\" (UID: \"a875c415-225e-40d6-8e3f-c0a390112936\") " pod="openshift-infra/auto-csr-approver-29550494-hz2hc" Mar 09 04:14:00 crc kubenswrapper[4901]: I0309 04:14:00.433448 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6r4wz\" (UniqueName: \"kubernetes.io/projected/a875c415-225e-40d6-8e3f-c0a390112936-kube-api-access-6r4wz\") pod \"auto-csr-approver-29550494-hz2hc\" (UID: \"a875c415-225e-40d6-8e3f-c0a390112936\") " pod="openshift-infra/auto-csr-approver-29550494-hz2hc" Mar 09 04:14:00 crc kubenswrapper[4901]: I0309 04:14:00.449793 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r4wz\" (UniqueName: \"kubernetes.io/projected/a875c415-225e-40d6-8e3f-c0a390112936-kube-api-access-6r4wz\") pod \"auto-csr-approver-29550494-hz2hc\" (UID: \"a875c415-225e-40d6-8e3f-c0a390112936\") " pod="openshift-infra/auto-csr-approver-29550494-hz2hc" Mar 09 04:14:00 crc kubenswrapper[4901]: I0309 04:14:00.478268 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550494-hz2hc" Mar 09 04:14:00 crc kubenswrapper[4901]: I0309 04:14:00.925240 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550494-hz2hc"] Mar 09 04:14:01 crc kubenswrapper[4901]: I0309 04:14:01.796243 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550494-hz2hc" event={"ID":"a875c415-225e-40d6-8e3f-c0a390112936","Type":"ContainerStarted","Data":"77cf977e2a08da215ca2811cbc858a8cdcdcf93518ca8c140468b531c3af9551"} Mar 09 04:14:01 crc kubenswrapper[4901]: I0309 04:14:01.798461 4901 generic.go:334] "Generic (PLEG): container finished" podID="c9980b6c-281f-4ee2-82c5-ae0be5525a75" containerID="df28497cdaa411177171933b7a13f589d2a5cf20b3b66e05ac9eedf98db29289" exitCode=0 Mar 09 04:14:01 crc kubenswrapper[4901]: I0309 04:14:01.798506 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-stg5m" event={"ID":"c9980b6c-281f-4ee2-82c5-ae0be5525a75","Type":"ContainerDied","Data":"df28497cdaa411177171933b7a13f589d2a5cf20b3b66e05ac9eedf98db29289"} Mar 09 04:14:02 crc kubenswrapper[4901]: I0309 04:14:02.818731 4901 generic.go:334] "Generic (PLEG): container finished" podID="a875c415-225e-40d6-8e3f-c0a390112936" containerID="2728ed9e1874ffe4f00aebc8ed43c6b257cf541534cbe5db334e6f08f54f592c" exitCode=0 Mar 09 04:14:02 crc kubenswrapper[4901]: I0309 04:14:02.818819 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550494-hz2hc" event={"ID":"a875c415-225e-40d6-8e3f-c0a390112936","Type":"ContainerDied","Data":"2728ed9e1874ffe4f00aebc8ed43c6b257cf541534cbe5db334e6f08f54f592c"} Mar 09 04:14:03 crc kubenswrapper[4901]: I0309 04:14:03.246127 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-stg5m" Mar 09 04:14:03 crc kubenswrapper[4901]: I0309 04:14:03.388758 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7njp\" (UniqueName: \"kubernetes.io/projected/c9980b6c-281f-4ee2-82c5-ae0be5525a75-kube-api-access-k7njp\") pod \"c9980b6c-281f-4ee2-82c5-ae0be5525a75\" (UID: \"c9980b6c-281f-4ee2-82c5-ae0be5525a75\") " Mar 09 04:14:03 crc kubenswrapper[4901]: I0309 04:14:03.388821 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9980b6c-281f-4ee2-82c5-ae0be5525a75-config-data\") pod \"c9980b6c-281f-4ee2-82c5-ae0be5525a75\" (UID: \"c9980b6c-281f-4ee2-82c5-ae0be5525a75\") " Mar 09 04:14:03 crc kubenswrapper[4901]: I0309 04:14:03.388846 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9980b6c-281f-4ee2-82c5-ae0be5525a75-combined-ca-bundle\") pod \"c9980b6c-281f-4ee2-82c5-ae0be5525a75\" (UID: \"c9980b6c-281f-4ee2-82c5-ae0be5525a75\") " Mar 09 04:14:03 crc kubenswrapper[4901]: I0309 04:14:03.393954 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9980b6c-281f-4ee2-82c5-ae0be5525a75-kube-api-access-k7njp" (OuterVolumeSpecName: "kube-api-access-k7njp") pod "c9980b6c-281f-4ee2-82c5-ae0be5525a75" (UID: "c9980b6c-281f-4ee2-82c5-ae0be5525a75"). InnerVolumeSpecName "kube-api-access-k7njp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:14:03 crc kubenswrapper[4901]: I0309 04:14:03.428489 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9980b6c-281f-4ee2-82c5-ae0be5525a75-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c9980b6c-281f-4ee2-82c5-ae0be5525a75" (UID: "c9980b6c-281f-4ee2-82c5-ae0be5525a75"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 04:14:03 crc kubenswrapper[4901]: I0309 04:14:03.470007 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9980b6c-281f-4ee2-82c5-ae0be5525a75-config-data" (OuterVolumeSpecName: "config-data") pod "c9980b6c-281f-4ee2-82c5-ae0be5525a75" (UID: "c9980b6c-281f-4ee2-82c5-ae0be5525a75"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 04:14:03 crc kubenswrapper[4901]: I0309 04:14:03.491785 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7njp\" (UniqueName: \"kubernetes.io/projected/c9980b6c-281f-4ee2-82c5-ae0be5525a75-kube-api-access-k7njp\") on node \"crc\" DevicePath \"\"" Mar 09 04:14:03 crc kubenswrapper[4901]: I0309 04:14:03.491860 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9980b6c-281f-4ee2-82c5-ae0be5525a75-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 04:14:03 crc kubenswrapper[4901]: I0309 04:14:03.491884 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9980b6c-281f-4ee2-82c5-ae0be5525a75-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 04:14:03 crc kubenswrapper[4901]: I0309 04:14:03.831963 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-stg5m" event={"ID":"c9980b6c-281f-4ee2-82c5-ae0be5525a75","Type":"ContainerDied","Data":"f8d6e02951c8144b9cdf199e4b36470ed77bcf7a4affcc5803b802af1eccb116"} Mar 09 04:14:03 crc kubenswrapper[4901]: I0309 04:14:03.832004 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8d6e02951c8144b9cdf199e4b36470ed77bcf7a4affcc5803b802af1eccb116" Mar 09 04:14:03 crc kubenswrapper[4901]: I0309 04:14:03.832039 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-stg5m" Mar 09 04:14:03 crc kubenswrapper[4901]: I0309 04:14:03.961274 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7989cc7f6f-vv8d8"] Mar 09 04:14:03 crc kubenswrapper[4901]: E0309 04:14:03.961890 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9980b6c-281f-4ee2-82c5-ae0be5525a75" containerName="keystone-db-sync" Mar 09 04:14:03 crc kubenswrapper[4901]: I0309 04:14:03.961974 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9980b6c-281f-4ee2-82c5-ae0be5525a75" containerName="keystone-db-sync" Mar 09 04:14:03 crc kubenswrapper[4901]: I0309 04:14:03.962470 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9980b6c-281f-4ee2-82c5-ae0be5525a75" containerName="keystone-db-sync" Mar 09 04:14:03 crc kubenswrapper[4901]: I0309 04:14:03.963571 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7989cc7f6f-vv8d8" Mar 09 04:14:03 crc kubenswrapper[4901]: I0309 04:14:03.986055 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7989cc7f6f-vv8d8"] Mar 09 04:14:04 crc kubenswrapper[4901]: I0309 04:14:04.015323 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-vt7r5"] Mar 09 04:14:04 crc kubenswrapper[4901]: I0309 04:14:04.016371 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vt7r5" Mar 09 04:14:04 crc kubenswrapper[4901]: I0309 04:14:04.020395 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 09 04:14:04 crc kubenswrapper[4901]: I0309 04:14:04.020551 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 09 04:14:04 crc kubenswrapper[4901]: I0309 04:14:04.020695 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-59p27" Mar 09 04:14:04 crc kubenswrapper[4901]: I0309 04:14:04.020850 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 09 04:14:04 crc kubenswrapper[4901]: I0309 04:14:04.020874 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 09 04:14:04 crc kubenswrapper[4901]: I0309 04:14:04.030731 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-vt7r5"] Mar 09 04:14:04 crc kubenswrapper[4901]: I0309 04:14:04.102280 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/63fc5507-22a2-4871-a6bf-557a5e4cde6b-ovsdbserver-nb\") pod \"dnsmasq-dns-7989cc7f6f-vv8d8\" (UID: \"63fc5507-22a2-4871-a6bf-557a5e4cde6b\") " pod="openstack/dnsmasq-dns-7989cc7f6f-vv8d8" Mar 09 04:14:04 crc kubenswrapper[4901]: I0309 04:14:04.102762 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/63fc5507-22a2-4871-a6bf-557a5e4cde6b-ovsdbserver-sb\") pod \"dnsmasq-dns-7989cc7f6f-vv8d8\" (UID: \"63fc5507-22a2-4871-a6bf-557a5e4cde6b\") " pod="openstack/dnsmasq-dns-7989cc7f6f-vv8d8" Mar 09 04:14:04 crc kubenswrapper[4901]: I0309 04:14:04.102801 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/63fc5507-22a2-4871-a6bf-557a5e4cde6b-dns-svc\") pod \"dnsmasq-dns-7989cc7f6f-vv8d8\" (UID: \"63fc5507-22a2-4871-a6bf-557a5e4cde6b\") " pod="openstack/dnsmasq-dns-7989cc7f6f-vv8d8" Mar 09 04:14:04 crc kubenswrapper[4901]: I0309 04:14:04.102832 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63fc5507-22a2-4871-a6bf-557a5e4cde6b-config\") pod \"dnsmasq-dns-7989cc7f6f-vv8d8\" (UID: \"63fc5507-22a2-4871-a6bf-557a5e4cde6b\") " pod="openstack/dnsmasq-dns-7989cc7f6f-vv8d8" Mar 09 04:14:04 crc kubenswrapper[4901]: I0309 04:14:04.102861 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfnxv\" (UniqueName: \"kubernetes.io/projected/63fc5507-22a2-4871-a6bf-557a5e4cde6b-kube-api-access-jfnxv\") pod \"dnsmasq-dns-7989cc7f6f-vv8d8\" (UID: \"63fc5507-22a2-4871-a6bf-557a5e4cde6b\") " pod="openstack/dnsmasq-dns-7989cc7f6f-vv8d8" Mar 09 04:14:04 crc kubenswrapper[4901]: I0309 04:14:04.206765 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/943932b8-e95f-48a1-b61c-fe1efcd1bf71-scripts\") pod \"keystone-bootstrap-vt7r5\" (UID: \"943932b8-e95f-48a1-b61c-fe1efcd1bf71\") " pod="openstack/keystone-bootstrap-vt7r5" Mar 09 04:14:04 crc kubenswrapper[4901]: I0309 04:14:04.207133 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/943932b8-e95f-48a1-b61c-fe1efcd1bf71-config-data\") pod \"keystone-bootstrap-vt7r5\" (UID: \"943932b8-e95f-48a1-b61c-fe1efcd1bf71\") " pod="openstack/keystone-bootstrap-vt7r5" Mar 09 04:14:04 crc kubenswrapper[4901]: I0309 04:14:04.207163 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/943932b8-e95f-48a1-b61c-fe1efcd1bf71-credential-keys\") pod \"keystone-bootstrap-vt7r5\" (UID: \"943932b8-e95f-48a1-b61c-fe1efcd1bf71\") " pod="openstack/keystone-bootstrap-vt7r5" Mar 09 04:14:04 crc kubenswrapper[4901]: I0309 04:14:04.207205 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/63fc5507-22a2-4871-a6bf-557a5e4cde6b-ovsdbserver-nb\") pod \"dnsmasq-dns-7989cc7f6f-vv8d8\" (UID: \"63fc5507-22a2-4871-a6bf-557a5e4cde6b\") " pod="openstack/dnsmasq-dns-7989cc7f6f-vv8d8" Mar 09 04:14:04 crc kubenswrapper[4901]: I0309 04:14:04.207428 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/63fc5507-22a2-4871-a6bf-557a5e4cde6b-ovsdbserver-sb\") pod \"dnsmasq-dns-7989cc7f6f-vv8d8\" (UID: \"63fc5507-22a2-4871-a6bf-557a5e4cde6b\") " pod="openstack/dnsmasq-dns-7989cc7f6f-vv8d8" Mar 09 04:14:04 crc kubenswrapper[4901]: I0309 04:14:04.207482 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/943932b8-e95f-48a1-b61c-fe1efcd1bf71-fernet-keys\") pod \"keystone-bootstrap-vt7r5\" (UID: \"943932b8-e95f-48a1-b61c-fe1efcd1bf71\") " pod="openstack/keystone-bootstrap-vt7r5" Mar 09 04:14:04 crc kubenswrapper[4901]: I0309 04:14:04.207522 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/63fc5507-22a2-4871-a6bf-557a5e4cde6b-dns-svc\") pod \"dnsmasq-dns-7989cc7f6f-vv8d8\" (UID: \"63fc5507-22a2-4871-a6bf-557a5e4cde6b\") " pod="openstack/dnsmasq-dns-7989cc7f6f-vv8d8" Mar 09 04:14:04 crc kubenswrapper[4901]: I0309 04:14:04.207548 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/943932b8-e95f-48a1-b61c-fe1efcd1bf71-combined-ca-bundle\") pod \"keystone-bootstrap-vt7r5\" (UID: \"943932b8-e95f-48a1-b61c-fe1efcd1bf71\") " pod="openstack/keystone-bootstrap-vt7r5" Mar 09 04:14:04 crc kubenswrapper[4901]: I0309 04:14:04.207581 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63fc5507-22a2-4871-a6bf-557a5e4cde6b-config\") pod \"dnsmasq-dns-7989cc7f6f-vv8d8\" (UID: \"63fc5507-22a2-4871-a6bf-557a5e4cde6b\") " pod="openstack/dnsmasq-dns-7989cc7f6f-vv8d8" Mar 09 04:14:04 crc kubenswrapper[4901]: I0309 04:14:04.207618 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfnxv\" (UniqueName: \"kubernetes.io/projected/63fc5507-22a2-4871-a6bf-557a5e4cde6b-kube-api-access-jfnxv\") pod \"dnsmasq-dns-7989cc7f6f-vv8d8\" (UID: \"63fc5507-22a2-4871-a6bf-557a5e4cde6b\") " pod="openstack/dnsmasq-dns-7989cc7f6f-vv8d8" Mar 09 04:14:04 crc kubenswrapper[4901]: I0309 04:14:04.207648 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djmzv\" (UniqueName: \"kubernetes.io/projected/943932b8-e95f-48a1-b61c-fe1efcd1bf71-kube-api-access-djmzv\") pod \"keystone-bootstrap-vt7r5\" (UID: \"943932b8-e95f-48a1-b61c-fe1efcd1bf71\") " pod="openstack/keystone-bootstrap-vt7r5" Mar 09 04:14:04 crc kubenswrapper[4901]: I0309 04:14:04.208710 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/63fc5507-22a2-4871-a6bf-557a5e4cde6b-ovsdbserver-nb\") pod \"dnsmasq-dns-7989cc7f6f-vv8d8\" (UID: \"63fc5507-22a2-4871-a6bf-557a5e4cde6b\") " pod="openstack/dnsmasq-dns-7989cc7f6f-vv8d8" Mar 09 04:14:04 crc kubenswrapper[4901]: I0309 04:14:04.208787 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/63fc5507-22a2-4871-a6bf-557a5e4cde6b-dns-svc\") pod \"dnsmasq-dns-7989cc7f6f-vv8d8\" (UID: \"63fc5507-22a2-4871-a6bf-557a5e4cde6b\") " pod="openstack/dnsmasq-dns-7989cc7f6f-vv8d8" Mar 09 04:14:04 crc kubenswrapper[4901]: I0309 04:14:04.209422 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63fc5507-22a2-4871-a6bf-557a5e4cde6b-config\") pod \"dnsmasq-dns-7989cc7f6f-vv8d8\" (UID: \"63fc5507-22a2-4871-a6bf-557a5e4cde6b\") " pod="openstack/dnsmasq-dns-7989cc7f6f-vv8d8" Mar 09 04:14:04 crc kubenswrapper[4901]: I0309 04:14:04.211995 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/63fc5507-22a2-4871-a6bf-557a5e4cde6b-ovsdbserver-sb\") pod \"dnsmasq-dns-7989cc7f6f-vv8d8\" (UID: \"63fc5507-22a2-4871-a6bf-557a5e4cde6b\") " pod="openstack/dnsmasq-dns-7989cc7f6f-vv8d8" Mar 09 04:14:04 crc kubenswrapper[4901]: I0309 04:14:04.247084 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfnxv\" (UniqueName: \"kubernetes.io/projected/63fc5507-22a2-4871-a6bf-557a5e4cde6b-kube-api-access-jfnxv\") pod \"dnsmasq-dns-7989cc7f6f-vv8d8\" (UID: \"63fc5507-22a2-4871-a6bf-557a5e4cde6b\") " pod="openstack/dnsmasq-dns-7989cc7f6f-vv8d8" Mar 09 04:14:04 crc kubenswrapper[4901]: I0309 04:14:04.283555 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7989cc7f6f-vv8d8" Mar 09 04:14:04 crc kubenswrapper[4901]: I0309 04:14:04.314086 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/943932b8-e95f-48a1-b61c-fe1efcd1bf71-fernet-keys\") pod \"keystone-bootstrap-vt7r5\" (UID: \"943932b8-e95f-48a1-b61c-fe1efcd1bf71\") " pod="openstack/keystone-bootstrap-vt7r5" Mar 09 04:14:04 crc kubenswrapper[4901]: I0309 04:14:04.314160 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/943932b8-e95f-48a1-b61c-fe1efcd1bf71-combined-ca-bundle\") pod \"keystone-bootstrap-vt7r5\" (UID: \"943932b8-e95f-48a1-b61c-fe1efcd1bf71\") " pod="openstack/keystone-bootstrap-vt7r5" Mar 09 04:14:04 crc kubenswrapper[4901]: I0309 04:14:04.314199 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djmzv\" (UniqueName: \"kubernetes.io/projected/943932b8-e95f-48a1-b61c-fe1efcd1bf71-kube-api-access-djmzv\") pod \"keystone-bootstrap-vt7r5\" (UID: \"943932b8-e95f-48a1-b61c-fe1efcd1bf71\") " pod="openstack/keystone-bootstrap-vt7r5" Mar 09 04:14:04 crc kubenswrapper[4901]: I0309 04:14:04.314249 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/943932b8-e95f-48a1-b61c-fe1efcd1bf71-scripts\") pod \"keystone-bootstrap-vt7r5\" (UID: \"943932b8-e95f-48a1-b61c-fe1efcd1bf71\") " pod="openstack/keystone-bootstrap-vt7r5" Mar 09 04:14:04 crc kubenswrapper[4901]: I0309 04:14:04.314293 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/943932b8-e95f-48a1-b61c-fe1efcd1bf71-config-data\") pod \"keystone-bootstrap-vt7r5\" (UID: \"943932b8-e95f-48a1-b61c-fe1efcd1bf71\") " pod="openstack/keystone-bootstrap-vt7r5" Mar 09 04:14:04 crc kubenswrapper[4901]: I0309 04:14:04.314307 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/943932b8-e95f-48a1-b61c-fe1efcd1bf71-credential-keys\") pod \"keystone-bootstrap-vt7r5\" (UID: \"943932b8-e95f-48a1-b61c-fe1efcd1bf71\") " pod="openstack/keystone-bootstrap-vt7r5" Mar 09 04:14:04 crc kubenswrapper[4901]: I0309 04:14:04.328843 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/943932b8-e95f-48a1-b61c-fe1efcd1bf71-credential-keys\") pod \"keystone-bootstrap-vt7r5\" (UID: \"943932b8-e95f-48a1-b61c-fe1efcd1bf71\") " pod="openstack/keystone-bootstrap-vt7r5" Mar 09 04:14:04 crc kubenswrapper[4901]: I0309 04:14:04.338154 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/943932b8-e95f-48a1-b61c-fe1efcd1bf71-fernet-keys\") pod \"keystone-bootstrap-vt7r5\" (UID: \"943932b8-e95f-48a1-b61c-fe1efcd1bf71\") " pod="openstack/keystone-bootstrap-vt7r5" Mar 09 04:14:04 crc kubenswrapper[4901]: I0309 04:14:04.339949 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/943932b8-e95f-48a1-b61c-fe1efcd1bf71-config-data\") pod \"keystone-bootstrap-vt7r5\" (UID: \"943932b8-e95f-48a1-b61c-fe1efcd1bf71\") " pod="openstack/keystone-bootstrap-vt7r5" Mar 09 04:14:04 crc kubenswrapper[4901]: I0309 04:14:04.341565 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/943932b8-e95f-48a1-b61c-fe1efcd1bf71-scripts\") pod \"keystone-bootstrap-vt7r5\" (UID: \"943932b8-e95f-48a1-b61c-fe1efcd1bf71\") " pod="openstack/keystone-bootstrap-vt7r5" Mar 09 04:14:04 crc kubenswrapper[4901]: I0309 04:14:04.343303 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/943932b8-e95f-48a1-b61c-fe1efcd1bf71-combined-ca-bundle\") pod \"keystone-bootstrap-vt7r5\" (UID: \"943932b8-e95f-48a1-b61c-fe1efcd1bf71\") " pod="openstack/keystone-bootstrap-vt7r5" Mar 09 04:14:04 crc kubenswrapper[4901]: I0309 04:14:04.344564 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550494-hz2hc" Mar 09 04:14:04 crc kubenswrapper[4901]: I0309 04:14:04.353077 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djmzv\" (UniqueName: \"kubernetes.io/projected/943932b8-e95f-48a1-b61c-fe1efcd1bf71-kube-api-access-djmzv\") pod \"keystone-bootstrap-vt7r5\" (UID: \"943932b8-e95f-48a1-b61c-fe1efcd1bf71\") " pod="openstack/keystone-bootstrap-vt7r5" Mar 09 04:14:04 crc kubenswrapper[4901]: I0309 04:14:04.414943 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6r4wz\" (UniqueName: \"kubernetes.io/projected/a875c415-225e-40d6-8e3f-c0a390112936-kube-api-access-6r4wz\") pod \"a875c415-225e-40d6-8e3f-c0a390112936\" (UID: \"a875c415-225e-40d6-8e3f-c0a390112936\") " Mar 09 04:14:04 crc kubenswrapper[4901]: I0309 04:14:04.427947 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a875c415-225e-40d6-8e3f-c0a390112936-kube-api-access-6r4wz" (OuterVolumeSpecName: "kube-api-access-6r4wz") pod "a875c415-225e-40d6-8e3f-c0a390112936" (UID: "a875c415-225e-40d6-8e3f-c0a390112936"). InnerVolumeSpecName "kube-api-access-6r4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:14:04 crc kubenswrapper[4901]: I0309 04:14:04.516354 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6r4wz\" (UniqueName: \"kubernetes.io/projected/a875c415-225e-40d6-8e3f-c0a390112936-kube-api-access-6r4wz\") on node \"crc\" DevicePath \"\"" Mar 09 04:14:04 crc kubenswrapper[4901]: I0309 04:14:04.648769 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vt7r5" Mar 09 04:14:04 crc kubenswrapper[4901]: I0309 04:14:04.806176 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7989cc7f6f-vv8d8"] Mar 09 04:14:04 crc kubenswrapper[4901]: I0309 04:14:04.849821 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550494-hz2hc" event={"ID":"a875c415-225e-40d6-8e3f-c0a390112936","Type":"ContainerDied","Data":"77cf977e2a08da215ca2811cbc858a8cdcdcf93518ca8c140468b531c3af9551"} Mar 09 04:14:04 crc kubenswrapper[4901]: I0309 04:14:04.849868 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77cf977e2a08da215ca2811cbc858a8cdcdcf93518ca8c140468b531c3af9551" Mar 09 04:14:04 crc kubenswrapper[4901]: I0309 04:14:04.849933 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550494-hz2hc" Mar 09 04:14:04 crc kubenswrapper[4901]: I0309 04:14:04.853309 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7989cc7f6f-vv8d8" event={"ID":"63fc5507-22a2-4871-a6bf-557a5e4cde6b","Type":"ContainerStarted","Data":"f38ed4a72dcb9519bb5d5f9ceb51fd5da03f9007a9efc1b0cd0e80562321a3a9"} Mar 09 04:14:05 crc kubenswrapper[4901]: I0309 04:14:05.116681 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-vt7r5"] Mar 09 04:14:05 crc kubenswrapper[4901]: W0309 04:14:05.129488 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod943932b8_e95f_48a1_b61c_fe1efcd1bf71.slice/crio-b9be60ec13254d8713d53883e71d43d601e0dfdcc48aee88792e60391a297c85 WatchSource:0}: Error finding container b9be60ec13254d8713d53883e71d43d601e0dfdcc48aee88792e60391a297c85: Status 404 returned error can't find the container with id b9be60ec13254d8713d53883e71d43d601e0dfdcc48aee88792e60391a297c85 Mar 09 04:14:05 crc kubenswrapper[4901]: I0309 04:14:05.425870 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550488-j68gd"] Mar 09 04:14:05 crc kubenswrapper[4901]: I0309 04:14:05.431650 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550488-j68gd"] Mar 09 04:14:05 crc kubenswrapper[4901]: I0309 04:14:05.864373 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vt7r5" event={"ID":"943932b8-e95f-48a1-b61c-fe1efcd1bf71","Type":"ContainerStarted","Data":"a98051d3ad7f950bd65e2e3a6812ba7a526f78921390e4e481d3a38908b9feed"} Mar 09 04:14:05 crc kubenswrapper[4901]: I0309 04:14:05.864689 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vt7r5" event={"ID":"943932b8-e95f-48a1-b61c-fe1efcd1bf71","Type":"ContainerStarted","Data":"b9be60ec13254d8713d53883e71d43d601e0dfdcc48aee88792e60391a297c85"} Mar 09 04:14:05 crc kubenswrapper[4901]: I0309 04:14:05.866670 4901 generic.go:334] "Generic (PLEG): container finished" podID="63fc5507-22a2-4871-a6bf-557a5e4cde6b" containerID="9c78fdbf38550b8430d0f1a538e74f41d40e0d887e98b81e8354910de4ecf4ac" exitCode=0 Mar 09 04:14:05 crc kubenswrapper[4901]: I0309 04:14:05.866729 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7989cc7f6f-vv8d8" event={"ID":"63fc5507-22a2-4871-a6bf-557a5e4cde6b","Type":"ContainerDied","Data":"9c78fdbf38550b8430d0f1a538e74f41d40e0d887e98b81e8354910de4ecf4ac"} Mar 09 04:14:05 crc kubenswrapper[4901]: I0309 04:14:05.927146 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-vt7r5" podStartSLOduration=2.9271059040000003 podStartE2EDuration="2.927105904s" podCreationTimestamp="2026-03-09 04:14:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 04:14:05.921765742 +0000 UTC m=+5570.511429474" watchObservedRunningTime="2026-03-09 04:14:05.927105904 +0000 UTC m=+5570.516769636" Mar 09 04:14:06 crc kubenswrapper[4901]: I0309 04:14:06.135505 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6d8792f-b941-44d3-a0ab-446c2b010bd0" path="/var/lib/kubelet/pods/b6d8792f-b941-44d3-a0ab-446c2b010bd0/volumes" Mar 09 04:14:06 crc kubenswrapper[4901]: I0309 04:14:06.878514 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7989cc7f6f-vv8d8" event={"ID":"63fc5507-22a2-4871-a6bf-557a5e4cde6b","Type":"ContainerStarted","Data":"e9209b62cf35e2790a019e8721408cfee8680433466df0a5a95edfbe9498e7e2"} Mar 09 04:14:06 crc kubenswrapper[4901]: I0309 04:14:06.879165 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7989cc7f6f-vv8d8" Mar 09 04:14:06 crc kubenswrapper[4901]: I0309 04:14:06.912984 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7989cc7f6f-vv8d8" podStartSLOduration=3.912953001 podStartE2EDuration="3.912953001s" podCreationTimestamp="2026-03-09 04:14:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 04:14:06.899982402 +0000 UTC m=+5571.489646154" watchObservedRunningTime="2026-03-09 04:14:06.912953001 +0000 UTC m=+5571.502616743" Mar 09 04:14:08 crc kubenswrapper[4901]: I0309 04:14:08.107050 4901 scope.go:117] "RemoveContainer" containerID="d217497ad345fae7da1bdd0eb24a9199e10562d0bb7d6f5c45a95f7e79585459" Mar 09 04:14:08 crc kubenswrapper[4901]: E0309 04:14:08.107962 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:14:08 crc kubenswrapper[4901]: I0309 04:14:08.902928 4901 generic.go:334] "Generic (PLEG): container finished" podID="943932b8-e95f-48a1-b61c-fe1efcd1bf71" containerID="a98051d3ad7f950bd65e2e3a6812ba7a526f78921390e4e481d3a38908b9feed" exitCode=0 Mar 09 04:14:08 crc kubenswrapper[4901]: I0309 04:14:08.902984 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vt7r5" event={"ID":"943932b8-e95f-48a1-b61c-fe1efcd1bf71","Type":"ContainerDied","Data":"a98051d3ad7f950bd65e2e3a6812ba7a526f78921390e4e481d3a38908b9feed"} Mar 09 04:14:10 crc kubenswrapper[4901]: I0309 04:14:10.408414 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vt7r5" Mar 09 04:14:10 crc kubenswrapper[4901]: I0309 04:14:10.529981 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/943932b8-e95f-48a1-b61c-fe1efcd1bf71-credential-keys\") pod \"943932b8-e95f-48a1-b61c-fe1efcd1bf71\" (UID: \"943932b8-e95f-48a1-b61c-fe1efcd1bf71\") " Mar 09 04:14:10 crc kubenswrapper[4901]: I0309 04:14:10.530344 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/943932b8-e95f-48a1-b61c-fe1efcd1bf71-fernet-keys\") pod \"943932b8-e95f-48a1-b61c-fe1efcd1bf71\" (UID: \"943932b8-e95f-48a1-b61c-fe1efcd1bf71\") " Mar 09 04:14:10 crc kubenswrapper[4901]: I0309 04:14:10.530407 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/943932b8-e95f-48a1-b61c-fe1efcd1bf71-combined-ca-bundle\") pod \"943932b8-e95f-48a1-b61c-fe1efcd1bf71\" (UID: \"943932b8-e95f-48a1-b61c-fe1efcd1bf71\") " Mar 09 04:14:10 crc kubenswrapper[4901]: I0309 04:14:10.530473 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/943932b8-e95f-48a1-b61c-fe1efcd1bf71-scripts\") pod \"943932b8-e95f-48a1-b61c-fe1efcd1bf71\" (UID: \"943932b8-e95f-48a1-b61c-fe1efcd1bf71\") " Mar 09 04:14:10 crc kubenswrapper[4901]: I0309 04:14:10.530579 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/943932b8-e95f-48a1-b61c-fe1efcd1bf71-config-data\") pod \"943932b8-e95f-48a1-b61c-fe1efcd1bf71\" (UID: \"943932b8-e95f-48a1-b61c-fe1efcd1bf71\") " Mar 09 04:14:10 crc kubenswrapper[4901]: I0309 04:14:10.530642 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djmzv\" (UniqueName: \"kubernetes.io/projected/943932b8-e95f-48a1-b61c-fe1efcd1bf71-kube-api-access-djmzv\") pod \"943932b8-e95f-48a1-b61c-fe1efcd1bf71\" (UID: \"943932b8-e95f-48a1-b61c-fe1efcd1bf71\") " Mar 09 04:14:10 crc kubenswrapper[4901]: I0309 04:14:10.538896 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/943932b8-e95f-48a1-b61c-fe1efcd1bf71-scripts" (OuterVolumeSpecName: "scripts") pod "943932b8-e95f-48a1-b61c-fe1efcd1bf71" (UID: "943932b8-e95f-48a1-b61c-fe1efcd1bf71"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 04:14:10 crc kubenswrapper[4901]: I0309 04:14:10.539091 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/943932b8-e95f-48a1-b61c-fe1efcd1bf71-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "943932b8-e95f-48a1-b61c-fe1efcd1bf71" (UID: "943932b8-e95f-48a1-b61c-fe1efcd1bf71"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 04:14:10 crc kubenswrapper[4901]: I0309 04:14:10.539572 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/943932b8-e95f-48a1-b61c-fe1efcd1bf71-kube-api-access-djmzv" (OuterVolumeSpecName: "kube-api-access-djmzv") pod "943932b8-e95f-48a1-b61c-fe1efcd1bf71" (UID: "943932b8-e95f-48a1-b61c-fe1efcd1bf71"). InnerVolumeSpecName "kube-api-access-djmzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:14:10 crc kubenswrapper[4901]: I0309 04:14:10.539914 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/943932b8-e95f-48a1-b61c-fe1efcd1bf71-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "943932b8-e95f-48a1-b61c-fe1efcd1bf71" (UID: "943932b8-e95f-48a1-b61c-fe1efcd1bf71"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 04:14:10 crc kubenswrapper[4901]: I0309 04:14:10.574436 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/943932b8-e95f-48a1-b61c-fe1efcd1bf71-config-data" (OuterVolumeSpecName: "config-data") pod "943932b8-e95f-48a1-b61c-fe1efcd1bf71" (UID: "943932b8-e95f-48a1-b61c-fe1efcd1bf71"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 04:14:10 crc kubenswrapper[4901]: I0309 04:14:10.579146 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/943932b8-e95f-48a1-b61c-fe1efcd1bf71-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "943932b8-e95f-48a1-b61c-fe1efcd1bf71" (UID: "943932b8-e95f-48a1-b61c-fe1efcd1bf71"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 04:14:10 crc kubenswrapper[4901]: I0309 04:14:10.632786 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/943932b8-e95f-48a1-b61c-fe1efcd1bf71-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 04:14:10 crc kubenswrapper[4901]: I0309 04:14:10.632842 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djmzv\" (UniqueName: \"kubernetes.io/projected/943932b8-e95f-48a1-b61c-fe1efcd1bf71-kube-api-access-djmzv\") on node \"crc\" DevicePath \"\"" Mar 09 04:14:10 crc kubenswrapper[4901]: I0309 04:14:10.632864 4901 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/943932b8-e95f-48a1-b61c-fe1efcd1bf71-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 09 04:14:10 crc kubenswrapper[4901]: I0309 04:14:10.632882 4901 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/943932b8-e95f-48a1-b61c-fe1efcd1bf71-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 09 04:14:10 crc kubenswrapper[4901]: I0309 04:14:10.632899 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/943932b8-e95f-48a1-b61c-fe1efcd1bf71-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 04:14:10 crc kubenswrapper[4901]: I0309 04:14:10.632918 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/943932b8-e95f-48a1-b61c-fe1efcd1bf71-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 04:14:10 crc kubenswrapper[4901]: I0309 04:14:10.930971 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vt7r5" event={"ID":"943932b8-e95f-48a1-b61c-fe1efcd1bf71","Type":"ContainerDied","Data":"b9be60ec13254d8713d53883e71d43d601e0dfdcc48aee88792e60391a297c85"} Mar 09 04:14:10 crc kubenswrapper[4901]: I0309 04:14:10.931027 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9be60ec13254d8713d53883e71d43d601e0dfdcc48aee88792e60391a297c85" Mar 09 04:14:10 crc kubenswrapper[4901]: I0309 04:14:10.931058 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vt7r5" Mar 09 04:14:11 crc kubenswrapper[4901]: I0309 04:14:11.029576 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-vt7r5"] Mar 09 04:14:11 crc kubenswrapper[4901]: I0309 04:14:11.035447 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-vt7r5"] Mar 09 04:14:11 crc kubenswrapper[4901]: I0309 04:14:11.116793 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-px9wn"] Mar 09 04:14:11 crc kubenswrapper[4901]: E0309 04:14:11.117394 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="943932b8-e95f-48a1-b61c-fe1efcd1bf71" containerName="keystone-bootstrap" Mar 09 04:14:11 crc kubenswrapper[4901]: I0309 04:14:11.117420 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="943932b8-e95f-48a1-b61c-fe1efcd1bf71" containerName="keystone-bootstrap" Mar 09 04:14:11 crc kubenswrapper[4901]: E0309 04:14:11.117479 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a875c415-225e-40d6-8e3f-c0a390112936" containerName="oc" Mar 09 04:14:11 crc kubenswrapper[4901]: I0309 04:14:11.117495 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="a875c415-225e-40d6-8e3f-c0a390112936" containerName="oc" Mar 09 04:14:11 crc kubenswrapper[4901]: I0309 04:14:11.117916 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="a875c415-225e-40d6-8e3f-c0a390112936" containerName="oc" Mar 09 04:14:11 crc kubenswrapper[4901]: I0309 04:14:11.117961 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="943932b8-e95f-48a1-b61c-fe1efcd1bf71" containerName="keystone-bootstrap" Mar 09 04:14:11 crc kubenswrapper[4901]: I0309 04:14:11.119529 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-px9wn" Mar 09 04:14:11 crc kubenswrapper[4901]: I0309 04:14:11.122714 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 09 04:14:11 crc kubenswrapper[4901]: I0309 04:14:11.123037 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-59p27" Mar 09 04:14:11 crc kubenswrapper[4901]: I0309 04:14:11.123319 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 09 04:14:11 crc kubenswrapper[4901]: I0309 04:14:11.123784 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 09 04:14:11 crc kubenswrapper[4901]: I0309 04:14:11.123883 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-px9wn"] Mar 09 04:14:11 crc kubenswrapper[4901]: I0309 04:14:11.123944 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 09 04:14:11 crc kubenswrapper[4901]: I0309 04:14:11.244058 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf-combined-ca-bundle\") pod \"keystone-bootstrap-px9wn\" (UID: \"9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf\") " pod="openstack/keystone-bootstrap-px9wn" Mar 09 04:14:11 crc kubenswrapper[4901]: I0309 04:14:11.244485 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf-config-data\") pod \"keystone-bootstrap-px9wn\" (UID: \"9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf\") " pod="openstack/keystone-bootstrap-px9wn" Mar 09 04:14:11 crc kubenswrapper[4901]: I0309 04:14:11.244543 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8lh7\" (UniqueName: \"kubernetes.io/projected/9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf-kube-api-access-w8lh7\") pod \"keystone-bootstrap-px9wn\" (UID: \"9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf\") " pod="openstack/keystone-bootstrap-px9wn" Mar 09 04:14:11 crc kubenswrapper[4901]: I0309 04:14:11.244587 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf-credential-keys\") pod \"keystone-bootstrap-px9wn\" (UID: \"9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf\") " pod="openstack/keystone-bootstrap-px9wn" Mar 09 04:14:11 crc kubenswrapper[4901]: I0309 04:14:11.244609 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf-fernet-keys\") pod \"keystone-bootstrap-px9wn\" (UID: \"9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf\") " pod="openstack/keystone-bootstrap-px9wn" Mar 09 04:14:11 crc kubenswrapper[4901]: I0309 04:14:11.244636 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf-scripts\") pod \"keystone-bootstrap-px9wn\" (UID: \"9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf\") " pod="openstack/keystone-bootstrap-px9wn" Mar 09 04:14:11 crc kubenswrapper[4901]: I0309 04:14:11.346073 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf-config-data\") pod \"keystone-bootstrap-px9wn\" (UID: \"9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf\") " pod="openstack/keystone-bootstrap-px9wn" Mar 09 04:14:11 crc kubenswrapper[4901]: I0309 04:14:11.346167 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8lh7\" (UniqueName: \"kubernetes.io/projected/9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf-kube-api-access-w8lh7\") pod \"keystone-bootstrap-px9wn\" (UID: \"9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf\") " pod="openstack/keystone-bootstrap-px9wn" Mar 09 04:14:11 crc kubenswrapper[4901]: I0309 04:14:11.346246 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf-credential-keys\") pod \"keystone-bootstrap-px9wn\" (UID: \"9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf\") " pod="openstack/keystone-bootstrap-px9wn" Mar 09 04:14:11 crc kubenswrapper[4901]: I0309 04:14:11.346284 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf-fernet-keys\") pod \"keystone-bootstrap-px9wn\" (UID: \"9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf\") " pod="openstack/keystone-bootstrap-px9wn" Mar 09 04:14:11 crc kubenswrapper[4901]: I0309 04:14:11.346328 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf-scripts\") pod \"keystone-bootstrap-px9wn\" (UID: \"9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf\") " pod="openstack/keystone-bootstrap-px9wn" Mar 09 04:14:11 crc kubenswrapper[4901]: I0309 04:14:11.346522 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf-combined-ca-bundle\") pod \"keystone-bootstrap-px9wn\" (UID: \"9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf\") " pod="openstack/keystone-bootstrap-px9wn" Mar 09 04:14:11 crc kubenswrapper[4901]: I0309 04:14:11.357809 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf-combined-ca-bundle\") pod \"keystone-bootstrap-px9wn\" (UID: \"9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf\") " pod="openstack/keystone-bootstrap-px9wn" Mar 09 04:14:11 crc kubenswrapper[4901]: I0309 04:14:11.358522 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf-scripts\") pod \"keystone-bootstrap-px9wn\" (UID: \"9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf\") " pod="openstack/keystone-bootstrap-px9wn" Mar 09 04:14:11 crc kubenswrapper[4901]: I0309 04:14:11.357749 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf-credential-keys\") pod \"keystone-bootstrap-px9wn\" (UID: \"9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf\") " pod="openstack/keystone-bootstrap-px9wn" Mar 09 04:14:11 crc kubenswrapper[4901]: I0309 04:14:11.360523 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf-config-data\") pod \"keystone-bootstrap-px9wn\" (UID: \"9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf\") " pod="openstack/keystone-bootstrap-px9wn" Mar 09 04:14:11 crc kubenswrapper[4901]: I0309 04:14:11.361292 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf-fernet-keys\") pod \"keystone-bootstrap-px9wn\" (UID: \"9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf\") " pod="openstack/keystone-bootstrap-px9wn" Mar 09 04:14:11 crc kubenswrapper[4901]: I0309 04:14:11.377525 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8lh7\" (UniqueName: \"kubernetes.io/projected/9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf-kube-api-access-w8lh7\") pod \"keystone-bootstrap-px9wn\" (UID: \"9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf\") " pod="openstack/keystone-bootstrap-px9wn" Mar 09 04:14:11 crc kubenswrapper[4901]: I0309 04:14:11.466482 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-px9wn" Mar 09 04:14:11 crc kubenswrapper[4901]: I0309 04:14:11.956671 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-px9wn"] Mar 09 04:14:12 crc kubenswrapper[4901]: I0309 04:14:12.125706 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="943932b8-e95f-48a1-b61c-fe1efcd1bf71" path="/var/lib/kubelet/pods/943932b8-e95f-48a1-b61c-fe1efcd1bf71/volumes" Mar 09 04:14:12 crc kubenswrapper[4901]: I0309 04:14:12.955439 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-px9wn" event={"ID":"9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf","Type":"ContainerStarted","Data":"a1476bec4bb7633a09dfe3d8f42b2a3e0099e79486ce817ae3993dbc5c11e195"} Mar 09 04:14:12 crc kubenswrapper[4901]: I0309 04:14:12.955477 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-px9wn" event={"ID":"9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf","Type":"ContainerStarted","Data":"4f80b30bbe295737ca4db1247d7ec1682bc5a88a548464dc4add49b2be894ea5"} Mar 09 04:14:12 crc kubenswrapper[4901]: I0309 04:14:12.980519 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-px9wn" podStartSLOduration=1.9804902800000002 podStartE2EDuration="1.98049028s" podCreationTimestamp="2026-03-09 04:14:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 04:14:12.970132985 +0000 UTC m=+5577.559796717" watchObservedRunningTime="2026-03-09 04:14:12.98049028 +0000 UTC m=+5577.570154072" Mar 09 04:14:15 crc kubenswrapper[4901]: I0309 04:14:15.030442 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7989cc7f6f-vv8d8" Mar 09 04:14:15 crc kubenswrapper[4901]: I0309 04:14:15.149430 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-857547b655-qxjlx"] Mar 09 04:14:15 crc kubenswrapper[4901]: I0309 04:14:15.149716 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-857547b655-qxjlx" podUID="cfe2f592-c7ea-4838-9cc2-281cbf406cae" containerName="dnsmasq-dns" containerID="cri-o://29032917a4112fa87357b616c68ce60f4c8b347ab5097f703120948db6bff4e3" gracePeriod=10 Mar 09 04:14:15 crc kubenswrapper[4901]: I0309 04:14:15.251286 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-857547b655-qxjlx" podUID="cfe2f592-c7ea-4838-9cc2-281cbf406cae" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.48:5353: connect: connection refused" Mar 09 04:14:15 crc kubenswrapper[4901]: I0309 04:14:15.644734 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-857547b655-qxjlx" Mar 09 04:14:15 crc kubenswrapper[4901]: I0309 04:14:15.748393 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtzdg\" (UniqueName: \"kubernetes.io/projected/cfe2f592-c7ea-4838-9cc2-281cbf406cae-kube-api-access-xtzdg\") pod \"cfe2f592-c7ea-4838-9cc2-281cbf406cae\" (UID: \"cfe2f592-c7ea-4838-9cc2-281cbf406cae\") " Mar 09 04:14:15 crc kubenswrapper[4901]: I0309 04:14:15.748503 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfe2f592-c7ea-4838-9cc2-281cbf406cae-config\") pod \"cfe2f592-c7ea-4838-9cc2-281cbf406cae\" (UID: \"cfe2f592-c7ea-4838-9cc2-281cbf406cae\") " Mar 09 04:14:15 crc kubenswrapper[4901]: I0309 04:14:15.748595 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cfe2f592-c7ea-4838-9cc2-281cbf406cae-dns-svc\") pod \"cfe2f592-c7ea-4838-9cc2-281cbf406cae\" (UID: \"cfe2f592-c7ea-4838-9cc2-281cbf406cae\") " Mar 09 04:14:15 crc kubenswrapper[4901]: I0309 04:14:15.748708 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cfe2f592-c7ea-4838-9cc2-281cbf406cae-ovsdbserver-sb\") pod \"cfe2f592-c7ea-4838-9cc2-281cbf406cae\" (UID: \"cfe2f592-c7ea-4838-9cc2-281cbf406cae\") " Mar 09 04:14:15 crc kubenswrapper[4901]: I0309 04:14:15.748752 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cfe2f592-c7ea-4838-9cc2-281cbf406cae-ovsdbserver-nb\") pod \"cfe2f592-c7ea-4838-9cc2-281cbf406cae\" (UID: \"cfe2f592-c7ea-4838-9cc2-281cbf406cae\") " Mar 09 04:14:15 crc kubenswrapper[4901]: I0309 04:14:15.754657 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfe2f592-c7ea-4838-9cc2-281cbf406cae-kube-api-access-xtzdg" (OuterVolumeSpecName: "kube-api-access-xtzdg") pod "cfe2f592-c7ea-4838-9cc2-281cbf406cae" (UID: "cfe2f592-c7ea-4838-9cc2-281cbf406cae"). InnerVolumeSpecName "kube-api-access-xtzdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:14:15 crc kubenswrapper[4901]: I0309 04:14:15.795029 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfe2f592-c7ea-4838-9cc2-281cbf406cae-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cfe2f592-c7ea-4838-9cc2-281cbf406cae" (UID: "cfe2f592-c7ea-4838-9cc2-281cbf406cae"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 04:14:15 crc kubenswrapper[4901]: I0309 04:14:15.803439 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfe2f592-c7ea-4838-9cc2-281cbf406cae-config" (OuterVolumeSpecName: "config") pod "cfe2f592-c7ea-4838-9cc2-281cbf406cae" (UID: "cfe2f592-c7ea-4838-9cc2-281cbf406cae"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 04:14:15 crc kubenswrapper[4901]: I0309 04:14:15.811406 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfe2f592-c7ea-4838-9cc2-281cbf406cae-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cfe2f592-c7ea-4838-9cc2-281cbf406cae" (UID: "cfe2f592-c7ea-4838-9cc2-281cbf406cae"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 04:14:15 crc kubenswrapper[4901]: I0309 04:14:15.823133 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfe2f592-c7ea-4838-9cc2-281cbf406cae-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cfe2f592-c7ea-4838-9cc2-281cbf406cae" (UID: "cfe2f592-c7ea-4838-9cc2-281cbf406cae"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 04:14:15 crc kubenswrapper[4901]: I0309 04:14:15.851651 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfe2f592-c7ea-4838-9cc2-281cbf406cae-config\") on node \"crc\" DevicePath \"\"" Mar 09 04:14:15 crc kubenswrapper[4901]: I0309 04:14:15.851723 4901 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cfe2f592-c7ea-4838-9cc2-281cbf406cae-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 04:14:15 crc kubenswrapper[4901]: I0309 04:14:15.851740 4901 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cfe2f592-c7ea-4838-9cc2-281cbf406cae-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 04:14:15 crc kubenswrapper[4901]: I0309 04:14:15.851752 4901 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cfe2f592-c7ea-4838-9cc2-281cbf406cae-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 04:14:15 crc kubenswrapper[4901]: I0309 04:14:15.851814 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtzdg\" (UniqueName: \"kubernetes.io/projected/cfe2f592-c7ea-4838-9cc2-281cbf406cae-kube-api-access-xtzdg\") on node \"crc\" DevicePath \"\"" Mar 09 04:14:16 crc kubenswrapper[4901]: I0309 04:14:16.095626 4901 generic.go:334] "Generic (PLEG): container finished" podID="cfe2f592-c7ea-4838-9cc2-281cbf406cae" containerID="29032917a4112fa87357b616c68ce60f4c8b347ab5097f703120948db6bff4e3" exitCode=0 Mar 09 04:14:16 crc kubenswrapper[4901]: I0309 04:14:16.095751 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-857547b655-qxjlx" event={"ID":"cfe2f592-c7ea-4838-9cc2-281cbf406cae","Type":"ContainerDied","Data":"29032917a4112fa87357b616c68ce60f4c8b347ab5097f703120948db6bff4e3"} Mar 09 04:14:16 crc kubenswrapper[4901]: I0309 04:14:16.095872 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-857547b655-qxjlx" event={"ID":"cfe2f592-c7ea-4838-9cc2-281cbf406cae","Type":"ContainerDied","Data":"ff602be06a69b748c455e7cb437cf1a307874983fdcd93f7f2c9fba5913141eb"} Mar 09 04:14:16 crc kubenswrapper[4901]: I0309 04:14:16.095906 4901 scope.go:117] "RemoveContainer" containerID="29032917a4112fa87357b616c68ce60f4c8b347ab5097f703120948db6bff4e3" Mar 09 04:14:16 crc kubenswrapper[4901]: I0309 04:14:16.096075 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-857547b655-qxjlx" Mar 09 04:14:16 crc kubenswrapper[4901]: I0309 04:14:16.107998 4901 generic.go:334] "Generic (PLEG): container finished" podID="9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf" containerID="a1476bec4bb7633a09dfe3d8f42b2a3e0099e79486ce817ae3993dbc5c11e195" exitCode=0 Mar 09 04:14:16 crc kubenswrapper[4901]: I0309 04:14:16.131131 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-px9wn" event={"ID":"9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf","Type":"ContainerDied","Data":"a1476bec4bb7633a09dfe3d8f42b2a3e0099e79486ce817ae3993dbc5c11e195"} Mar 09 04:14:16 crc kubenswrapper[4901]: I0309 04:14:16.139011 4901 scope.go:117] "RemoveContainer" containerID="f83ae20b441e7daf6bbc7859e4aa12f8cb598e680dcc6c1787c2f12e38f815fe" Mar 09 04:14:16 crc kubenswrapper[4901]: I0309 04:14:16.189131 4901 scope.go:117] "RemoveContainer" containerID="29032917a4112fa87357b616c68ce60f4c8b347ab5097f703120948db6bff4e3" Mar 09 04:14:16 crc kubenswrapper[4901]: E0309 04:14:16.189583 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29032917a4112fa87357b616c68ce60f4c8b347ab5097f703120948db6bff4e3\": container with ID starting with 29032917a4112fa87357b616c68ce60f4c8b347ab5097f703120948db6bff4e3 not found: ID does not exist" containerID="29032917a4112fa87357b616c68ce60f4c8b347ab5097f703120948db6bff4e3" Mar 09 04:14:16 crc kubenswrapper[4901]: I0309 04:14:16.189612 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29032917a4112fa87357b616c68ce60f4c8b347ab5097f703120948db6bff4e3"} err="failed to get container status \"29032917a4112fa87357b616c68ce60f4c8b347ab5097f703120948db6bff4e3\": rpc error: code = NotFound desc = could not find container \"29032917a4112fa87357b616c68ce60f4c8b347ab5097f703120948db6bff4e3\": container with ID starting with 29032917a4112fa87357b616c68ce60f4c8b347ab5097f703120948db6bff4e3 not found: ID does not exist" Mar 09 04:14:16 crc kubenswrapper[4901]: I0309 04:14:16.189634 4901 scope.go:117] "RemoveContainer" containerID="f83ae20b441e7daf6bbc7859e4aa12f8cb598e680dcc6c1787c2f12e38f815fe" Mar 09 04:14:16 crc kubenswrapper[4901]: E0309 04:14:16.190004 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f83ae20b441e7daf6bbc7859e4aa12f8cb598e680dcc6c1787c2f12e38f815fe\": container with ID starting with f83ae20b441e7daf6bbc7859e4aa12f8cb598e680dcc6c1787c2f12e38f815fe not found: ID does not exist" containerID="f83ae20b441e7daf6bbc7859e4aa12f8cb598e680dcc6c1787c2f12e38f815fe" Mar 09 04:14:16 crc kubenswrapper[4901]: I0309 04:14:16.190073 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f83ae20b441e7daf6bbc7859e4aa12f8cb598e680dcc6c1787c2f12e38f815fe"} err="failed to get container status \"f83ae20b441e7daf6bbc7859e4aa12f8cb598e680dcc6c1787c2f12e38f815fe\": rpc error: code = NotFound desc = could not find container \"f83ae20b441e7daf6bbc7859e4aa12f8cb598e680dcc6c1787c2f12e38f815fe\": container with ID starting with f83ae20b441e7daf6bbc7859e4aa12f8cb598e680dcc6c1787c2f12e38f815fe not found: ID does not exist" Mar 09 04:14:16 crc kubenswrapper[4901]: I0309 04:14:16.190905 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-857547b655-qxjlx"] Mar 09 04:14:16 crc kubenswrapper[4901]: I0309 04:14:16.199071 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-857547b655-qxjlx"] Mar 09 04:14:17 crc kubenswrapper[4901]: I0309 04:14:17.558149 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-px9wn" Mar 09 04:14:17 crc kubenswrapper[4901]: I0309 04:14:17.683612 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf-combined-ca-bundle\") pod \"9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf\" (UID: \"9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf\") " Mar 09 04:14:17 crc kubenswrapper[4901]: I0309 04:14:17.683668 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf-config-data\") pod \"9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf\" (UID: \"9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf\") " Mar 09 04:14:17 crc kubenswrapper[4901]: I0309 04:14:17.683706 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf-scripts\") pod \"9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf\" (UID: \"9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf\") " Mar 09 04:14:17 crc kubenswrapper[4901]: I0309 04:14:17.683733 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf-credential-keys\") pod \"9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf\" (UID: \"9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf\") " Mar 09 04:14:17 crc kubenswrapper[4901]: I0309 04:14:17.683851 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf-fernet-keys\") pod \"9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf\" (UID: \"9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf\") " Mar 09 04:14:17 crc kubenswrapper[4901]: I0309 04:14:17.683884 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8lh7\" (UniqueName: \"kubernetes.io/projected/9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf-kube-api-access-w8lh7\") pod \"9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf\" (UID: \"9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf\") " Mar 09 04:14:17 crc kubenswrapper[4901]: I0309 04:14:17.691378 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf" (UID: "9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 04:14:17 crc kubenswrapper[4901]: I0309 04:14:17.691409 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf-kube-api-access-w8lh7" (OuterVolumeSpecName: "kube-api-access-w8lh7") pod "9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf" (UID: "9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf"). InnerVolumeSpecName "kube-api-access-w8lh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:14:17 crc kubenswrapper[4901]: I0309 04:14:17.691426 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf" (UID: "9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 04:14:17 crc kubenswrapper[4901]: I0309 04:14:17.691536 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf-scripts" (OuterVolumeSpecName: "scripts") pod "9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf" (UID: "9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 04:14:17 crc kubenswrapper[4901]: I0309 04:14:17.723757 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf" (UID: "9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 04:14:17 crc kubenswrapper[4901]: I0309 04:14:17.731788 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf-config-data" (OuterVolumeSpecName: "config-data") pod "9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf" (UID: "9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 04:14:17 crc kubenswrapper[4901]: I0309 04:14:17.785437 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 04:14:17 crc kubenswrapper[4901]: I0309 04:14:17.785681 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 04:14:17 crc kubenswrapper[4901]: I0309 04:14:17.785690 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 04:14:17 crc kubenswrapper[4901]: I0309 04:14:17.785698 4901 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 09 04:14:17 crc kubenswrapper[4901]: I0309 04:14:17.785706 4901 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 09 04:14:17 crc kubenswrapper[4901]: I0309 04:14:17.785716 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8lh7\" (UniqueName: \"kubernetes.io/projected/9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf-kube-api-access-w8lh7\") on node \"crc\" DevicePath \"\"" Mar 09 04:14:18 crc kubenswrapper[4901]: I0309 04:14:18.122620 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfe2f592-c7ea-4838-9cc2-281cbf406cae" path="/var/lib/kubelet/pods/cfe2f592-c7ea-4838-9cc2-281cbf406cae/volumes" Mar 09 04:14:18 crc kubenswrapper[4901]: I0309 04:14:18.137310 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-px9wn" event={"ID":"9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf","Type":"ContainerDied","Data":"4f80b30bbe295737ca4db1247d7ec1682bc5a88a548464dc4add49b2be894ea5"} Mar 09 04:14:18 crc kubenswrapper[4901]: I0309 04:14:18.137378 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f80b30bbe295737ca4db1247d7ec1682bc5a88a548464dc4add49b2be894ea5" Mar 09 04:14:18 crc kubenswrapper[4901]: I0309 04:14:18.137467 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-px9wn" Mar 09 04:14:18 crc kubenswrapper[4901]: I0309 04:14:18.273627 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7bd95c8c59-n8g5r"] Mar 09 04:14:18 crc kubenswrapper[4901]: E0309 04:14:18.276885 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfe2f592-c7ea-4838-9cc2-281cbf406cae" containerName="dnsmasq-dns" Mar 09 04:14:18 crc kubenswrapper[4901]: I0309 04:14:18.276945 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfe2f592-c7ea-4838-9cc2-281cbf406cae" containerName="dnsmasq-dns" Mar 09 04:14:18 crc kubenswrapper[4901]: E0309 04:14:18.276990 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfe2f592-c7ea-4838-9cc2-281cbf406cae" containerName="init" Mar 09 04:14:18 crc kubenswrapper[4901]: I0309 04:14:18.277002 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfe2f592-c7ea-4838-9cc2-281cbf406cae" containerName="init" Mar 09 04:14:18 crc kubenswrapper[4901]: E0309 04:14:18.277018 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf" containerName="keystone-bootstrap" Mar 09 04:14:18 crc kubenswrapper[4901]: I0309 04:14:18.277029 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf" containerName="keystone-bootstrap" Mar 09 04:14:18 crc kubenswrapper[4901]: I0309 04:14:18.277939 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf" containerName="keystone-bootstrap" Mar 09 04:14:18 crc kubenswrapper[4901]: I0309 04:14:18.277978 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfe2f592-c7ea-4838-9cc2-281cbf406cae" containerName="dnsmasq-dns" Mar 09 04:14:18 crc kubenswrapper[4901]: I0309 04:14:18.297874 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7bd95c8c59-n8g5r" Mar 09 04:14:18 crc kubenswrapper[4901]: I0309 04:14:18.301826 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 09 04:14:18 crc kubenswrapper[4901]: I0309 04:14:18.304739 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 09 04:14:18 crc kubenswrapper[4901]: I0309 04:14:18.304806 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 09 04:14:18 crc kubenswrapper[4901]: I0309 04:14:18.304843 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7bd95c8c59-n8g5r"] Mar 09 04:14:18 crc kubenswrapper[4901]: I0309 04:14:18.304954 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 09 04:14:18 crc kubenswrapper[4901]: I0309 04:14:18.305494 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 09 04:14:18 crc kubenswrapper[4901]: I0309 04:14:18.305703 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-59p27" Mar 09 04:14:18 crc kubenswrapper[4901]: I0309 04:14:18.397148 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f5275cc-6600-4603-9252-11131d31cd1b-config-data\") pod \"keystone-7bd95c8c59-n8g5r\" (UID: \"4f5275cc-6600-4603-9252-11131d31cd1b\") " pod="openstack/keystone-7bd95c8c59-n8g5r" Mar 09 04:14:18 crc kubenswrapper[4901]: I0309 04:14:18.397531 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4f5275cc-6600-4603-9252-11131d31cd1b-fernet-keys\") pod \"keystone-7bd95c8c59-n8g5r\" (UID: \"4f5275cc-6600-4603-9252-11131d31cd1b\") " pod="openstack/keystone-7bd95c8c59-n8g5r" Mar 09 04:14:18 crc kubenswrapper[4901]: I0309 04:14:18.397605 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f5275cc-6600-4603-9252-11131d31cd1b-combined-ca-bundle\") pod \"keystone-7bd95c8c59-n8g5r\" (UID: \"4f5275cc-6600-4603-9252-11131d31cd1b\") " pod="openstack/keystone-7bd95c8c59-n8g5r" Mar 09 04:14:18 crc kubenswrapper[4901]: I0309 04:14:18.397699 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f5275cc-6600-4603-9252-11131d31cd1b-scripts\") pod \"keystone-7bd95c8c59-n8g5r\" (UID: \"4f5275cc-6600-4603-9252-11131d31cd1b\") " pod="openstack/keystone-7bd95c8c59-n8g5r" Mar 09 04:14:18 crc kubenswrapper[4901]: I0309 04:14:18.397779 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f5275cc-6600-4603-9252-11131d31cd1b-internal-tls-certs\") pod \"keystone-7bd95c8c59-n8g5r\" (UID: \"4f5275cc-6600-4603-9252-11131d31cd1b\") " pod="openstack/keystone-7bd95c8c59-n8g5r" Mar 09 04:14:18 crc kubenswrapper[4901]: I0309 04:14:18.397858 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f5275cc-6600-4603-9252-11131d31cd1b-public-tls-certs\") pod \"keystone-7bd95c8c59-n8g5r\" (UID: \"4f5275cc-6600-4603-9252-11131d31cd1b\") " pod="openstack/keystone-7bd95c8c59-n8g5r" Mar 09 04:14:18 crc kubenswrapper[4901]: I0309 04:14:18.397931 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4f5275cc-6600-4603-9252-11131d31cd1b-credential-keys\") pod \"keystone-7bd95c8c59-n8g5r\" (UID: \"4f5275cc-6600-4603-9252-11131d31cd1b\") " pod="openstack/keystone-7bd95c8c59-n8g5r" Mar 09 04:14:18 crc kubenswrapper[4901]: I0309 04:14:18.398000 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsk5k\" (UniqueName: \"kubernetes.io/projected/4f5275cc-6600-4603-9252-11131d31cd1b-kube-api-access-zsk5k\") pod \"keystone-7bd95c8c59-n8g5r\" (UID: \"4f5275cc-6600-4603-9252-11131d31cd1b\") " pod="openstack/keystone-7bd95c8c59-n8g5r" Mar 09 04:14:18 crc kubenswrapper[4901]: I0309 04:14:18.500154 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4f5275cc-6600-4603-9252-11131d31cd1b-fernet-keys\") pod \"keystone-7bd95c8c59-n8g5r\" (UID: \"4f5275cc-6600-4603-9252-11131d31cd1b\") " pod="openstack/keystone-7bd95c8c59-n8g5r" Mar 09 04:14:18 crc kubenswrapper[4901]: I0309 04:14:18.500208 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f5275cc-6600-4603-9252-11131d31cd1b-combined-ca-bundle\") pod \"keystone-7bd95c8c59-n8g5r\" (UID: \"4f5275cc-6600-4603-9252-11131d31cd1b\") " pod="openstack/keystone-7bd95c8c59-n8g5r" Mar 09 04:14:18 crc kubenswrapper[4901]: I0309 04:14:18.500281 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f5275cc-6600-4603-9252-11131d31cd1b-scripts\") pod \"keystone-7bd95c8c59-n8g5r\" (UID: \"4f5275cc-6600-4603-9252-11131d31cd1b\") " pod="openstack/keystone-7bd95c8c59-n8g5r" Mar 09 04:14:18 crc kubenswrapper[4901]: I0309 04:14:18.500337 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f5275cc-6600-4603-9252-11131d31cd1b-internal-tls-certs\") pod \"keystone-7bd95c8c59-n8g5r\" (UID: \"4f5275cc-6600-4603-9252-11131d31cd1b\") " pod="openstack/keystone-7bd95c8c59-n8g5r" Mar 09 04:14:18 crc kubenswrapper[4901]: I0309 04:14:18.500377 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f5275cc-6600-4603-9252-11131d31cd1b-public-tls-certs\") pod \"keystone-7bd95c8c59-n8g5r\" (UID: \"4f5275cc-6600-4603-9252-11131d31cd1b\") " pod="openstack/keystone-7bd95c8c59-n8g5r" Mar 09 04:14:18 crc kubenswrapper[4901]: I0309 04:14:18.500404 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4f5275cc-6600-4603-9252-11131d31cd1b-credential-keys\") pod \"keystone-7bd95c8c59-n8g5r\" (UID: \"4f5275cc-6600-4603-9252-11131d31cd1b\") " pod="openstack/keystone-7bd95c8c59-n8g5r" Mar 09 04:14:18 crc kubenswrapper[4901]: I0309 04:14:18.500434 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsk5k\" (UniqueName: \"kubernetes.io/projected/4f5275cc-6600-4603-9252-11131d31cd1b-kube-api-access-zsk5k\") pod \"keystone-7bd95c8c59-n8g5r\" (UID: \"4f5275cc-6600-4603-9252-11131d31cd1b\") " pod="openstack/keystone-7bd95c8c59-n8g5r" Mar 09 04:14:18 crc kubenswrapper[4901]: I0309 04:14:18.500477 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f5275cc-6600-4603-9252-11131d31cd1b-config-data\") pod \"keystone-7bd95c8c59-n8g5r\" (UID: \"4f5275cc-6600-4603-9252-11131d31cd1b\") " pod="openstack/keystone-7bd95c8c59-n8g5r" Mar 09 04:14:18 crc kubenswrapper[4901]: I0309 04:14:18.511425 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4f5275cc-6600-4603-9252-11131d31cd1b-credential-keys\") pod \"keystone-7bd95c8c59-n8g5r\" (UID: \"4f5275cc-6600-4603-9252-11131d31cd1b\") " pod="openstack/keystone-7bd95c8c59-n8g5r" Mar 09 04:14:18 crc kubenswrapper[4901]: I0309 04:14:18.515164 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f5275cc-6600-4603-9252-11131d31cd1b-config-data\") pod \"keystone-7bd95c8c59-n8g5r\" (UID: \"4f5275cc-6600-4603-9252-11131d31cd1b\") " pod="openstack/keystone-7bd95c8c59-n8g5r" Mar 09 04:14:18 crc kubenswrapper[4901]: I0309 04:14:18.536916 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4f5275cc-6600-4603-9252-11131d31cd1b-fernet-keys\") pod \"keystone-7bd95c8c59-n8g5r\" (UID: \"4f5275cc-6600-4603-9252-11131d31cd1b\") " pod="openstack/keystone-7bd95c8c59-n8g5r" Mar 09 04:14:18 crc kubenswrapper[4901]: I0309 04:14:18.541443 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f5275cc-6600-4603-9252-11131d31cd1b-internal-tls-certs\") pod \"keystone-7bd95c8c59-n8g5r\" (UID: \"4f5275cc-6600-4603-9252-11131d31cd1b\") " pod="openstack/keystone-7bd95c8c59-n8g5r" Mar 09 04:14:18 crc kubenswrapper[4901]: I0309 04:14:18.541651 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f5275cc-6600-4603-9252-11131d31cd1b-scripts\") pod \"keystone-7bd95c8c59-n8g5r\" (UID: \"4f5275cc-6600-4603-9252-11131d31cd1b\") " pod="openstack/keystone-7bd95c8c59-n8g5r" Mar 09 04:14:18 crc kubenswrapper[4901]: I0309 04:14:18.541983 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f5275cc-6600-4603-9252-11131d31cd1b-public-tls-certs\") pod \"keystone-7bd95c8c59-n8g5r\" (UID: \"4f5275cc-6600-4603-9252-11131d31cd1b\") " pod="openstack/keystone-7bd95c8c59-n8g5r" Mar 09 04:14:18 crc kubenswrapper[4901]: I0309 04:14:18.545352 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f5275cc-6600-4603-9252-11131d31cd1b-combined-ca-bundle\") pod \"keystone-7bd95c8c59-n8g5r\" (UID: \"4f5275cc-6600-4603-9252-11131d31cd1b\") " pod="openstack/keystone-7bd95c8c59-n8g5r" Mar 09 04:14:18 crc kubenswrapper[4901]: I0309 04:14:18.549747 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsk5k\" (UniqueName: \"kubernetes.io/projected/4f5275cc-6600-4603-9252-11131d31cd1b-kube-api-access-zsk5k\") pod \"keystone-7bd95c8c59-n8g5r\" (UID: \"4f5275cc-6600-4603-9252-11131d31cd1b\") " pod="openstack/keystone-7bd95c8c59-n8g5r" Mar 09 04:14:18 crc kubenswrapper[4901]: I0309 04:14:18.627650 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7bd95c8c59-n8g5r" Mar 09 04:14:19 crc kubenswrapper[4901]: I0309 04:14:19.101263 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7bd95c8c59-n8g5r"] Mar 09 04:14:19 crc kubenswrapper[4901]: I0309 04:14:19.145754 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7bd95c8c59-n8g5r" event={"ID":"4f5275cc-6600-4603-9252-11131d31cd1b","Type":"ContainerStarted","Data":"21de50ad879b2ac6a980483af3e900911d52742405b1dd9a2bd04c929bfe9816"} Mar 09 04:14:20 crc kubenswrapper[4901]: I0309 04:14:20.156891 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7bd95c8c59-n8g5r" event={"ID":"4f5275cc-6600-4603-9252-11131d31cd1b","Type":"ContainerStarted","Data":"e420a3f13a2b990379ad38753f0eda07a4e3623364937c9afe9384bdb17fae60"} Mar 09 04:14:20 crc kubenswrapper[4901]: I0309 04:14:20.157239 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-7bd95c8c59-n8g5r" Mar 09 04:14:20 crc kubenswrapper[4901]: I0309 04:14:20.183658 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7bd95c8c59-n8g5r" podStartSLOduration=2.18363231 podStartE2EDuration="2.18363231s" podCreationTimestamp="2026-03-09 04:14:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 04:14:20.179042067 +0000 UTC m=+5584.768705809" watchObservedRunningTime="2026-03-09 04:14:20.18363231 +0000 UTC m=+5584.773296052" Mar 09 04:14:21 crc kubenswrapper[4901]: I0309 04:14:21.106876 4901 scope.go:117] "RemoveContainer" containerID="d217497ad345fae7da1bdd0eb24a9199e10562d0bb7d6f5c45a95f7e79585459" Mar 09 04:14:21 crc kubenswrapper[4901]: E0309 04:14:21.107206 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:14:35 crc kubenswrapper[4901]: I0309 04:14:35.106565 4901 scope.go:117] "RemoveContainer" containerID="d217497ad345fae7da1bdd0eb24a9199e10562d0bb7d6f5c45a95f7e79585459" Mar 09 04:14:35 crc kubenswrapper[4901]: E0309 04:14:35.107555 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:14:42 crc kubenswrapper[4901]: I0309 04:14:42.616276 4901 scope.go:117] "RemoveContainer" containerID="07aca33045e52a42df8f9db2b5908b1ba6fb2b86444f95f48083050b438a88d7" Mar 09 04:14:50 crc kubenswrapper[4901]: I0309 04:14:50.108807 4901 scope.go:117] "RemoveContainer" containerID="d217497ad345fae7da1bdd0eb24a9199e10562d0bb7d6f5c45a95f7e79585459" Mar 09 04:14:50 crc kubenswrapper[4901]: E0309 04:14:50.111253 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:14:50 crc kubenswrapper[4901]: I0309 04:14:50.198669 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-7bd95c8c59-n8g5r" Mar 09 04:14:53 crc kubenswrapper[4901]: I0309 04:14:53.010998 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 09 04:14:53 crc kubenswrapper[4901]: I0309 04:14:53.013388 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 09 04:14:53 crc kubenswrapper[4901]: I0309 04:14:53.016268 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 09 04:14:53 crc kubenswrapper[4901]: I0309 04:14:53.016414 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 09 04:14:53 crc kubenswrapper[4901]: I0309 04:14:53.017340 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-dzhpr" Mar 09 04:14:53 crc kubenswrapper[4901]: I0309 04:14:53.029548 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 09 04:14:53 crc kubenswrapper[4901]: I0309 04:14:53.107060 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6b898827-c802-48ee-b7b9-17e6a6706ef3-openstack-config\") pod \"openstackclient\" (UID: \"6b898827-c802-48ee-b7b9-17e6a6706ef3\") " pod="openstack/openstackclient" Mar 09 04:14:53 crc kubenswrapper[4901]: I0309 04:14:53.107488 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6mg8\" (UniqueName: \"kubernetes.io/projected/6b898827-c802-48ee-b7b9-17e6a6706ef3-kube-api-access-z6mg8\") pod \"openstackclient\" (UID: \"6b898827-c802-48ee-b7b9-17e6a6706ef3\") " pod="openstack/openstackclient" Mar 09 04:14:53 crc kubenswrapper[4901]: I0309 04:14:53.107629 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6b898827-c802-48ee-b7b9-17e6a6706ef3-openstack-config-secret\") pod \"openstackclient\" (UID: \"6b898827-c802-48ee-b7b9-17e6a6706ef3\") " pod="openstack/openstackclient" Mar 09 04:14:53 crc kubenswrapper[4901]: I0309 04:14:53.107695 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b898827-c802-48ee-b7b9-17e6a6706ef3-combined-ca-bundle\") pod \"openstackclient\" (UID: \"6b898827-c802-48ee-b7b9-17e6a6706ef3\") " pod="openstack/openstackclient" Mar 09 04:14:53 crc kubenswrapper[4901]: I0309 04:14:53.211128 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6b898827-c802-48ee-b7b9-17e6a6706ef3-openstack-config\") pod \"openstackclient\" (UID: \"6b898827-c802-48ee-b7b9-17e6a6706ef3\") " pod="openstack/openstackclient" Mar 09 04:14:53 crc kubenswrapper[4901]: I0309 04:14:53.211289 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6mg8\" (UniqueName: \"kubernetes.io/projected/6b898827-c802-48ee-b7b9-17e6a6706ef3-kube-api-access-z6mg8\") pod \"openstackclient\" (UID: \"6b898827-c802-48ee-b7b9-17e6a6706ef3\") " pod="openstack/openstackclient" Mar 09 04:14:53 crc kubenswrapper[4901]: I0309 04:14:53.211338 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6b898827-c802-48ee-b7b9-17e6a6706ef3-openstack-config-secret\") pod \"openstackclient\" (UID: \"6b898827-c802-48ee-b7b9-17e6a6706ef3\") " pod="openstack/openstackclient" Mar 09 04:14:53 crc kubenswrapper[4901]: I0309 04:14:53.211374 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b898827-c802-48ee-b7b9-17e6a6706ef3-combined-ca-bundle\") pod \"openstackclient\" (UID: \"6b898827-c802-48ee-b7b9-17e6a6706ef3\") " pod="openstack/openstackclient" Mar 09 04:14:53 crc kubenswrapper[4901]: I0309 04:14:53.213049 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6b898827-c802-48ee-b7b9-17e6a6706ef3-openstack-config\") pod \"openstackclient\" (UID: \"6b898827-c802-48ee-b7b9-17e6a6706ef3\") " pod="openstack/openstackclient" Mar 09 04:14:53 crc kubenswrapper[4901]: I0309 04:14:53.220177 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b898827-c802-48ee-b7b9-17e6a6706ef3-combined-ca-bundle\") pod \"openstackclient\" (UID: \"6b898827-c802-48ee-b7b9-17e6a6706ef3\") " pod="openstack/openstackclient" Mar 09 04:14:53 crc kubenswrapper[4901]: I0309 04:14:53.220395 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6b898827-c802-48ee-b7b9-17e6a6706ef3-openstack-config-secret\") pod \"openstackclient\" (UID: \"6b898827-c802-48ee-b7b9-17e6a6706ef3\") " pod="openstack/openstackclient" Mar 09 04:14:53 crc kubenswrapper[4901]: I0309 04:14:53.241212 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6mg8\" (UniqueName: \"kubernetes.io/projected/6b898827-c802-48ee-b7b9-17e6a6706ef3-kube-api-access-z6mg8\") pod \"openstackclient\" (UID: \"6b898827-c802-48ee-b7b9-17e6a6706ef3\") " pod="openstack/openstackclient" Mar 09 04:14:53 crc kubenswrapper[4901]: I0309 04:14:53.358871 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 09 04:14:53 crc kubenswrapper[4901]: I0309 04:14:53.848386 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 09 04:14:54 crc kubenswrapper[4901]: I0309 04:14:54.525295 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"6b898827-c802-48ee-b7b9-17e6a6706ef3","Type":"ContainerStarted","Data":"519ea8f759b8e52c1a571af6a2a32afe0f6ba6c21a6907667c30b1f126601597"} Mar 09 04:14:54 crc kubenswrapper[4901]: I0309 04:14:54.525774 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"6b898827-c802-48ee-b7b9-17e6a6706ef3","Type":"ContainerStarted","Data":"55c306eb2cda35fac5d7519f98c1041237965fd7af3862a2488361ef54014be4"} Mar 09 04:14:54 crc kubenswrapper[4901]: I0309 04:14:54.553005 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.552977753 podStartE2EDuration="2.552977753s" podCreationTimestamp="2026-03-09 04:14:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 04:14:54.549735373 +0000 UTC m=+5619.139399175" watchObservedRunningTime="2026-03-09 04:14:54.552977753 +0000 UTC m=+5619.142641515" Mar 09 04:15:00 crc kubenswrapper[4901]: I0309 04:15:00.161993 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550495-pml8j"] Mar 09 04:15:00 crc kubenswrapper[4901]: I0309 04:15:00.165283 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550495-pml8j" Mar 09 04:15:00 crc kubenswrapper[4901]: I0309 04:15:00.168256 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 09 04:15:00 crc kubenswrapper[4901]: I0309 04:15:00.168363 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 09 04:15:00 crc kubenswrapper[4901]: I0309 04:15:00.180336 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550495-pml8j"] Mar 09 04:15:00 crc kubenswrapper[4901]: I0309 04:15:00.258466 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1bc68c54-9fba-4487-962c-59b28953727e-secret-volume\") pod \"collect-profiles-29550495-pml8j\" (UID: \"1bc68c54-9fba-4487-962c-59b28953727e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550495-pml8j" Mar 09 04:15:00 crc kubenswrapper[4901]: I0309 04:15:00.258663 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22g5c\" (UniqueName: \"kubernetes.io/projected/1bc68c54-9fba-4487-962c-59b28953727e-kube-api-access-22g5c\") pod \"collect-profiles-29550495-pml8j\" (UID: \"1bc68c54-9fba-4487-962c-59b28953727e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550495-pml8j" Mar 09 04:15:00 crc kubenswrapper[4901]: I0309 04:15:00.259103 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1bc68c54-9fba-4487-962c-59b28953727e-config-volume\") pod \"collect-profiles-29550495-pml8j\" (UID: \"1bc68c54-9fba-4487-962c-59b28953727e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550495-pml8j" Mar 09 04:15:00 crc kubenswrapper[4901]: I0309 04:15:00.360650 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22g5c\" (UniqueName: \"kubernetes.io/projected/1bc68c54-9fba-4487-962c-59b28953727e-kube-api-access-22g5c\") pod \"collect-profiles-29550495-pml8j\" (UID: \"1bc68c54-9fba-4487-962c-59b28953727e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550495-pml8j" Mar 09 04:15:00 crc kubenswrapper[4901]: I0309 04:15:00.360815 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1bc68c54-9fba-4487-962c-59b28953727e-config-volume\") pod \"collect-profiles-29550495-pml8j\" (UID: \"1bc68c54-9fba-4487-962c-59b28953727e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550495-pml8j" Mar 09 04:15:00 crc kubenswrapper[4901]: I0309 04:15:00.360871 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1bc68c54-9fba-4487-962c-59b28953727e-secret-volume\") pod \"collect-profiles-29550495-pml8j\" (UID: \"1bc68c54-9fba-4487-962c-59b28953727e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550495-pml8j" Mar 09 04:15:00 crc kubenswrapper[4901]: I0309 04:15:00.362792 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1bc68c54-9fba-4487-962c-59b28953727e-config-volume\") pod \"collect-profiles-29550495-pml8j\" (UID: \"1bc68c54-9fba-4487-962c-59b28953727e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550495-pml8j" Mar 09 04:15:00 crc kubenswrapper[4901]: I0309 04:15:00.366885 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1bc68c54-9fba-4487-962c-59b28953727e-secret-volume\") pod \"collect-profiles-29550495-pml8j\" (UID: \"1bc68c54-9fba-4487-962c-59b28953727e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550495-pml8j" Mar 09 04:15:00 crc kubenswrapper[4901]: I0309 04:15:00.379400 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22g5c\" (UniqueName: \"kubernetes.io/projected/1bc68c54-9fba-4487-962c-59b28953727e-kube-api-access-22g5c\") pod \"collect-profiles-29550495-pml8j\" (UID: \"1bc68c54-9fba-4487-962c-59b28953727e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550495-pml8j" Mar 09 04:15:00 crc kubenswrapper[4901]: I0309 04:15:00.500497 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550495-pml8j" Mar 09 04:15:01 crc kubenswrapper[4901]: I0309 04:15:01.011861 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550495-pml8j"] Mar 09 04:15:01 crc kubenswrapper[4901]: I0309 04:15:01.605809 4901 generic.go:334] "Generic (PLEG): container finished" podID="1bc68c54-9fba-4487-962c-59b28953727e" containerID="49619b3b76bcd71a74d97d4b3f0de4cdc12daef5657f1219d82a7a99150c7f33" exitCode=0 Mar 09 04:15:01 crc kubenswrapper[4901]: I0309 04:15:01.605876 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550495-pml8j" event={"ID":"1bc68c54-9fba-4487-962c-59b28953727e","Type":"ContainerDied","Data":"49619b3b76bcd71a74d97d4b3f0de4cdc12daef5657f1219d82a7a99150c7f33"} Mar 09 04:15:01 crc kubenswrapper[4901]: I0309 04:15:01.605931 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550495-pml8j" event={"ID":"1bc68c54-9fba-4487-962c-59b28953727e","Type":"ContainerStarted","Data":"c8bbae8f72a6ea9f33fd00e67b36edf4dbbfb047ea7e2adc2f41694c78756ef1"} Mar 09 04:15:02 crc kubenswrapper[4901]: I0309 04:15:02.107570 4901 scope.go:117] "RemoveContainer" containerID="d217497ad345fae7da1bdd0eb24a9199e10562d0bb7d6f5c45a95f7e79585459" Mar 09 04:15:02 crc kubenswrapper[4901]: E0309 04:15:02.108379 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:15:03 crc kubenswrapper[4901]: I0309 04:15:03.010031 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550495-pml8j" Mar 09 04:15:03 crc kubenswrapper[4901]: I0309 04:15:03.114877 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1bc68c54-9fba-4487-962c-59b28953727e-secret-volume\") pod \"1bc68c54-9fba-4487-962c-59b28953727e\" (UID: \"1bc68c54-9fba-4487-962c-59b28953727e\") " Mar 09 04:15:03 crc kubenswrapper[4901]: I0309 04:15:03.115024 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22g5c\" (UniqueName: \"kubernetes.io/projected/1bc68c54-9fba-4487-962c-59b28953727e-kube-api-access-22g5c\") pod \"1bc68c54-9fba-4487-962c-59b28953727e\" (UID: \"1bc68c54-9fba-4487-962c-59b28953727e\") " Mar 09 04:15:03 crc kubenswrapper[4901]: I0309 04:15:03.115075 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1bc68c54-9fba-4487-962c-59b28953727e-config-volume\") pod \"1bc68c54-9fba-4487-962c-59b28953727e\" (UID: \"1bc68c54-9fba-4487-962c-59b28953727e\") " Mar 09 04:15:03 crc kubenswrapper[4901]: I0309 04:15:03.115995 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bc68c54-9fba-4487-962c-59b28953727e-config-volume" (OuterVolumeSpecName: "config-volume") pod "1bc68c54-9fba-4487-962c-59b28953727e" (UID: "1bc68c54-9fba-4487-962c-59b28953727e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 04:15:03 crc kubenswrapper[4901]: I0309 04:15:03.116355 4901 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1bc68c54-9fba-4487-962c-59b28953727e-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 04:15:03 crc kubenswrapper[4901]: I0309 04:15:03.123392 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bc68c54-9fba-4487-962c-59b28953727e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1bc68c54-9fba-4487-962c-59b28953727e" (UID: "1bc68c54-9fba-4487-962c-59b28953727e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 04:15:03 crc kubenswrapper[4901]: I0309 04:15:03.123531 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bc68c54-9fba-4487-962c-59b28953727e-kube-api-access-22g5c" (OuterVolumeSpecName: "kube-api-access-22g5c") pod "1bc68c54-9fba-4487-962c-59b28953727e" (UID: "1bc68c54-9fba-4487-962c-59b28953727e"). InnerVolumeSpecName "kube-api-access-22g5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:15:03 crc kubenswrapper[4901]: I0309 04:15:03.218831 4901 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1bc68c54-9fba-4487-962c-59b28953727e-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 09 04:15:03 crc kubenswrapper[4901]: I0309 04:15:03.218912 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22g5c\" (UniqueName: \"kubernetes.io/projected/1bc68c54-9fba-4487-962c-59b28953727e-kube-api-access-22g5c\") on node \"crc\" DevicePath \"\"" Mar 09 04:15:03 crc kubenswrapper[4901]: I0309 04:15:03.630790 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550495-pml8j" event={"ID":"1bc68c54-9fba-4487-962c-59b28953727e","Type":"ContainerDied","Data":"c8bbae8f72a6ea9f33fd00e67b36edf4dbbfb047ea7e2adc2f41694c78756ef1"} Mar 09 04:15:03 crc kubenswrapper[4901]: I0309 04:15:03.630856 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8bbae8f72a6ea9f33fd00e67b36edf4dbbfb047ea7e2adc2f41694c78756ef1" Mar 09 04:15:03 crc kubenswrapper[4901]: I0309 04:15:03.630901 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550495-pml8j" Mar 09 04:15:04 crc kubenswrapper[4901]: I0309 04:15:04.123719 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550450-h5mcz"] Mar 09 04:15:04 crc kubenswrapper[4901]: I0309 04:15:04.145207 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550450-h5mcz"] Mar 09 04:15:06 crc kubenswrapper[4901]: I0309 04:15:06.125735 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38e0d506-1a40-4a28-8819-ebae5d085f89" path="/var/lib/kubelet/pods/38e0d506-1a40-4a28-8819-ebae5d085f89/volumes" Mar 09 04:15:15 crc kubenswrapper[4901]: I0309 04:15:15.107570 4901 scope.go:117] "RemoveContainer" containerID="d217497ad345fae7da1bdd0eb24a9199e10562d0bb7d6f5c45a95f7e79585459" Mar 09 04:15:15 crc kubenswrapper[4901]: E0309 04:15:15.108709 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:15:27 crc kubenswrapper[4901]: I0309 04:15:27.105820 4901 scope.go:117] "RemoveContainer" containerID="d217497ad345fae7da1bdd0eb24a9199e10562d0bb7d6f5c45a95f7e79585459" Mar 09 04:15:27 crc kubenswrapper[4901]: E0309 04:15:27.106552 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:15:39 crc kubenswrapper[4901]: I0309 04:15:39.106854 4901 scope.go:117] "RemoveContainer" containerID="d217497ad345fae7da1bdd0eb24a9199e10562d0bb7d6f5c45a95f7e79585459" Mar 09 04:15:39 crc kubenswrapper[4901]: E0309 04:15:39.107783 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:15:42 crc kubenswrapper[4901]: I0309 04:15:42.764516 4901 scope.go:117] "RemoveContainer" containerID="887521f9d676b728687ca618dbe31f6f346e44011bf697088239931c2d36482a" Mar 09 04:15:50 crc kubenswrapper[4901]: I0309 04:15:50.115638 4901 scope.go:117] "RemoveContainer" containerID="d217497ad345fae7da1bdd0eb24a9199e10562d0bb7d6f5c45a95f7e79585459" Mar 09 04:15:50 crc kubenswrapper[4901]: E0309 04:15:50.116696 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:16:00 crc kubenswrapper[4901]: I0309 04:16:00.148655 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550496-nwz7b"] Mar 09 04:16:00 crc kubenswrapper[4901]: E0309 04:16:00.150946 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bc68c54-9fba-4487-962c-59b28953727e" containerName="collect-profiles" Mar 09 04:16:00 crc kubenswrapper[4901]: I0309 04:16:00.150984 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bc68c54-9fba-4487-962c-59b28953727e" containerName="collect-profiles" Mar 09 04:16:00 crc kubenswrapper[4901]: I0309 04:16:00.151809 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bc68c54-9fba-4487-962c-59b28953727e" containerName="collect-profiles" Mar 09 04:16:00 crc kubenswrapper[4901]: I0309 04:16:00.153476 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550496-nwz7b" Mar 09 04:16:00 crc kubenswrapper[4901]: I0309 04:16:00.156294 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 04:16:00 crc kubenswrapper[4901]: I0309 04:16:00.156471 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 04:16:00 crc kubenswrapper[4901]: I0309 04:16:00.156511 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lq6vs" Mar 09 04:16:00 crc kubenswrapper[4901]: I0309 04:16:00.157810 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550496-nwz7b"] Mar 09 04:16:00 crc kubenswrapper[4901]: I0309 04:16:00.248765 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhxxb\" (UniqueName: \"kubernetes.io/projected/1f8b785e-c883-404c-ab52-a24fa9f7aac4-kube-api-access-qhxxb\") pod \"auto-csr-approver-29550496-nwz7b\" (UID: \"1f8b785e-c883-404c-ab52-a24fa9f7aac4\") " pod="openshift-infra/auto-csr-approver-29550496-nwz7b" Mar 09 04:16:00 crc kubenswrapper[4901]: I0309 04:16:00.350711 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhxxb\" (UniqueName: \"kubernetes.io/projected/1f8b785e-c883-404c-ab52-a24fa9f7aac4-kube-api-access-qhxxb\") pod \"auto-csr-approver-29550496-nwz7b\" (UID: \"1f8b785e-c883-404c-ab52-a24fa9f7aac4\") " pod="openshift-infra/auto-csr-approver-29550496-nwz7b" Mar 09 04:16:00 crc kubenswrapper[4901]: I0309 04:16:00.377218 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhxxb\" (UniqueName: \"kubernetes.io/projected/1f8b785e-c883-404c-ab52-a24fa9f7aac4-kube-api-access-qhxxb\") pod \"auto-csr-approver-29550496-nwz7b\" (UID: \"1f8b785e-c883-404c-ab52-a24fa9f7aac4\") " pod="openshift-infra/auto-csr-approver-29550496-nwz7b" Mar 09 04:16:00 crc kubenswrapper[4901]: I0309 04:16:00.478539 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550496-nwz7b" Mar 09 04:16:00 crc kubenswrapper[4901]: I0309 04:16:00.797304 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550496-nwz7b"] Mar 09 04:16:00 crc kubenswrapper[4901]: W0309 04:16:00.807525 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f8b785e_c883_404c_ab52_a24fa9f7aac4.slice/crio-2e75a58726fc9f476fe0c4828e5b35a1880124cd2764104faa6c911baf59d35d WatchSource:0}: Error finding container 2e75a58726fc9f476fe0c4828e5b35a1880124cd2764104faa6c911baf59d35d: Status 404 returned error can't find the container with id 2e75a58726fc9f476fe0c4828e5b35a1880124cd2764104faa6c911baf59d35d Mar 09 04:16:01 crc kubenswrapper[4901]: I0309 04:16:01.224475 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550496-nwz7b" event={"ID":"1f8b785e-c883-404c-ab52-a24fa9f7aac4","Type":"ContainerStarted","Data":"2e75a58726fc9f476fe0c4828e5b35a1880124cd2764104faa6c911baf59d35d"} Mar 09 04:16:02 crc kubenswrapper[4901]: I0309 04:16:02.106474 4901 scope.go:117] "RemoveContainer" containerID="d217497ad345fae7da1bdd0eb24a9199e10562d0bb7d6f5c45a95f7e79585459" Mar 09 04:16:03 crc kubenswrapper[4901]: I0309 04:16:03.240645 4901 generic.go:334] "Generic (PLEG): container finished" podID="1f8b785e-c883-404c-ab52-a24fa9f7aac4" containerID="def578be99c9f0f716a9f6cf56e1dcf11e1a2b6201038c22c2d33d19113ad1e1" exitCode=0 Mar 09 04:16:03 crc kubenswrapper[4901]: I0309 04:16:03.240706 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550496-nwz7b" event={"ID":"1f8b785e-c883-404c-ab52-a24fa9f7aac4","Type":"ContainerDied","Data":"def578be99c9f0f716a9f6cf56e1dcf11e1a2b6201038c22c2d33d19113ad1e1"} Mar 09 04:16:03 crc kubenswrapper[4901]: I0309 04:16:03.247184 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" event={"ID":"65e722e8-52c4-4bb6-9927-f378b2f7296a","Type":"ContainerStarted","Data":"e7ff674ef89b0fc5d88826b6774b6df277813f850f698639d418107ba854ec9d"} Mar 09 04:16:04 crc kubenswrapper[4901]: I0309 04:16:04.592933 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550496-nwz7b" Mar 09 04:16:04 crc kubenswrapper[4901]: I0309 04:16:04.685037 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhxxb\" (UniqueName: \"kubernetes.io/projected/1f8b785e-c883-404c-ab52-a24fa9f7aac4-kube-api-access-qhxxb\") pod \"1f8b785e-c883-404c-ab52-a24fa9f7aac4\" (UID: \"1f8b785e-c883-404c-ab52-a24fa9f7aac4\") " Mar 09 04:16:04 crc kubenswrapper[4901]: I0309 04:16:04.692672 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f8b785e-c883-404c-ab52-a24fa9f7aac4-kube-api-access-qhxxb" (OuterVolumeSpecName: "kube-api-access-qhxxb") pod "1f8b785e-c883-404c-ab52-a24fa9f7aac4" (UID: "1f8b785e-c883-404c-ab52-a24fa9f7aac4"). InnerVolumeSpecName "kube-api-access-qhxxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:16:04 crc kubenswrapper[4901]: I0309 04:16:04.788505 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhxxb\" (UniqueName: \"kubernetes.io/projected/1f8b785e-c883-404c-ab52-a24fa9f7aac4-kube-api-access-qhxxb\") on node \"crc\" DevicePath \"\"" Mar 09 04:16:05 crc kubenswrapper[4901]: I0309 04:16:05.272621 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550496-nwz7b" event={"ID":"1f8b785e-c883-404c-ab52-a24fa9f7aac4","Type":"ContainerDied","Data":"2e75a58726fc9f476fe0c4828e5b35a1880124cd2764104faa6c911baf59d35d"} Mar 09 04:16:05 crc kubenswrapper[4901]: I0309 04:16:05.272685 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e75a58726fc9f476fe0c4828e5b35a1880124cd2764104faa6c911baf59d35d" Mar 09 04:16:05 crc kubenswrapper[4901]: I0309 04:16:05.272693 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550496-nwz7b" Mar 09 04:16:05 crc kubenswrapper[4901]: I0309 04:16:05.681959 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550490-5xxvw"] Mar 09 04:16:05 crc kubenswrapper[4901]: I0309 04:16:05.695655 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550490-5xxvw"] Mar 09 04:16:06 crc kubenswrapper[4901]: I0309 04:16:06.125877 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cde0fc6-6b28-4065-b35e-3a9dda22574a" path="/var/lib/kubelet/pods/9cde0fc6-6b28-4065-b35e-3a9dda22574a/volumes" Mar 09 04:16:42 crc kubenswrapper[4901]: I0309 04:16:42.849353 4901 scope.go:117] "RemoveContainer" containerID="4bf87b2f9423cd63169f60194fb521e0ae02ffb275c0cc3ee1ee878233a5ce4f" Mar 09 04:16:51 crc kubenswrapper[4901]: I0309 04:16:51.537701 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9x4xp"] Mar 09 04:16:51 crc kubenswrapper[4901]: E0309 04:16:51.539044 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f8b785e-c883-404c-ab52-a24fa9f7aac4" containerName="oc" Mar 09 04:16:51 crc kubenswrapper[4901]: I0309 04:16:51.539074 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f8b785e-c883-404c-ab52-a24fa9f7aac4" containerName="oc" Mar 09 04:16:51 crc kubenswrapper[4901]: I0309 04:16:51.539585 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f8b785e-c883-404c-ab52-a24fa9f7aac4" containerName="oc" Mar 09 04:16:51 crc kubenswrapper[4901]: I0309 04:16:51.542709 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9x4xp" Mar 09 04:16:51 crc kubenswrapper[4901]: I0309 04:16:51.550956 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9x4xp"] Mar 09 04:16:51 crc kubenswrapper[4901]: I0309 04:16:51.616127 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f255138-ad5d-424f-b498-d7caeb758e98-utilities\") pod \"community-operators-9x4xp\" (UID: \"2f255138-ad5d-424f-b498-d7caeb758e98\") " pod="openshift-marketplace/community-operators-9x4xp" Mar 09 04:16:51 crc kubenswrapper[4901]: I0309 04:16:51.616232 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5f9w8\" (UniqueName: \"kubernetes.io/projected/2f255138-ad5d-424f-b498-d7caeb758e98-kube-api-access-5f9w8\") pod \"community-operators-9x4xp\" (UID: \"2f255138-ad5d-424f-b498-d7caeb758e98\") " pod="openshift-marketplace/community-operators-9x4xp" Mar 09 04:16:51 crc kubenswrapper[4901]: I0309 04:16:51.616425 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f255138-ad5d-424f-b498-d7caeb758e98-catalog-content\") pod \"community-operators-9x4xp\" (UID: \"2f255138-ad5d-424f-b498-d7caeb758e98\") " pod="openshift-marketplace/community-operators-9x4xp" Mar 09 04:16:51 crc kubenswrapper[4901]: I0309 04:16:51.717582 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5f9w8\" (UniqueName: \"kubernetes.io/projected/2f255138-ad5d-424f-b498-d7caeb758e98-kube-api-access-5f9w8\") pod \"community-operators-9x4xp\" (UID: \"2f255138-ad5d-424f-b498-d7caeb758e98\") " pod="openshift-marketplace/community-operators-9x4xp" Mar 09 04:16:51 crc kubenswrapper[4901]: I0309 04:16:51.717662 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f255138-ad5d-424f-b498-d7caeb758e98-catalog-content\") pod \"community-operators-9x4xp\" (UID: \"2f255138-ad5d-424f-b498-d7caeb758e98\") " pod="openshift-marketplace/community-operators-9x4xp" Mar 09 04:16:51 crc kubenswrapper[4901]: I0309 04:16:51.717725 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f255138-ad5d-424f-b498-d7caeb758e98-utilities\") pod \"community-operators-9x4xp\" (UID: \"2f255138-ad5d-424f-b498-d7caeb758e98\") " pod="openshift-marketplace/community-operators-9x4xp" Mar 09 04:16:51 crc kubenswrapper[4901]: I0309 04:16:51.718195 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f255138-ad5d-424f-b498-d7caeb758e98-catalog-content\") pod \"community-operators-9x4xp\" (UID: \"2f255138-ad5d-424f-b498-d7caeb758e98\") " pod="openshift-marketplace/community-operators-9x4xp" Mar 09 04:16:51 crc kubenswrapper[4901]: I0309 04:16:51.718263 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f255138-ad5d-424f-b498-d7caeb758e98-utilities\") pod \"community-operators-9x4xp\" (UID: \"2f255138-ad5d-424f-b498-d7caeb758e98\") " pod="openshift-marketplace/community-operators-9x4xp" Mar 09 04:16:51 crc kubenswrapper[4901]: I0309 04:16:51.750007 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5f9w8\" (UniqueName: \"kubernetes.io/projected/2f255138-ad5d-424f-b498-d7caeb758e98-kube-api-access-5f9w8\") pod \"community-operators-9x4xp\" (UID: \"2f255138-ad5d-424f-b498-d7caeb758e98\") " pod="openshift-marketplace/community-operators-9x4xp" Mar 09 04:16:51 crc kubenswrapper[4901]: I0309 04:16:51.882486 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9x4xp" Mar 09 04:16:52 crc kubenswrapper[4901]: I0309 04:16:52.403169 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9x4xp"] Mar 09 04:16:52 crc kubenswrapper[4901]: I0309 04:16:52.720720 4901 generic.go:334] "Generic (PLEG): container finished" podID="2f255138-ad5d-424f-b498-d7caeb758e98" containerID="12af6fedbf0af39581e62eab1c3084b6f94b3d86c3e1f077681ed134694f5faa" exitCode=0 Mar 09 04:16:52 crc kubenswrapper[4901]: I0309 04:16:52.720839 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9x4xp" event={"ID":"2f255138-ad5d-424f-b498-d7caeb758e98","Type":"ContainerDied","Data":"12af6fedbf0af39581e62eab1c3084b6f94b3d86c3e1f077681ed134694f5faa"} Mar 09 04:16:52 crc kubenswrapper[4901]: I0309 04:16:52.721090 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9x4xp" event={"ID":"2f255138-ad5d-424f-b498-d7caeb758e98","Type":"ContainerStarted","Data":"c1bc85ff6fc186467732b7d47d85d0d719385e66ef49a2648b9e41bbaa295062"} Mar 09 04:16:53 crc kubenswrapper[4901]: I0309 04:16:53.733920 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9x4xp" event={"ID":"2f255138-ad5d-424f-b498-d7caeb758e98","Type":"ContainerStarted","Data":"b675aa6b03f8280262df12d5291151fa6c34cb0f16b4c827b96619288463514f"} Mar 09 04:16:54 crc kubenswrapper[4901]: I0309 04:16:54.747804 4901 generic.go:334] "Generic (PLEG): container finished" podID="2f255138-ad5d-424f-b498-d7caeb758e98" containerID="b675aa6b03f8280262df12d5291151fa6c34cb0f16b4c827b96619288463514f" exitCode=0 Mar 09 04:16:54 crc kubenswrapper[4901]: I0309 04:16:54.747861 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9x4xp" event={"ID":"2f255138-ad5d-424f-b498-d7caeb758e98","Type":"ContainerDied","Data":"b675aa6b03f8280262df12d5291151fa6c34cb0f16b4c827b96619288463514f"} Mar 09 04:16:55 crc kubenswrapper[4901]: I0309 04:16:55.759759 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9x4xp" event={"ID":"2f255138-ad5d-424f-b498-d7caeb758e98","Type":"ContainerStarted","Data":"c31a35ef8e26cbdc946f289600f18ea69ea7d4fc8cf42c4b3d62f9d7bb3cec13"} Mar 09 04:16:55 crc kubenswrapper[4901]: I0309 04:16:55.784842 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9x4xp" podStartSLOduration=2.312387955 podStartE2EDuration="4.784806937s" podCreationTimestamp="2026-03-09 04:16:51 +0000 UTC" firstStartedPulling="2026-03-09 04:16:52.723932991 +0000 UTC m=+5737.313596763" lastFinishedPulling="2026-03-09 04:16:55.196352013 +0000 UTC m=+5739.786015745" observedRunningTime="2026-03-09 04:16:55.780401378 +0000 UTC m=+5740.370065110" watchObservedRunningTime="2026-03-09 04:16:55.784806937 +0000 UTC m=+5740.374470679" Mar 09 04:17:01 crc kubenswrapper[4901]: I0309 04:17:01.883281 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9x4xp" Mar 09 04:17:01 crc kubenswrapper[4901]: I0309 04:17:01.883964 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9x4xp" Mar 09 04:17:01 crc kubenswrapper[4901]: I0309 04:17:01.965876 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9x4xp" Mar 09 04:17:02 crc kubenswrapper[4901]: I0309 04:17:02.902420 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9x4xp" Mar 09 04:17:02 crc kubenswrapper[4901]: I0309 04:17:02.967786 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9x4xp"] Mar 09 04:17:04 crc kubenswrapper[4901]: I0309 04:17:04.850751 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9x4xp" podUID="2f255138-ad5d-424f-b498-d7caeb758e98" containerName="registry-server" containerID="cri-o://c31a35ef8e26cbdc946f289600f18ea69ea7d4fc8cf42c4b3d62f9d7bb3cec13" gracePeriod=2 Mar 09 04:17:05 crc kubenswrapper[4901]: I0309 04:17:05.336481 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9x4xp" Mar 09 04:17:05 crc kubenswrapper[4901]: I0309 04:17:05.399573 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f255138-ad5d-424f-b498-d7caeb758e98-utilities\") pod \"2f255138-ad5d-424f-b498-d7caeb758e98\" (UID: \"2f255138-ad5d-424f-b498-d7caeb758e98\") " Mar 09 04:17:05 crc kubenswrapper[4901]: I0309 04:17:05.399675 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f255138-ad5d-424f-b498-d7caeb758e98-catalog-content\") pod \"2f255138-ad5d-424f-b498-d7caeb758e98\" (UID: \"2f255138-ad5d-424f-b498-d7caeb758e98\") " Mar 09 04:17:05 crc kubenswrapper[4901]: I0309 04:17:05.399774 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5f9w8\" (UniqueName: \"kubernetes.io/projected/2f255138-ad5d-424f-b498-d7caeb758e98-kube-api-access-5f9w8\") pod \"2f255138-ad5d-424f-b498-d7caeb758e98\" (UID: \"2f255138-ad5d-424f-b498-d7caeb758e98\") " Mar 09 04:17:05 crc kubenswrapper[4901]: I0309 04:17:05.400573 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f255138-ad5d-424f-b498-d7caeb758e98-utilities" (OuterVolumeSpecName: "utilities") pod "2f255138-ad5d-424f-b498-d7caeb758e98" (UID: "2f255138-ad5d-424f-b498-d7caeb758e98"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 04:17:05 crc kubenswrapper[4901]: I0309 04:17:05.405923 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f255138-ad5d-424f-b498-d7caeb758e98-kube-api-access-5f9w8" (OuterVolumeSpecName: "kube-api-access-5f9w8") pod "2f255138-ad5d-424f-b498-d7caeb758e98" (UID: "2f255138-ad5d-424f-b498-d7caeb758e98"). InnerVolumeSpecName "kube-api-access-5f9w8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:17:05 crc kubenswrapper[4901]: I0309 04:17:05.478252 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f255138-ad5d-424f-b498-d7caeb758e98-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2f255138-ad5d-424f-b498-d7caeb758e98" (UID: "2f255138-ad5d-424f-b498-d7caeb758e98"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 04:17:05 crc kubenswrapper[4901]: I0309 04:17:05.501966 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5f9w8\" (UniqueName: \"kubernetes.io/projected/2f255138-ad5d-424f-b498-d7caeb758e98-kube-api-access-5f9w8\") on node \"crc\" DevicePath \"\"" Mar 09 04:17:05 crc kubenswrapper[4901]: I0309 04:17:05.502010 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f255138-ad5d-424f-b498-d7caeb758e98-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 04:17:05 crc kubenswrapper[4901]: I0309 04:17:05.502021 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f255138-ad5d-424f-b498-d7caeb758e98-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 04:17:05 crc kubenswrapper[4901]: I0309 04:17:05.864340 4901 generic.go:334] "Generic (PLEG): container finished" podID="2f255138-ad5d-424f-b498-d7caeb758e98" containerID="c31a35ef8e26cbdc946f289600f18ea69ea7d4fc8cf42c4b3d62f9d7bb3cec13" exitCode=0 Mar 09 04:17:05 crc kubenswrapper[4901]: I0309 04:17:05.864390 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9x4xp" event={"ID":"2f255138-ad5d-424f-b498-d7caeb758e98","Type":"ContainerDied","Data":"c31a35ef8e26cbdc946f289600f18ea69ea7d4fc8cf42c4b3d62f9d7bb3cec13"} Mar 09 04:17:05 crc kubenswrapper[4901]: I0309 04:17:05.864449 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9x4xp" Mar 09 04:17:05 crc kubenswrapper[4901]: I0309 04:17:05.864471 4901 scope.go:117] "RemoveContainer" containerID="c31a35ef8e26cbdc946f289600f18ea69ea7d4fc8cf42c4b3d62f9d7bb3cec13" Mar 09 04:17:05 crc kubenswrapper[4901]: I0309 04:17:05.864458 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9x4xp" event={"ID":"2f255138-ad5d-424f-b498-d7caeb758e98","Type":"ContainerDied","Data":"c1bc85ff6fc186467732b7d47d85d0d719385e66ef49a2648b9e41bbaa295062"} Mar 09 04:17:05 crc kubenswrapper[4901]: I0309 04:17:05.908356 4901 scope.go:117] "RemoveContainer" containerID="b675aa6b03f8280262df12d5291151fa6c34cb0f16b4c827b96619288463514f" Mar 09 04:17:05 crc kubenswrapper[4901]: I0309 04:17:05.916927 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9x4xp"] Mar 09 04:17:05 crc kubenswrapper[4901]: I0309 04:17:05.925213 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9x4xp"] Mar 09 04:17:05 crc kubenswrapper[4901]: I0309 04:17:05.946525 4901 scope.go:117] "RemoveContainer" containerID="12af6fedbf0af39581e62eab1c3084b6f94b3d86c3e1f077681ed134694f5faa" Mar 09 04:17:05 crc kubenswrapper[4901]: I0309 04:17:05.979968 4901 scope.go:117] "RemoveContainer" containerID="c31a35ef8e26cbdc946f289600f18ea69ea7d4fc8cf42c4b3d62f9d7bb3cec13" Mar 09 04:17:05 crc kubenswrapper[4901]: E0309 04:17:05.980552 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c31a35ef8e26cbdc946f289600f18ea69ea7d4fc8cf42c4b3d62f9d7bb3cec13\": container with ID starting with c31a35ef8e26cbdc946f289600f18ea69ea7d4fc8cf42c4b3d62f9d7bb3cec13 not found: ID does not exist" containerID="c31a35ef8e26cbdc946f289600f18ea69ea7d4fc8cf42c4b3d62f9d7bb3cec13" Mar 09 04:17:05 crc kubenswrapper[4901]: I0309 04:17:05.980604 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c31a35ef8e26cbdc946f289600f18ea69ea7d4fc8cf42c4b3d62f9d7bb3cec13"} err="failed to get container status \"c31a35ef8e26cbdc946f289600f18ea69ea7d4fc8cf42c4b3d62f9d7bb3cec13\": rpc error: code = NotFound desc = could not find container \"c31a35ef8e26cbdc946f289600f18ea69ea7d4fc8cf42c4b3d62f9d7bb3cec13\": container with ID starting with c31a35ef8e26cbdc946f289600f18ea69ea7d4fc8cf42c4b3d62f9d7bb3cec13 not found: ID does not exist" Mar 09 04:17:05 crc kubenswrapper[4901]: I0309 04:17:05.980634 4901 scope.go:117] "RemoveContainer" containerID="b675aa6b03f8280262df12d5291151fa6c34cb0f16b4c827b96619288463514f" Mar 09 04:17:05 crc kubenswrapper[4901]: E0309 04:17:05.981015 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b675aa6b03f8280262df12d5291151fa6c34cb0f16b4c827b96619288463514f\": container with ID starting with b675aa6b03f8280262df12d5291151fa6c34cb0f16b4c827b96619288463514f not found: ID does not exist" containerID="b675aa6b03f8280262df12d5291151fa6c34cb0f16b4c827b96619288463514f" Mar 09 04:17:05 crc kubenswrapper[4901]: I0309 04:17:05.981042 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b675aa6b03f8280262df12d5291151fa6c34cb0f16b4c827b96619288463514f"} err="failed to get container status \"b675aa6b03f8280262df12d5291151fa6c34cb0f16b4c827b96619288463514f\": rpc error: code = NotFound desc = could not find container \"b675aa6b03f8280262df12d5291151fa6c34cb0f16b4c827b96619288463514f\": container with ID starting with b675aa6b03f8280262df12d5291151fa6c34cb0f16b4c827b96619288463514f not found: ID does not exist" Mar 09 04:17:05 crc kubenswrapper[4901]: I0309 04:17:05.981059 4901 scope.go:117] "RemoveContainer" containerID="12af6fedbf0af39581e62eab1c3084b6f94b3d86c3e1f077681ed134694f5faa" Mar 09 04:17:05 crc kubenswrapper[4901]: E0309 04:17:05.981433 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12af6fedbf0af39581e62eab1c3084b6f94b3d86c3e1f077681ed134694f5faa\": container with ID starting with 12af6fedbf0af39581e62eab1c3084b6f94b3d86c3e1f077681ed134694f5faa not found: ID does not exist" containerID="12af6fedbf0af39581e62eab1c3084b6f94b3d86c3e1f077681ed134694f5faa" Mar 09 04:17:05 crc kubenswrapper[4901]: I0309 04:17:05.981493 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12af6fedbf0af39581e62eab1c3084b6f94b3d86c3e1f077681ed134694f5faa"} err="failed to get container status \"12af6fedbf0af39581e62eab1c3084b6f94b3d86c3e1f077681ed134694f5faa\": rpc error: code = NotFound desc = could not find container \"12af6fedbf0af39581e62eab1c3084b6f94b3d86c3e1f077681ed134694f5faa\": container with ID starting with 12af6fedbf0af39581e62eab1c3084b6f94b3d86c3e1f077681ed134694f5faa not found: ID does not exist" Mar 09 04:17:06 crc kubenswrapper[4901]: I0309 04:17:06.121619 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f255138-ad5d-424f-b498-d7caeb758e98" path="/var/lib/kubelet/pods/2f255138-ad5d-424f-b498-d7caeb758e98/volumes" Mar 09 04:17:14 crc kubenswrapper[4901]: I0309 04:17:14.585416 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ppbts"] Mar 09 04:17:14 crc kubenswrapper[4901]: E0309 04:17:14.586503 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f255138-ad5d-424f-b498-d7caeb758e98" containerName="extract-utilities" Mar 09 04:17:14 crc kubenswrapper[4901]: I0309 04:17:14.586522 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f255138-ad5d-424f-b498-d7caeb758e98" containerName="extract-utilities" Mar 09 04:17:14 crc kubenswrapper[4901]: E0309 04:17:14.586552 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f255138-ad5d-424f-b498-d7caeb758e98" containerName="extract-content" Mar 09 04:17:14 crc kubenswrapper[4901]: I0309 04:17:14.586561 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f255138-ad5d-424f-b498-d7caeb758e98" containerName="extract-content" Mar 09 04:17:14 crc kubenswrapper[4901]: E0309 04:17:14.586575 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f255138-ad5d-424f-b498-d7caeb758e98" containerName="registry-server" Mar 09 04:17:14 crc kubenswrapper[4901]: I0309 04:17:14.586584 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f255138-ad5d-424f-b498-d7caeb758e98" containerName="registry-server" Mar 09 04:17:14 crc kubenswrapper[4901]: I0309 04:17:14.586804 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f255138-ad5d-424f-b498-d7caeb758e98" containerName="registry-server" Mar 09 04:17:14 crc kubenswrapper[4901]: I0309 04:17:14.588292 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ppbts" Mar 09 04:17:14 crc kubenswrapper[4901]: I0309 04:17:14.607651 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ppbts"] Mar 09 04:17:14 crc kubenswrapper[4901]: I0309 04:17:14.688705 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bec0c488-a1d0-4c5b-bdd1-5094b74d30e2-utilities\") pod \"redhat-marketplace-ppbts\" (UID: \"bec0c488-a1d0-4c5b-bdd1-5094b74d30e2\") " pod="openshift-marketplace/redhat-marketplace-ppbts" Mar 09 04:17:14 crc kubenswrapper[4901]: I0309 04:17:14.688880 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhtvl\" (UniqueName: \"kubernetes.io/projected/bec0c488-a1d0-4c5b-bdd1-5094b74d30e2-kube-api-access-zhtvl\") pod \"redhat-marketplace-ppbts\" (UID: \"bec0c488-a1d0-4c5b-bdd1-5094b74d30e2\") " pod="openshift-marketplace/redhat-marketplace-ppbts" Mar 09 04:17:14 crc kubenswrapper[4901]: I0309 04:17:14.689465 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bec0c488-a1d0-4c5b-bdd1-5094b74d30e2-catalog-content\") pod \"redhat-marketplace-ppbts\" (UID: \"bec0c488-a1d0-4c5b-bdd1-5094b74d30e2\") " pod="openshift-marketplace/redhat-marketplace-ppbts" Mar 09 04:17:14 crc kubenswrapper[4901]: I0309 04:17:14.791360 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bec0c488-a1d0-4c5b-bdd1-5094b74d30e2-catalog-content\") pod \"redhat-marketplace-ppbts\" (UID: \"bec0c488-a1d0-4c5b-bdd1-5094b74d30e2\") " pod="openshift-marketplace/redhat-marketplace-ppbts" Mar 09 04:17:14 crc kubenswrapper[4901]: I0309 04:17:14.791437 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bec0c488-a1d0-4c5b-bdd1-5094b74d30e2-utilities\") pod \"redhat-marketplace-ppbts\" (UID: \"bec0c488-a1d0-4c5b-bdd1-5094b74d30e2\") " pod="openshift-marketplace/redhat-marketplace-ppbts" Mar 09 04:17:14 crc kubenswrapper[4901]: I0309 04:17:14.791526 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhtvl\" (UniqueName: \"kubernetes.io/projected/bec0c488-a1d0-4c5b-bdd1-5094b74d30e2-kube-api-access-zhtvl\") pod \"redhat-marketplace-ppbts\" (UID: \"bec0c488-a1d0-4c5b-bdd1-5094b74d30e2\") " pod="openshift-marketplace/redhat-marketplace-ppbts" Mar 09 04:17:14 crc kubenswrapper[4901]: I0309 04:17:14.792052 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bec0c488-a1d0-4c5b-bdd1-5094b74d30e2-catalog-content\") pod \"redhat-marketplace-ppbts\" (UID: \"bec0c488-a1d0-4c5b-bdd1-5094b74d30e2\") " pod="openshift-marketplace/redhat-marketplace-ppbts" Mar 09 04:17:14 crc kubenswrapper[4901]: I0309 04:17:14.792280 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bec0c488-a1d0-4c5b-bdd1-5094b74d30e2-utilities\") pod \"redhat-marketplace-ppbts\" (UID: \"bec0c488-a1d0-4c5b-bdd1-5094b74d30e2\") " pod="openshift-marketplace/redhat-marketplace-ppbts" Mar 09 04:17:14 crc kubenswrapper[4901]: I0309 04:17:14.824075 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhtvl\" (UniqueName: \"kubernetes.io/projected/bec0c488-a1d0-4c5b-bdd1-5094b74d30e2-kube-api-access-zhtvl\") pod \"redhat-marketplace-ppbts\" (UID: \"bec0c488-a1d0-4c5b-bdd1-5094b74d30e2\") " pod="openshift-marketplace/redhat-marketplace-ppbts" Mar 09 04:17:14 crc kubenswrapper[4901]: I0309 04:17:14.922842 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ppbts" Mar 09 04:17:15 crc kubenswrapper[4901]: I0309 04:17:15.399971 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ppbts"] Mar 09 04:17:16 crc kubenswrapper[4901]: I0309 04:17:16.119742 4901 generic.go:334] "Generic (PLEG): container finished" podID="bec0c488-a1d0-4c5b-bdd1-5094b74d30e2" containerID="02fcd9c7763cf77acb712a64b8ab0ec5ecb10947507ba66dc3799a43ace5b902" exitCode=0 Mar 09 04:17:16 crc kubenswrapper[4901]: I0309 04:17:16.146358 4901 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 04:17:16 crc kubenswrapper[4901]: I0309 04:17:16.157587 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ppbts" event={"ID":"bec0c488-a1d0-4c5b-bdd1-5094b74d30e2","Type":"ContainerDied","Data":"02fcd9c7763cf77acb712a64b8ab0ec5ecb10947507ba66dc3799a43ace5b902"} Mar 09 04:17:16 crc kubenswrapper[4901]: I0309 04:17:16.157650 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ppbts" event={"ID":"bec0c488-a1d0-4c5b-bdd1-5094b74d30e2","Type":"ContainerStarted","Data":"b3dbf2ce257d9eda277124075132fbf8a7f532ee5da1340c80598efd183c497a"} Mar 09 04:17:17 crc kubenswrapper[4901]: I0309 04:17:17.143865 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ppbts" event={"ID":"bec0c488-a1d0-4c5b-bdd1-5094b74d30e2","Type":"ContainerStarted","Data":"760458aaa136769d91d56dc710fc2db1fbb6a05bb7b3d47d34e6b845a4ccec3a"} Mar 09 04:17:18 crc kubenswrapper[4901]: I0309 04:17:18.168810 4901 generic.go:334] "Generic (PLEG): container finished" podID="bec0c488-a1d0-4c5b-bdd1-5094b74d30e2" containerID="760458aaa136769d91d56dc710fc2db1fbb6a05bb7b3d47d34e6b845a4ccec3a" exitCode=0 Mar 09 04:17:18 crc kubenswrapper[4901]: I0309 04:17:18.169923 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ppbts" event={"ID":"bec0c488-a1d0-4c5b-bdd1-5094b74d30e2","Type":"ContainerDied","Data":"760458aaa136769d91d56dc710fc2db1fbb6a05bb7b3d47d34e6b845a4ccec3a"} Mar 09 04:17:19 crc kubenswrapper[4901]: I0309 04:17:19.186806 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ppbts" event={"ID":"bec0c488-a1d0-4c5b-bdd1-5094b74d30e2","Type":"ContainerStarted","Data":"50df445d08829a4fde29c11f23f6a8931aa327c2d1ebda2a7ed8e19ee8403733"} Mar 09 04:17:19 crc kubenswrapper[4901]: I0309 04:17:19.218860 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ppbts" podStartSLOduration=2.677508469 podStartE2EDuration="5.218834007s" podCreationTimestamp="2026-03-09 04:17:14 +0000 UTC" firstStartedPulling="2026-03-09 04:17:16.145877713 +0000 UTC m=+5760.735541485" lastFinishedPulling="2026-03-09 04:17:18.687203251 +0000 UTC m=+5763.276867023" observedRunningTime="2026-03-09 04:17:19.209499148 +0000 UTC m=+5763.799162940" watchObservedRunningTime="2026-03-09 04:17:19.218834007 +0000 UTC m=+5763.808497779" Mar 09 04:17:20 crc kubenswrapper[4901]: I0309 04:17:20.078815 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-wp4lm"] Mar 09 04:17:20 crc kubenswrapper[4901]: I0309 04:17:20.088267 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-wp4lm"] Mar 09 04:17:20 crc kubenswrapper[4901]: I0309 04:17:20.123846 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5c413e1-2b2d-45c5-be20-c8b4fc90c324" path="/var/lib/kubelet/pods/a5c413e1-2b2d-45c5-be20-c8b4fc90c324/volumes" Mar 09 04:17:24 crc kubenswrapper[4901]: I0309 04:17:24.923438 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ppbts" Mar 09 04:17:24 crc kubenswrapper[4901]: I0309 04:17:24.923903 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ppbts" Mar 09 04:17:25 crc kubenswrapper[4901]: I0309 04:17:25.035788 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ppbts" Mar 09 04:17:25 crc kubenswrapper[4901]: I0309 04:17:25.319871 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ppbts" Mar 09 04:17:25 crc kubenswrapper[4901]: I0309 04:17:25.368916 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ppbts"] Mar 09 04:17:27 crc kubenswrapper[4901]: I0309 04:17:27.275550 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ppbts" podUID="bec0c488-a1d0-4c5b-bdd1-5094b74d30e2" containerName="registry-server" containerID="cri-o://50df445d08829a4fde29c11f23f6a8931aa327c2d1ebda2a7ed8e19ee8403733" gracePeriod=2 Mar 09 04:17:27 crc kubenswrapper[4901]: I0309 04:17:27.809756 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ppbts" Mar 09 04:17:27 crc kubenswrapper[4901]: I0309 04:17:27.841918 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bec0c488-a1d0-4c5b-bdd1-5094b74d30e2-catalog-content\") pod \"bec0c488-a1d0-4c5b-bdd1-5094b74d30e2\" (UID: \"bec0c488-a1d0-4c5b-bdd1-5094b74d30e2\") " Mar 09 04:17:27 crc kubenswrapper[4901]: I0309 04:17:27.842241 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhtvl\" (UniqueName: \"kubernetes.io/projected/bec0c488-a1d0-4c5b-bdd1-5094b74d30e2-kube-api-access-zhtvl\") pod \"bec0c488-a1d0-4c5b-bdd1-5094b74d30e2\" (UID: \"bec0c488-a1d0-4c5b-bdd1-5094b74d30e2\") " Mar 09 04:17:27 crc kubenswrapper[4901]: I0309 04:17:27.842287 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bec0c488-a1d0-4c5b-bdd1-5094b74d30e2-utilities\") pod \"bec0c488-a1d0-4c5b-bdd1-5094b74d30e2\" (UID: \"bec0c488-a1d0-4c5b-bdd1-5094b74d30e2\") " Mar 09 04:17:27 crc kubenswrapper[4901]: I0309 04:17:27.843686 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bec0c488-a1d0-4c5b-bdd1-5094b74d30e2-utilities" (OuterVolumeSpecName: "utilities") pod "bec0c488-a1d0-4c5b-bdd1-5094b74d30e2" (UID: "bec0c488-a1d0-4c5b-bdd1-5094b74d30e2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 04:17:27 crc kubenswrapper[4901]: I0309 04:17:27.855483 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bec0c488-a1d0-4c5b-bdd1-5094b74d30e2-kube-api-access-zhtvl" (OuterVolumeSpecName: "kube-api-access-zhtvl") pod "bec0c488-a1d0-4c5b-bdd1-5094b74d30e2" (UID: "bec0c488-a1d0-4c5b-bdd1-5094b74d30e2"). InnerVolumeSpecName "kube-api-access-zhtvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:17:27 crc kubenswrapper[4901]: I0309 04:17:27.871846 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bec0c488-a1d0-4c5b-bdd1-5094b74d30e2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bec0c488-a1d0-4c5b-bdd1-5094b74d30e2" (UID: "bec0c488-a1d0-4c5b-bdd1-5094b74d30e2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 04:17:27 crc kubenswrapper[4901]: I0309 04:17:27.944427 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bec0c488-a1d0-4c5b-bdd1-5094b74d30e2-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 04:17:27 crc kubenswrapper[4901]: I0309 04:17:27.944466 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bec0c488-a1d0-4c5b-bdd1-5094b74d30e2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 04:17:27 crc kubenswrapper[4901]: I0309 04:17:27.944485 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhtvl\" (UniqueName: \"kubernetes.io/projected/bec0c488-a1d0-4c5b-bdd1-5094b74d30e2-kube-api-access-zhtvl\") on node \"crc\" DevicePath \"\"" Mar 09 04:17:28 crc kubenswrapper[4901]: I0309 04:17:28.286491 4901 generic.go:334] "Generic (PLEG): container finished" podID="bec0c488-a1d0-4c5b-bdd1-5094b74d30e2" containerID="50df445d08829a4fde29c11f23f6a8931aa327c2d1ebda2a7ed8e19ee8403733" exitCode=0 Mar 09 04:17:28 crc kubenswrapper[4901]: I0309 04:17:28.286540 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ppbts" event={"ID":"bec0c488-a1d0-4c5b-bdd1-5094b74d30e2","Type":"ContainerDied","Data":"50df445d08829a4fde29c11f23f6a8931aa327c2d1ebda2a7ed8e19ee8403733"} Mar 09 04:17:28 crc kubenswrapper[4901]: I0309 04:17:28.286571 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ppbts" event={"ID":"bec0c488-a1d0-4c5b-bdd1-5094b74d30e2","Type":"ContainerDied","Data":"b3dbf2ce257d9eda277124075132fbf8a7f532ee5da1340c80598efd183c497a"} Mar 09 04:17:28 crc kubenswrapper[4901]: I0309 04:17:28.286590 4901 scope.go:117] "RemoveContainer" containerID="50df445d08829a4fde29c11f23f6a8931aa327c2d1ebda2a7ed8e19ee8403733" Mar 09 04:17:28 crc kubenswrapper[4901]: I0309 04:17:28.286730 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ppbts" Mar 09 04:17:28 crc kubenswrapper[4901]: I0309 04:17:28.315199 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ppbts"] Mar 09 04:17:28 crc kubenswrapper[4901]: I0309 04:17:28.315424 4901 scope.go:117] "RemoveContainer" containerID="760458aaa136769d91d56dc710fc2db1fbb6a05bb7b3d47d34e6b845a4ccec3a" Mar 09 04:17:28 crc kubenswrapper[4901]: I0309 04:17:28.327394 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ppbts"] Mar 09 04:17:28 crc kubenswrapper[4901]: I0309 04:17:28.344568 4901 scope.go:117] "RemoveContainer" containerID="02fcd9c7763cf77acb712a64b8ab0ec5ecb10947507ba66dc3799a43ace5b902" Mar 09 04:17:28 crc kubenswrapper[4901]: I0309 04:17:28.382775 4901 scope.go:117] "RemoveContainer" containerID="50df445d08829a4fde29c11f23f6a8931aa327c2d1ebda2a7ed8e19ee8403733" Mar 09 04:17:28 crc kubenswrapper[4901]: E0309 04:17:28.383500 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50df445d08829a4fde29c11f23f6a8931aa327c2d1ebda2a7ed8e19ee8403733\": container with ID starting with 50df445d08829a4fde29c11f23f6a8931aa327c2d1ebda2a7ed8e19ee8403733 not found: ID does not exist" containerID="50df445d08829a4fde29c11f23f6a8931aa327c2d1ebda2a7ed8e19ee8403733" Mar 09 04:17:28 crc kubenswrapper[4901]: I0309 04:17:28.383542 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50df445d08829a4fde29c11f23f6a8931aa327c2d1ebda2a7ed8e19ee8403733"} err="failed to get container status \"50df445d08829a4fde29c11f23f6a8931aa327c2d1ebda2a7ed8e19ee8403733\": rpc error: code = NotFound desc = could not find container \"50df445d08829a4fde29c11f23f6a8931aa327c2d1ebda2a7ed8e19ee8403733\": container with ID starting with 50df445d08829a4fde29c11f23f6a8931aa327c2d1ebda2a7ed8e19ee8403733 not found: ID does not exist" Mar 09 04:17:28 crc kubenswrapper[4901]: I0309 04:17:28.383570 4901 scope.go:117] "RemoveContainer" containerID="760458aaa136769d91d56dc710fc2db1fbb6a05bb7b3d47d34e6b845a4ccec3a" Mar 09 04:17:28 crc kubenswrapper[4901]: E0309 04:17:28.383969 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"760458aaa136769d91d56dc710fc2db1fbb6a05bb7b3d47d34e6b845a4ccec3a\": container with ID starting with 760458aaa136769d91d56dc710fc2db1fbb6a05bb7b3d47d34e6b845a4ccec3a not found: ID does not exist" containerID="760458aaa136769d91d56dc710fc2db1fbb6a05bb7b3d47d34e6b845a4ccec3a" Mar 09 04:17:28 crc kubenswrapper[4901]: I0309 04:17:28.384027 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"760458aaa136769d91d56dc710fc2db1fbb6a05bb7b3d47d34e6b845a4ccec3a"} err="failed to get container status \"760458aaa136769d91d56dc710fc2db1fbb6a05bb7b3d47d34e6b845a4ccec3a\": rpc error: code = NotFound desc = could not find container \"760458aaa136769d91d56dc710fc2db1fbb6a05bb7b3d47d34e6b845a4ccec3a\": container with ID starting with 760458aaa136769d91d56dc710fc2db1fbb6a05bb7b3d47d34e6b845a4ccec3a not found: ID does not exist" Mar 09 04:17:28 crc kubenswrapper[4901]: I0309 04:17:28.384063 4901 scope.go:117] "RemoveContainer" containerID="02fcd9c7763cf77acb712a64b8ab0ec5ecb10947507ba66dc3799a43ace5b902" Mar 09 04:17:28 crc kubenswrapper[4901]: E0309 04:17:28.384379 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02fcd9c7763cf77acb712a64b8ab0ec5ecb10947507ba66dc3799a43ace5b902\": container with ID starting with 02fcd9c7763cf77acb712a64b8ab0ec5ecb10947507ba66dc3799a43ace5b902 not found: ID does not exist" containerID="02fcd9c7763cf77acb712a64b8ab0ec5ecb10947507ba66dc3799a43ace5b902" Mar 09 04:17:28 crc kubenswrapper[4901]: I0309 04:17:28.384412 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02fcd9c7763cf77acb712a64b8ab0ec5ecb10947507ba66dc3799a43ace5b902"} err="failed to get container status \"02fcd9c7763cf77acb712a64b8ab0ec5ecb10947507ba66dc3799a43ace5b902\": rpc error: code = NotFound desc = could not find container \"02fcd9c7763cf77acb712a64b8ab0ec5ecb10947507ba66dc3799a43ace5b902\": container with ID starting with 02fcd9c7763cf77acb712a64b8ab0ec5ecb10947507ba66dc3799a43ace5b902 not found: ID does not exist" Mar 09 04:17:30 crc kubenswrapper[4901]: I0309 04:17:30.126827 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bec0c488-a1d0-4c5b-bdd1-5094b74d30e2" path="/var/lib/kubelet/pods/bec0c488-a1d0-4c5b-bdd1-5094b74d30e2/volumes" Mar 09 04:17:42 crc kubenswrapper[4901]: I0309 04:17:42.934341 4901 scope.go:117] "RemoveContainer" containerID="a0abcad488c073232285e86fb3e59aaec44a6017e059d27e01e0859726eb2882" Mar 09 04:18:00 crc kubenswrapper[4901]: I0309 04:18:00.164159 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550498-mpkrx"] Mar 09 04:18:00 crc kubenswrapper[4901]: E0309 04:18:00.165286 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bec0c488-a1d0-4c5b-bdd1-5094b74d30e2" containerName="extract-content" Mar 09 04:18:00 crc kubenswrapper[4901]: I0309 04:18:00.165308 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="bec0c488-a1d0-4c5b-bdd1-5094b74d30e2" containerName="extract-content" Mar 09 04:18:00 crc kubenswrapper[4901]: E0309 04:18:00.165344 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bec0c488-a1d0-4c5b-bdd1-5094b74d30e2" containerName="extract-utilities" Mar 09 04:18:00 crc kubenswrapper[4901]: I0309 04:18:00.165356 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="bec0c488-a1d0-4c5b-bdd1-5094b74d30e2" containerName="extract-utilities" Mar 09 04:18:00 crc kubenswrapper[4901]: E0309 04:18:00.165382 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bec0c488-a1d0-4c5b-bdd1-5094b74d30e2" containerName="registry-server" Mar 09 04:18:00 crc kubenswrapper[4901]: I0309 04:18:00.165395 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="bec0c488-a1d0-4c5b-bdd1-5094b74d30e2" containerName="registry-server" Mar 09 04:18:00 crc kubenswrapper[4901]: I0309 04:18:00.165713 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="bec0c488-a1d0-4c5b-bdd1-5094b74d30e2" containerName="registry-server" Mar 09 04:18:00 crc kubenswrapper[4901]: I0309 04:18:00.166599 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550498-mpkrx" Mar 09 04:18:00 crc kubenswrapper[4901]: I0309 04:18:00.169705 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 04:18:00 crc kubenswrapper[4901]: I0309 04:18:00.170280 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 04:18:00 crc kubenswrapper[4901]: I0309 04:18:00.170991 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lq6vs" Mar 09 04:18:00 crc kubenswrapper[4901]: I0309 04:18:00.186128 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550498-mpkrx"] Mar 09 04:18:00 crc kubenswrapper[4901]: I0309 04:18:00.326717 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2msh\" (UniqueName: \"kubernetes.io/projected/391820fa-b20b-475a-a06d-326041cf8728-kube-api-access-w2msh\") pod \"auto-csr-approver-29550498-mpkrx\" (UID: \"391820fa-b20b-475a-a06d-326041cf8728\") " pod="openshift-infra/auto-csr-approver-29550498-mpkrx" Mar 09 04:18:00 crc kubenswrapper[4901]: I0309 04:18:00.428596 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2msh\" (UniqueName: \"kubernetes.io/projected/391820fa-b20b-475a-a06d-326041cf8728-kube-api-access-w2msh\") pod \"auto-csr-approver-29550498-mpkrx\" (UID: \"391820fa-b20b-475a-a06d-326041cf8728\") " pod="openshift-infra/auto-csr-approver-29550498-mpkrx" Mar 09 04:18:00 crc kubenswrapper[4901]: I0309 04:18:00.454975 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2msh\" (UniqueName: \"kubernetes.io/projected/391820fa-b20b-475a-a06d-326041cf8728-kube-api-access-w2msh\") pod \"auto-csr-approver-29550498-mpkrx\" (UID: \"391820fa-b20b-475a-a06d-326041cf8728\") " pod="openshift-infra/auto-csr-approver-29550498-mpkrx" Mar 09 04:18:00 crc kubenswrapper[4901]: I0309 04:18:00.494397 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550498-mpkrx" Mar 09 04:18:00 crc kubenswrapper[4901]: I0309 04:18:00.803144 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550498-mpkrx"] Mar 09 04:18:01 crc kubenswrapper[4901]: I0309 04:18:01.594289 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550498-mpkrx" event={"ID":"391820fa-b20b-475a-a06d-326041cf8728","Type":"ContainerStarted","Data":"508468564dead0c07d7733105c10272c1134978feb979ab437a216db4e068f07"} Mar 09 04:18:02 crc kubenswrapper[4901]: I0309 04:18:02.606405 4901 generic.go:334] "Generic (PLEG): container finished" podID="391820fa-b20b-475a-a06d-326041cf8728" containerID="7684a5e7253918e334c6c32e9589d03561b9feaab1c38a27e3370cd0b552990e" exitCode=0 Mar 09 04:18:02 crc kubenswrapper[4901]: I0309 04:18:02.606482 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550498-mpkrx" event={"ID":"391820fa-b20b-475a-a06d-326041cf8728","Type":"ContainerDied","Data":"7684a5e7253918e334c6c32e9589d03561b9feaab1c38a27e3370cd0b552990e"} Mar 09 04:18:04 crc kubenswrapper[4901]: I0309 04:18:04.018683 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550498-mpkrx" Mar 09 04:18:04 crc kubenswrapper[4901]: I0309 04:18:04.193763 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2msh\" (UniqueName: \"kubernetes.io/projected/391820fa-b20b-475a-a06d-326041cf8728-kube-api-access-w2msh\") pod \"391820fa-b20b-475a-a06d-326041cf8728\" (UID: \"391820fa-b20b-475a-a06d-326041cf8728\") " Mar 09 04:18:04 crc kubenswrapper[4901]: I0309 04:18:04.199812 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/391820fa-b20b-475a-a06d-326041cf8728-kube-api-access-w2msh" (OuterVolumeSpecName: "kube-api-access-w2msh") pod "391820fa-b20b-475a-a06d-326041cf8728" (UID: "391820fa-b20b-475a-a06d-326041cf8728"). InnerVolumeSpecName "kube-api-access-w2msh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:18:04 crc kubenswrapper[4901]: I0309 04:18:04.295928 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2msh\" (UniqueName: \"kubernetes.io/projected/391820fa-b20b-475a-a06d-326041cf8728-kube-api-access-w2msh\") on node \"crc\" DevicePath \"\"" Mar 09 04:18:04 crc kubenswrapper[4901]: I0309 04:18:04.636546 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550498-mpkrx" event={"ID":"391820fa-b20b-475a-a06d-326041cf8728","Type":"ContainerDied","Data":"508468564dead0c07d7733105c10272c1134978feb979ab437a216db4e068f07"} Mar 09 04:18:04 crc kubenswrapper[4901]: I0309 04:18:04.636598 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="508468564dead0c07d7733105c10272c1134978feb979ab437a216db4e068f07" Mar 09 04:18:04 crc kubenswrapper[4901]: I0309 04:18:04.636622 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550498-mpkrx" Mar 09 04:18:05 crc kubenswrapper[4901]: I0309 04:18:05.118521 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550492-bxlcq"] Mar 09 04:18:05 crc kubenswrapper[4901]: I0309 04:18:05.129826 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550492-bxlcq"] Mar 09 04:18:06 crc kubenswrapper[4901]: I0309 04:18:06.125098 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a165250-7b91-481f-80f9-1561a790b7c9" path="/var/lib/kubelet/pods/4a165250-7b91-481f-80f9-1561a790b7c9/volumes" Mar 09 04:18:30 crc kubenswrapper[4901]: I0309 04:18:30.863190 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 04:18:30 crc kubenswrapper[4901]: I0309 04:18:30.863873 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 04:18:43 crc kubenswrapper[4901]: I0309 04:18:43.023061 4901 scope.go:117] "RemoveContainer" containerID="7fb4cff40f2d552f73408e05cbd3fd49b9c40585ea6a0fa507d15e7ef5bc3a16" Mar 09 04:19:00 crc kubenswrapper[4901]: I0309 04:19:00.385944 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bts42"] Mar 09 04:19:00 crc kubenswrapper[4901]: E0309 04:19:00.387065 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="391820fa-b20b-475a-a06d-326041cf8728" containerName="oc" Mar 09 04:19:00 crc kubenswrapper[4901]: I0309 04:19:00.387087 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="391820fa-b20b-475a-a06d-326041cf8728" containerName="oc" Mar 09 04:19:00 crc kubenswrapper[4901]: I0309 04:19:00.387434 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="391820fa-b20b-475a-a06d-326041cf8728" containerName="oc" Mar 09 04:19:00 crc kubenswrapper[4901]: I0309 04:19:00.389735 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bts42" Mar 09 04:19:00 crc kubenswrapper[4901]: I0309 04:19:00.412723 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bts42"] Mar 09 04:19:00 crc kubenswrapper[4901]: I0309 04:19:00.482002 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdlcm\" (UniqueName: \"kubernetes.io/projected/f37c48cb-e13a-42e6-9261-d5898938bbe0-kube-api-access-zdlcm\") pod \"redhat-operators-bts42\" (UID: \"f37c48cb-e13a-42e6-9261-d5898938bbe0\") " pod="openshift-marketplace/redhat-operators-bts42" Mar 09 04:19:00 crc kubenswrapper[4901]: I0309 04:19:00.482128 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f37c48cb-e13a-42e6-9261-d5898938bbe0-catalog-content\") pod \"redhat-operators-bts42\" (UID: \"f37c48cb-e13a-42e6-9261-d5898938bbe0\") " pod="openshift-marketplace/redhat-operators-bts42" Mar 09 04:19:00 crc kubenswrapper[4901]: I0309 04:19:00.482437 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f37c48cb-e13a-42e6-9261-d5898938bbe0-utilities\") pod \"redhat-operators-bts42\" (UID: \"f37c48cb-e13a-42e6-9261-d5898938bbe0\") " pod="openshift-marketplace/redhat-operators-bts42" Mar 09 04:19:00 crc kubenswrapper[4901]: I0309 04:19:00.583656 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f37c48cb-e13a-42e6-9261-d5898938bbe0-catalog-content\") pod \"redhat-operators-bts42\" (UID: \"f37c48cb-e13a-42e6-9261-d5898938bbe0\") " pod="openshift-marketplace/redhat-operators-bts42" Mar 09 04:19:00 crc kubenswrapper[4901]: I0309 04:19:00.583790 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f37c48cb-e13a-42e6-9261-d5898938bbe0-utilities\") pod \"redhat-operators-bts42\" (UID: \"f37c48cb-e13a-42e6-9261-d5898938bbe0\") " pod="openshift-marketplace/redhat-operators-bts42" Mar 09 04:19:00 crc kubenswrapper[4901]: I0309 04:19:00.583875 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdlcm\" (UniqueName: \"kubernetes.io/projected/f37c48cb-e13a-42e6-9261-d5898938bbe0-kube-api-access-zdlcm\") pod \"redhat-operators-bts42\" (UID: \"f37c48cb-e13a-42e6-9261-d5898938bbe0\") " pod="openshift-marketplace/redhat-operators-bts42" Mar 09 04:19:00 crc kubenswrapper[4901]: I0309 04:19:00.584555 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f37c48cb-e13a-42e6-9261-d5898938bbe0-catalog-content\") pod \"redhat-operators-bts42\" (UID: \"f37c48cb-e13a-42e6-9261-d5898938bbe0\") " pod="openshift-marketplace/redhat-operators-bts42" Mar 09 04:19:00 crc kubenswrapper[4901]: I0309 04:19:00.584729 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f37c48cb-e13a-42e6-9261-d5898938bbe0-utilities\") pod \"redhat-operators-bts42\" (UID: \"f37c48cb-e13a-42e6-9261-d5898938bbe0\") " pod="openshift-marketplace/redhat-operators-bts42" Mar 09 04:19:00 crc kubenswrapper[4901]: I0309 04:19:00.610288 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdlcm\" (UniqueName: \"kubernetes.io/projected/f37c48cb-e13a-42e6-9261-d5898938bbe0-kube-api-access-zdlcm\") pod \"redhat-operators-bts42\" (UID: \"f37c48cb-e13a-42e6-9261-d5898938bbe0\") " pod="openshift-marketplace/redhat-operators-bts42" Mar 09 04:19:00 crc kubenswrapper[4901]: I0309 04:19:00.731027 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bts42" Mar 09 04:19:00 crc kubenswrapper[4901]: I0309 04:19:00.863481 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 04:19:00 crc kubenswrapper[4901]: I0309 04:19:00.864209 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 04:19:01 crc kubenswrapper[4901]: I0309 04:19:01.275198 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bts42"] Mar 09 04:19:02 crc kubenswrapper[4901]: I0309 04:19:02.271407 4901 generic.go:334] "Generic (PLEG): container finished" podID="f37c48cb-e13a-42e6-9261-d5898938bbe0" containerID="4d876ade2bac54538d75571bc65c31afe45c3c186d1cbf4c946c7d5e1dd03fb3" exitCode=0 Mar 09 04:19:02 crc kubenswrapper[4901]: I0309 04:19:02.271566 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bts42" event={"ID":"f37c48cb-e13a-42e6-9261-d5898938bbe0","Type":"ContainerDied","Data":"4d876ade2bac54538d75571bc65c31afe45c3c186d1cbf4c946c7d5e1dd03fb3"} Mar 09 04:19:02 crc kubenswrapper[4901]: I0309 04:19:02.271685 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bts42" event={"ID":"f37c48cb-e13a-42e6-9261-d5898938bbe0","Type":"ContainerStarted","Data":"4fda98d8fe530dd8e4b63dcdeac8a0e3ed256688300209a07b20499867fafc28"} Mar 09 04:19:04 crc kubenswrapper[4901]: I0309 04:19:04.292885 4901 generic.go:334] "Generic (PLEG): container finished" podID="f37c48cb-e13a-42e6-9261-d5898938bbe0" containerID="3fe699db26e97728e9ce01ae902a0015c43409c20e4a6c37157e471c23016ef7" exitCode=0 Mar 09 04:19:04 crc kubenswrapper[4901]: I0309 04:19:04.292983 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bts42" event={"ID":"f37c48cb-e13a-42e6-9261-d5898938bbe0","Type":"ContainerDied","Data":"3fe699db26e97728e9ce01ae902a0015c43409c20e4a6c37157e471c23016ef7"} Mar 09 04:19:05 crc kubenswrapper[4901]: I0309 04:19:05.311528 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bts42" event={"ID":"f37c48cb-e13a-42e6-9261-d5898938bbe0","Type":"ContainerStarted","Data":"4a04c8992f85146038a560ac72e439d369969a0cdc0e1acd5a1cfe4ad46ec5a7"} Mar 09 04:19:05 crc kubenswrapper[4901]: I0309 04:19:05.352656 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bts42" podStartSLOduration=2.954713507 podStartE2EDuration="5.352630296s" podCreationTimestamp="2026-03-09 04:19:00 +0000 UTC" firstStartedPulling="2026-03-09 04:19:02.275145732 +0000 UTC m=+5866.864809504" lastFinishedPulling="2026-03-09 04:19:04.673062521 +0000 UTC m=+5869.262726293" observedRunningTime="2026-03-09 04:19:05.344352343 +0000 UTC m=+5869.934016115" watchObservedRunningTime="2026-03-09 04:19:05.352630296 +0000 UTC m=+5869.942294038" Mar 09 04:19:10 crc kubenswrapper[4901]: I0309 04:19:10.731379 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bts42" Mar 09 04:19:10 crc kubenswrapper[4901]: I0309 04:19:10.731774 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bts42" Mar 09 04:19:11 crc kubenswrapper[4901]: I0309 04:19:11.813554 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bts42" podUID="f37c48cb-e13a-42e6-9261-d5898938bbe0" containerName="registry-server" probeResult="failure" output=< Mar 09 04:19:11 crc kubenswrapper[4901]: timeout: failed to connect service ":50051" within 1s Mar 09 04:19:11 crc kubenswrapper[4901]: > Mar 09 04:19:20 crc kubenswrapper[4901]: I0309 04:19:20.810365 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bts42" Mar 09 04:19:20 crc kubenswrapper[4901]: I0309 04:19:20.895752 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bts42" Mar 09 04:19:21 crc kubenswrapper[4901]: I0309 04:19:21.066356 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bts42"] Mar 09 04:19:22 crc kubenswrapper[4901]: I0309 04:19:22.152882 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bts42" podUID="f37c48cb-e13a-42e6-9261-d5898938bbe0" containerName="registry-server" containerID="cri-o://4a04c8992f85146038a560ac72e439d369969a0cdc0e1acd5a1cfe4ad46ec5a7" gracePeriod=2 Mar 09 04:19:22 crc kubenswrapper[4901]: I0309 04:19:22.696435 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bts42" Mar 09 04:19:22 crc kubenswrapper[4901]: I0309 04:19:22.820314 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f37c48cb-e13a-42e6-9261-d5898938bbe0-catalog-content\") pod \"f37c48cb-e13a-42e6-9261-d5898938bbe0\" (UID: \"f37c48cb-e13a-42e6-9261-d5898938bbe0\") " Mar 09 04:19:22 crc kubenswrapper[4901]: I0309 04:19:22.820568 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdlcm\" (UniqueName: \"kubernetes.io/projected/f37c48cb-e13a-42e6-9261-d5898938bbe0-kube-api-access-zdlcm\") pod \"f37c48cb-e13a-42e6-9261-d5898938bbe0\" (UID: \"f37c48cb-e13a-42e6-9261-d5898938bbe0\") " Mar 09 04:19:22 crc kubenswrapper[4901]: I0309 04:19:22.820692 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f37c48cb-e13a-42e6-9261-d5898938bbe0-utilities\") pod \"f37c48cb-e13a-42e6-9261-d5898938bbe0\" (UID: \"f37c48cb-e13a-42e6-9261-d5898938bbe0\") " Mar 09 04:19:22 crc kubenswrapper[4901]: I0309 04:19:22.821725 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f37c48cb-e13a-42e6-9261-d5898938bbe0-utilities" (OuterVolumeSpecName: "utilities") pod "f37c48cb-e13a-42e6-9261-d5898938bbe0" (UID: "f37c48cb-e13a-42e6-9261-d5898938bbe0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 04:19:22 crc kubenswrapper[4901]: I0309 04:19:22.826416 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f37c48cb-e13a-42e6-9261-d5898938bbe0-kube-api-access-zdlcm" (OuterVolumeSpecName: "kube-api-access-zdlcm") pod "f37c48cb-e13a-42e6-9261-d5898938bbe0" (UID: "f37c48cb-e13a-42e6-9261-d5898938bbe0"). InnerVolumeSpecName "kube-api-access-zdlcm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:19:22 crc kubenswrapper[4901]: I0309 04:19:22.928647 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdlcm\" (UniqueName: \"kubernetes.io/projected/f37c48cb-e13a-42e6-9261-d5898938bbe0-kube-api-access-zdlcm\") on node \"crc\" DevicePath \"\"" Mar 09 04:19:22 crc kubenswrapper[4901]: I0309 04:19:22.928715 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f37c48cb-e13a-42e6-9261-d5898938bbe0-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 04:19:22 crc kubenswrapper[4901]: I0309 04:19:22.956408 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f37c48cb-e13a-42e6-9261-d5898938bbe0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f37c48cb-e13a-42e6-9261-d5898938bbe0" (UID: "f37c48cb-e13a-42e6-9261-d5898938bbe0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 04:19:23 crc kubenswrapper[4901]: I0309 04:19:23.029962 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f37c48cb-e13a-42e6-9261-d5898938bbe0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 04:19:23 crc kubenswrapper[4901]: I0309 04:19:23.167734 4901 generic.go:334] "Generic (PLEG): container finished" podID="f37c48cb-e13a-42e6-9261-d5898938bbe0" containerID="4a04c8992f85146038a560ac72e439d369969a0cdc0e1acd5a1cfe4ad46ec5a7" exitCode=0 Mar 09 04:19:23 crc kubenswrapper[4901]: I0309 04:19:23.167790 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bts42" event={"ID":"f37c48cb-e13a-42e6-9261-d5898938bbe0","Type":"ContainerDied","Data":"4a04c8992f85146038a560ac72e439d369969a0cdc0e1acd5a1cfe4ad46ec5a7"} Mar 09 04:19:23 crc kubenswrapper[4901]: I0309 04:19:23.167843 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bts42" Mar 09 04:19:23 crc kubenswrapper[4901]: I0309 04:19:23.167859 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bts42" event={"ID":"f37c48cb-e13a-42e6-9261-d5898938bbe0","Type":"ContainerDied","Data":"4fda98d8fe530dd8e4b63dcdeac8a0e3ed256688300209a07b20499867fafc28"} Mar 09 04:19:23 crc kubenswrapper[4901]: I0309 04:19:23.167891 4901 scope.go:117] "RemoveContainer" containerID="4a04c8992f85146038a560ac72e439d369969a0cdc0e1acd5a1cfe4ad46ec5a7" Mar 09 04:19:23 crc kubenswrapper[4901]: I0309 04:19:23.194876 4901 scope.go:117] "RemoveContainer" containerID="3fe699db26e97728e9ce01ae902a0015c43409c20e4a6c37157e471c23016ef7" Mar 09 04:19:23 crc kubenswrapper[4901]: I0309 04:19:23.218917 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bts42"] Mar 09 04:19:23 crc kubenswrapper[4901]: I0309 04:19:23.226169 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bts42"] Mar 09 04:19:23 crc kubenswrapper[4901]: I0309 04:19:23.253530 4901 scope.go:117] "RemoveContainer" containerID="4d876ade2bac54538d75571bc65c31afe45c3c186d1cbf4c946c7d5e1dd03fb3" Mar 09 04:19:23 crc kubenswrapper[4901]: I0309 04:19:23.277816 4901 scope.go:117] "RemoveContainer" containerID="4a04c8992f85146038a560ac72e439d369969a0cdc0e1acd5a1cfe4ad46ec5a7" Mar 09 04:19:23 crc kubenswrapper[4901]: E0309 04:19:23.278181 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a04c8992f85146038a560ac72e439d369969a0cdc0e1acd5a1cfe4ad46ec5a7\": container with ID starting with 4a04c8992f85146038a560ac72e439d369969a0cdc0e1acd5a1cfe4ad46ec5a7 not found: ID does not exist" containerID="4a04c8992f85146038a560ac72e439d369969a0cdc0e1acd5a1cfe4ad46ec5a7" Mar 09 04:19:23 crc kubenswrapper[4901]: I0309 04:19:23.278244 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a04c8992f85146038a560ac72e439d369969a0cdc0e1acd5a1cfe4ad46ec5a7"} err="failed to get container status \"4a04c8992f85146038a560ac72e439d369969a0cdc0e1acd5a1cfe4ad46ec5a7\": rpc error: code = NotFound desc = could not find container \"4a04c8992f85146038a560ac72e439d369969a0cdc0e1acd5a1cfe4ad46ec5a7\": container with ID starting with 4a04c8992f85146038a560ac72e439d369969a0cdc0e1acd5a1cfe4ad46ec5a7 not found: ID does not exist" Mar 09 04:19:23 crc kubenswrapper[4901]: I0309 04:19:23.278269 4901 scope.go:117] "RemoveContainer" containerID="3fe699db26e97728e9ce01ae902a0015c43409c20e4a6c37157e471c23016ef7" Mar 09 04:19:23 crc kubenswrapper[4901]: E0309 04:19:23.278501 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fe699db26e97728e9ce01ae902a0015c43409c20e4a6c37157e471c23016ef7\": container with ID starting with 3fe699db26e97728e9ce01ae902a0015c43409c20e4a6c37157e471c23016ef7 not found: ID does not exist" containerID="3fe699db26e97728e9ce01ae902a0015c43409c20e4a6c37157e471c23016ef7" Mar 09 04:19:23 crc kubenswrapper[4901]: I0309 04:19:23.278554 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fe699db26e97728e9ce01ae902a0015c43409c20e4a6c37157e471c23016ef7"} err="failed to get container status \"3fe699db26e97728e9ce01ae902a0015c43409c20e4a6c37157e471c23016ef7\": rpc error: code = NotFound desc = could not find container \"3fe699db26e97728e9ce01ae902a0015c43409c20e4a6c37157e471c23016ef7\": container with ID starting with 3fe699db26e97728e9ce01ae902a0015c43409c20e4a6c37157e471c23016ef7 not found: ID does not exist" Mar 09 04:19:23 crc kubenswrapper[4901]: I0309 04:19:23.278569 4901 scope.go:117] "RemoveContainer" containerID="4d876ade2bac54538d75571bc65c31afe45c3c186d1cbf4c946c7d5e1dd03fb3" Mar 09 04:19:23 crc kubenswrapper[4901]: E0309 04:19:23.278775 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d876ade2bac54538d75571bc65c31afe45c3c186d1cbf4c946c7d5e1dd03fb3\": container with ID starting with 4d876ade2bac54538d75571bc65c31afe45c3c186d1cbf4c946c7d5e1dd03fb3 not found: ID does not exist" containerID="4d876ade2bac54538d75571bc65c31afe45c3c186d1cbf4c946c7d5e1dd03fb3" Mar 09 04:19:23 crc kubenswrapper[4901]: I0309 04:19:23.278805 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d876ade2bac54538d75571bc65c31afe45c3c186d1cbf4c946c7d5e1dd03fb3"} err="failed to get container status \"4d876ade2bac54538d75571bc65c31afe45c3c186d1cbf4c946c7d5e1dd03fb3\": rpc error: code = NotFound desc = could not find container \"4d876ade2bac54538d75571bc65c31afe45c3c186d1cbf4c946c7d5e1dd03fb3\": container with ID starting with 4d876ade2bac54538d75571bc65c31afe45c3c186d1cbf4c946c7d5e1dd03fb3 not found: ID does not exist" Mar 09 04:19:24 crc kubenswrapper[4901]: I0309 04:19:24.128355 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f37c48cb-e13a-42e6-9261-d5898938bbe0" path="/var/lib/kubelet/pods/f37c48cb-e13a-42e6-9261-d5898938bbe0/volumes" Mar 09 04:19:30 crc kubenswrapper[4901]: I0309 04:19:30.863113 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 04:19:30 crc kubenswrapper[4901]: I0309 04:19:30.863944 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 04:19:30 crc kubenswrapper[4901]: I0309 04:19:30.864031 4901 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5c998" Mar 09 04:19:30 crc kubenswrapper[4901]: I0309 04:19:30.865166 4901 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e7ff674ef89b0fc5d88826b6774b6df277813f850f698639d418107ba854ec9d"} pod="openshift-machine-config-operator/machine-config-daemon-5c998" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 04:19:30 crc kubenswrapper[4901]: I0309 04:19:30.865312 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" containerID="cri-o://e7ff674ef89b0fc5d88826b6774b6df277813f850f698639d418107ba854ec9d" gracePeriod=600 Mar 09 04:19:31 crc kubenswrapper[4901]: I0309 04:19:31.258429 4901 generic.go:334] "Generic (PLEG): container finished" podID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerID="e7ff674ef89b0fc5d88826b6774b6df277813f850f698639d418107ba854ec9d" exitCode=0 Mar 09 04:19:31 crc kubenswrapper[4901]: I0309 04:19:31.258527 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" event={"ID":"65e722e8-52c4-4bb6-9927-f378b2f7296a","Type":"ContainerDied","Data":"e7ff674ef89b0fc5d88826b6774b6df277813f850f698639d418107ba854ec9d"} Mar 09 04:19:31 crc kubenswrapper[4901]: I0309 04:19:31.258898 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" event={"ID":"65e722e8-52c4-4bb6-9927-f378b2f7296a","Type":"ContainerStarted","Data":"100361249ed61d7cd1b77e958bc2d42a32f7e2074262e7f511a1dc3907fab49b"} Mar 09 04:19:31 crc kubenswrapper[4901]: I0309 04:19:31.258937 4901 scope.go:117] "RemoveContainer" containerID="d217497ad345fae7da1bdd0eb24a9199e10562d0bb7d6f5c45a95f7e79585459" Mar 09 04:19:43 crc kubenswrapper[4901]: I0309 04:19:43.111368 4901 scope.go:117] "RemoveContainer" containerID="076be717e745166e865c397bca8d8d5d5e33035f0e3bd8455106f955f417df1a" Mar 09 04:19:43 crc kubenswrapper[4901]: I0309 04:19:43.143269 4901 scope.go:117] "RemoveContainer" containerID="b64c9fa4a78cb15087c846ae2b8aab8a8f0aad175208ab7db5a7c8eb04d05ea6" Mar 09 04:20:00 crc kubenswrapper[4901]: I0309 04:20:00.179522 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550500-sz6h7"] Mar 09 04:20:00 crc kubenswrapper[4901]: E0309 04:20:00.181198 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f37c48cb-e13a-42e6-9261-d5898938bbe0" containerName="registry-server" Mar 09 04:20:00 crc kubenswrapper[4901]: I0309 04:20:00.181266 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="f37c48cb-e13a-42e6-9261-d5898938bbe0" containerName="registry-server" Mar 09 04:20:00 crc kubenswrapper[4901]: E0309 04:20:00.181357 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f37c48cb-e13a-42e6-9261-d5898938bbe0" containerName="extract-content" Mar 09 04:20:00 crc kubenswrapper[4901]: I0309 04:20:00.181378 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="f37c48cb-e13a-42e6-9261-d5898938bbe0" containerName="extract-content" Mar 09 04:20:00 crc kubenswrapper[4901]: E0309 04:20:00.181405 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f37c48cb-e13a-42e6-9261-d5898938bbe0" containerName="extract-utilities" Mar 09 04:20:00 crc kubenswrapper[4901]: I0309 04:20:00.181424 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="f37c48cb-e13a-42e6-9261-d5898938bbe0" containerName="extract-utilities" Mar 09 04:20:00 crc kubenswrapper[4901]: I0309 04:20:00.181918 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="f37c48cb-e13a-42e6-9261-d5898938bbe0" containerName="registry-server" Mar 09 04:20:00 crc kubenswrapper[4901]: I0309 04:20:00.183176 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550500-sz6h7" Mar 09 04:20:00 crc kubenswrapper[4901]: I0309 04:20:00.186840 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 04:20:00 crc kubenswrapper[4901]: I0309 04:20:00.187048 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lq6vs" Mar 09 04:20:00 crc kubenswrapper[4901]: I0309 04:20:00.187296 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 04:20:00 crc kubenswrapper[4901]: I0309 04:20:00.193413 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550500-sz6h7"] Mar 09 04:20:00 crc kubenswrapper[4901]: I0309 04:20:00.295144 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh4rd\" (UniqueName: \"kubernetes.io/projected/d8db118a-59ed-4968-a6d1-52ebcbf1feca-kube-api-access-nh4rd\") pod \"auto-csr-approver-29550500-sz6h7\" (UID: \"d8db118a-59ed-4968-a6d1-52ebcbf1feca\") " pod="openshift-infra/auto-csr-approver-29550500-sz6h7" Mar 09 04:20:00 crc kubenswrapper[4901]: I0309 04:20:00.397252 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh4rd\" (UniqueName: \"kubernetes.io/projected/d8db118a-59ed-4968-a6d1-52ebcbf1feca-kube-api-access-nh4rd\") pod \"auto-csr-approver-29550500-sz6h7\" (UID: \"d8db118a-59ed-4968-a6d1-52ebcbf1feca\") " pod="openshift-infra/auto-csr-approver-29550500-sz6h7" Mar 09 04:20:00 crc kubenswrapper[4901]: I0309 04:20:00.430629 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh4rd\" (UniqueName: \"kubernetes.io/projected/d8db118a-59ed-4968-a6d1-52ebcbf1feca-kube-api-access-nh4rd\") pod \"auto-csr-approver-29550500-sz6h7\" (UID: \"d8db118a-59ed-4968-a6d1-52ebcbf1feca\") " pod="openshift-infra/auto-csr-approver-29550500-sz6h7" Mar 09 04:20:00 crc kubenswrapper[4901]: I0309 04:20:00.521138 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550500-sz6h7" Mar 09 04:20:01 crc kubenswrapper[4901]: I0309 04:20:01.014347 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550500-sz6h7"] Mar 09 04:20:01 crc kubenswrapper[4901]: I0309 04:20:01.589744 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550500-sz6h7" event={"ID":"d8db118a-59ed-4968-a6d1-52ebcbf1feca","Type":"ContainerStarted","Data":"f356c744ede87af504a8b03fed330765841a9668b7d1634bcbf9104db2538406"} Mar 09 04:20:02 crc kubenswrapper[4901]: I0309 04:20:02.608904 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550500-sz6h7" event={"ID":"d8db118a-59ed-4968-a6d1-52ebcbf1feca","Type":"ContainerStarted","Data":"e105a17d1f123a53db81a40583199747b0fa52dc9c4dbed951475f7aed30d7e4"} Mar 09 04:20:02 crc kubenswrapper[4901]: I0309 04:20:02.637008 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550500-sz6h7" podStartSLOduration=1.424638683 podStartE2EDuration="2.636975872s" podCreationTimestamp="2026-03-09 04:20:00 +0000 UTC" firstStartedPulling="2026-03-09 04:20:01.025441845 +0000 UTC m=+5925.615105597" lastFinishedPulling="2026-03-09 04:20:02.237779014 +0000 UTC m=+5926.827442786" observedRunningTime="2026-03-09 04:20:02.629422647 +0000 UTC m=+5927.219086459" watchObservedRunningTime="2026-03-09 04:20:02.636975872 +0000 UTC m=+5927.226639654" Mar 09 04:20:03 crc kubenswrapper[4901]: I0309 04:20:03.619934 4901 generic.go:334] "Generic (PLEG): container finished" podID="d8db118a-59ed-4968-a6d1-52ebcbf1feca" containerID="e105a17d1f123a53db81a40583199747b0fa52dc9c4dbed951475f7aed30d7e4" exitCode=0 Mar 09 04:20:03 crc kubenswrapper[4901]: I0309 04:20:03.619984 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550500-sz6h7" event={"ID":"d8db118a-59ed-4968-a6d1-52ebcbf1feca","Type":"ContainerDied","Data":"e105a17d1f123a53db81a40583199747b0fa52dc9c4dbed951475f7aed30d7e4"} Mar 09 04:20:05 crc kubenswrapper[4901]: I0309 04:20:05.047648 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550500-sz6h7" Mar 09 04:20:05 crc kubenswrapper[4901]: I0309 04:20:05.187810 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nh4rd\" (UniqueName: \"kubernetes.io/projected/d8db118a-59ed-4968-a6d1-52ebcbf1feca-kube-api-access-nh4rd\") pod \"d8db118a-59ed-4968-a6d1-52ebcbf1feca\" (UID: \"d8db118a-59ed-4968-a6d1-52ebcbf1feca\") " Mar 09 04:20:05 crc kubenswrapper[4901]: I0309 04:20:05.196457 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8db118a-59ed-4968-a6d1-52ebcbf1feca-kube-api-access-nh4rd" (OuterVolumeSpecName: "kube-api-access-nh4rd") pod "d8db118a-59ed-4968-a6d1-52ebcbf1feca" (UID: "d8db118a-59ed-4968-a6d1-52ebcbf1feca"). InnerVolumeSpecName "kube-api-access-nh4rd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:20:05 crc kubenswrapper[4901]: I0309 04:20:05.290568 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nh4rd\" (UniqueName: \"kubernetes.io/projected/d8db118a-59ed-4968-a6d1-52ebcbf1feca-kube-api-access-nh4rd\") on node \"crc\" DevicePath \"\"" Mar 09 04:20:05 crc kubenswrapper[4901]: I0309 04:20:05.660509 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550500-sz6h7" event={"ID":"d8db118a-59ed-4968-a6d1-52ebcbf1feca","Type":"ContainerDied","Data":"f356c744ede87af504a8b03fed330765841a9668b7d1634bcbf9104db2538406"} Mar 09 04:20:05 crc kubenswrapper[4901]: I0309 04:20:05.660568 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f356c744ede87af504a8b03fed330765841a9668b7d1634bcbf9104db2538406" Mar 09 04:20:05 crc kubenswrapper[4901]: I0309 04:20:05.660608 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550500-sz6h7" Mar 09 04:20:05 crc kubenswrapper[4901]: I0309 04:20:05.732136 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550494-hz2hc"] Mar 09 04:20:05 crc kubenswrapper[4901]: I0309 04:20:05.744383 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550494-hz2hc"] Mar 09 04:20:06 crc kubenswrapper[4901]: I0309 04:20:06.127381 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a875c415-225e-40d6-8e3f-c0a390112936" path="/var/lib/kubelet/pods/a875c415-225e-40d6-8e3f-c0a390112936/volumes" Mar 09 04:20:43 crc kubenswrapper[4901]: I0309 04:20:43.320594 4901 scope.go:117] "RemoveContainer" containerID="a98051d3ad7f950bd65e2e3a6812ba7a526f78921390e4e481d3a38908b9feed" Mar 09 04:20:43 crc kubenswrapper[4901]: I0309 04:20:43.366767 4901 scope.go:117] "RemoveContainer" containerID="2728ed9e1874ffe4f00aebc8ed43c6b257cf541534cbe5db334e6f08f54f592c" Mar 09 04:22:00 crc kubenswrapper[4901]: I0309 04:22:00.154497 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550502-cm4mf"] Mar 09 04:22:00 crc kubenswrapper[4901]: E0309 04:22:00.155192 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8db118a-59ed-4968-a6d1-52ebcbf1feca" containerName="oc" Mar 09 04:22:00 crc kubenswrapper[4901]: I0309 04:22:00.155203 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8db118a-59ed-4968-a6d1-52ebcbf1feca" containerName="oc" Mar 09 04:22:00 crc kubenswrapper[4901]: I0309 04:22:00.155430 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8db118a-59ed-4968-a6d1-52ebcbf1feca" containerName="oc" Mar 09 04:22:00 crc kubenswrapper[4901]: I0309 04:22:00.155922 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550502-cm4mf" Mar 09 04:22:00 crc kubenswrapper[4901]: I0309 04:22:00.158530 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 04:22:00 crc kubenswrapper[4901]: I0309 04:22:00.158860 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 04:22:00 crc kubenswrapper[4901]: I0309 04:22:00.163359 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lq6vs" Mar 09 04:22:00 crc kubenswrapper[4901]: I0309 04:22:00.205846 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550502-cm4mf"] Mar 09 04:22:00 crc kubenswrapper[4901]: I0309 04:22:00.315256 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4dtn\" (UniqueName: \"kubernetes.io/projected/94b70874-34da-412e-a97e-9b855fe7c149-kube-api-access-c4dtn\") pod \"auto-csr-approver-29550502-cm4mf\" (UID: \"94b70874-34da-412e-a97e-9b855fe7c149\") " pod="openshift-infra/auto-csr-approver-29550502-cm4mf" Mar 09 04:22:00 crc kubenswrapper[4901]: I0309 04:22:00.417122 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4dtn\" (UniqueName: \"kubernetes.io/projected/94b70874-34da-412e-a97e-9b855fe7c149-kube-api-access-c4dtn\") pod \"auto-csr-approver-29550502-cm4mf\" (UID: \"94b70874-34da-412e-a97e-9b855fe7c149\") " pod="openshift-infra/auto-csr-approver-29550502-cm4mf" Mar 09 04:22:00 crc kubenswrapper[4901]: I0309 04:22:00.443079 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4dtn\" (UniqueName: \"kubernetes.io/projected/94b70874-34da-412e-a97e-9b855fe7c149-kube-api-access-c4dtn\") pod \"auto-csr-approver-29550502-cm4mf\" (UID: \"94b70874-34da-412e-a97e-9b855fe7c149\") " pod="openshift-infra/auto-csr-approver-29550502-cm4mf" Mar 09 04:22:00 crc kubenswrapper[4901]: I0309 04:22:00.484866 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550502-cm4mf" Mar 09 04:22:00 crc kubenswrapper[4901]: I0309 04:22:00.862982 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 04:22:00 crc kubenswrapper[4901]: I0309 04:22:00.863379 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 04:22:01 crc kubenswrapper[4901]: I0309 04:22:01.033629 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550502-cm4mf"] Mar 09 04:22:01 crc kubenswrapper[4901]: I0309 04:22:01.859861 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550502-cm4mf" event={"ID":"94b70874-34da-412e-a97e-9b855fe7c149","Type":"ContainerStarted","Data":"f6d6276b054da5f0e86196d0b9c2c93e9d5976e45477f7e96dcd08c33e6782aa"} Mar 09 04:22:02 crc kubenswrapper[4901]: I0309 04:22:02.875401 4901 generic.go:334] "Generic (PLEG): container finished" podID="94b70874-34da-412e-a97e-9b855fe7c149" containerID="4067f397471c33b2c93a890b6456f659152e6f70bd9f6fedf568bfe6402e467d" exitCode=0 Mar 09 04:22:02 crc kubenswrapper[4901]: I0309 04:22:02.875544 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550502-cm4mf" event={"ID":"94b70874-34da-412e-a97e-9b855fe7c149","Type":"ContainerDied","Data":"4067f397471c33b2c93a890b6456f659152e6f70bd9f6fedf568bfe6402e467d"} Mar 09 04:22:04 crc kubenswrapper[4901]: I0309 04:22:04.191892 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550502-cm4mf" Mar 09 04:22:04 crc kubenswrapper[4901]: I0309 04:22:04.305289 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4dtn\" (UniqueName: \"kubernetes.io/projected/94b70874-34da-412e-a97e-9b855fe7c149-kube-api-access-c4dtn\") pod \"94b70874-34da-412e-a97e-9b855fe7c149\" (UID: \"94b70874-34da-412e-a97e-9b855fe7c149\") " Mar 09 04:22:04 crc kubenswrapper[4901]: I0309 04:22:04.312366 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94b70874-34da-412e-a97e-9b855fe7c149-kube-api-access-c4dtn" (OuterVolumeSpecName: "kube-api-access-c4dtn") pod "94b70874-34da-412e-a97e-9b855fe7c149" (UID: "94b70874-34da-412e-a97e-9b855fe7c149"). InnerVolumeSpecName "kube-api-access-c4dtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:22:04 crc kubenswrapper[4901]: I0309 04:22:04.408187 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4dtn\" (UniqueName: \"kubernetes.io/projected/94b70874-34da-412e-a97e-9b855fe7c149-kube-api-access-c4dtn\") on node \"crc\" DevicePath \"\"" Mar 09 04:22:04 crc kubenswrapper[4901]: I0309 04:22:04.894371 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550502-cm4mf" event={"ID":"94b70874-34da-412e-a97e-9b855fe7c149","Type":"ContainerDied","Data":"f6d6276b054da5f0e86196d0b9c2c93e9d5976e45477f7e96dcd08c33e6782aa"} Mar 09 04:22:04 crc kubenswrapper[4901]: I0309 04:22:04.894415 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6d6276b054da5f0e86196d0b9c2c93e9d5976e45477f7e96dcd08c33e6782aa" Mar 09 04:22:04 crc kubenswrapper[4901]: I0309 04:22:04.894482 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550502-cm4mf" Mar 09 04:22:05 crc kubenswrapper[4901]: I0309 04:22:05.268133 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550496-nwz7b"] Mar 09 04:22:05 crc kubenswrapper[4901]: I0309 04:22:05.276335 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550496-nwz7b"] Mar 09 04:22:06 crc kubenswrapper[4901]: I0309 04:22:06.119620 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f8b785e-c883-404c-ab52-a24fa9f7aac4" path="/var/lib/kubelet/pods/1f8b785e-c883-404c-ab52-a24fa9f7aac4/volumes" Mar 09 04:22:30 crc kubenswrapper[4901]: I0309 04:22:30.862850 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 04:22:30 crc kubenswrapper[4901]: I0309 04:22:30.863364 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 04:22:32 crc kubenswrapper[4901]: I0309 04:22:32.214528 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ww4h8"] Mar 09 04:22:32 crc kubenswrapper[4901]: E0309 04:22:32.216550 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94b70874-34da-412e-a97e-9b855fe7c149" containerName="oc" Mar 09 04:22:32 crc kubenswrapper[4901]: I0309 04:22:32.216593 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="94b70874-34da-412e-a97e-9b855fe7c149" containerName="oc" Mar 09 04:22:32 crc kubenswrapper[4901]: I0309 04:22:32.216934 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="94b70874-34da-412e-a97e-9b855fe7c149" containerName="oc" Mar 09 04:22:32 crc kubenswrapper[4901]: I0309 04:22:32.218576 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ww4h8" Mar 09 04:22:32 crc kubenswrapper[4901]: I0309 04:22:32.232198 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ww4h8"] Mar 09 04:22:32 crc kubenswrapper[4901]: I0309 04:22:32.415286 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a884af1-fc3a-47a5-9d6a-8860bff0d0fd-utilities\") pod \"certified-operators-ww4h8\" (UID: \"5a884af1-fc3a-47a5-9d6a-8860bff0d0fd\") " pod="openshift-marketplace/certified-operators-ww4h8" Mar 09 04:22:32 crc kubenswrapper[4901]: I0309 04:22:32.415488 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pb9f5\" (UniqueName: \"kubernetes.io/projected/5a884af1-fc3a-47a5-9d6a-8860bff0d0fd-kube-api-access-pb9f5\") pod \"certified-operators-ww4h8\" (UID: \"5a884af1-fc3a-47a5-9d6a-8860bff0d0fd\") " pod="openshift-marketplace/certified-operators-ww4h8" Mar 09 04:22:32 crc kubenswrapper[4901]: I0309 04:22:32.415738 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a884af1-fc3a-47a5-9d6a-8860bff0d0fd-catalog-content\") pod \"certified-operators-ww4h8\" (UID: \"5a884af1-fc3a-47a5-9d6a-8860bff0d0fd\") " pod="openshift-marketplace/certified-operators-ww4h8" Mar 09 04:22:32 crc kubenswrapper[4901]: I0309 04:22:32.516964 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pb9f5\" (UniqueName: \"kubernetes.io/projected/5a884af1-fc3a-47a5-9d6a-8860bff0d0fd-kube-api-access-pb9f5\") pod \"certified-operators-ww4h8\" (UID: \"5a884af1-fc3a-47a5-9d6a-8860bff0d0fd\") " pod="openshift-marketplace/certified-operators-ww4h8" Mar 09 04:22:32 crc kubenswrapper[4901]: I0309 04:22:32.517051 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a884af1-fc3a-47a5-9d6a-8860bff0d0fd-catalog-content\") pod \"certified-operators-ww4h8\" (UID: \"5a884af1-fc3a-47a5-9d6a-8860bff0d0fd\") " pod="openshift-marketplace/certified-operators-ww4h8" Mar 09 04:22:32 crc kubenswrapper[4901]: I0309 04:22:32.517119 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a884af1-fc3a-47a5-9d6a-8860bff0d0fd-utilities\") pod \"certified-operators-ww4h8\" (UID: \"5a884af1-fc3a-47a5-9d6a-8860bff0d0fd\") " pod="openshift-marketplace/certified-operators-ww4h8" Mar 09 04:22:32 crc kubenswrapper[4901]: I0309 04:22:32.517606 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a884af1-fc3a-47a5-9d6a-8860bff0d0fd-utilities\") pod \"certified-operators-ww4h8\" (UID: \"5a884af1-fc3a-47a5-9d6a-8860bff0d0fd\") " pod="openshift-marketplace/certified-operators-ww4h8" Mar 09 04:22:32 crc kubenswrapper[4901]: I0309 04:22:32.517867 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a884af1-fc3a-47a5-9d6a-8860bff0d0fd-catalog-content\") pod \"certified-operators-ww4h8\" (UID: \"5a884af1-fc3a-47a5-9d6a-8860bff0d0fd\") " pod="openshift-marketplace/certified-operators-ww4h8" Mar 09 04:22:32 crc kubenswrapper[4901]: I0309 04:22:32.547602 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pb9f5\" (UniqueName: \"kubernetes.io/projected/5a884af1-fc3a-47a5-9d6a-8860bff0d0fd-kube-api-access-pb9f5\") pod \"certified-operators-ww4h8\" (UID: \"5a884af1-fc3a-47a5-9d6a-8860bff0d0fd\") " pod="openshift-marketplace/certified-operators-ww4h8" Mar 09 04:22:32 crc kubenswrapper[4901]: I0309 04:22:32.839911 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ww4h8" Mar 09 04:22:33 crc kubenswrapper[4901]: I0309 04:22:33.383719 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ww4h8"] Mar 09 04:22:34 crc kubenswrapper[4901]: I0309 04:22:34.348596 4901 generic.go:334] "Generic (PLEG): container finished" podID="5a884af1-fc3a-47a5-9d6a-8860bff0d0fd" containerID="9c7dc009e8c573687b6cc31a843d6a2975bca62713fd27a4a6d7cfea7d438ec8" exitCode=0 Mar 09 04:22:34 crc kubenswrapper[4901]: I0309 04:22:34.348696 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ww4h8" event={"ID":"5a884af1-fc3a-47a5-9d6a-8860bff0d0fd","Type":"ContainerDied","Data":"9c7dc009e8c573687b6cc31a843d6a2975bca62713fd27a4a6d7cfea7d438ec8"} Mar 09 04:22:34 crc kubenswrapper[4901]: I0309 04:22:34.349309 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ww4h8" event={"ID":"5a884af1-fc3a-47a5-9d6a-8860bff0d0fd","Type":"ContainerStarted","Data":"104911d0cb0f3580d8e0d329ada09fcd9365090a1f003552e611ddf8837c9222"} Mar 09 04:22:34 crc kubenswrapper[4901]: I0309 04:22:34.351152 4901 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 04:22:35 crc kubenswrapper[4901]: I0309 04:22:35.362344 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ww4h8" event={"ID":"5a884af1-fc3a-47a5-9d6a-8860bff0d0fd","Type":"ContainerStarted","Data":"a801e830b179aec2fcd8786949238cca25d684ab33d0e5a3421065459260e3f7"} Mar 09 04:22:36 crc kubenswrapper[4901]: I0309 04:22:36.377152 4901 generic.go:334] "Generic (PLEG): container finished" podID="5a884af1-fc3a-47a5-9d6a-8860bff0d0fd" containerID="a801e830b179aec2fcd8786949238cca25d684ab33d0e5a3421065459260e3f7" exitCode=0 Mar 09 04:22:36 crc kubenswrapper[4901]: I0309 04:22:36.377238 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ww4h8" event={"ID":"5a884af1-fc3a-47a5-9d6a-8860bff0d0fd","Type":"ContainerDied","Data":"a801e830b179aec2fcd8786949238cca25d684ab33d0e5a3421065459260e3f7"} Mar 09 04:22:37 crc kubenswrapper[4901]: I0309 04:22:37.387805 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ww4h8" event={"ID":"5a884af1-fc3a-47a5-9d6a-8860bff0d0fd","Type":"ContainerStarted","Data":"e308e0ca052c911a3b1f9a5d59a3d5fea3320153fb0e5e671d259d461ace3dff"} Mar 09 04:22:37 crc kubenswrapper[4901]: I0309 04:22:37.415670 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ww4h8" podStartSLOduration=3.009667207 podStartE2EDuration="5.415653995s" podCreationTimestamp="2026-03-09 04:22:32 +0000 UTC" firstStartedPulling="2026-03-09 04:22:34.350895734 +0000 UTC m=+6078.940559476" lastFinishedPulling="2026-03-09 04:22:36.756882522 +0000 UTC m=+6081.346546264" observedRunningTime="2026-03-09 04:22:37.410252182 +0000 UTC m=+6081.999915924" watchObservedRunningTime="2026-03-09 04:22:37.415653995 +0000 UTC m=+6082.005317747" Mar 09 04:22:42 crc kubenswrapper[4901]: I0309 04:22:42.841155 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ww4h8" Mar 09 04:22:42 crc kubenswrapper[4901]: I0309 04:22:42.841537 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ww4h8" Mar 09 04:22:42 crc kubenswrapper[4901]: I0309 04:22:42.892316 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ww4h8" Mar 09 04:22:43 crc kubenswrapper[4901]: I0309 04:22:43.503611 4901 scope.go:117] "RemoveContainer" containerID="def578be99c9f0f716a9f6cf56e1dcf11e1a2b6201038c22c2d33d19113ad1e1" Mar 09 04:22:43 crc kubenswrapper[4901]: I0309 04:22:43.516478 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ww4h8" Mar 09 04:22:43 crc kubenswrapper[4901]: I0309 04:22:43.573889 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ww4h8"] Mar 09 04:22:45 crc kubenswrapper[4901]: I0309 04:22:45.469774 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ww4h8" podUID="5a884af1-fc3a-47a5-9d6a-8860bff0d0fd" containerName="registry-server" containerID="cri-o://e308e0ca052c911a3b1f9a5d59a3d5fea3320153fb0e5e671d259d461ace3dff" gracePeriod=2 Mar 09 04:22:45 crc kubenswrapper[4901]: I0309 04:22:45.952703 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ww4h8" Mar 09 04:22:46 crc kubenswrapper[4901]: I0309 04:22:46.085974 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a884af1-fc3a-47a5-9d6a-8860bff0d0fd-catalog-content\") pod \"5a884af1-fc3a-47a5-9d6a-8860bff0d0fd\" (UID: \"5a884af1-fc3a-47a5-9d6a-8860bff0d0fd\") " Mar 09 04:22:46 crc kubenswrapper[4901]: I0309 04:22:46.086084 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a884af1-fc3a-47a5-9d6a-8860bff0d0fd-utilities\") pod \"5a884af1-fc3a-47a5-9d6a-8860bff0d0fd\" (UID: \"5a884af1-fc3a-47a5-9d6a-8860bff0d0fd\") " Mar 09 04:22:46 crc kubenswrapper[4901]: I0309 04:22:46.086192 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pb9f5\" (UniqueName: \"kubernetes.io/projected/5a884af1-fc3a-47a5-9d6a-8860bff0d0fd-kube-api-access-pb9f5\") pod \"5a884af1-fc3a-47a5-9d6a-8860bff0d0fd\" (UID: \"5a884af1-fc3a-47a5-9d6a-8860bff0d0fd\") " Mar 09 04:22:46 crc kubenswrapper[4901]: I0309 04:22:46.087163 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a884af1-fc3a-47a5-9d6a-8860bff0d0fd-utilities" (OuterVolumeSpecName: "utilities") pod "5a884af1-fc3a-47a5-9d6a-8860bff0d0fd" (UID: "5a884af1-fc3a-47a5-9d6a-8860bff0d0fd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 04:22:46 crc kubenswrapper[4901]: I0309 04:22:46.093550 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a884af1-fc3a-47a5-9d6a-8860bff0d0fd-kube-api-access-pb9f5" (OuterVolumeSpecName: "kube-api-access-pb9f5") pod "5a884af1-fc3a-47a5-9d6a-8860bff0d0fd" (UID: "5a884af1-fc3a-47a5-9d6a-8860bff0d0fd"). InnerVolumeSpecName "kube-api-access-pb9f5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:22:46 crc kubenswrapper[4901]: I0309 04:22:46.146465 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a884af1-fc3a-47a5-9d6a-8860bff0d0fd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5a884af1-fc3a-47a5-9d6a-8860bff0d0fd" (UID: "5a884af1-fc3a-47a5-9d6a-8860bff0d0fd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 04:22:46 crc kubenswrapper[4901]: I0309 04:22:46.188604 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a884af1-fc3a-47a5-9d6a-8860bff0d0fd-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 04:22:46 crc kubenswrapper[4901]: I0309 04:22:46.188647 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pb9f5\" (UniqueName: \"kubernetes.io/projected/5a884af1-fc3a-47a5-9d6a-8860bff0d0fd-kube-api-access-pb9f5\") on node \"crc\" DevicePath \"\"" Mar 09 04:22:46 crc kubenswrapper[4901]: I0309 04:22:46.188658 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a884af1-fc3a-47a5-9d6a-8860bff0d0fd-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 04:22:46 crc kubenswrapper[4901]: I0309 04:22:46.485472 4901 generic.go:334] "Generic (PLEG): container finished" podID="5a884af1-fc3a-47a5-9d6a-8860bff0d0fd" containerID="e308e0ca052c911a3b1f9a5d59a3d5fea3320153fb0e5e671d259d461ace3dff" exitCode=0 Mar 09 04:22:46 crc kubenswrapper[4901]: I0309 04:22:46.485575 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ww4h8" event={"ID":"5a884af1-fc3a-47a5-9d6a-8860bff0d0fd","Type":"ContainerDied","Data":"e308e0ca052c911a3b1f9a5d59a3d5fea3320153fb0e5e671d259d461ace3dff"} Mar 09 04:22:46 crc kubenswrapper[4901]: I0309 04:22:46.485822 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ww4h8" event={"ID":"5a884af1-fc3a-47a5-9d6a-8860bff0d0fd","Type":"ContainerDied","Data":"104911d0cb0f3580d8e0d329ada09fcd9365090a1f003552e611ddf8837c9222"} Mar 09 04:22:46 crc kubenswrapper[4901]: I0309 04:22:46.485853 4901 scope.go:117] "RemoveContainer" containerID="e308e0ca052c911a3b1f9a5d59a3d5fea3320153fb0e5e671d259d461ace3dff" Mar 09 04:22:46 crc kubenswrapper[4901]: I0309 04:22:46.485612 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ww4h8" Mar 09 04:22:46 crc kubenswrapper[4901]: I0309 04:22:46.531522 4901 scope.go:117] "RemoveContainer" containerID="a801e830b179aec2fcd8786949238cca25d684ab33d0e5a3421065459260e3f7" Mar 09 04:22:46 crc kubenswrapper[4901]: I0309 04:22:46.594471 4901 scope.go:117] "RemoveContainer" containerID="9c7dc009e8c573687b6cc31a843d6a2975bca62713fd27a4a6d7cfea7d438ec8" Mar 09 04:22:46 crc kubenswrapper[4901]: I0309 04:22:46.633366 4901 scope.go:117] "RemoveContainer" containerID="e308e0ca052c911a3b1f9a5d59a3d5fea3320153fb0e5e671d259d461ace3dff" Mar 09 04:22:46 crc kubenswrapper[4901]: E0309 04:22:46.637373 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e308e0ca052c911a3b1f9a5d59a3d5fea3320153fb0e5e671d259d461ace3dff\": container with ID starting with e308e0ca052c911a3b1f9a5d59a3d5fea3320153fb0e5e671d259d461ace3dff not found: ID does not exist" containerID="e308e0ca052c911a3b1f9a5d59a3d5fea3320153fb0e5e671d259d461ace3dff" Mar 09 04:22:46 crc kubenswrapper[4901]: I0309 04:22:46.637426 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e308e0ca052c911a3b1f9a5d59a3d5fea3320153fb0e5e671d259d461ace3dff"} err="failed to get container status \"e308e0ca052c911a3b1f9a5d59a3d5fea3320153fb0e5e671d259d461ace3dff\": rpc error: code = NotFound desc = could not find container \"e308e0ca052c911a3b1f9a5d59a3d5fea3320153fb0e5e671d259d461ace3dff\": container with ID starting with e308e0ca052c911a3b1f9a5d59a3d5fea3320153fb0e5e671d259d461ace3dff not found: ID does not exist" Mar 09 04:22:46 crc kubenswrapper[4901]: I0309 04:22:46.637458 4901 scope.go:117] "RemoveContainer" containerID="a801e830b179aec2fcd8786949238cca25d684ab33d0e5a3421065459260e3f7" Mar 09 04:22:46 crc kubenswrapper[4901]: E0309 04:22:46.642378 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a801e830b179aec2fcd8786949238cca25d684ab33d0e5a3421065459260e3f7\": container with ID starting with a801e830b179aec2fcd8786949238cca25d684ab33d0e5a3421065459260e3f7 not found: ID does not exist" containerID="a801e830b179aec2fcd8786949238cca25d684ab33d0e5a3421065459260e3f7" Mar 09 04:22:46 crc kubenswrapper[4901]: I0309 04:22:46.642432 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a801e830b179aec2fcd8786949238cca25d684ab33d0e5a3421065459260e3f7"} err="failed to get container status \"a801e830b179aec2fcd8786949238cca25d684ab33d0e5a3421065459260e3f7\": rpc error: code = NotFound desc = could not find container \"a801e830b179aec2fcd8786949238cca25d684ab33d0e5a3421065459260e3f7\": container with ID starting with a801e830b179aec2fcd8786949238cca25d684ab33d0e5a3421065459260e3f7 not found: ID does not exist" Mar 09 04:22:46 crc kubenswrapper[4901]: I0309 04:22:46.642476 4901 scope.go:117] "RemoveContainer" containerID="9c7dc009e8c573687b6cc31a843d6a2975bca62713fd27a4a6d7cfea7d438ec8" Mar 09 04:22:46 crc kubenswrapper[4901]: I0309 04:22:46.642595 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ww4h8"] Mar 09 04:22:46 crc kubenswrapper[4901]: E0309 04:22:46.644645 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c7dc009e8c573687b6cc31a843d6a2975bca62713fd27a4a6d7cfea7d438ec8\": container with ID starting with 9c7dc009e8c573687b6cc31a843d6a2975bca62713fd27a4a6d7cfea7d438ec8 not found: ID does not exist" containerID="9c7dc009e8c573687b6cc31a843d6a2975bca62713fd27a4a6d7cfea7d438ec8" Mar 09 04:22:46 crc kubenswrapper[4901]: I0309 04:22:46.644678 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c7dc009e8c573687b6cc31a843d6a2975bca62713fd27a4a6d7cfea7d438ec8"} err="failed to get container status \"9c7dc009e8c573687b6cc31a843d6a2975bca62713fd27a4a6d7cfea7d438ec8\": rpc error: code = NotFound desc = could not find container \"9c7dc009e8c573687b6cc31a843d6a2975bca62713fd27a4a6d7cfea7d438ec8\": container with ID starting with 9c7dc009e8c573687b6cc31a843d6a2975bca62713fd27a4a6d7cfea7d438ec8 not found: ID does not exist" Mar 09 04:22:46 crc kubenswrapper[4901]: I0309 04:22:46.652006 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ww4h8"] Mar 09 04:22:48 crc kubenswrapper[4901]: I0309 04:22:48.127334 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a884af1-fc3a-47a5-9d6a-8860bff0d0fd" path="/var/lib/kubelet/pods/5a884af1-fc3a-47a5-9d6a-8860bff0d0fd/volumes" Mar 09 04:23:00 crc kubenswrapper[4901]: I0309 04:23:00.862738 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 04:23:00 crc kubenswrapper[4901]: I0309 04:23:00.863397 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 04:23:00 crc kubenswrapper[4901]: I0309 04:23:00.863444 4901 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5c998" Mar 09 04:23:00 crc kubenswrapper[4901]: I0309 04:23:00.864306 4901 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"100361249ed61d7cd1b77e958bc2d42a32f7e2074262e7f511a1dc3907fab49b"} pod="openshift-machine-config-operator/machine-config-daemon-5c998" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 04:23:00 crc kubenswrapper[4901]: I0309 04:23:00.864363 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" containerID="cri-o://100361249ed61d7cd1b77e958bc2d42a32f7e2074262e7f511a1dc3907fab49b" gracePeriod=600 Mar 09 04:23:00 crc kubenswrapper[4901]: E0309 04:23:00.998416 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:23:01 crc kubenswrapper[4901]: I0309 04:23:01.636639 4901 generic.go:334] "Generic (PLEG): container finished" podID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerID="100361249ed61d7cd1b77e958bc2d42a32f7e2074262e7f511a1dc3907fab49b" exitCode=0 Mar 09 04:23:01 crc kubenswrapper[4901]: I0309 04:23:01.636689 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" event={"ID":"65e722e8-52c4-4bb6-9927-f378b2f7296a","Type":"ContainerDied","Data":"100361249ed61d7cd1b77e958bc2d42a32f7e2074262e7f511a1dc3907fab49b"} Mar 09 04:23:01 crc kubenswrapper[4901]: I0309 04:23:01.637034 4901 scope.go:117] "RemoveContainer" containerID="e7ff674ef89b0fc5d88826b6774b6df277813f850f698639d418107ba854ec9d" Mar 09 04:23:01 crc kubenswrapper[4901]: I0309 04:23:01.637841 4901 scope.go:117] "RemoveContainer" containerID="100361249ed61d7cd1b77e958bc2d42a32f7e2074262e7f511a1dc3907fab49b" Mar 09 04:23:01 crc kubenswrapper[4901]: E0309 04:23:01.638125 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:23:15 crc kubenswrapper[4901]: I0309 04:23:15.106895 4901 scope.go:117] "RemoveContainer" containerID="100361249ed61d7cd1b77e958bc2d42a32f7e2074262e7f511a1dc3907fab49b" Mar 09 04:23:15 crc kubenswrapper[4901]: E0309 04:23:15.107817 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:23:29 crc kubenswrapper[4901]: I0309 04:23:29.107545 4901 scope.go:117] "RemoveContainer" containerID="100361249ed61d7cd1b77e958bc2d42a32f7e2074262e7f511a1dc3907fab49b" Mar 09 04:23:29 crc kubenswrapper[4901]: E0309 04:23:29.108906 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:23:43 crc kubenswrapper[4901]: I0309 04:23:43.108192 4901 scope.go:117] "RemoveContainer" containerID="100361249ed61d7cd1b77e958bc2d42a32f7e2074262e7f511a1dc3907fab49b" Mar 09 04:23:43 crc kubenswrapper[4901]: E0309 04:23:43.109578 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:23:55 crc kubenswrapper[4901]: I0309 04:23:55.107069 4901 scope.go:117] "RemoveContainer" containerID="100361249ed61d7cd1b77e958bc2d42a32f7e2074262e7f511a1dc3907fab49b" Mar 09 04:23:55 crc kubenswrapper[4901]: E0309 04:23:55.108062 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:23:57 crc kubenswrapper[4901]: I0309 04:23:57.063530 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-03b9-account-create-update-jbqws"] Mar 09 04:23:57 crc kubenswrapper[4901]: I0309 04:23:57.073476 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-s7nhh"] Mar 09 04:23:57 crc kubenswrapper[4901]: I0309 04:23:57.084445 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-03b9-account-create-update-jbqws"] Mar 09 04:23:57 crc kubenswrapper[4901]: I0309 04:23:57.092587 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-s7nhh"] Mar 09 04:23:58 crc kubenswrapper[4901]: I0309 04:23:58.121744 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09b6b1e3-ab73-49e7-aecd-fcd62e8bbf8c" path="/var/lib/kubelet/pods/09b6b1e3-ab73-49e7-aecd-fcd62e8bbf8c/volumes" Mar 09 04:23:58 crc kubenswrapper[4901]: I0309 04:23:58.122333 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6445c1e-79ac-4e92-b7ab-a697417e44f8" path="/var/lib/kubelet/pods/b6445c1e-79ac-4e92-b7ab-a697417e44f8/volumes" Mar 09 04:24:00 crc kubenswrapper[4901]: I0309 04:24:00.221468 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550504-7mjqt"] Mar 09 04:24:00 crc kubenswrapper[4901]: E0309 04:24:00.222124 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a884af1-fc3a-47a5-9d6a-8860bff0d0fd" containerName="registry-server" Mar 09 04:24:00 crc kubenswrapper[4901]: I0309 04:24:00.222152 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a884af1-fc3a-47a5-9d6a-8860bff0d0fd" containerName="registry-server" Mar 09 04:24:00 crc kubenswrapper[4901]: E0309 04:24:00.222203 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a884af1-fc3a-47a5-9d6a-8860bff0d0fd" containerName="extract-content" Mar 09 04:24:00 crc kubenswrapper[4901]: I0309 04:24:00.222219 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a884af1-fc3a-47a5-9d6a-8860bff0d0fd" containerName="extract-content" Mar 09 04:24:00 crc kubenswrapper[4901]: E0309 04:24:00.222279 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a884af1-fc3a-47a5-9d6a-8860bff0d0fd" containerName="extract-utilities" Mar 09 04:24:00 crc kubenswrapper[4901]: I0309 04:24:00.222298 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a884af1-fc3a-47a5-9d6a-8860bff0d0fd" containerName="extract-utilities" Mar 09 04:24:00 crc kubenswrapper[4901]: I0309 04:24:00.222630 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a884af1-fc3a-47a5-9d6a-8860bff0d0fd" containerName="registry-server" Mar 09 04:24:00 crc kubenswrapper[4901]: I0309 04:24:00.223688 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550504-7mjqt" Mar 09 04:24:00 crc kubenswrapper[4901]: I0309 04:24:00.225848 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lq6vs" Mar 09 04:24:00 crc kubenswrapper[4901]: I0309 04:24:00.226671 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 04:24:00 crc kubenswrapper[4901]: I0309 04:24:00.227869 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 04:24:00 crc kubenswrapper[4901]: I0309 04:24:00.239817 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550504-7mjqt"] Mar 09 04:24:00 crc kubenswrapper[4901]: I0309 04:24:00.338089 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8h2h\" (UniqueName: \"kubernetes.io/projected/5aff16dd-be1c-4dad-9397-1d215c1f9ef8-kube-api-access-d8h2h\") pod \"auto-csr-approver-29550504-7mjqt\" (UID: \"5aff16dd-be1c-4dad-9397-1d215c1f9ef8\") " pod="openshift-infra/auto-csr-approver-29550504-7mjqt" Mar 09 04:24:00 crc kubenswrapper[4901]: I0309 04:24:00.440341 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8h2h\" (UniqueName: \"kubernetes.io/projected/5aff16dd-be1c-4dad-9397-1d215c1f9ef8-kube-api-access-d8h2h\") pod \"auto-csr-approver-29550504-7mjqt\" (UID: \"5aff16dd-be1c-4dad-9397-1d215c1f9ef8\") " pod="openshift-infra/auto-csr-approver-29550504-7mjqt" Mar 09 04:24:00 crc kubenswrapper[4901]: I0309 04:24:00.472952 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8h2h\" (UniqueName: \"kubernetes.io/projected/5aff16dd-be1c-4dad-9397-1d215c1f9ef8-kube-api-access-d8h2h\") pod \"auto-csr-approver-29550504-7mjqt\" (UID: \"5aff16dd-be1c-4dad-9397-1d215c1f9ef8\") " pod="openshift-infra/auto-csr-approver-29550504-7mjqt" Mar 09 04:24:00 crc kubenswrapper[4901]: I0309 04:24:00.558212 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550504-7mjqt" Mar 09 04:24:01 crc kubenswrapper[4901]: I0309 04:24:01.060022 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550504-7mjqt"] Mar 09 04:24:01 crc kubenswrapper[4901]: I0309 04:24:01.238767 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550504-7mjqt" event={"ID":"5aff16dd-be1c-4dad-9397-1d215c1f9ef8","Type":"ContainerStarted","Data":"7eb3db0d4ecb457b0dc32ea58a9ae6632fc43dc26f264cb7c82fb542f6699679"} Mar 09 04:24:03 crc kubenswrapper[4901]: I0309 04:24:03.049533 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-stg5m"] Mar 09 04:24:03 crc kubenswrapper[4901]: I0309 04:24:03.058358 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-stg5m"] Mar 09 04:24:03 crc kubenswrapper[4901]: I0309 04:24:03.267930 4901 generic.go:334] "Generic (PLEG): container finished" podID="5aff16dd-be1c-4dad-9397-1d215c1f9ef8" containerID="50be4bf5bfc3bc874fb44c0c48bb61119d9960932bc2dd997a5d03f96f96f24e" exitCode=0 Mar 09 04:24:03 crc kubenswrapper[4901]: I0309 04:24:03.268276 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550504-7mjqt" event={"ID":"5aff16dd-be1c-4dad-9397-1d215c1f9ef8","Type":"ContainerDied","Data":"50be4bf5bfc3bc874fb44c0c48bb61119d9960932bc2dd997a5d03f96f96f24e"} Mar 09 04:24:04 crc kubenswrapper[4901]: I0309 04:24:04.127112 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9980b6c-281f-4ee2-82c5-ae0be5525a75" path="/var/lib/kubelet/pods/c9980b6c-281f-4ee2-82c5-ae0be5525a75/volumes" Mar 09 04:24:04 crc kubenswrapper[4901]: I0309 04:24:04.671659 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550504-7mjqt" Mar 09 04:24:04 crc kubenswrapper[4901]: I0309 04:24:04.818585 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8h2h\" (UniqueName: \"kubernetes.io/projected/5aff16dd-be1c-4dad-9397-1d215c1f9ef8-kube-api-access-d8h2h\") pod \"5aff16dd-be1c-4dad-9397-1d215c1f9ef8\" (UID: \"5aff16dd-be1c-4dad-9397-1d215c1f9ef8\") " Mar 09 04:24:04 crc kubenswrapper[4901]: I0309 04:24:04.823661 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5aff16dd-be1c-4dad-9397-1d215c1f9ef8-kube-api-access-d8h2h" (OuterVolumeSpecName: "kube-api-access-d8h2h") pod "5aff16dd-be1c-4dad-9397-1d215c1f9ef8" (UID: "5aff16dd-be1c-4dad-9397-1d215c1f9ef8"). InnerVolumeSpecName "kube-api-access-d8h2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:24:04 crc kubenswrapper[4901]: I0309 04:24:04.920787 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8h2h\" (UniqueName: \"kubernetes.io/projected/5aff16dd-be1c-4dad-9397-1d215c1f9ef8-kube-api-access-d8h2h\") on node \"crc\" DevicePath \"\"" Mar 09 04:24:05 crc kubenswrapper[4901]: I0309 04:24:05.293329 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550504-7mjqt" event={"ID":"5aff16dd-be1c-4dad-9397-1d215c1f9ef8","Type":"ContainerDied","Data":"7eb3db0d4ecb457b0dc32ea58a9ae6632fc43dc26f264cb7c82fb542f6699679"} Mar 09 04:24:05 crc kubenswrapper[4901]: I0309 04:24:05.293383 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7eb3db0d4ecb457b0dc32ea58a9ae6632fc43dc26f264cb7c82fb542f6699679" Mar 09 04:24:05 crc kubenswrapper[4901]: I0309 04:24:05.293411 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550504-7mjqt" Mar 09 04:24:05 crc kubenswrapper[4901]: I0309 04:24:05.753345 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550498-mpkrx"] Mar 09 04:24:05 crc kubenswrapper[4901]: I0309 04:24:05.761196 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550498-mpkrx"] Mar 09 04:24:06 crc kubenswrapper[4901]: I0309 04:24:06.122841 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="391820fa-b20b-475a-a06d-326041cf8728" path="/var/lib/kubelet/pods/391820fa-b20b-475a-a06d-326041cf8728/volumes" Mar 09 04:24:07 crc kubenswrapper[4901]: I0309 04:24:07.106425 4901 scope.go:117] "RemoveContainer" containerID="100361249ed61d7cd1b77e958bc2d42a32f7e2074262e7f511a1dc3907fab49b" Mar 09 04:24:07 crc kubenswrapper[4901]: E0309 04:24:07.106742 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:24:18 crc kubenswrapper[4901]: I0309 04:24:18.072300 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-px9wn"] Mar 09 04:24:18 crc kubenswrapper[4901]: I0309 04:24:18.088305 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-px9wn"] Mar 09 04:24:18 crc kubenswrapper[4901]: I0309 04:24:18.125466 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf" path="/var/lib/kubelet/pods/9ef77a0a-b8ca-4b35-9951-b3ca9d1d3faf/volumes" Mar 09 04:24:21 crc kubenswrapper[4901]: I0309 04:24:21.107271 4901 scope.go:117] "RemoveContainer" containerID="100361249ed61d7cd1b77e958bc2d42a32f7e2074262e7f511a1dc3907fab49b" Mar 09 04:24:21 crc kubenswrapper[4901]: E0309 04:24:21.108028 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:24:35 crc kubenswrapper[4901]: I0309 04:24:35.107479 4901 scope.go:117] "RemoveContainer" containerID="100361249ed61d7cd1b77e958bc2d42a32f7e2074262e7f511a1dc3907fab49b" Mar 09 04:24:35 crc kubenswrapper[4901]: E0309 04:24:35.108996 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:24:43 crc kubenswrapper[4901]: I0309 04:24:43.631175 4901 scope.go:117] "RemoveContainer" containerID="de454455d0ff97b922aa28c731915769c3c32e2566a19f004e38a89f59e7c628" Mar 09 04:24:43 crc kubenswrapper[4901]: I0309 04:24:43.663067 4901 scope.go:117] "RemoveContainer" containerID="a1476bec4bb7633a09dfe3d8f42b2a3e0099e79486ce817ae3993dbc5c11e195" Mar 09 04:24:43 crc kubenswrapper[4901]: I0309 04:24:43.729808 4901 scope.go:117] "RemoveContainer" containerID="7684a5e7253918e334c6c32e9589d03561b9feaab1c38a27e3370cd0b552990e" Mar 09 04:24:43 crc kubenswrapper[4901]: I0309 04:24:43.795889 4901 scope.go:117] "RemoveContainer" containerID="df28497cdaa411177171933b7a13f589d2a5cf20b3b66e05ac9eedf98db29289" Mar 09 04:24:43 crc kubenswrapper[4901]: I0309 04:24:43.821521 4901 scope.go:117] "RemoveContainer" containerID="376408b9a1611bfecca129502d5376008f1f799e32d35e8669afb17e4df0cbd3" Mar 09 04:24:49 crc kubenswrapper[4901]: I0309 04:24:49.106775 4901 scope.go:117] "RemoveContainer" containerID="100361249ed61d7cd1b77e958bc2d42a32f7e2074262e7f511a1dc3907fab49b" Mar 09 04:24:49 crc kubenswrapper[4901]: E0309 04:24:49.107905 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:25:03 crc kubenswrapper[4901]: I0309 04:25:03.106707 4901 scope.go:117] "RemoveContainer" containerID="100361249ed61d7cd1b77e958bc2d42a32f7e2074262e7f511a1dc3907fab49b" Mar 09 04:25:03 crc kubenswrapper[4901]: E0309 04:25:03.108298 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:25:15 crc kubenswrapper[4901]: I0309 04:25:15.106042 4901 scope.go:117] "RemoveContainer" containerID="100361249ed61d7cd1b77e958bc2d42a32f7e2074262e7f511a1dc3907fab49b" Mar 09 04:25:15 crc kubenswrapper[4901]: E0309 04:25:15.106979 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:25:29 crc kubenswrapper[4901]: I0309 04:25:29.107885 4901 scope.go:117] "RemoveContainer" containerID="100361249ed61d7cd1b77e958bc2d42a32f7e2074262e7f511a1dc3907fab49b" Mar 09 04:25:29 crc kubenswrapper[4901]: E0309 04:25:29.108868 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:25:44 crc kubenswrapper[4901]: I0309 04:25:44.109739 4901 scope.go:117] "RemoveContainer" containerID="100361249ed61d7cd1b77e958bc2d42a32f7e2074262e7f511a1dc3907fab49b" Mar 09 04:25:44 crc kubenswrapper[4901]: E0309 04:25:44.110711 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:25:55 crc kubenswrapper[4901]: I0309 04:25:55.106995 4901 scope.go:117] "RemoveContainer" containerID="100361249ed61d7cd1b77e958bc2d42a32f7e2074262e7f511a1dc3907fab49b" Mar 09 04:25:55 crc kubenswrapper[4901]: E0309 04:25:55.108272 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:26:00 crc kubenswrapper[4901]: I0309 04:26:00.181611 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550506-dcstq"] Mar 09 04:26:00 crc kubenswrapper[4901]: E0309 04:26:00.182663 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5aff16dd-be1c-4dad-9397-1d215c1f9ef8" containerName="oc" Mar 09 04:26:00 crc kubenswrapper[4901]: I0309 04:26:00.182684 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aff16dd-be1c-4dad-9397-1d215c1f9ef8" containerName="oc" Mar 09 04:26:00 crc kubenswrapper[4901]: I0309 04:26:00.182947 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="5aff16dd-be1c-4dad-9397-1d215c1f9ef8" containerName="oc" Mar 09 04:26:00 crc kubenswrapper[4901]: I0309 04:26:00.183796 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550506-dcstq" Mar 09 04:26:00 crc kubenswrapper[4901]: I0309 04:26:00.186195 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 04:26:00 crc kubenswrapper[4901]: I0309 04:26:00.186423 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lq6vs" Mar 09 04:26:00 crc kubenswrapper[4901]: I0309 04:26:00.188996 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 04:26:00 crc kubenswrapper[4901]: I0309 04:26:00.241856 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550506-dcstq"] Mar 09 04:26:00 crc kubenswrapper[4901]: I0309 04:26:00.349325 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rg88\" (UniqueName: \"kubernetes.io/projected/ead7818d-771b-4ba3-b7ad-95ab0bd15378-kube-api-access-8rg88\") pod \"auto-csr-approver-29550506-dcstq\" (UID: \"ead7818d-771b-4ba3-b7ad-95ab0bd15378\") " pod="openshift-infra/auto-csr-approver-29550506-dcstq" Mar 09 04:26:00 crc kubenswrapper[4901]: I0309 04:26:00.451687 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rg88\" (UniqueName: \"kubernetes.io/projected/ead7818d-771b-4ba3-b7ad-95ab0bd15378-kube-api-access-8rg88\") pod \"auto-csr-approver-29550506-dcstq\" (UID: \"ead7818d-771b-4ba3-b7ad-95ab0bd15378\") " pod="openshift-infra/auto-csr-approver-29550506-dcstq" Mar 09 04:26:00 crc kubenswrapper[4901]: I0309 04:26:00.470450 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rg88\" (UniqueName: \"kubernetes.io/projected/ead7818d-771b-4ba3-b7ad-95ab0bd15378-kube-api-access-8rg88\") pod \"auto-csr-approver-29550506-dcstq\" (UID: \"ead7818d-771b-4ba3-b7ad-95ab0bd15378\") " pod="openshift-infra/auto-csr-approver-29550506-dcstq" Mar 09 04:26:00 crc kubenswrapper[4901]: I0309 04:26:00.505763 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550506-dcstq" Mar 09 04:26:00 crc kubenswrapper[4901]: I0309 04:26:00.801407 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550506-dcstq"] Mar 09 04:26:01 crc kubenswrapper[4901]: I0309 04:26:01.508723 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550506-dcstq" event={"ID":"ead7818d-771b-4ba3-b7ad-95ab0bd15378","Type":"ContainerStarted","Data":"de9f7deda44eab6d73a6b60b4ee3530762f87740997d0afe8ad23b4b14e915ff"} Mar 09 04:26:02 crc kubenswrapper[4901]: I0309 04:26:02.523780 4901 generic.go:334] "Generic (PLEG): container finished" podID="ead7818d-771b-4ba3-b7ad-95ab0bd15378" containerID="87594a0e39eff61661b0cfa988c28592af47973f050c1f1458d2d64ce6aed757" exitCode=0 Mar 09 04:26:02 crc kubenswrapper[4901]: I0309 04:26:02.524089 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550506-dcstq" event={"ID":"ead7818d-771b-4ba3-b7ad-95ab0bd15378","Type":"ContainerDied","Data":"87594a0e39eff61661b0cfa988c28592af47973f050c1f1458d2d64ce6aed757"} Mar 09 04:26:03 crc kubenswrapper[4901]: I0309 04:26:03.882442 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550506-dcstq" Mar 09 04:26:04 crc kubenswrapper[4901]: I0309 04:26:04.021292 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rg88\" (UniqueName: \"kubernetes.io/projected/ead7818d-771b-4ba3-b7ad-95ab0bd15378-kube-api-access-8rg88\") pod \"ead7818d-771b-4ba3-b7ad-95ab0bd15378\" (UID: \"ead7818d-771b-4ba3-b7ad-95ab0bd15378\") " Mar 09 04:26:04 crc kubenswrapper[4901]: I0309 04:26:04.027406 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ead7818d-771b-4ba3-b7ad-95ab0bd15378-kube-api-access-8rg88" (OuterVolumeSpecName: "kube-api-access-8rg88") pod "ead7818d-771b-4ba3-b7ad-95ab0bd15378" (UID: "ead7818d-771b-4ba3-b7ad-95ab0bd15378"). InnerVolumeSpecName "kube-api-access-8rg88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:26:04 crc kubenswrapper[4901]: I0309 04:26:04.124897 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rg88\" (UniqueName: \"kubernetes.io/projected/ead7818d-771b-4ba3-b7ad-95ab0bd15378-kube-api-access-8rg88\") on node \"crc\" DevicePath \"\"" Mar 09 04:26:04 crc kubenswrapper[4901]: I0309 04:26:04.549760 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550506-dcstq" event={"ID":"ead7818d-771b-4ba3-b7ad-95ab0bd15378","Type":"ContainerDied","Data":"de9f7deda44eab6d73a6b60b4ee3530762f87740997d0afe8ad23b4b14e915ff"} Mar 09 04:26:04 crc kubenswrapper[4901]: I0309 04:26:04.549833 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de9f7deda44eab6d73a6b60b4ee3530762f87740997d0afe8ad23b4b14e915ff" Mar 09 04:26:04 crc kubenswrapper[4901]: I0309 04:26:04.549911 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550506-dcstq" Mar 09 04:26:05 crc kubenswrapper[4901]: I0309 04:26:05.003480 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550500-sz6h7"] Mar 09 04:26:05 crc kubenswrapper[4901]: I0309 04:26:05.012822 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550500-sz6h7"] Mar 09 04:26:06 crc kubenswrapper[4901]: I0309 04:26:06.122097 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8db118a-59ed-4968-a6d1-52ebcbf1feca" path="/var/lib/kubelet/pods/d8db118a-59ed-4968-a6d1-52ebcbf1feca/volumes" Mar 09 04:26:09 crc kubenswrapper[4901]: I0309 04:26:09.106907 4901 scope.go:117] "RemoveContainer" containerID="100361249ed61d7cd1b77e958bc2d42a32f7e2074262e7f511a1dc3907fab49b" Mar 09 04:26:09 crc kubenswrapper[4901]: E0309 04:26:09.107573 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:26:21 crc kubenswrapper[4901]: I0309 04:26:21.112726 4901 scope.go:117] "RemoveContainer" containerID="100361249ed61d7cd1b77e958bc2d42a32f7e2074262e7f511a1dc3907fab49b" Mar 09 04:26:21 crc kubenswrapper[4901]: E0309 04:26:21.114130 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:26:35 crc kubenswrapper[4901]: I0309 04:26:35.106572 4901 scope.go:117] "RemoveContainer" containerID="100361249ed61d7cd1b77e958bc2d42a32f7e2074262e7f511a1dc3907fab49b" Mar 09 04:26:35 crc kubenswrapper[4901]: E0309 04:26:35.108130 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:26:43 crc kubenswrapper[4901]: I0309 04:26:43.970255 4901 scope.go:117] "RemoveContainer" containerID="e105a17d1f123a53db81a40583199747b0fa52dc9c4dbed951475f7aed30d7e4" Mar 09 04:26:47 crc kubenswrapper[4901]: I0309 04:26:47.106437 4901 scope.go:117] "RemoveContainer" containerID="100361249ed61d7cd1b77e958bc2d42a32f7e2074262e7f511a1dc3907fab49b" Mar 09 04:26:47 crc kubenswrapper[4901]: E0309 04:26:47.106836 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:27:02 crc kubenswrapper[4901]: I0309 04:27:02.106093 4901 scope.go:117] "RemoveContainer" containerID="100361249ed61d7cd1b77e958bc2d42a32f7e2074262e7f511a1dc3907fab49b" Mar 09 04:27:02 crc kubenswrapper[4901]: E0309 04:27:02.106958 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:27:16 crc kubenswrapper[4901]: I0309 04:27:16.154488 4901 scope.go:117] "RemoveContainer" containerID="100361249ed61d7cd1b77e958bc2d42a32f7e2074262e7f511a1dc3907fab49b" Mar 09 04:27:16 crc kubenswrapper[4901]: E0309 04:27:16.155736 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:27:18 crc kubenswrapper[4901]: I0309 04:27:18.121506 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tlwkw"] Mar 09 04:27:18 crc kubenswrapper[4901]: E0309 04:27:18.121793 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ead7818d-771b-4ba3-b7ad-95ab0bd15378" containerName="oc" Mar 09 04:27:18 crc kubenswrapper[4901]: I0309 04:27:18.121804 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="ead7818d-771b-4ba3-b7ad-95ab0bd15378" containerName="oc" Mar 09 04:27:18 crc kubenswrapper[4901]: I0309 04:27:18.121990 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="ead7818d-771b-4ba3-b7ad-95ab0bd15378" containerName="oc" Mar 09 04:27:18 crc kubenswrapper[4901]: I0309 04:27:18.123393 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tlwkw" Mar 09 04:27:18 crc kubenswrapper[4901]: I0309 04:27:18.133623 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tlwkw"] Mar 09 04:27:18 crc kubenswrapper[4901]: I0309 04:27:18.186340 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db171228-c197-4ec8-9a72-2244710d178b-catalog-content\") pod \"community-operators-tlwkw\" (UID: \"db171228-c197-4ec8-9a72-2244710d178b\") " pod="openshift-marketplace/community-operators-tlwkw" Mar 09 04:27:18 crc kubenswrapper[4901]: I0309 04:27:18.186701 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5449h\" (UniqueName: \"kubernetes.io/projected/db171228-c197-4ec8-9a72-2244710d178b-kube-api-access-5449h\") pod \"community-operators-tlwkw\" (UID: \"db171228-c197-4ec8-9a72-2244710d178b\") " pod="openshift-marketplace/community-operators-tlwkw" Mar 09 04:27:18 crc kubenswrapper[4901]: I0309 04:27:18.186813 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db171228-c197-4ec8-9a72-2244710d178b-utilities\") pod \"community-operators-tlwkw\" (UID: \"db171228-c197-4ec8-9a72-2244710d178b\") " pod="openshift-marketplace/community-operators-tlwkw" Mar 09 04:27:18 crc kubenswrapper[4901]: I0309 04:27:18.288346 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db171228-c197-4ec8-9a72-2244710d178b-catalog-content\") pod \"community-operators-tlwkw\" (UID: \"db171228-c197-4ec8-9a72-2244710d178b\") " pod="openshift-marketplace/community-operators-tlwkw" Mar 09 04:27:18 crc kubenswrapper[4901]: I0309 04:27:18.288402 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5449h\" (UniqueName: \"kubernetes.io/projected/db171228-c197-4ec8-9a72-2244710d178b-kube-api-access-5449h\") pod \"community-operators-tlwkw\" (UID: \"db171228-c197-4ec8-9a72-2244710d178b\") " pod="openshift-marketplace/community-operators-tlwkw" Mar 09 04:27:18 crc kubenswrapper[4901]: I0309 04:27:18.288461 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db171228-c197-4ec8-9a72-2244710d178b-utilities\") pod \"community-operators-tlwkw\" (UID: \"db171228-c197-4ec8-9a72-2244710d178b\") " pod="openshift-marketplace/community-operators-tlwkw" Mar 09 04:27:18 crc kubenswrapper[4901]: I0309 04:27:18.289112 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db171228-c197-4ec8-9a72-2244710d178b-utilities\") pod \"community-operators-tlwkw\" (UID: \"db171228-c197-4ec8-9a72-2244710d178b\") " pod="openshift-marketplace/community-operators-tlwkw" Mar 09 04:27:18 crc kubenswrapper[4901]: I0309 04:27:18.289446 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db171228-c197-4ec8-9a72-2244710d178b-catalog-content\") pod \"community-operators-tlwkw\" (UID: \"db171228-c197-4ec8-9a72-2244710d178b\") " pod="openshift-marketplace/community-operators-tlwkw" Mar 09 04:27:18 crc kubenswrapper[4901]: I0309 04:27:18.312785 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5449h\" (UniqueName: \"kubernetes.io/projected/db171228-c197-4ec8-9a72-2244710d178b-kube-api-access-5449h\") pod \"community-operators-tlwkw\" (UID: \"db171228-c197-4ec8-9a72-2244710d178b\") " pod="openshift-marketplace/community-operators-tlwkw" Mar 09 04:27:18 crc kubenswrapper[4901]: I0309 04:27:18.448918 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tlwkw" Mar 09 04:27:18 crc kubenswrapper[4901]: I0309 04:27:18.936035 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tlwkw"] Mar 09 04:27:18 crc kubenswrapper[4901]: W0309 04:27:18.939502 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb171228_c197_4ec8_9a72_2244710d178b.slice/crio-0368962f721beeb8dccf97e9555d27ef36ffb58f270fb1ea2d7c997b29832a4e WatchSource:0}: Error finding container 0368962f721beeb8dccf97e9555d27ef36ffb58f270fb1ea2d7c997b29832a4e: Status 404 returned error can't find the container with id 0368962f721beeb8dccf97e9555d27ef36ffb58f270fb1ea2d7c997b29832a4e Mar 09 04:27:19 crc kubenswrapper[4901]: I0309 04:27:19.249990 4901 generic.go:334] "Generic (PLEG): container finished" podID="db171228-c197-4ec8-9a72-2244710d178b" containerID="8b075e898464a9984afb8f612e05b104326f114752190c5c22f8aa93edc9f41e" exitCode=0 Mar 09 04:27:19 crc kubenswrapper[4901]: I0309 04:27:19.250076 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tlwkw" event={"ID":"db171228-c197-4ec8-9a72-2244710d178b","Type":"ContainerDied","Data":"8b075e898464a9984afb8f612e05b104326f114752190c5c22f8aa93edc9f41e"} Mar 09 04:27:19 crc kubenswrapper[4901]: I0309 04:27:19.250351 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tlwkw" event={"ID":"db171228-c197-4ec8-9a72-2244710d178b","Type":"ContainerStarted","Data":"0368962f721beeb8dccf97e9555d27ef36ffb58f270fb1ea2d7c997b29832a4e"} Mar 09 04:27:20 crc kubenswrapper[4901]: I0309 04:27:20.267211 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tlwkw" event={"ID":"db171228-c197-4ec8-9a72-2244710d178b","Type":"ContainerStarted","Data":"304fa9e57ac8183e64a68ee914bd0930b7ebfca0369c8d0f8604a5c67dbb0528"} Mar 09 04:27:21 crc kubenswrapper[4901]: I0309 04:27:21.311822 4901 generic.go:334] "Generic (PLEG): container finished" podID="db171228-c197-4ec8-9a72-2244710d178b" containerID="304fa9e57ac8183e64a68ee914bd0930b7ebfca0369c8d0f8604a5c67dbb0528" exitCode=0 Mar 09 04:27:21 crc kubenswrapper[4901]: I0309 04:27:21.312121 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tlwkw" event={"ID":"db171228-c197-4ec8-9a72-2244710d178b","Type":"ContainerDied","Data":"304fa9e57ac8183e64a68ee914bd0930b7ebfca0369c8d0f8604a5c67dbb0528"} Mar 09 04:27:22 crc kubenswrapper[4901]: I0309 04:27:22.323164 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tlwkw" event={"ID":"db171228-c197-4ec8-9a72-2244710d178b","Type":"ContainerStarted","Data":"aa7ae259cf05d5f825014ce23692762d0f9e4b07499a2e03a2f1abfb388ae2e2"} Mar 09 04:27:22 crc kubenswrapper[4901]: I0309 04:27:22.346592 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tlwkw" podStartSLOduration=1.8944000920000001 podStartE2EDuration="4.346575186s" podCreationTimestamp="2026-03-09 04:27:18 +0000 UTC" firstStartedPulling="2026-03-09 04:27:19.251875238 +0000 UTC m=+6363.841538990" lastFinishedPulling="2026-03-09 04:27:21.704050322 +0000 UTC m=+6366.293714084" observedRunningTime="2026-03-09 04:27:22.342763272 +0000 UTC m=+6366.932427024" watchObservedRunningTime="2026-03-09 04:27:22.346575186 +0000 UTC m=+6366.936238918" Mar 09 04:27:28 crc kubenswrapper[4901]: I0309 04:27:28.449470 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tlwkw" Mar 09 04:27:28 crc kubenswrapper[4901]: I0309 04:27:28.450155 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tlwkw" Mar 09 04:27:28 crc kubenswrapper[4901]: I0309 04:27:28.520230 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tlwkw" Mar 09 04:27:29 crc kubenswrapper[4901]: I0309 04:27:29.446533 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tlwkw" Mar 09 04:27:29 crc kubenswrapper[4901]: I0309 04:27:29.518674 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tlwkw"] Mar 09 04:27:31 crc kubenswrapper[4901]: I0309 04:27:31.107037 4901 scope.go:117] "RemoveContainer" containerID="100361249ed61d7cd1b77e958bc2d42a32f7e2074262e7f511a1dc3907fab49b" Mar 09 04:27:31 crc kubenswrapper[4901]: E0309 04:27:31.107926 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:27:31 crc kubenswrapper[4901]: I0309 04:27:31.184445 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sk8pm"] Mar 09 04:27:31 crc kubenswrapper[4901]: I0309 04:27:31.187049 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sk8pm" Mar 09 04:27:31 crc kubenswrapper[4901]: I0309 04:27:31.219483 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sk8pm"] Mar 09 04:27:31 crc kubenswrapper[4901]: I0309 04:27:31.252167 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9235a26f-6030-48c5-8177-4f2d3772ef93-utilities\") pod \"redhat-marketplace-sk8pm\" (UID: \"9235a26f-6030-48c5-8177-4f2d3772ef93\") " pod="openshift-marketplace/redhat-marketplace-sk8pm" Mar 09 04:27:31 crc kubenswrapper[4901]: I0309 04:27:31.252343 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfs8h\" (UniqueName: \"kubernetes.io/projected/9235a26f-6030-48c5-8177-4f2d3772ef93-kube-api-access-kfs8h\") pod \"redhat-marketplace-sk8pm\" (UID: \"9235a26f-6030-48c5-8177-4f2d3772ef93\") " pod="openshift-marketplace/redhat-marketplace-sk8pm" Mar 09 04:27:31 crc kubenswrapper[4901]: I0309 04:27:31.252405 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9235a26f-6030-48c5-8177-4f2d3772ef93-catalog-content\") pod \"redhat-marketplace-sk8pm\" (UID: \"9235a26f-6030-48c5-8177-4f2d3772ef93\") " pod="openshift-marketplace/redhat-marketplace-sk8pm" Mar 09 04:27:31 crc kubenswrapper[4901]: I0309 04:27:31.353391 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9235a26f-6030-48c5-8177-4f2d3772ef93-catalog-content\") pod \"redhat-marketplace-sk8pm\" (UID: \"9235a26f-6030-48c5-8177-4f2d3772ef93\") " pod="openshift-marketplace/redhat-marketplace-sk8pm" Mar 09 04:27:31 crc kubenswrapper[4901]: I0309 04:27:31.353509 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9235a26f-6030-48c5-8177-4f2d3772ef93-utilities\") pod \"redhat-marketplace-sk8pm\" (UID: \"9235a26f-6030-48c5-8177-4f2d3772ef93\") " pod="openshift-marketplace/redhat-marketplace-sk8pm" Mar 09 04:27:31 crc kubenswrapper[4901]: I0309 04:27:31.354088 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9235a26f-6030-48c5-8177-4f2d3772ef93-catalog-content\") pod \"redhat-marketplace-sk8pm\" (UID: \"9235a26f-6030-48c5-8177-4f2d3772ef93\") " pod="openshift-marketplace/redhat-marketplace-sk8pm" Mar 09 04:27:31 crc kubenswrapper[4901]: I0309 04:27:31.354157 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9235a26f-6030-48c5-8177-4f2d3772ef93-utilities\") pod \"redhat-marketplace-sk8pm\" (UID: \"9235a26f-6030-48c5-8177-4f2d3772ef93\") " pod="openshift-marketplace/redhat-marketplace-sk8pm" Mar 09 04:27:31 crc kubenswrapper[4901]: I0309 04:27:31.354279 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfs8h\" (UniqueName: \"kubernetes.io/projected/9235a26f-6030-48c5-8177-4f2d3772ef93-kube-api-access-kfs8h\") pod \"redhat-marketplace-sk8pm\" (UID: \"9235a26f-6030-48c5-8177-4f2d3772ef93\") " pod="openshift-marketplace/redhat-marketplace-sk8pm" Mar 09 04:27:31 crc kubenswrapper[4901]: I0309 04:27:31.374040 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfs8h\" (UniqueName: \"kubernetes.io/projected/9235a26f-6030-48c5-8177-4f2d3772ef93-kube-api-access-kfs8h\") pod \"redhat-marketplace-sk8pm\" (UID: \"9235a26f-6030-48c5-8177-4f2d3772ef93\") " pod="openshift-marketplace/redhat-marketplace-sk8pm" Mar 09 04:27:31 crc kubenswrapper[4901]: I0309 04:27:31.403063 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tlwkw" podUID="db171228-c197-4ec8-9a72-2244710d178b" containerName="registry-server" containerID="cri-o://aa7ae259cf05d5f825014ce23692762d0f9e4b07499a2e03a2f1abfb388ae2e2" gracePeriod=2 Mar 09 04:27:31 crc kubenswrapper[4901]: I0309 04:27:31.550849 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sk8pm" Mar 09 04:27:31 crc kubenswrapper[4901]: I0309 04:27:31.918673 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tlwkw" Mar 09 04:27:32 crc kubenswrapper[4901]: I0309 04:27:32.064861 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db171228-c197-4ec8-9a72-2244710d178b-utilities\") pod \"db171228-c197-4ec8-9a72-2244710d178b\" (UID: \"db171228-c197-4ec8-9a72-2244710d178b\") " Mar 09 04:27:32 crc kubenswrapper[4901]: I0309 04:27:32.065125 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db171228-c197-4ec8-9a72-2244710d178b-catalog-content\") pod \"db171228-c197-4ec8-9a72-2244710d178b\" (UID: \"db171228-c197-4ec8-9a72-2244710d178b\") " Mar 09 04:27:32 crc kubenswrapper[4901]: I0309 04:27:32.065179 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5449h\" (UniqueName: \"kubernetes.io/projected/db171228-c197-4ec8-9a72-2244710d178b-kube-api-access-5449h\") pod \"db171228-c197-4ec8-9a72-2244710d178b\" (UID: \"db171228-c197-4ec8-9a72-2244710d178b\") " Mar 09 04:27:32 crc kubenswrapper[4901]: I0309 04:27:32.065737 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db171228-c197-4ec8-9a72-2244710d178b-utilities" (OuterVolumeSpecName: "utilities") pod "db171228-c197-4ec8-9a72-2244710d178b" (UID: "db171228-c197-4ec8-9a72-2244710d178b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 04:27:32 crc kubenswrapper[4901]: I0309 04:27:32.066873 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sk8pm"] Mar 09 04:27:32 crc kubenswrapper[4901]: I0309 04:27:32.069956 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db171228-c197-4ec8-9a72-2244710d178b-kube-api-access-5449h" (OuterVolumeSpecName: "kube-api-access-5449h") pod "db171228-c197-4ec8-9a72-2244710d178b" (UID: "db171228-c197-4ec8-9a72-2244710d178b"). InnerVolumeSpecName "kube-api-access-5449h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:27:32 crc kubenswrapper[4901]: I0309 04:27:32.118978 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db171228-c197-4ec8-9a72-2244710d178b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "db171228-c197-4ec8-9a72-2244710d178b" (UID: "db171228-c197-4ec8-9a72-2244710d178b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 04:27:32 crc kubenswrapper[4901]: I0309 04:27:32.167768 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db171228-c197-4ec8-9a72-2244710d178b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 04:27:32 crc kubenswrapper[4901]: I0309 04:27:32.167801 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5449h\" (UniqueName: \"kubernetes.io/projected/db171228-c197-4ec8-9a72-2244710d178b-kube-api-access-5449h\") on node \"crc\" DevicePath \"\"" Mar 09 04:27:32 crc kubenswrapper[4901]: I0309 04:27:32.167812 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db171228-c197-4ec8-9a72-2244710d178b-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 04:27:32 crc kubenswrapper[4901]: I0309 04:27:32.414448 4901 generic.go:334] "Generic (PLEG): container finished" podID="9235a26f-6030-48c5-8177-4f2d3772ef93" containerID="83be0ee304486f308a9f457ceccb3a019b7ef836b9a4390184190bfb9fb6732a" exitCode=0 Mar 09 04:27:32 crc kubenswrapper[4901]: I0309 04:27:32.414533 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sk8pm" event={"ID":"9235a26f-6030-48c5-8177-4f2d3772ef93","Type":"ContainerDied","Data":"83be0ee304486f308a9f457ceccb3a019b7ef836b9a4390184190bfb9fb6732a"} Mar 09 04:27:32 crc kubenswrapper[4901]: I0309 04:27:32.414567 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sk8pm" event={"ID":"9235a26f-6030-48c5-8177-4f2d3772ef93","Type":"ContainerStarted","Data":"195abd6bed11249e6e4bf011c5512ebb8b29a30eadacec71549dcb29af292fa3"} Mar 09 04:27:32 crc kubenswrapper[4901]: I0309 04:27:32.421027 4901 generic.go:334] "Generic (PLEG): container finished" podID="db171228-c197-4ec8-9a72-2244710d178b" containerID="aa7ae259cf05d5f825014ce23692762d0f9e4b07499a2e03a2f1abfb388ae2e2" exitCode=0 Mar 09 04:27:32 crc kubenswrapper[4901]: I0309 04:27:32.421070 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tlwkw" event={"ID":"db171228-c197-4ec8-9a72-2244710d178b","Type":"ContainerDied","Data":"aa7ae259cf05d5f825014ce23692762d0f9e4b07499a2e03a2f1abfb388ae2e2"} Mar 09 04:27:32 crc kubenswrapper[4901]: I0309 04:27:32.421099 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tlwkw" event={"ID":"db171228-c197-4ec8-9a72-2244710d178b","Type":"ContainerDied","Data":"0368962f721beeb8dccf97e9555d27ef36ffb58f270fb1ea2d7c997b29832a4e"} Mar 09 04:27:32 crc kubenswrapper[4901]: I0309 04:27:32.421111 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tlwkw" Mar 09 04:27:32 crc kubenswrapper[4901]: I0309 04:27:32.421121 4901 scope.go:117] "RemoveContainer" containerID="aa7ae259cf05d5f825014ce23692762d0f9e4b07499a2e03a2f1abfb388ae2e2" Mar 09 04:27:32 crc kubenswrapper[4901]: I0309 04:27:32.456734 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tlwkw"] Mar 09 04:27:32 crc kubenswrapper[4901]: I0309 04:27:32.456963 4901 scope.go:117] "RemoveContainer" containerID="304fa9e57ac8183e64a68ee914bd0930b7ebfca0369c8d0f8604a5c67dbb0528" Mar 09 04:27:32 crc kubenswrapper[4901]: I0309 04:27:32.465872 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tlwkw"] Mar 09 04:27:32 crc kubenswrapper[4901]: I0309 04:27:32.487404 4901 scope.go:117] "RemoveContainer" containerID="8b075e898464a9984afb8f612e05b104326f114752190c5c22f8aa93edc9f41e" Mar 09 04:27:32 crc kubenswrapper[4901]: I0309 04:27:32.526665 4901 scope.go:117] "RemoveContainer" containerID="aa7ae259cf05d5f825014ce23692762d0f9e4b07499a2e03a2f1abfb388ae2e2" Mar 09 04:27:32 crc kubenswrapper[4901]: E0309 04:27:32.527140 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa7ae259cf05d5f825014ce23692762d0f9e4b07499a2e03a2f1abfb388ae2e2\": container with ID starting with aa7ae259cf05d5f825014ce23692762d0f9e4b07499a2e03a2f1abfb388ae2e2 not found: ID does not exist" containerID="aa7ae259cf05d5f825014ce23692762d0f9e4b07499a2e03a2f1abfb388ae2e2" Mar 09 04:27:32 crc kubenswrapper[4901]: I0309 04:27:32.527181 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa7ae259cf05d5f825014ce23692762d0f9e4b07499a2e03a2f1abfb388ae2e2"} err="failed to get container status \"aa7ae259cf05d5f825014ce23692762d0f9e4b07499a2e03a2f1abfb388ae2e2\": rpc error: code = NotFound desc = could not find container \"aa7ae259cf05d5f825014ce23692762d0f9e4b07499a2e03a2f1abfb388ae2e2\": container with ID starting with aa7ae259cf05d5f825014ce23692762d0f9e4b07499a2e03a2f1abfb388ae2e2 not found: ID does not exist" Mar 09 04:27:32 crc kubenswrapper[4901]: I0309 04:27:32.527210 4901 scope.go:117] "RemoveContainer" containerID="304fa9e57ac8183e64a68ee914bd0930b7ebfca0369c8d0f8604a5c67dbb0528" Mar 09 04:27:32 crc kubenswrapper[4901]: E0309 04:27:32.527641 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"304fa9e57ac8183e64a68ee914bd0930b7ebfca0369c8d0f8604a5c67dbb0528\": container with ID starting with 304fa9e57ac8183e64a68ee914bd0930b7ebfca0369c8d0f8604a5c67dbb0528 not found: ID does not exist" containerID="304fa9e57ac8183e64a68ee914bd0930b7ebfca0369c8d0f8604a5c67dbb0528" Mar 09 04:27:32 crc kubenswrapper[4901]: I0309 04:27:32.527676 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"304fa9e57ac8183e64a68ee914bd0930b7ebfca0369c8d0f8604a5c67dbb0528"} err="failed to get container status \"304fa9e57ac8183e64a68ee914bd0930b7ebfca0369c8d0f8604a5c67dbb0528\": rpc error: code = NotFound desc = could not find container \"304fa9e57ac8183e64a68ee914bd0930b7ebfca0369c8d0f8604a5c67dbb0528\": container with ID starting with 304fa9e57ac8183e64a68ee914bd0930b7ebfca0369c8d0f8604a5c67dbb0528 not found: ID does not exist" Mar 09 04:27:32 crc kubenswrapper[4901]: I0309 04:27:32.527698 4901 scope.go:117] "RemoveContainer" containerID="8b075e898464a9984afb8f612e05b104326f114752190c5c22f8aa93edc9f41e" Mar 09 04:27:32 crc kubenswrapper[4901]: E0309 04:27:32.527989 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b075e898464a9984afb8f612e05b104326f114752190c5c22f8aa93edc9f41e\": container with ID starting with 8b075e898464a9984afb8f612e05b104326f114752190c5c22f8aa93edc9f41e not found: ID does not exist" containerID="8b075e898464a9984afb8f612e05b104326f114752190c5c22f8aa93edc9f41e" Mar 09 04:27:32 crc kubenswrapper[4901]: I0309 04:27:32.528013 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b075e898464a9984afb8f612e05b104326f114752190c5c22f8aa93edc9f41e"} err="failed to get container status \"8b075e898464a9984afb8f612e05b104326f114752190c5c22f8aa93edc9f41e\": rpc error: code = NotFound desc = could not find container \"8b075e898464a9984afb8f612e05b104326f114752190c5c22f8aa93edc9f41e\": container with ID starting with 8b075e898464a9984afb8f612e05b104326f114752190c5c22f8aa93edc9f41e not found: ID does not exist" Mar 09 04:27:33 crc kubenswrapper[4901]: I0309 04:27:33.434093 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sk8pm" event={"ID":"9235a26f-6030-48c5-8177-4f2d3772ef93","Type":"ContainerStarted","Data":"6f16d730a4a44880ba8749e464863b9f57b04fe68f4c273770cb927d712f67c7"} Mar 09 04:27:34 crc kubenswrapper[4901]: I0309 04:27:34.125250 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db171228-c197-4ec8-9a72-2244710d178b" path="/var/lib/kubelet/pods/db171228-c197-4ec8-9a72-2244710d178b/volumes" Mar 09 04:27:34 crc kubenswrapper[4901]: I0309 04:27:34.454542 4901 generic.go:334] "Generic (PLEG): container finished" podID="9235a26f-6030-48c5-8177-4f2d3772ef93" containerID="6f16d730a4a44880ba8749e464863b9f57b04fe68f4c273770cb927d712f67c7" exitCode=0 Mar 09 04:27:34 crc kubenswrapper[4901]: I0309 04:27:34.454583 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sk8pm" event={"ID":"9235a26f-6030-48c5-8177-4f2d3772ef93","Type":"ContainerDied","Data":"6f16d730a4a44880ba8749e464863b9f57b04fe68f4c273770cb927d712f67c7"} Mar 09 04:27:34 crc kubenswrapper[4901]: I0309 04:27:34.457485 4901 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 04:27:35 crc kubenswrapper[4901]: I0309 04:27:35.464789 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sk8pm" event={"ID":"9235a26f-6030-48c5-8177-4f2d3772ef93","Type":"ContainerStarted","Data":"7559cbe77c9b51a82cf5823003a8304be80b5a3a09263adc5d2c00de44297df0"} Mar 09 04:27:35 crc kubenswrapper[4901]: I0309 04:27:35.484539 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sk8pm" podStartSLOduration=2.034342266 podStartE2EDuration="4.484523692s" podCreationTimestamp="2026-03-09 04:27:31 +0000 UTC" firstStartedPulling="2026-03-09 04:27:32.416215623 +0000 UTC m=+6377.005879355" lastFinishedPulling="2026-03-09 04:27:34.866397049 +0000 UTC m=+6379.456060781" observedRunningTime="2026-03-09 04:27:35.479697053 +0000 UTC m=+6380.069360785" watchObservedRunningTime="2026-03-09 04:27:35.484523692 +0000 UTC m=+6380.074187424" Mar 09 04:27:41 crc kubenswrapper[4901]: I0309 04:27:41.551602 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sk8pm" Mar 09 04:27:41 crc kubenswrapper[4901]: I0309 04:27:41.551983 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sk8pm" Mar 09 04:27:41 crc kubenswrapper[4901]: I0309 04:27:41.636467 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sk8pm" Mar 09 04:27:42 crc kubenswrapper[4901]: I0309 04:27:42.611036 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sk8pm" Mar 09 04:27:42 crc kubenswrapper[4901]: I0309 04:27:42.677356 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sk8pm"] Mar 09 04:27:44 crc kubenswrapper[4901]: I0309 04:27:44.552818 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sk8pm" podUID="9235a26f-6030-48c5-8177-4f2d3772ef93" containerName="registry-server" containerID="cri-o://7559cbe77c9b51a82cf5823003a8304be80b5a3a09263adc5d2c00de44297df0" gracePeriod=2 Mar 09 04:27:45 crc kubenswrapper[4901]: I0309 04:27:45.106526 4901 scope.go:117] "RemoveContainer" containerID="100361249ed61d7cd1b77e958bc2d42a32f7e2074262e7f511a1dc3907fab49b" Mar 09 04:27:45 crc kubenswrapper[4901]: E0309 04:27:45.107136 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:27:45 crc kubenswrapper[4901]: I0309 04:27:45.137672 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sk8pm" Mar 09 04:27:45 crc kubenswrapper[4901]: I0309 04:27:45.223634 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfs8h\" (UniqueName: \"kubernetes.io/projected/9235a26f-6030-48c5-8177-4f2d3772ef93-kube-api-access-kfs8h\") pod \"9235a26f-6030-48c5-8177-4f2d3772ef93\" (UID: \"9235a26f-6030-48c5-8177-4f2d3772ef93\") " Mar 09 04:27:45 crc kubenswrapper[4901]: I0309 04:27:45.223799 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9235a26f-6030-48c5-8177-4f2d3772ef93-catalog-content\") pod \"9235a26f-6030-48c5-8177-4f2d3772ef93\" (UID: \"9235a26f-6030-48c5-8177-4f2d3772ef93\") " Mar 09 04:27:45 crc kubenswrapper[4901]: I0309 04:27:45.223825 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9235a26f-6030-48c5-8177-4f2d3772ef93-utilities\") pod \"9235a26f-6030-48c5-8177-4f2d3772ef93\" (UID: \"9235a26f-6030-48c5-8177-4f2d3772ef93\") " Mar 09 04:27:45 crc kubenswrapper[4901]: I0309 04:27:45.224639 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9235a26f-6030-48c5-8177-4f2d3772ef93-utilities" (OuterVolumeSpecName: "utilities") pod "9235a26f-6030-48c5-8177-4f2d3772ef93" (UID: "9235a26f-6030-48c5-8177-4f2d3772ef93"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 04:27:45 crc kubenswrapper[4901]: I0309 04:27:45.228799 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9235a26f-6030-48c5-8177-4f2d3772ef93-kube-api-access-kfs8h" (OuterVolumeSpecName: "kube-api-access-kfs8h") pod "9235a26f-6030-48c5-8177-4f2d3772ef93" (UID: "9235a26f-6030-48c5-8177-4f2d3772ef93"). InnerVolumeSpecName "kube-api-access-kfs8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:27:45 crc kubenswrapper[4901]: I0309 04:27:45.256292 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9235a26f-6030-48c5-8177-4f2d3772ef93-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9235a26f-6030-48c5-8177-4f2d3772ef93" (UID: "9235a26f-6030-48c5-8177-4f2d3772ef93"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 04:27:45 crc kubenswrapper[4901]: I0309 04:27:45.326130 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfs8h\" (UniqueName: \"kubernetes.io/projected/9235a26f-6030-48c5-8177-4f2d3772ef93-kube-api-access-kfs8h\") on node \"crc\" DevicePath \"\"" Mar 09 04:27:45 crc kubenswrapper[4901]: I0309 04:27:45.326173 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9235a26f-6030-48c5-8177-4f2d3772ef93-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 04:27:45 crc kubenswrapper[4901]: I0309 04:27:45.326184 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9235a26f-6030-48c5-8177-4f2d3772ef93-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 04:27:45 crc kubenswrapper[4901]: I0309 04:27:45.566402 4901 generic.go:334] "Generic (PLEG): container finished" podID="9235a26f-6030-48c5-8177-4f2d3772ef93" containerID="7559cbe77c9b51a82cf5823003a8304be80b5a3a09263adc5d2c00de44297df0" exitCode=0 Mar 09 04:27:45 crc kubenswrapper[4901]: I0309 04:27:45.566460 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sk8pm" event={"ID":"9235a26f-6030-48c5-8177-4f2d3772ef93","Type":"ContainerDied","Data":"7559cbe77c9b51a82cf5823003a8304be80b5a3a09263adc5d2c00de44297df0"} Mar 09 04:27:45 crc kubenswrapper[4901]: I0309 04:27:45.566500 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sk8pm" event={"ID":"9235a26f-6030-48c5-8177-4f2d3772ef93","Type":"ContainerDied","Data":"195abd6bed11249e6e4bf011c5512ebb8b29a30eadacec71549dcb29af292fa3"} Mar 09 04:27:45 crc kubenswrapper[4901]: I0309 04:27:45.566523 4901 scope.go:117] "RemoveContainer" containerID="7559cbe77c9b51a82cf5823003a8304be80b5a3a09263adc5d2c00de44297df0" Mar 09 04:27:45 crc kubenswrapper[4901]: I0309 04:27:45.566540 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sk8pm" Mar 09 04:27:45 crc kubenswrapper[4901]: I0309 04:27:45.598162 4901 scope.go:117] "RemoveContainer" containerID="6f16d730a4a44880ba8749e464863b9f57b04fe68f4c273770cb927d712f67c7" Mar 09 04:27:45 crc kubenswrapper[4901]: I0309 04:27:45.625086 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sk8pm"] Mar 09 04:27:45 crc kubenswrapper[4901]: I0309 04:27:45.634663 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sk8pm"] Mar 09 04:27:45 crc kubenswrapper[4901]: I0309 04:27:45.638322 4901 scope.go:117] "RemoveContainer" containerID="83be0ee304486f308a9f457ceccb3a019b7ef836b9a4390184190bfb9fb6732a" Mar 09 04:27:45 crc kubenswrapper[4901]: I0309 04:27:45.673816 4901 scope.go:117] "RemoveContainer" containerID="7559cbe77c9b51a82cf5823003a8304be80b5a3a09263adc5d2c00de44297df0" Mar 09 04:27:45 crc kubenswrapper[4901]: E0309 04:27:45.674542 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7559cbe77c9b51a82cf5823003a8304be80b5a3a09263adc5d2c00de44297df0\": container with ID starting with 7559cbe77c9b51a82cf5823003a8304be80b5a3a09263adc5d2c00de44297df0 not found: ID does not exist" containerID="7559cbe77c9b51a82cf5823003a8304be80b5a3a09263adc5d2c00de44297df0" Mar 09 04:27:45 crc kubenswrapper[4901]: I0309 04:27:45.674633 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7559cbe77c9b51a82cf5823003a8304be80b5a3a09263adc5d2c00de44297df0"} err="failed to get container status \"7559cbe77c9b51a82cf5823003a8304be80b5a3a09263adc5d2c00de44297df0\": rpc error: code = NotFound desc = could not find container \"7559cbe77c9b51a82cf5823003a8304be80b5a3a09263adc5d2c00de44297df0\": container with ID starting with 7559cbe77c9b51a82cf5823003a8304be80b5a3a09263adc5d2c00de44297df0 not found: ID does not exist" Mar 09 04:27:45 crc kubenswrapper[4901]: I0309 04:27:45.674692 4901 scope.go:117] "RemoveContainer" containerID="6f16d730a4a44880ba8749e464863b9f57b04fe68f4c273770cb927d712f67c7" Mar 09 04:27:45 crc kubenswrapper[4901]: E0309 04:27:45.675380 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f16d730a4a44880ba8749e464863b9f57b04fe68f4c273770cb927d712f67c7\": container with ID starting with 6f16d730a4a44880ba8749e464863b9f57b04fe68f4c273770cb927d712f67c7 not found: ID does not exist" containerID="6f16d730a4a44880ba8749e464863b9f57b04fe68f4c273770cb927d712f67c7" Mar 09 04:27:45 crc kubenswrapper[4901]: I0309 04:27:45.675452 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f16d730a4a44880ba8749e464863b9f57b04fe68f4c273770cb927d712f67c7"} err="failed to get container status \"6f16d730a4a44880ba8749e464863b9f57b04fe68f4c273770cb927d712f67c7\": rpc error: code = NotFound desc = could not find container \"6f16d730a4a44880ba8749e464863b9f57b04fe68f4c273770cb927d712f67c7\": container with ID starting with 6f16d730a4a44880ba8749e464863b9f57b04fe68f4c273770cb927d712f67c7 not found: ID does not exist" Mar 09 04:27:45 crc kubenswrapper[4901]: I0309 04:27:45.675497 4901 scope.go:117] "RemoveContainer" containerID="83be0ee304486f308a9f457ceccb3a019b7ef836b9a4390184190bfb9fb6732a" Mar 09 04:27:45 crc kubenswrapper[4901]: E0309 04:27:45.675980 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83be0ee304486f308a9f457ceccb3a019b7ef836b9a4390184190bfb9fb6732a\": container with ID starting with 83be0ee304486f308a9f457ceccb3a019b7ef836b9a4390184190bfb9fb6732a not found: ID does not exist" containerID="83be0ee304486f308a9f457ceccb3a019b7ef836b9a4390184190bfb9fb6732a" Mar 09 04:27:45 crc kubenswrapper[4901]: I0309 04:27:45.676014 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83be0ee304486f308a9f457ceccb3a019b7ef836b9a4390184190bfb9fb6732a"} err="failed to get container status \"83be0ee304486f308a9f457ceccb3a019b7ef836b9a4390184190bfb9fb6732a\": rpc error: code = NotFound desc = could not find container \"83be0ee304486f308a9f457ceccb3a019b7ef836b9a4390184190bfb9fb6732a\": container with ID starting with 83be0ee304486f308a9f457ceccb3a019b7ef836b9a4390184190bfb9fb6732a not found: ID does not exist" Mar 09 04:27:46 crc kubenswrapper[4901]: I0309 04:27:46.122620 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9235a26f-6030-48c5-8177-4f2d3772ef93" path="/var/lib/kubelet/pods/9235a26f-6030-48c5-8177-4f2d3772ef93/volumes" Mar 09 04:27:58 crc kubenswrapper[4901]: I0309 04:27:58.107014 4901 scope.go:117] "RemoveContainer" containerID="100361249ed61d7cd1b77e958bc2d42a32f7e2074262e7f511a1dc3907fab49b" Mar 09 04:27:58 crc kubenswrapper[4901]: E0309 04:27:58.108049 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:28:00 crc kubenswrapper[4901]: I0309 04:28:00.153738 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550508-xmpp9"] Mar 09 04:28:00 crc kubenswrapper[4901]: E0309 04:28:00.154670 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db171228-c197-4ec8-9a72-2244710d178b" containerName="registry-server" Mar 09 04:28:00 crc kubenswrapper[4901]: I0309 04:28:00.154685 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="db171228-c197-4ec8-9a72-2244710d178b" containerName="registry-server" Mar 09 04:28:00 crc kubenswrapper[4901]: E0309 04:28:00.154696 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9235a26f-6030-48c5-8177-4f2d3772ef93" containerName="extract-content" Mar 09 04:28:00 crc kubenswrapper[4901]: I0309 04:28:00.154703 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="9235a26f-6030-48c5-8177-4f2d3772ef93" containerName="extract-content" Mar 09 04:28:00 crc kubenswrapper[4901]: E0309 04:28:00.154723 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db171228-c197-4ec8-9a72-2244710d178b" containerName="extract-content" Mar 09 04:28:00 crc kubenswrapper[4901]: I0309 04:28:00.154731 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="db171228-c197-4ec8-9a72-2244710d178b" containerName="extract-content" Mar 09 04:28:00 crc kubenswrapper[4901]: E0309 04:28:00.154747 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9235a26f-6030-48c5-8177-4f2d3772ef93" containerName="registry-server" Mar 09 04:28:00 crc kubenswrapper[4901]: I0309 04:28:00.154754 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="9235a26f-6030-48c5-8177-4f2d3772ef93" containerName="registry-server" Mar 09 04:28:00 crc kubenswrapper[4901]: E0309 04:28:00.154766 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9235a26f-6030-48c5-8177-4f2d3772ef93" containerName="extract-utilities" Mar 09 04:28:00 crc kubenswrapper[4901]: I0309 04:28:00.154776 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="9235a26f-6030-48c5-8177-4f2d3772ef93" containerName="extract-utilities" Mar 09 04:28:00 crc kubenswrapper[4901]: E0309 04:28:00.154790 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db171228-c197-4ec8-9a72-2244710d178b" containerName="extract-utilities" Mar 09 04:28:00 crc kubenswrapper[4901]: I0309 04:28:00.154797 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="db171228-c197-4ec8-9a72-2244710d178b" containerName="extract-utilities" Mar 09 04:28:00 crc kubenswrapper[4901]: I0309 04:28:00.154972 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="9235a26f-6030-48c5-8177-4f2d3772ef93" containerName="registry-server" Mar 09 04:28:00 crc kubenswrapper[4901]: I0309 04:28:00.154985 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="db171228-c197-4ec8-9a72-2244710d178b" containerName="registry-server" Mar 09 04:28:00 crc kubenswrapper[4901]: I0309 04:28:00.155638 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550508-xmpp9" Mar 09 04:28:00 crc kubenswrapper[4901]: I0309 04:28:00.159183 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lq6vs" Mar 09 04:28:00 crc kubenswrapper[4901]: I0309 04:28:00.159730 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 04:28:00 crc kubenswrapper[4901]: I0309 04:28:00.161127 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 04:28:00 crc kubenswrapper[4901]: I0309 04:28:00.170148 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550508-xmpp9"] Mar 09 04:28:00 crc kubenswrapper[4901]: I0309 04:28:00.189737 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6rtf\" (UniqueName: \"kubernetes.io/projected/0632f7aa-01aa-4e06-b187-b35a6f7a68dd-kube-api-access-p6rtf\") pod \"auto-csr-approver-29550508-xmpp9\" (UID: \"0632f7aa-01aa-4e06-b187-b35a6f7a68dd\") " pod="openshift-infra/auto-csr-approver-29550508-xmpp9" Mar 09 04:28:00 crc kubenswrapper[4901]: I0309 04:28:00.291952 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6rtf\" (UniqueName: \"kubernetes.io/projected/0632f7aa-01aa-4e06-b187-b35a6f7a68dd-kube-api-access-p6rtf\") pod \"auto-csr-approver-29550508-xmpp9\" (UID: \"0632f7aa-01aa-4e06-b187-b35a6f7a68dd\") " pod="openshift-infra/auto-csr-approver-29550508-xmpp9" Mar 09 04:28:00 crc kubenswrapper[4901]: I0309 04:28:00.312294 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6rtf\" (UniqueName: \"kubernetes.io/projected/0632f7aa-01aa-4e06-b187-b35a6f7a68dd-kube-api-access-p6rtf\") pod \"auto-csr-approver-29550508-xmpp9\" (UID: \"0632f7aa-01aa-4e06-b187-b35a6f7a68dd\") " pod="openshift-infra/auto-csr-approver-29550508-xmpp9" Mar 09 04:28:00 crc kubenswrapper[4901]: I0309 04:28:00.478939 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550508-xmpp9" Mar 09 04:28:00 crc kubenswrapper[4901]: I0309 04:28:00.980457 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550508-xmpp9"] Mar 09 04:28:01 crc kubenswrapper[4901]: I0309 04:28:01.761697 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550508-xmpp9" event={"ID":"0632f7aa-01aa-4e06-b187-b35a6f7a68dd","Type":"ContainerStarted","Data":"fde357f195e4c6f0cb2d6cdbe129eec27ef740ebe40deccbe0bfbf54aa162b40"} Mar 09 04:28:02 crc kubenswrapper[4901]: I0309 04:28:02.774609 4901 generic.go:334] "Generic (PLEG): container finished" podID="0632f7aa-01aa-4e06-b187-b35a6f7a68dd" containerID="aacac1deea1b10699ae491849cfe34b5127e5e3d5b51c3656a27cb2dbe97ed64" exitCode=0 Mar 09 04:28:02 crc kubenswrapper[4901]: I0309 04:28:02.774666 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550508-xmpp9" event={"ID":"0632f7aa-01aa-4e06-b187-b35a6f7a68dd","Type":"ContainerDied","Data":"aacac1deea1b10699ae491849cfe34b5127e5e3d5b51c3656a27cb2dbe97ed64"} Mar 09 04:28:04 crc kubenswrapper[4901]: I0309 04:28:04.216479 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550508-xmpp9" Mar 09 04:28:04 crc kubenswrapper[4901]: I0309 04:28:04.365468 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6rtf\" (UniqueName: \"kubernetes.io/projected/0632f7aa-01aa-4e06-b187-b35a6f7a68dd-kube-api-access-p6rtf\") pod \"0632f7aa-01aa-4e06-b187-b35a6f7a68dd\" (UID: \"0632f7aa-01aa-4e06-b187-b35a6f7a68dd\") " Mar 09 04:28:04 crc kubenswrapper[4901]: I0309 04:28:04.370798 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0632f7aa-01aa-4e06-b187-b35a6f7a68dd-kube-api-access-p6rtf" (OuterVolumeSpecName: "kube-api-access-p6rtf") pod "0632f7aa-01aa-4e06-b187-b35a6f7a68dd" (UID: "0632f7aa-01aa-4e06-b187-b35a6f7a68dd"). InnerVolumeSpecName "kube-api-access-p6rtf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:28:04 crc kubenswrapper[4901]: I0309 04:28:04.467976 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6rtf\" (UniqueName: \"kubernetes.io/projected/0632f7aa-01aa-4e06-b187-b35a6f7a68dd-kube-api-access-p6rtf\") on node \"crc\" DevicePath \"\"" Mar 09 04:28:04 crc kubenswrapper[4901]: I0309 04:28:04.797710 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550508-xmpp9" event={"ID":"0632f7aa-01aa-4e06-b187-b35a6f7a68dd","Type":"ContainerDied","Data":"fde357f195e4c6f0cb2d6cdbe129eec27ef740ebe40deccbe0bfbf54aa162b40"} Mar 09 04:28:04 crc kubenswrapper[4901]: I0309 04:28:04.797769 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fde357f195e4c6f0cb2d6cdbe129eec27ef740ebe40deccbe0bfbf54aa162b40" Mar 09 04:28:04 crc kubenswrapper[4901]: I0309 04:28:04.797799 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550508-xmpp9" Mar 09 04:28:05 crc kubenswrapper[4901]: I0309 04:28:05.303484 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550502-cm4mf"] Mar 09 04:28:05 crc kubenswrapper[4901]: I0309 04:28:05.313565 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550502-cm4mf"] Mar 09 04:28:06 crc kubenswrapper[4901]: I0309 04:28:06.123168 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94b70874-34da-412e-a97e-9b855fe7c149" path="/var/lib/kubelet/pods/94b70874-34da-412e-a97e-9b855fe7c149/volumes" Mar 09 04:28:10 crc kubenswrapper[4901]: I0309 04:28:10.106336 4901 scope.go:117] "RemoveContainer" containerID="100361249ed61d7cd1b77e958bc2d42a32f7e2074262e7f511a1dc3907fab49b" Mar 09 04:28:10 crc kubenswrapper[4901]: I0309 04:28:10.858962 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" event={"ID":"65e722e8-52c4-4bb6-9927-f378b2f7296a","Type":"ContainerStarted","Data":"7f2a067a4431ae093b269f245287846947ed97ffeb5cdb48f9f1af1456173456"} Mar 09 04:28:44 crc kubenswrapper[4901]: I0309 04:28:44.138674 4901 scope.go:117] "RemoveContainer" containerID="4067f397471c33b2c93a890b6456f659152e6f70bd9f6fedf568bfe6402e467d" Mar 09 04:30:00 crc kubenswrapper[4901]: I0309 04:30:00.155448 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550510-ntxrw"] Mar 09 04:30:00 crc kubenswrapper[4901]: E0309 04:30:00.156541 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0632f7aa-01aa-4e06-b187-b35a6f7a68dd" containerName="oc" Mar 09 04:30:00 crc kubenswrapper[4901]: I0309 04:30:00.156559 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="0632f7aa-01aa-4e06-b187-b35a6f7a68dd" containerName="oc" Mar 09 04:30:00 crc kubenswrapper[4901]: I0309 04:30:00.156754 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="0632f7aa-01aa-4e06-b187-b35a6f7a68dd" containerName="oc" Mar 09 04:30:00 crc kubenswrapper[4901]: I0309 04:30:00.157474 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550510-ntxrw" Mar 09 04:30:00 crc kubenswrapper[4901]: I0309 04:30:00.159870 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 09 04:30:00 crc kubenswrapper[4901]: I0309 04:30:00.160159 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 09 04:30:00 crc kubenswrapper[4901]: I0309 04:30:00.164320 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550510-fqxbf"] Mar 09 04:30:00 crc kubenswrapper[4901]: I0309 04:30:00.166066 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550510-fqxbf" Mar 09 04:30:00 crc kubenswrapper[4901]: I0309 04:30:00.169742 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 04:30:00 crc kubenswrapper[4901]: I0309 04:30:00.169966 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lq6vs" Mar 09 04:30:00 crc kubenswrapper[4901]: I0309 04:30:00.170405 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 04:30:00 crc kubenswrapper[4901]: I0309 04:30:00.176581 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550510-fqxbf"] Mar 09 04:30:00 crc kubenswrapper[4901]: I0309 04:30:00.232457 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550510-ntxrw"] Mar 09 04:30:00 crc kubenswrapper[4901]: I0309 04:30:00.311325 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhdd5\" (UniqueName: \"kubernetes.io/projected/c73e245e-01e0-4a47-b008-5f09ae89d325-kube-api-access-qhdd5\") pod \"collect-profiles-29550510-ntxrw\" (UID: \"c73e245e-01e0-4a47-b008-5f09ae89d325\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550510-ntxrw" Mar 09 04:30:00 crc kubenswrapper[4901]: I0309 04:30:00.311408 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmpml\" (UniqueName: \"kubernetes.io/projected/51dff792-b3db-45b2-91b9-3b0297f15fe1-kube-api-access-wmpml\") pod \"auto-csr-approver-29550510-fqxbf\" (UID: \"51dff792-b3db-45b2-91b9-3b0297f15fe1\") " pod="openshift-infra/auto-csr-approver-29550510-fqxbf" Mar 09 04:30:00 crc kubenswrapper[4901]: I0309 04:30:00.311596 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c73e245e-01e0-4a47-b008-5f09ae89d325-config-volume\") pod \"collect-profiles-29550510-ntxrw\" (UID: \"c73e245e-01e0-4a47-b008-5f09ae89d325\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550510-ntxrw" Mar 09 04:30:00 crc kubenswrapper[4901]: I0309 04:30:00.311833 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c73e245e-01e0-4a47-b008-5f09ae89d325-secret-volume\") pod \"collect-profiles-29550510-ntxrw\" (UID: \"c73e245e-01e0-4a47-b008-5f09ae89d325\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550510-ntxrw" Mar 09 04:30:00 crc kubenswrapper[4901]: I0309 04:30:00.413679 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhdd5\" (UniqueName: \"kubernetes.io/projected/c73e245e-01e0-4a47-b008-5f09ae89d325-kube-api-access-qhdd5\") pod \"collect-profiles-29550510-ntxrw\" (UID: \"c73e245e-01e0-4a47-b008-5f09ae89d325\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550510-ntxrw" Mar 09 04:30:00 crc kubenswrapper[4901]: I0309 04:30:00.413791 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmpml\" (UniqueName: \"kubernetes.io/projected/51dff792-b3db-45b2-91b9-3b0297f15fe1-kube-api-access-wmpml\") pod \"auto-csr-approver-29550510-fqxbf\" (UID: \"51dff792-b3db-45b2-91b9-3b0297f15fe1\") " pod="openshift-infra/auto-csr-approver-29550510-fqxbf" Mar 09 04:30:00 crc kubenswrapper[4901]: I0309 04:30:00.413878 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c73e245e-01e0-4a47-b008-5f09ae89d325-config-volume\") pod \"collect-profiles-29550510-ntxrw\" (UID: \"c73e245e-01e0-4a47-b008-5f09ae89d325\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550510-ntxrw" Mar 09 04:30:00 crc kubenswrapper[4901]: I0309 04:30:00.413983 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c73e245e-01e0-4a47-b008-5f09ae89d325-secret-volume\") pod \"collect-profiles-29550510-ntxrw\" (UID: \"c73e245e-01e0-4a47-b008-5f09ae89d325\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550510-ntxrw" Mar 09 04:30:00 crc kubenswrapper[4901]: I0309 04:30:00.415729 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c73e245e-01e0-4a47-b008-5f09ae89d325-config-volume\") pod \"collect-profiles-29550510-ntxrw\" (UID: \"c73e245e-01e0-4a47-b008-5f09ae89d325\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550510-ntxrw" Mar 09 04:30:00 crc kubenswrapper[4901]: I0309 04:30:00.424874 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c73e245e-01e0-4a47-b008-5f09ae89d325-secret-volume\") pod \"collect-profiles-29550510-ntxrw\" (UID: \"c73e245e-01e0-4a47-b008-5f09ae89d325\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550510-ntxrw" Mar 09 04:30:00 crc kubenswrapper[4901]: I0309 04:30:00.432459 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhdd5\" (UniqueName: \"kubernetes.io/projected/c73e245e-01e0-4a47-b008-5f09ae89d325-kube-api-access-qhdd5\") pod \"collect-profiles-29550510-ntxrw\" (UID: \"c73e245e-01e0-4a47-b008-5f09ae89d325\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550510-ntxrw" Mar 09 04:30:00 crc kubenswrapper[4901]: I0309 04:30:00.443924 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmpml\" (UniqueName: \"kubernetes.io/projected/51dff792-b3db-45b2-91b9-3b0297f15fe1-kube-api-access-wmpml\") pod \"auto-csr-approver-29550510-fqxbf\" (UID: \"51dff792-b3db-45b2-91b9-3b0297f15fe1\") " pod="openshift-infra/auto-csr-approver-29550510-fqxbf" Mar 09 04:30:00 crc kubenswrapper[4901]: I0309 04:30:00.499275 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550510-ntxrw" Mar 09 04:30:00 crc kubenswrapper[4901]: I0309 04:30:00.516591 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550510-fqxbf" Mar 09 04:30:00 crc kubenswrapper[4901]: I0309 04:30:00.846405 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550510-fqxbf"] Mar 09 04:30:00 crc kubenswrapper[4901]: I0309 04:30:00.992167 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550510-ntxrw"] Mar 09 04:30:01 crc kubenswrapper[4901]: W0309 04:30:01.000324 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc73e245e_01e0_4a47_b008_5f09ae89d325.slice/crio-6b8d18aa7753cb88d865bb0dcdb9c3c2f9be6c7298dc0823f3644f937e1992c5 WatchSource:0}: Error finding container 6b8d18aa7753cb88d865bb0dcdb9c3c2f9be6c7298dc0823f3644f937e1992c5: Status 404 returned error can't find the container with id 6b8d18aa7753cb88d865bb0dcdb9c3c2f9be6c7298dc0823f3644f937e1992c5 Mar 09 04:30:01 crc kubenswrapper[4901]: I0309 04:30:01.017074 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550510-fqxbf" event={"ID":"51dff792-b3db-45b2-91b9-3b0297f15fe1","Type":"ContainerStarted","Data":"0b6da9d5963513da7890dca95064d5d8c3c22c8d3b187d5de74058c192a76d61"} Mar 09 04:30:01 crc kubenswrapper[4901]: I0309 04:30:01.019203 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550510-ntxrw" event={"ID":"c73e245e-01e0-4a47-b008-5f09ae89d325","Type":"ContainerStarted","Data":"6b8d18aa7753cb88d865bb0dcdb9c3c2f9be6c7298dc0823f3644f937e1992c5"} Mar 09 04:30:02 crc kubenswrapper[4901]: I0309 04:30:02.037623 4901 generic.go:334] "Generic (PLEG): container finished" podID="c73e245e-01e0-4a47-b008-5f09ae89d325" containerID="aeb242ed82f0d2cffbd57436bd2f59902f4ff47f2c9723865f6589073b4860db" exitCode=0 Mar 09 04:30:02 crc kubenswrapper[4901]: I0309 04:30:02.037700 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550510-ntxrw" event={"ID":"c73e245e-01e0-4a47-b008-5f09ae89d325","Type":"ContainerDied","Data":"aeb242ed82f0d2cffbd57436bd2f59902f4ff47f2c9723865f6589073b4860db"} Mar 09 04:30:03 crc kubenswrapper[4901]: I0309 04:30:03.049416 4901 generic.go:334] "Generic (PLEG): container finished" podID="51dff792-b3db-45b2-91b9-3b0297f15fe1" containerID="75698da71b77ab4f8aaa4215103d2e1275e326ecff932edfb55bd3c5a047d420" exitCode=0 Mar 09 04:30:03 crc kubenswrapper[4901]: I0309 04:30:03.049490 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550510-fqxbf" event={"ID":"51dff792-b3db-45b2-91b9-3b0297f15fe1","Type":"ContainerDied","Data":"75698da71b77ab4f8aaa4215103d2e1275e326ecff932edfb55bd3c5a047d420"} Mar 09 04:30:03 crc kubenswrapper[4901]: I0309 04:30:03.423201 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550510-ntxrw" Mar 09 04:30:03 crc kubenswrapper[4901]: I0309 04:30:03.469775 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c73e245e-01e0-4a47-b008-5f09ae89d325-config-volume\") pod \"c73e245e-01e0-4a47-b008-5f09ae89d325\" (UID: \"c73e245e-01e0-4a47-b008-5f09ae89d325\") " Mar 09 04:30:03 crc kubenswrapper[4901]: I0309 04:30:03.469907 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhdd5\" (UniqueName: \"kubernetes.io/projected/c73e245e-01e0-4a47-b008-5f09ae89d325-kube-api-access-qhdd5\") pod \"c73e245e-01e0-4a47-b008-5f09ae89d325\" (UID: \"c73e245e-01e0-4a47-b008-5f09ae89d325\") " Mar 09 04:30:03 crc kubenswrapper[4901]: I0309 04:30:03.469996 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c73e245e-01e0-4a47-b008-5f09ae89d325-secret-volume\") pod \"c73e245e-01e0-4a47-b008-5f09ae89d325\" (UID: \"c73e245e-01e0-4a47-b008-5f09ae89d325\") " Mar 09 04:30:03 crc kubenswrapper[4901]: I0309 04:30:03.470716 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c73e245e-01e0-4a47-b008-5f09ae89d325-config-volume" (OuterVolumeSpecName: "config-volume") pod "c73e245e-01e0-4a47-b008-5f09ae89d325" (UID: "c73e245e-01e0-4a47-b008-5f09ae89d325"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 04:30:03 crc kubenswrapper[4901]: I0309 04:30:03.475984 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c73e245e-01e0-4a47-b008-5f09ae89d325-kube-api-access-qhdd5" (OuterVolumeSpecName: "kube-api-access-qhdd5") pod "c73e245e-01e0-4a47-b008-5f09ae89d325" (UID: "c73e245e-01e0-4a47-b008-5f09ae89d325"). InnerVolumeSpecName "kube-api-access-qhdd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:30:03 crc kubenswrapper[4901]: I0309 04:30:03.476565 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c73e245e-01e0-4a47-b008-5f09ae89d325-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c73e245e-01e0-4a47-b008-5f09ae89d325" (UID: "c73e245e-01e0-4a47-b008-5f09ae89d325"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 04:30:03 crc kubenswrapper[4901]: I0309 04:30:03.572551 4901 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c73e245e-01e0-4a47-b008-5f09ae89d325-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 09 04:30:03 crc kubenswrapper[4901]: I0309 04:30:03.572582 4901 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c73e245e-01e0-4a47-b008-5f09ae89d325-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 04:30:03 crc kubenswrapper[4901]: I0309 04:30:03.572594 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhdd5\" (UniqueName: \"kubernetes.io/projected/c73e245e-01e0-4a47-b008-5f09ae89d325-kube-api-access-qhdd5\") on node \"crc\" DevicePath \"\"" Mar 09 04:30:04 crc kubenswrapper[4901]: I0309 04:30:04.060277 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550510-ntxrw" event={"ID":"c73e245e-01e0-4a47-b008-5f09ae89d325","Type":"ContainerDied","Data":"6b8d18aa7753cb88d865bb0dcdb9c3c2f9be6c7298dc0823f3644f937e1992c5"} Mar 09 04:30:04 crc kubenswrapper[4901]: I0309 04:30:04.060690 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b8d18aa7753cb88d865bb0dcdb9c3c2f9be6c7298dc0823f3644f937e1992c5" Mar 09 04:30:04 crc kubenswrapper[4901]: I0309 04:30:04.060308 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550510-ntxrw" Mar 09 04:30:04 crc kubenswrapper[4901]: I0309 04:30:04.453530 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550510-fqxbf" Mar 09 04:30:04 crc kubenswrapper[4901]: I0309 04:30:04.488111 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmpml\" (UniqueName: \"kubernetes.io/projected/51dff792-b3db-45b2-91b9-3b0297f15fe1-kube-api-access-wmpml\") pod \"51dff792-b3db-45b2-91b9-3b0297f15fe1\" (UID: \"51dff792-b3db-45b2-91b9-3b0297f15fe1\") " Mar 09 04:30:04 crc kubenswrapper[4901]: I0309 04:30:04.524989 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550465-4qjkd"] Mar 09 04:30:04 crc kubenswrapper[4901]: I0309 04:30:04.528573 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51dff792-b3db-45b2-91b9-3b0297f15fe1-kube-api-access-wmpml" (OuterVolumeSpecName: "kube-api-access-wmpml") pod "51dff792-b3db-45b2-91b9-3b0297f15fe1" (UID: "51dff792-b3db-45b2-91b9-3b0297f15fe1"). InnerVolumeSpecName "kube-api-access-wmpml". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:30:04 crc kubenswrapper[4901]: I0309 04:30:04.554879 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550465-4qjkd"] Mar 09 04:30:04 crc kubenswrapper[4901]: I0309 04:30:04.590396 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmpml\" (UniqueName: \"kubernetes.io/projected/51dff792-b3db-45b2-91b9-3b0297f15fe1-kube-api-access-wmpml\") on node \"crc\" DevicePath \"\"" Mar 09 04:30:05 crc kubenswrapper[4901]: I0309 04:30:05.071061 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550510-fqxbf" event={"ID":"51dff792-b3db-45b2-91b9-3b0297f15fe1","Type":"ContainerDied","Data":"0b6da9d5963513da7890dca95064d5d8c3c22c8d3b187d5de74058c192a76d61"} Mar 09 04:30:05 crc kubenswrapper[4901]: I0309 04:30:05.071427 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b6da9d5963513da7890dca95064d5d8c3c22c8d3b187d5de74058c192a76d61" Mar 09 04:30:05 crc kubenswrapper[4901]: I0309 04:30:05.071160 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550510-fqxbf" Mar 09 04:30:05 crc kubenswrapper[4901]: I0309 04:30:05.545573 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550504-7mjqt"] Mar 09 04:30:05 crc kubenswrapper[4901]: I0309 04:30:05.556908 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550504-7mjqt"] Mar 09 04:30:06 crc kubenswrapper[4901]: I0309 04:30:06.125649 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5aff16dd-be1c-4dad-9397-1d215c1f9ef8" path="/var/lib/kubelet/pods/5aff16dd-be1c-4dad-9397-1d215c1f9ef8/volumes" Mar 09 04:30:06 crc kubenswrapper[4901]: I0309 04:30:06.127139 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="685ee708-3377-434c-b5ec-c7b61822f3e1" path="/var/lib/kubelet/pods/685ee708-3377-434c-b5ec-c7b61822f3e1/volumes" Mar 09 04:30:30 crc kubenswrapper[4901]: I0309 04:30:30.863154 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 04:30:30 crc kubenswrapper[4901]: I0309 04:30:30.863654 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 04:30:44 crc kubenswrapper[4901]: I0309 04:30:44.259744 4901 scope.go:117] "RemoveContainer" containerID="278c53b7cd9b6b602f979211b77f19a38dffa551b943485bcaed0a5baff64e03" Mar 09 04:30:44 crc kubenswrapper[4901]: I0309 04:30:44.287458 4901 scope.go:117] "RemoveContainer" containerID="50be4bf5bfc3bc874fb44c0c48bb61119d9960932bc2dd997a5d03f96f96f24e" Mar 09 04:31:00 crc kubenswrapper[4901]: I0309 04:31:00.863449 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 04:31:00 crc kubenswrapper[4901]: I0309 04:31:00.864351 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 04:31:30 crc kubenswrapper[4901]: I0309 04:31:30.862817 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 04:31:30 crc kubenswrapper[4901]: I0309 04:31:30.863386 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 04:31:30 crc kubenswrapper[4901]: I0309 04:31:30.863432 4901 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5c998" Mar 09 04:31:30 crc kubenswrapper[4901]: I0309 04:31:30.864288 4901 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7f2a067a4431ae093b269f245287846947ed97ffeb5cdb48f9f1af1456173456"} pod="openshift-machine-config-operator/machine-config-daemon-5c998" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 04:31:30 crc kubenswrapper[4901]: I0309 04:31:30.864349 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" containerID="cri-o://7f2a067a4431ae093b269f245287846947ed97ffeb5cdb48f9f1af1456173456" gracePeriod=600 Mar 09 04:31:31 crc kubenswrapper[4901]: I0309 04:31:31.014840 4901 generic.go:334] "Generic (PLEG): container finished" podID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerID="7f2a067a4431ae093b269f245287846947ed97ffeb5cdb48f9f1af1456173456" exitCode=0 Mar 09 04:31:31 crc kubenswrapper[4901]: I0309 04:31:31.014937 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" event={"ID":"65e722e8-52c4-4bb6-9927-f378b2f7296a","Type":"ContainerDied","Data":"7f2a067a4431ae093b269f245287846947ed97ffeb5cdb48f9f1af1456173456"} Mar 09 04:31:31 crc kubenswrapper[4901]: I0309 04:31:31.015267 4901 scope.go:117] "RemoveContainer" containerID="100361249ed61d7cd1b77e958bc2d42a32f7e2074262e7f511a1dc3907fab49b" Mar 09 04:31:32 crc kubenswrapper[4901]: I0309 04:31:32.026720 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" event={"ID":"65e722e8-52c4-4bb6-9927-f378b2f7296a","Type":"ContainerStarted","Data":"314b4bd509612b348bb3d7d47b625755ea46c37745d82dbd5b718f92d9be76de"} Mar 09 04:32:00 crc kubenswrapper[4901]: I0309 04:32:00.160467 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550512-d8j8v"] Mar 09 04:32:00 crc kubenswrapper[4901]: E0309 04:32:00.161507 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c73e245e-01e0-4a47-b008-5f09ae89d325" containerName="collect-profiles" Mar 09 04:32:00 crc kubenswrapper[4901]: I0309 04:32:00.161524 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="c73e245e-01e0-4a47-b008-5f09ae89d325" containerName="collect-profiles" Mar 09 04:32:00 crc kubenswrapper[4901]: E0309 04:32:00.161553 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51dff792-b3db-45b2-91b9-3b0297f15fe1" containerName="oc" Mar 09 04:32:00 crc kubenswrapper[4901]: I0309 04:32:00.161561 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="51dff792-b3db-45b2-91b9-3b0297f15fe1" containerName="oc" Mar 09 04:32:00 crc kubenswrapper[4901]: I0309 04:32:00.161758 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="51dff792-b3db-45b2-91b9-3b0297f15fe1" containerName="oc" Mar 09 04:32:00 crc kubenswrapper[4901]: I0309 04:32:00.161782 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="c73e245e-01e0-4a47-b008-5f09ae89d325" containerName="collect-profiles" Mar 09 04:32:00 crc kubenswrapper[4901]: I0309 04:32:00.163920 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550512-d8j8v" Mar 09 04:32:00 crc kubenswrapper[4901]: I0309 04:32:00.211168 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h7kj\" (UniqueName: \"kubernetes.io/projected/faa014b3-f618-46b3-ba3b-94d0627b7572-kube-api-access-7h7kj\") pod \"auto-csr-approver-29550512-d8j8v\" (UID: \"faa014b3-f618-46b3-ba3b-94d0627b7572\") " pod="openshift-infra/auto-csr-approver-29550512-d8j8v" Mar 09 04:32:00 crc kubenswrapper[4901]: I0309 04:32:00.217624 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lq6vs" Mar 09 04:32:00 crc kubenswrapper[4901]: I0309 04:32:00.217669 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 04:32:00 crc kubenswrapper[4901]: I0309 04:32:00.218027 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 04:32:00 crc kubenswrapper[4901]: I0309 04:32:00.221105 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550512-d8j8v"] Mar 09 04:32:00 crc kubenswrapper[4901]: I0309 04:32:00.313591 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7h7kj\" (UniqueName: \"kubernetes.io/projected/faa014b3-f618-46b3-ba3b-94d0627b7572-kube-api-access-7h7kj\") pod \"auto-csr-approver-29550512-d8j8v\" (UID: \"faa014b3-f618-46b3-ba3b-94d0627b7572\") " pod="openshift-infra/auto-csr-approver-29550512-d8j8v" Mar 09 04:32:00 crc kubenswrapper[4901]: I0309 04:32:00.339161 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h7kj\" (UniqueName: \"kubernetes.io/projected/faa014b3-f618-46b3-ba3b-94d0627b7572-kube-api-access-7h7kj\") pod \"auto-csr-approver-29550512-d8j8v\" (UID: \"faa014b3-f618-46b3-ba3b-94d0627b7572\") " pod="openshift-infra/auto-csr-approver-29550512-d8j8v" Mar 09 04:32:00 crc kubenswrapper[4901]: I0309 04:32:00.534828 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550512-d8j8v" Mar 09 04:32:01 crc kubenswrapper[4901]: I0309 04:32:01.043794 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550512-d8j8v"] Mar 09 04:32:01 crc kubenswrapper[4901]: I0309 04:32:01.279957 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550512-d8j8v" event={"ID":"faa014b3-f618-46b3-ba3b-94d0627b7572","Type":"ContainerStarted","Data":"788510eb613cb3dd74da18b686f50af3ad6009b6ab63ab9df0d13e09df304cf5"} Mar 09 04:32:03 crc kubenswrapper[4901]: I0309 04:32:03.300994 4901 generic.go:334] "Generic (PLEG): container finished" podID="faa014b3-f618-46b3-ba3b-94d0627b7572" containerID="f6ac12b0c6378005df9db812c89474ac5e6181a1aae0b5a0d62a4dae49fc67cd" exitCode=0 Mar 09 04:32:03 crc kubenswrapper[4901]: I0309 04:32:03.301094 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550512-d8j8v" event={"ID":"faa014b3-f618-46b3-ba3b-94d0627b7572","Type":"ContainerDied","Data":"f6ac12b0c6378005df9db812c89474ac5e6181a1aae0b5a0d62a4dae49fc67cd"} Mar 09 04:32:04 crc kubenswrapper[4901]: I0309 04:32:04.628724 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550512-d8j8v" Mar 09 04:32:04 crc kubenswrapper[4901]: I0309 04:32:04.784851 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7h7kj\" (UniqueName: \"kubernetes.io/projected/faa014b3-f618-46b3-ba3b-94d0627b7572-kube-api-access-7h7kj\") pod \"faa014b3-f618-46b3-ba3b-94d0627b7572\" (UID: \"faa014b3-f618-46b3-ba3b-94d0627b7572\") " Mar 09 04:32:04 crc kubenswrapper[4901]: I0309 04:32:04.795538 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/faa014b3-f618-46b3-ba3b-94d0627b7572-kube-api-access-7h7kj" (OuterVolumeSpecName: "kube-api-access-7h7kj") pod "faa014b3-f618-46b3-ba3b-94d0627b7572" (UID: "faa014b3-f618-46b3-ba3b-94d0627b7572"). InnerVolumeSpecName "kube-api-access-7h7kj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:32:04 crc kubenswrapper[4901]: I0309 04:32:04.886646 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7h7kj\" (UniqueName: \"kubernetes.io/projected/faa014b3-f618-46b3-ba3b-94d0627b7572-kube-api-access-7h7kj\") on node \"crc\" DevicePath \"\"" Mar 09 04:32:05 crc kubenswrapper[4901]: I0309 04:32:05.323084 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550512-d8j8v" event={"ID":"faa014b3-f618-46b3-ba3b-94d0627b7572","Type":"ContainerDied","Data":"788510eb613cb3dd74da18b686f50af3ad6009b6ab63ab9df0d13e09df304cf5"} Mar 09 04:32:05 crc kubenswrapper[4901]: I0309 04:32:05.323127 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="788510eb613cb3dd74da18b686f50af3ad6009b6ab63ab9df0d13e09df304cf5" Mar 09 04:32:05 crc kubenswrapper[4901]: I0309 04:32:05.323176 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550512-d8j8v" Mar 09 04:32:05 crc kubenswrapper[4901]: I0309 04:32:05.706014 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550506-dcstq"] Mar 09 04:32:05 crc kubenswrapper[4901]: I0309 04:32:05.710974 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550506-dcstq"] Mar 09 04:32:06 crc kubenswrapper[4901]: I0309 04:32:06.117412 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ead7818d-771b-4ba3-b7ad-95ab0bd15378" path="/var/lib/kubelet/pods/ead7818d-771b-4ba3-b7ad-95ab0bd15378/volumes" Mar 09 04:32:16 crc kubenswrapper[4901]: I0309 04:32:16.682478 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pgsg2"] Mar 09 04:32:16 crc kubenswrapper[4901]: E0309 04:32:16.683469 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faa014b3-f618-46b3-ba3b-94d0627b7572" containerName="oc" Mar 09 04:32:16 crc kubenswrapper[4901]: I0309 04:32:16.683486 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="faa014b3-f618-46b3-ba3b-94d0627b7572" containerName="oc" Mar 09 04:32:16 crc kubenswrapper[4901]: I0309 04:32:16.683711 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="faa014b3-f618-46b3-ba3b-94d0627b7572" containerName="oc" Mar 09 04:32:16 crc kubenswrapper[4901]: I0309 04:32:16.685286 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pgsg2" Mar 09 04:32:16 crc kubenswrapper[4901]: I0309 04:32:16.703145 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pgsg2"] Mar 09 04:32:16 crc kubenswrapper[4901]: I0309 04:32:16.792909 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afff4a3f-156c-4b0c-9e54-81b835d8a70c-utilities\") pod \"redhat-operators-pgsg2\" (UID: \"afff4a3f-156c-4b0c-9e54-81b835d8a70c\") " pod="openshift-marketplace/redhat-operators-pgsg2" Mar 09 04:32:16 crc kubenswrapper[4901]: I0309 04:32:16.793288 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gx4td\" (UniqueName: \"kubernetes.io/projected/afff4a3f-156c-4b0c-9e54-81b835d8a70c-kube-api-access-gx4td\") pod \"redhat-operators-pgsg2\" (UID: \"afff4a3f-156c-4b0c-9e54-81b835d8a70c\") " pod="openshift-marketplace/redhat-operators-pgsg2" Mar 09 04:32:16 crc kubenswrapper[4901]: I0309 04:32:16.793319 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afff4a3f-156c-4b0c-9e54-81b835d8a70c-catalog-content\") pod \"redhat-operators-pgsg2\" (UID: \"afff4a3f-156c-4b0c-9e54-81b835d8a70c\") " pod="openshift-marketplace/redhat-operators-pgsg2" Mar 09 04:32:16 crc kubenswrapper[4901]: I0309 04:32:16.894812 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afff4a3f-156c-4b0c-9e54-81b835d8a70c-utilities\") pod \"redhat-operators-pgsg2\" (UID: \"afff4a3f-156c-4b0c-9e54-81b835d8a70c\") " pod="openshift-marketplace/redhat-operators-pgsg2" Mar 09 04:32:16 crc kubenswrapper[4901]: I0309 04:32:16.894858 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gx4td\" (UniqueName: \"kubernetes.io/projected/afff4a3f-156c-4b0c-9e54-81b835d8a70c-kube-api-access-gx4td\") pod \"redhat-operators-pgsg2\" (UID: \"afff4a3f-156c-4b0c-9e54-81b835d8a70c\") " pod="openshift-marketplace/redhat-operators-pgsg2" Mar 09 04:32:16 crc kubenswrapper[4901]: I0309 04:32:16.894885 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afff4a3f-156c-4b0c-9e54-81b835d8a70c-catalog-content\") pod \"redhat-operators-pgsg2\" (UID: \"afff4a3f-156c-4b0c-9e54-81b835d8a70c\") " pod="openshift-marketplace/redhat-operators-pgsg2" Mar 09 04:32:16 crc kubenswrapper[4901]: I0309 04:32:16.895406 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afff4a3f-156c-4b0c-9e54-81b835d8a70c-catalog-content\") pod \"redhat-operators-pgsg2\" (UID: \"afff4a3f-156c-4b0c-9e54-81b835d8a70c\") " pod="openshift-marketplace/redhat-operators-pgsg2" Mar 09 04:32:16 crc kubenswrapper[4901]: I0309 04:32:16.895435 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afff4a3f-156c-4b0c-9e54-81b835d8a70c-utilities\") pod \"redhat-operators-pgsg2\" (UID: \"afff4a3f-156c-4b0c-9e54-81b835d8a70c\") " pod="openshift-marketplace/redhat-operators-pgsg2" Mar 09 04:32:16 crc kubenswrapper[4901]: I0309 04:32:16.918051 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gx4td\" (UniqueName: \"kubernetes.io/projected/afff4a3f-156c-4b0c-9e54-81b835d8a70c-kube-api-access-gx4td\") pod \"redhat-operators-pgsg2\" (UID: \"afff4a3f-156c-4b0c-9e54-81b835d8a70c\") " pod="openshift-marketplace/redhat-operators-pgsg2" Mar 09 04:32:17 crc kubenswrapper[4901]: I0309 04:32:17.003415 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pgsg2" Mar 09 04:32:17 crc kubenswrapper[4901]: I0309 04:32:17.461375 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pgsg2"] Mar 09 04:32:18 crc kubenswrapper[4901]: I0309 04:32:18.429387 4901 generic.go:334] "Generic (PLEG): container finished" podID="afff4a3f-156c-4b0c-9e54-81b835d8a70c" containerID="346e8f337646882775f04451800525d87f2d622610e48d2a889ef1124d49b841" exitCode=0 Mar 09 04:32:18 crc kubenswrapper[4901]: I0309 04:32:18.429476 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pgsg2" event={"ID":"afff4a3f-156c-4b0c-9e54-81b835d8a70c","Type":"ContainerDied","Data":"346e8f337646882775f04451800525d87f2d622610e48d2a889ef1124d49b841"} Mar 09 04:32:18 crc kubenswrapper[4901]: I0309 04:32:18.429662 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pgsg2" event={"ID":"afff4a3f-156c-4b0c-9e54-81b835d8a70c","Type":"ContainerStarted","Data":"36a6f0638ba3b3c99c7c0cd08769c22babbc71a4bee0f4c990ab3d52fed54c2a"} Mar 09 04:32:19 crc kubenswrapper[4901]: I0309 04:32:19.440958 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pgsg2" event={"ID":"afff4a3f-156c-4b0c-9e54-81b835d8a70c","Type":"ContainerStarted","Data":"6d2e4b5f40f3263ec1cb34de235f713c86bf238e3215a930ffd0a5a7ae91b812"} Mar 09 04:32:20 crc kubenswrapper[4901]: I0309 04:32:20.452256 4901 generic.go:334] "Generic (PLEG): container finished" podID="afff4a3f-156c-4b0c-9e54-81b835d8a70c" containerID="6d2e4b5f40f3263ec1cb34de235f713c86bf238e3215a930ffd0a5a7ae91b812" exitCode=0 Mar 09 04:32:20 crc kubenswrapper[4901]: I0309 04:32:20.452367 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pgsg2" event={"ID":"afff4a3f-156c-4b0c-9e54-81b835d8a70c","Type":"ContainerDied","Data":"6d2e4b5f40f3263ec1cb34de235f713c86bf238e3215a930ffd0a5a7ae91b812"} Mar 09 04:32:21 crc kubenswrapper[4901]: I0309 04:32:21.461936 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pgsg2" event={"ID":"afff4a3f-156c-4b0c-9e54-81b835d8a70c","Type":"ContainerStarted","Data":"63ab12edef360a5be5af08e0b49f20817969ac80de2e456be67b99707bf1d55b"} Mar 09 04:32:21 crc kubenswrapper[4901]: I0309 04:32:21.495283 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pgsg2" podStartSLOduration=3.018629106 podStartE2EDuration="5.495265777s" podCreationTimestamp="2026-03-09 04:32:16 +0000 UTC" firstStartedPulling="2026-03-09 04:32:18.430709665 +0000 UTC m=+6663.020373407" lastFinishedPulling="2026-03-09 04:32:20.907346316 +0000 UTC m=+6665.497010078" observedRunningTime="2026-03-09 04:32:21.489691161 +0000 UTC m=+6666.079354893" watchObservedRunningTime="2026-03-09 04:32:21.495265777 +0000 UTC m=+6666.084929509" Mar 09 04:32:27 crc kubenswrapper[4901]: I0309 04:32:27.004139 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pgsg2" Mar 09 04:32:27 crc kubenswrapper[4901]: I0309 04:32:27.004678 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pgsg2" Mar 09 04:32:28 crc kubenswrapper[4901]: I0309 04:32:28.061791 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pgsg2" podUID="afff4a3f-156c-4b0c-9e54-81b835d8a70c" containerName="registry-server" probeResult="failure" output=< Mar 09 04:32:28 crc kubenswrapper[4901]: timeout: failed to connect service ":50051" within 1s Mar 09 04:32:28 crc kubenswrapper[4901]: > Mar 09 04:32:37 crc kubenswrapper[4901]: I0309 04:32:37.055307 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pgsg2" Mar 09 04:32:37 crc kubenswrapper[4901]: I0309 04:32:37.114059 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pgsg2" Mar 09 04:32:39 crc kubenswrapper[4901]: I0309 04:32:39.670375 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pgsg2"] Mar 09 04:32:39 crc kubenswrapper[4901]: I0309 04:32:39.670839 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pgsg2" podUID="afff4a3f-156c-4b0c-9e54-81b835d8a70c" containerName="registry-server" containerID="cri-o://63ab12edef360a5be5af08e0b49f20817969ac80de2e456be67b99707bf1d55b" gracePeriod=2 Mar 09 04:32:40 crc kubenswrapper[4901]: I0309 04:32:40.178172 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pgsg2" Mar 09 04:32:40 crc kubenswrapper[4901]: I0309 04:32:40.225925 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gx4td\" (UniqueName: \"kubernetes.io/projected/afff4a3f-156c-4b0c-9e54-81b835d8a70c-kube-api-access-gx4td\") pod \"afff4a3f-156c-4b0c-9e54-81b835d8a70c\" (UID: \"afff4a3f-156c-4b0c-9e54-81b835d8a70c\") " Mar 09 04:32:40 crc kubenswrapper[4901]: I0309 04:32:40.226013 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afff4a3f-156c-4b0c-9e54-81b835d8a70c-utilities\") pod \"afff4a3f-156c-4b0c-9e54-81b835d8a70c\" (UID: \"afff4a3f-156c-4b0c-9e54-81b835d8a70c\") " Mar 09 04:32:40 crc kubenswrapper[4901]: I0309 04:32:40.226037 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afff4a3f-156c-4b0c-9e54-81b835d8a70c-catalog-content\") pod \"afff4a3f-156c-4b0c-9e54-81b835d8a70c\" (UID: \"afff4a3f-156c-4b0c-9e54-81b835d8a70c\") " Mar 09 04:32:40 crc kubenswrapper[4901]: I0309 04:32:40.228025 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afff4a3f-156c-4b0c-9e54-81b835d8a70c-utilities" (OuterVolumeSpecName: "utilities") pod "afff4a3f-156c-4b0c-9e54-81b835d8a70c" (UID: "afff4a3f-156c-4b0c-9e54-81b835d8a70c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 04:32:40 crc kubenswrapper[4901]: I0309 04:32:40.233058 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afff4a3f-156c-4b0c-9e54-81b835d8a70c-kube-api-access-gx4td" (OuterVolumeSpecName: "kube-api-access-gx4td") pod "afff4a3f-156c-4b0c-9e54-81b835d8a70c" (UID: "afff4a3f-156c-4b0c-9e54-81b835d8a70c"). InnerVolumeSpecName "kube-api-access-gx4td". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:32:40 crc kubenswrapper[4901]: I0309 04:32:40.328387 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gx4td\" (UniqueName: \"kubernetes.io/projected/afff4a3f-156c-4b0c-9e54-81b835d8a70c-kube-api-access-gx4td\") on node \"crc\" DevicePath \"\"" Mar 09 04:32:40 crc kubenswrapper[4901]: I0309 04:32:40.328435 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afff4a3f-156c-4b0c-9e54-81b835d8a70c-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 04:32:40 crc kubenswrapper[4901]: I0309 04:32:40.353767 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afff4a3f-156c-4b0c-9e54-81b835d8a70c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "afff4a3f-156c-4b0c-9e54-81b835d8a70c" (UID: "afff4a3f-156c-4b0c-9e54-81b835d8a70c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 04:32:40 crc kubenswrapper[4901]: I0309 04:32:40.430508 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afff4a3f-156c-4b0c-9e54-81b835d8a70c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 04:32:40 crc kubenswrapper[4901]: I0309 04:32:40.622196 4901 generic.go:334] "Generic (PLEG): container finished" podID="afff4a3f-156c-4b0c-9e54-81b835d8a70c" containerID="63ab12edef360a5be5af08e0b49f20817969ac80de2e456be67b99707bf1d55b" exitCode=0 Mar 09 04:32:40 crc kubenswrapper[4901]: I0309 04:32:40.622261 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pgsg2" event={"ID":"afff4a3f-156c-4b0c-9e54-81b835d8a70c","Type":"ContainerDied","Data":"63ab12edef360a5be5af08e0b49f20817969ac80de2e456be67b99707bf1d55b"} Mar 09 04:32:40 crc kubenswrapper[4901]: I0309 04:32:40.622307 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pgsg2" event={"ID":"afff4a3f-156c-4b0c-9e54-81b835d8a70c","Type":"ContainerDied","Data":"36a6f0638ba3b3c99c7c0cd08769c22babbc71a4bee0f4c990ab3d52fed54c2a"} Mar 09 04:32:40 crc kubenswrapper[4901]: I0309 04:32:40.622304 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pgsg2" Mar 09 04:32:40 crc kubenswrapper[4901]: I0309 04:32:40.622407 4901 scope.go:117] "RemoveContainer" containerID="63ab12edef360a5be5af08e0b49f20817969ac80de2e456be67b99707bf1d55b" Mar 09 04:32:40 crc kubenswrapper[4901]: I0309 04:32:40.654518 4901 scope.go:117] "RemoveContainer" containerID="6d2e4b5f40f3263ec1cb34de235f713c86bf238e3215a930ffd0a5a7ae91b812" Mar 09 04:32:40 crc kubenswrapper[4901]: I0309 04:32:40.676314 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pgsg2"] Mar 09 04:32:40 crc kubenswrapper[4901]: I0309 04:32:40.682545 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pgsg2"] Mar 09 04:32:40 crc kubenswrapper[4901]: I0309 04:32:40.692398 4901 scope.go:117] "RemoveContainer" containerID="346e8f337646882775f04451800525d87f2d622610e48d2a889ef1124d49b841" Mar 09 04:32:40 crc kubenswrapper[4901]: I0309 04:32:40.716122 4901 scope.go:117] "RemoveContainer" containerID="63ab12edef360a5be5af08e0b49f20817969ac80de2e456be67b99707bf1d55b" Mar 09 04:32:40 crc kubenswrapper[4901]: E0309 04:32:40.716732 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63ab12edef360a5be5af08e0b49f20817969ac80de2e456be67b99707bf1d55b\": container with ID starting with 63ab12edef360a5be5af08e0b49f20817969ac80de2e456be67b99707bf1d55b not found: ID does not exist" containerID="63ab12edef360a5be5af08e0b49f20817969ac80de2e456be67b99707bf1d55b" Mar 09 04:32:40 crc kubenswrapper[4901]: I0309 04:32:40.716785 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63ab12edef360a5be5af08e0b49f20817969ac80de2e456be67b99707bf1d55b"} err="failed to get container status \"63ab12edef360a5be5af08e0b49f20817969ac80de2e456be67b99707bf1d55b\": rpc error: code = NotFound desc = could not find container \"63ab12edef360a5be5af08e0b49f20817969ac80de2e456be67b99707bf1d55b\": container with ID starting with 63ab12edef360a5be5af08e0b49f20817969ac80de2e456be67b99707bf1d55b not found: ID does not exist" Mar 09 04:32:40 crc kubenswrapper[4901]: I0309 04:32:40.716819 4901 scope.go:117] "RemoveContainer" containerID="6d2e4b5f40f3263ec1cb34de235f713c86bf238e3215a930ffd0a5a7ae91b812" Mar 09 04:32:40 crc kubenswrapper[4901]: E0309 04:32:40.717469 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d2e4b5f40f3263ec1cb34de235f713c86bf238e3215a930ffd0a5a7ae91b812\": container with ID starting with 6d2e4b5f40f3263ec1cb34de235f713c86bf238e3215a930ffd0a5a7ae91b812 not found: ID does not exist" containerID="6d2e4b5f40f3263ec1cb34de235f713c86bf238e3215a930ffd0a5a7ae91b812" Mar 09 04:32:40 crc kubenswrapper[4901]: I0309 04:32:40.717551 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d2e4b5f40f3263ec1cb34de235f713c86bf238e3215a930ffd0a5a7ae91b812"} err="failed to get container status \"6d2e4b5f40f3263ec1cb34de235f713c86bf238e3215a930ffd0a5a7ae91b812\": rpc error: code = NotFound desc = could not find container \"6d2e4b5f40f3263ec1cb34de235f713c86bf238e3215a930ffd0a5a7ae91b812\": container with ID starting with 6d2e4b5f40f3263ec1cb34de235f713c86bf238e3215a930ffd0a5a7ae91b812 not found: ID does not exist" Mar 09 04:32:40 crc kubenswrapper[4901]: I0309 04:32:40.717594 4901 scope.go:117] "RemoveContainer" containerID="346e8f337646882775f04451800525d87f2d622610e48d2a889ef1124d49b841" Mar 09 04:32:40 crc kubenswrapper[4901]: E0309 04:32:40.717973 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"346e8f337646882775f04451800525d87f2d622610e48d2a889ef1124d49b841\": container with ID starting with 346e8f337646882775f04451800525d87f2d622610e48d2a889ef1124d49b841 not found: ID does not exist" containerID="346e8f337646882775f04451800525d87f2d622610e48d2a889ef1124d49b841" Mar 09 04:32:40 crc kubenswrapper[4901]: I0309 04:32:40.718036 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"346e8f337646882775f04451800525d87f2d622610e48d2a889ef1124d49b841"} err="failed to get container status \"346e8f337646882775f04451800525d87f2d622610e48d2a889ef1124d49b841\": rpc error: code = NotFound desc = could not find container \"346e8f337646882775f04451800525d87f2d622610e48d2a889ef1124d49b841\": container with ID starting with 346e8f337646882775f04451800525d87f2d622610e48d2a889ef1124d49b841 not found: ID does not exist" Mar 09 04:32:42 crc kubenswrapper[4901]: I0309 04:32:42.126971 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afff4a3f-156c-4b0c-9e54-81b835d8a70c" path="/var/lib/kubelet/pods/afff4a3f-156c-4b0c-9e54-81b835d8a70c/volumes" Mar 09 04:32:43 crc kubenswrapper[4901]: I0309 04:32:43.690466 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nj9np"] Mar 09 04:32:43 crc kubenswrapper[4901]: E0309 04:32:43.692306 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afff4a3f-156c-4b0c-9e54-81b835d8a70c" containerName="registry-server" Mar 09 04:32:43 crc kubenswrapper[4901]: I0309 04:32:43.692466 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="afff4a3f-156c-4b0c-9e54-81b835d8a70c" containerName="registry-server" Mar 09 04:32:43 crc kubenswrapper[4901]: E0309 04:32:43.692635 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afff4a3f-156c-4b0c-9e54-81b835d8a70c" containerName="extract-content" Mar 09 04:32:43 crc kubenswrapper[4901]: I0309 04:32:43.692768 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="afff4a3f-156c-4b0c-9e54-81b835d8a70c" containerName="extract-content" Mar 09 04:32:43 crc kubenswrapper[4901]: E0309 04:32:43.692920 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afff4a3f-156c-4b0c-9e54-81b835d8a70c" containerName="extract-utilities" Mar 09 04:32:43 crc kubenswrapper[4901]: I0309 04:32:43.693057 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="afff4a3f-156c-4b0c-9e54-81b835d8a70c" containerName="extract-utilities" Mar 09 04:32:43 crc kubenswrapper[4901]: I0309 04:32:43.693559 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="afff4a3f-156c-4b0c-9e54-81b835d8a70c" containerName="registry-server" Mar 09 04:32:43 crc kubenswrapper[4901]: I0309 04:32:43.695920 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nj9np" Mar 09 04:32:43 crc kubenswrapper[4901]: I0309 04:32:43.709846 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nj9np"] Mar 09 04:32:43 crc kubenswrapper[4901]: I0309 04:32:43.808520 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/426adfbe-8e2d-4fb7-aff5-52f5953a574f-utilities\") pod \"certified-operators-nj9np\" (UID: \"426adfbe-8e2d-4fb7-aff5-52f5953a574f\") " pod="openshift-marketplace/certified-operators-nj9np" Mar 09 04:32:43 crc kubenswrapper[4901]: I0309 04:32:43.808703 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/426adfbe-8e2d-4fb7-aff5-52f5953a574f-catalog-content\") pod \"certified-operators-nj9np\" (UID: \"426adfbe-8e2d-4fb7-aff5-52f5953a574f\") " pod="openshift-marketplace/certified-operators-nj9np" Mar 09 04:32:43 crc kubenswrapper[4901]: I0309 04:32:43.808764 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5t5pk\" (UniqueName: \"kubernetes.io/projected/426adfbe-8e2d-4fb7-aff5-52f5953a574f-kube-api-access-5t5pk\") pod \"certified-operators-nj9np\" (UID: \"426adfbe-8e2d-4fb7-aff5-52f5953a574f\") " pod="openshift-marketplace/certified-operators-nj9np" Mar 09 04:32:43 crc kubenswrapper[4901]: I0309 04:32:43.910263 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/426adfbe-8e2d-4fb7-aff5-52f5953a574f-catalog-content\") pod \"certified-operators-nj9np\" (UID: \"426adfbe-8e2d-4fb7-aff5-52f5953a574f\") " pod="openshift-marketplace/certified-operators-nj9np" Mar 09 04:32:43 crc kubenswrapper[4901]: I0309 04:32:43.910570 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5t5pk\" (UniqueName: \"kubernetes.io/projected/426adfbe-8e2d-4fb7-aff5-52f5953a574f-kube-api-access-5t5pk\") pod \"certified-operators-nj9np\" (UID: \"426adfbe-8e2d-4fb7-aff5-52f5953a574f\") " pod="openshift-marketplace/certified-operators-nj9np" Mar 09 04:32:43 crc kubenswrapper[4901]: I0309 04:32:43.910762 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/426adfbe-8e2d-4fb7-aff5-52f5953a574f-utilities\") pod \"certified-operators-nj9np\" (UID: \"426adfbe-8e2d-4fb7-aff5-52f5953a574f\") " pod="openshift-marketplace/certified-operators-nj9np" Mar 09 04:32:43 crc kubenswrapper[4901]: I0309 04:32:43.910838 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/426adfbe-8e2d-4fb7-aff5-52f5953a574f-catalog-content\") pod \"certified-operators-nj9np\" (UID: \"426adfbe-8e2d-4fb7-aff5-52f5953a574f\") " pod="openshift-marketplace/certified-operators-nj9np" Mar 09 04:32:43 crc kubenswrapper[4901]: I0309 04:32:43.911299 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/426adfbe-8e2d-4fb7-aff5-52f5953a574f-utilities\") pod \"certified-operators-nj9np\" (UID: \"426adfbe-8e2d-4fb7-aff5-52f5953a574f\") " pod="openshift-marketplace/certified-operators-nj9np" Mar 09 04:32:43 crc kubenswrapper[4901]: I0309 04:32:43.945756 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5t5pk\" (UniqueName: \"kubernetes.io/projected/426adfbe-8e2d-4fb7-aff5-52f5953a574f-kube-api-access-5t5pk\") pod \"certified-operators-nj9np\" (UID: \"426adfbe-8e2d-4fb7-aff5-52f5953a574f\") " pod="openshift-marketplace/certified-operators-nj9np" Mar 09 04:32:44 crc kubenswrapper[4901]: I0309 04:32:44.022661 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nj9np" Mar 09 04:32:44 crc kubenswrapper[4901]: I0309 04:32:44.465342 4901 scope.go:117] "RemoveContainer" containerID="87594a0e39eff61661b0cfa988c28592af47973f050c1f1458d2d64ce6aed757" Mar 09 04:32:44 crc kubenswrapper[4901]: I0309 04:32:44.508292 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nj9np"] Mar 09 04:32:44 crc kubenswrapper[4901]: I0309 04:32:44.661830 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nj9np" event={"ID":"426adfbe-8e2d-4fb7-aff5-52f5953a574f","Type":"ContainerStarted","Data":"5df9f37c409c16ce756c59c87a916df59914a49d50fb9a3f4ac57439bc8744a8"} Mar 09 04:32:45 crc kubenswrapper[4901]: I0309 04:32:45.679088 4901 generic.go:334] "Generic (PLEG): container finished" podID="426adfbe-8e2d-4fb7-aff5-52f5953a574f" containerID="20f1d5f0afa49939c52bf049ac6572002f02bc07b71d835b4ae320e5b9e83aad" exitCode=0 Mar 09 04:32:45 crc kubenswrapper[4901]: I0309 04:32:45.679608 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nj9np" event={"ID":"426adfbe-8e2d-4fb7-aff5-52f5953a574f","Type":"ContainerDied","Data":"20f1d5f0afa49939c52bf049ac6572002f02bc07b71d835b4ae320e5b9e83aad"} Mar 09 04:32:45 crc kubenswrapper[4901]: I0309 04:32:45.686891 4901 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 04:32:46 crc kubenswrapper[4901]: I0309 04:32:46.691062 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nj9np" event={"ID":"426adfbe-8e2d-4fb7-aff5-52f5953a574f","Type":"ContainerStarted","Data":"d4f50baa98ee4e6c93f3bc033dde4794853d0b65fdceac421f158896ce843593"} Mar 09 04:32:47 crc kubenswrapper[4901]: I0309 04:32:47.704758 4901 generic.go:334] "Generic (PLEG): container finished" podID="426adfbe-8e2d-4fb7-aff5-52f5953a574f" containerID="d4f50baa98ee4e6c93f3bc033dde4794853d0b65fdceac421f158896ce843593" exitCode=0 Mar 09 04:32:47 crc kubenswrapper[4901]: I0309 04:32:47.704864 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nj9np" event={"ID":"426adfbe-8e2d-4fb7-aff5-52f5953a574f","Type":"ContainerDied","Data":"d4f50baa98ee4e6c93f3bc033dde4794853d0b65fdceac421f158896ce843593"} Mar 09 04:32:48 crc kubenswrapper[4901]: I0309 04:32:48.717174 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nj9np" event={"ID":"426adfbe-8e2d-4fb7-aff5-52f5953a574f","Type":"ContainerStarted","Data":"25936f963bff35090693453a620f54672c44522616efe8e7f52240b259e92c68"} Mar 09 04:32:48 crc kubenswrapper[4901]: I0309 04:32:48.751523 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nj9np" podStartSLOduration=3.276505004 podStartE2EDuration="5.751497734s" podCreationTimestamp="2026-03-09 04:32:43 +0000 UTC" firstStartedPulling="2026-03-09 04:32:45.686545842 +0000 UTC m=+6690.276209584" lastFinishedPulling="2026-03-09 04:32:48.161538541 +0000 UTC m=+6692.751202314" observedRunningTime="2026-03-09 04:32:48.741708254 +0000 UTC m=+6693.331372006" watchObservedRunningTime="2026-03-09 04:32:48.751497734 +0000 UTC m=+6693.341161506" Mar 09 04:32:54 crc kubenswrapper[4901]: I0309 04:32:54.022929 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nj9np" Mar 09 04:32:54 crc kubenswrapper[4901]: I0309 04:32:54.023685 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nj9np" Mar 09 04:32:54 crc kubenswrapper[4901]: I0309 04:32:54.126580 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nj9np" Mar 09 04:32:54 crc kubenswrapper[4901]: I0309 04:32:54.875333 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nj9np" Mar 09 04:32:58 crc kubenswrapper[4901]: I0309 04:32:58.274051 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nj9np"] Mar 09 04:32:58 crc kubenswrapper[4901]: I0309 04:32:58.274846 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nj9np" podUID="426adfbe-8e2d-4fb7-aff5-52f5953a574f" containerName="registry-server" containerID="cri-o://25936f963bff35090693453a620f54672c44522616efe8e7f52240b259e92c68" gracePeriod=2 Mar 09 04:32:58 crc kubenswrapper[4901]: I0309 04:32:58.760768 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nj9np" Mar 09 04:32:58 crc kubenswrapper[4901]: I0309 04:32:58.833534 4901 generic.go:334] "Generic (PLEG): container finished" podID="426adfbe-8e2d-4fb7-aff5-52f5953a574f" containerID="25936f963bff35090693453a620f54672c44522616efe8e7f52240b259e92c68" exitCode=0 Mar 09 04:32:58 crc kubenswrapper[4901]: I0309 04:32:58.833593 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nj9np" event={"ID":"426adfbe-8e2d-4fb7-aff5-52f5953a574f","Type":"ContainerDied","Data":"25936f963bff35090693453a620f54672c44522616efe8e7f52240b259e92c68"} Mar 09 04:32:58 crc kubenswrapper[4901]: I0309 04:32:58.833626 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nj9np" Mar 09 04:32:58 crc kubenswrapper[4901]: I0309 04:32:58.833662 4901 scope.go:117] "RemoveContainer" containerID="25936f963bff35090693453a620f54672c44522616efe8e7f52240b259e92c68" Mar 09 04:32:58 crc kubenswrapper[4901]: I0309 04:32:58.833643 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nj9np" event={"ID":"426adfbe-8e2d-4fb7-aff5-52f5953a574f","Type":"ContainerDied","Data":"5df9f37c409c16ce756c59c87a916df59914a49d50fb9a3f4ac57439bc8744a8"} Mar 09 04:32:58 crc kubenswrapper[4901]: I0309 04:32:58.861003 4901 scope.go:117] "RemoveContainer" containerID="d4f50baa98ee4e6c93f3bc033dde4794853d0b65fdceac421f158896ce843593" Mar 09 04:32:58 crc kubenswrapper[4901]: I0309 04:32:58.893789 4901 scope.go:117] "RemoveContainer" containerID="20f1d5f0afa49939c52bf049ac6572002f02bc07b71d835b4ae320e5b9e83aad" Mar 09 04:32:58 crc kubenswrapper[4901]: I0309 04:32:58.898449 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/426adfbe-8e2d-4fb7-aff5-52f5953a574f-utilities\") pod \"426adfbe-8e2d-4fb7-aff5-52f5953a574f\" (UID: \"426adfbe-8e2d-4fb7-aff5-52f5953a574f\") " Mar 09 04:32:58 crc kubenswrapper[4901]: I0309 04:32:58.898515 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/426adfbe-8e2d-4fb7-aff5-52f5953a574f-catalog-content\") pod \"426adfbe-8e2d-4fb7-aff5-52f5953a574f\" (UID: \"426adfbe-8e2d-4fb7-aff5-52f5953a574f\") " Mar 09 04:32:58 crc kubenswrapper[4901]: I0309 04:32:58.898730 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5t5pk\" (UniqueName: \"kubernetes.io/projected/426adfbe-8e2d-4fb7-aff5-52f5953a574f-kube-api-access-5t5pk\") pod \"426adfbe-8e2d-4fb7-aff5-52f5953a574f\" (UID: \"426adfbe-8e2d-4fb7-aff5-52f5953a574f\") " Mar 09 04:32:58 crc kubenswrapper[4901]: I0309 04:32:58.899932 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/426adfbe-8e2d-4fb7-aff5-52f5953a574f-utilities" (OuterVolumeSpecName: "utilities") pod "426adfbe-8e2d-4fb7-aff5-52f5953a574f" (UID: "426adfbe-8e2d-4fb7-aff5-52f5953a574f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 04:32:58 crc kubenswrapper[4901]: I0309 04:32:58.906397 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/426adfbe-8e2d-4fb7-aff5-52f5953a574f-kube-api-access-5t5pk" (OuterVolumeSpecName: "kube-api-access-5t5pk") pod "426adfbe-8e2d-4fb7-aff5-52f5953a574f" (UID: "426adfbe-8e2d-4fb7-aff5-52f5953a574f"). InnerVolumeSpecName "kube-api-access-5t5pk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:32:58 crc kubenswrapper[4901]: I0309 04:32:58.932953 4901 scope.go:117] "RemoveContainer" containerID="25936f963bff35090693453a620f54672c44522616efe8e7f52240b259e92c68" Mar 09 04:32:58 crc kubenswrapper[4901]: E0309 04:32:58.934459 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25936f963bff35090693453a620f54672c44522616efe8e7f52240b259e92c68\": container with ID starting with 25936f963bff35090693453a620f54672c44522616efe8e7f52240b259e92c68 not found: ID does not exist" containerID="25936f963bff35090693453a620f54672c44522616efe8e7f52240b259e92c68" Mar 09 04:32:58 crc kubenswrapper[4901]: I0309 04:32:58.934510 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25936f963bff35090693453a620f54672c44522616efe8e7f52240b259e92c68"} err="failed to get container status \"25936f963bff35090693453a620f54672c44522616efe8e7f52240b259e92c68\": rpc error: code = NotFound desc = could not find container \"25936f963bff35090693453a620f54672c44522616efe8e7f52240b259e92c68\": container with ID starting with 25936f963bff35090693453a620f54672c44522616efe8e7f52240b259e92c68 not found: ID does not exist" Mar 09 04:32:58 crc kubenswrapper[4901]: I0309 04:32:58.934545 4901 scope.go:117] "RemoveContainer" containerID="d4f50baa98ee4e6c93f3bc033dde4794853d0b65fdceac421f158896ce843593" Mar 09 04:32:58 crc kubenswrapper[4901]: E0309 04:32:58.940208 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4f50baa98ee4e6c93f3bc033dde4794853d0b65fdceac421f158896ce843593\": container with ID starting with d4f50baa98ee4e6c93f3bc033dde4794853d0b65fdceac421f158896ce843593 not found: ID does not exist" containerID="d4f50baa98ee4e6c93f3bc033dde4794853d0b65fdceac421f158896ce843593" Mar 09 04:32:58 crc kubenswrapper[4901]: I0309 04:32:58.940286 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4f50baa98ee4e6c93f3bc033dde4794853d0b65fdceac421f158896ce843593"} err="failed to get container status \"d4f50baa98ee4e6c93f3bc033dde4794853d0b65fdceac421f158896ce843593\": rpc error: code = NotFound desc = could not find container \"d4f50baa98ee4e6c93f3bc033dde4794853d0b65fdceac421f158896ce843593\": container with ID starting with d4f50baa98ee4e6c93f3bc033dde4794853d0b65fdceac421f158896ce843593 not found: ID does not exist" Mar 09 04:32:58 crc kubenswrapper[4901]: I0309 04:32:58.940318 4901 scope.go:117] "RemoveContainer" containerID="20f1d5f0afa49939c52bf049ac6572002f02bc07b71d835b4ae320e5b9e83aad" Mar 09 04:32:58 crc kubenswrapper[4901]: E0309 04:32:58.941186 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20f1d5f0afa49939c52bf049ac6572002f02bc07b71d835b4ae320e5b9e83aad\": container with ID starting with 20f1d5f0afa49939c52bf049ac6572002f02bc07b71d835b4ae320e5b9e83aad not found: ID does not exist" containerID="20f1d5f0afa49939c52bf049ac6572002f02bc07b71d835b4ae320e5b9e83aad" Mar 09 04:32:58 crc kubenswrapper[4901]: I0309 04:32:58.941273 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20f1d5f0afa49939c52bf049ac6572002f02bc07b71d835b4ae320e5b9e83aad"} err="failed to get container status \"20f1d5f0afa49939c52bf049ac6572002f02bc07b71d835b4ae320e5b9e83aad\": rpc error: code = NotFound desc = could not find container \"20f1d5f0afa49939c52bf049ac6572002f02bc07b71d835b4ae320e5b9e83aad\": container with ID starting with 20f1d5f0afa49939c52bf049ac6572002f02bc07b71d835b4ae320e5b9e83aad not found: ID does not exist" Mar 09 04:32:58 crc kubenswrapper[4901]: I0309 04:32:58.982372 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/426adfbe-8e2d-4fb7-aff5-52f5953a574f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "426adfbe-8e2d-4fb7-aff5-52f5953a574f" (UID: "426adfbe-8e2d-4fb7-aff5-52f5953a574f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 04:32:59 crc kubenswrapper[4901]: I0309 04:32:59.000415 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/426adfbe-8e2d-4fb7-aff5-52f5953a574f-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 04:32:59 crc kubenswrapper[4901]: I0309 04:32:59.000449 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/426adfbe-8e2d-4fb7-aff5-52f5953a574f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 04:32:59 crc kubenswrapper[4901]: I0309 04:32:59.000463 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5t5pk\" (UniqueName: \"kubernetes.io/projected/426adfbe-8e2d-4fb7-aff5-52f5953a574f-kube-api-access-5t5pk\") on node \"crc\" DevicePath \"\"" Mar 09 04:32:59 crc kubenswrapper[4901]: I0309 04:32:59.193905 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nj9np"] Mar 09 04:32:59 crc kubenswrapper[4901]: I0309 04:32:59.204365 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nj9np"] Mar 09 04:33:00 crc kubenswrapper[4901]: I0309 04:33:00.122369 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="426adfbe-8e2d-4fb7-aff5-52f5953a574f" path="/var/lib/kubelet/pods/426adfbe-8e2d-4fb7-aff5-52f5953a574f/volumes" Mar 09 04:33:30 crc kubenswrapper[4901]: I0309 04:33:30.567092 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mthrn/must-gather-mbwfc"] Mar 09 04:33:30 crc kubenswrapper[4901]: E0309 04:33:30.567777 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="426adfbe-8e2d-4fb7-aff5-52f5953a574f" containerName="extract-content" Mar 09 04:33:30 crc kubenswrapper[4901]: I0309 04:33:30.567788 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="426adfbe-8e2d-4fb7-aff5-52f5953a574f" containerName="extract-content" Mar 09 04:33:30 crc kubenswrapper[4901]: E0309 04:33:30.567801 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="426adfbe-8e2d-4fb7-aff5-52f5953a574f" containerName="extract-utilities" Mar 09 04:33:30 crc kubenswrapper[4901]: I0309 04:33:30.567806 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="426adfbe-8e2d-4fb7-aff5-52f5953a574f" containerName="extract-utilities" Mar 09 04:33:30 crc kubenswrapper[4901]: E0309 04:33:30.567814 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="426adfbe-8e2d-4fb7-aff5-52f5953a574f" containerName="registry-server" Mar 09 04:33:30 crc kubenswrapper[4901]: I0309 04:33:30.567822 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="426adfbe-8e2d-4fb7-aff5-52f5953a574f" containerName="registry-server" Mar 09 04:33:30 crc kubenswrapper[4901]: I0309 04:33:30.567970 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="426adfbe-8e2d-4fb7-aff5-52f5953a574f" containerName="registry-server" Mar 09 04:33:30 crc kubenswrapper[4901]: I0309 04:33:30.568745 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mthrn/must-gather-mbwfc" Mar 09 04:33:30 crc kubenswrapper[4901]: I0309 04:33:30.570816 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-mthrn"/"kube-root-ca.crt" Mar 09 04:33:30 crc kubenswrapper[4901]: I0309 04:33:30.571052 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-mthrn"/"openshift-service-ca.crt" Mar 09 04:33:30 crc kubenswrapper[4901]: I0309 04:33:30.576986 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mthrn/must-gather-mbwfc"] Mar 09 04:33:30 crc kubenswrapper[4901]: I0309 04:33:30.577639 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-mthrn"/"default-dockercfg-k2htj" Mar 09 04:33:30 crc kubenswrapper[4901]: I0309 04:33:30.704482 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sk9c2\" (UniqueName: \"kubernetes.io/projected/79837b41-60c2-458a-8e72-77886e95872d-kube-api-access-sk9c2\") pod \"must-gather-mbwfc\" (UID: \"79837b41-60c2-458a-8e72-77886e95872d\") " pod="openshift-must-gather-mthrn/must-gather-mbwfc" Mar 09 04:33:30 crc kubenswrapper[4901]: I0309 04:33:30.704813 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/79837b41-60c2-458a-8e72-77886e95872d-must-gather-output\") pod \"must-gather-mbwfc\" (UID: \"79837b41-60c2-458a-8e72-77886e95872d\") " pod="openshift-must-gather-mthrn/must-gather-mbwfc" Mar 09 04:33:30 crc kubenswrapper[4901]: I0309 04:33:30.806026 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/79837b41-60c2-458a-8e72-77886e95872d-must-gather-output\") pod \"must-gather-mbwfc\" (UID: \"79837b41-60c2-458a-8e72-77886e95872d\") " pod="openshift-must-gather-mthrn/must-gather-mbwfc" Mar 09 04:33:30 crc kubenswrapper[4901]: I0309 04:33:30.806133 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sk9c2\" (UniqueName: \"kubernetes.io/projected/79837b41-60c2-458a-8e72-77886e95872d-kube-api-access-sk9c2\") pod \"must-gather-mbwfc\" (UID: \"79837b41-60c2-458a-8e72-77886e95872d\") " pod="openshift-must-gather-mthrn/must-gather-mbwfc" Mar 09 04:33:30 crc kubenswrapper[4901]: I0309 04:33:30.806508 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/79837b41-60c2-458a-8e72-77886e95872d-must-gather-output\") pod \"must-gather-mbwfc\" (UID: \"79837b41-60c2-458a-8e72-77886e95872d\") " pod="openshift-must-gather-mthrn/must-gather-mbwfc" Mar 09 04:33:30 crc kubenswrapper[4901]: I0309 04:33:30.826186 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sk9c2\" (UniqueName: \"kubernetes.io/projected/79837b41-60c2-458a-8e72-77886e95872d-kube-api-access-sk9c2\") pod \"must-gather-mbwfc\" (UID: \"79837b41-60c2-458a-8e72-77886e95872d\") " pod="openshift-must-gather-mthrn/must-gather-mbwfc" Mar 09 04:33:30 crc kubenswrapper[4901]: I0309 04:33:30.884114 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mthrn/must-gather-mbwfc" Mar 09 04:33:31 crc kubenswrapper[4901]: I0309 04:33:31.327895 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mthrn/must-gather-mbwfc"] Mar 09 04:33:32 crc kubenswrapper[4901]: I0309 04:33:32.164033 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mthrn/must-gather-mbwfc" event={"ID":"79837b41-60c2-458a-8e72-77886e95872d","Type":"ContainerStarted","Data":"1ded01434451b57c4f4ffc36b951f51aa390e9d6be1ceb34bacf8cf75d05dc83"} Mar 09 04:33:38 crc kubenswrapper[4901]: I0309 04:33:38.218531 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mthrn/must-gather-mbwfc" event={"ID":"79837b41-60c2-458a-8e72-77886e95872d","Type":"ContainerStarted","Data":"431356b34be708cf60262caac9cf7429fa18787d7b0e4bbbb02cbe34c366c2f9"} Mar 09 04:33:38 crc kubenswrapper[4901]: I0309 04:33:38.219005 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mthrn/must-gather-mbwfc" event={"ID":"79837b41-60c2-458a-8e72-77886e95872d","Type":"ContainerStarted","Data":"d792d9a1d6db75eea2e213f06be048dc8538f2f24e71a060113e0d106a8464dc"} Mar 09 04:33:38 crc kubenswrapper[4901]: I0309 04:33:38.237237 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-mthrn/must-gather-mbwfc" podStartSLOduration=2.086423603 podStartE2EDuration="8.237196269s" podCreationTimestamp="2026-03-09 04:33:30 +0000 UTC" firstStartedPulling="2026-03-09 04:33:31.344645892 +0000 UTC m=+6735.934309634" lastFinishedPulling="2026-03-09 04:33:37.495418568 +0000 UTC m=+6742.085082300" observedRunningTime="2026-03-09 04:33:38.231603872 +0000 UTC m=+6742.821267594" watchObservedRunningTime="2026-03-09 04:33:38.237196269 +0000 UTC m=+6742.826860001" Mar 09 04:33:40 crc kubenswrapper[4901]: I0309 04:33:40.699267 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mthrn/crc-debug-9whff"] Mar 09 04:33:40 crc kubenswrapper[4901]: I0309 04:33:40.700647 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mthrn/crc-debug-9whff" Mar 09 04:33:40 crc kubenswrapper[4901]: I0309 04:33:40.796735 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvx49\" (UniqueName: \"kubernetes.io/projected/51509b37-855d-47a1-a180-71f2d728321a-kube-api-access-lvx49\") pod \"crc-debug-9whff\" (UID: \"51509b37-855d-47a1-a180-71f2d728321a\") " pod="openshift-must-gather-mthrn/crc-debug-9whff" Mar 09 04:33:40 crc kubenswrapper[4901]: I0309 04:33:40.797049 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/51509b37-855d-47a1-a180-71f2d728321a-host\") pod \"crc-debug-9whff\" (UID: \"51509b37-855d-47a1-a180-71f2d728321a\") " pod="openshift-must-gather-mthrn/crc-debug-9whff" Mar 09 04:33:40 crc kubenswrapper[4901]: I0309 04:33:40.898817 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvx49\" (UniqueName: \"kubernetes.io/projected/51509b37-855d-47a1-a180-71f2d728321a-kube-api-access-lvx49\") pod \"crc-debug-9whff\" (UID: \"51509b37-855d-47a1-a180-71f2d728321a\") " pod="openshift-must-gather-mthrn/crc-debug-9whff" Mar 09 04:33:40 crc kubenswrapper[4901]: I0309 04:33:40.898891 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/51509b37-855d-47a1-a180-71f2d728321a-host\") pod \"crc-debug-9whff\" (UID: \"51509b37-855d-47a1-a180-71f2d728321a\") " pod="openshift-must-gather-mthrn/crc-debug-9whff" Mar 09 04:33:40 crc kubenswrapper[4901]: I0309 04:33:40.898968 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/51509b37-855d-47a1-a180-71f2d728321a-host\") pod \"crc-debug-9whff\" (UID: \"51509b37-855d-47a1-a180-71f2d728321a\") " pod="openshift-must-gather-mthrn/crc-debug-9whff" Mar 09 04:33:40 crc kubenswrapper[4901]: I0309 04:33:40.927477 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvx49\" (UniqueName: \"kubernetes.io/projected/51509b37-855d-47a1-a180-71f2d728321a-kube-api-access-lvx49\") pod \"crc-debug-9whff\" (UID: \"51509b37-855d-47a1-a180-71f2d728321a\") " pod="openshift-must-gather-mthrn/crc-debug-9whff" Mar 09 04:33:41 crc kubenswrapper[4901]: I0309 04:33:41.021375 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mthrn/crc-debug-9whff" Mar 09 04:33:41 crc kubenswrapper[4901]: W0309 04:33:41.066861 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51509b37_855d_47a1_a180_71f2d728321a.slice/crio-e2e6b78563e624ff201c26df00ad1f577f803bcf541db6ef8d8880575771b268 WatchSource:0}: Error finding container e2e6b78563e624ff201c26df00ad1f577f803bcf541db6ef8d8880575771b268: Status 404 returned error can't find the container with id e2e6b78563e624ff201c26df00ad1f577f803bcf541db6ef8d8880575771b268 Mar 09 04:33:41 crc kubenswrapper[4901]: I0309 04:33:41.245718 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mthrn/crc-debug-9whff" event={"ID":"51509b37-855d-47a1-a180-71f2d728321a","Type":"ContainerStarted","Data":"e2e6b78563e624ff201c26df00ad1f577f803bcf541db6ef8d8880575771b268"} Mar 09 04:33:52 crc kubenswrapper[4901]: I0309 04:33:52.332873 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mthrn/crc-debug-9whff" event={"ID":"51509b37-855d-47a1-a180-71f2d728321a","Type":"ContainerStarted","Data":"87f753b8aee8fd279e5ace2efcec02185b6071d9f5e154c66ef47f102059eeb2"} Mar 09 04:33:52 crc kubenswrapper[4901]: I0309 04:33:52.349179 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-mthrn/crc-debug-9whff" podStartSLOduration=1.738338657 podStartE2EDuration="12.34916142s" podCreationTimestamp="2026-03-09 04:33:40 +0000 UTC" firstStartedPulling="2026-03-09 04:33:41.071214094 +0000 UTC m=+6745.660877826" lastFinishedPulling="2026-03-09 04:33:51.682036857 +0000 UTC m=+6756.271700589" observedRunningTime="2026-03-09 04:33:52.346234288 +0000 UTC m=+6756.935898040" watchObservedRunningTime="2026-03-09 04:33:52.34916142 +0000 UTC m=+6756.938825152" Mar 09 04:34:00 crc kubenswrapper[4901]: I0309 04:34:00.170861 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550514-m9gf9"] Mar 09 04:34:00 crc kubenswrapper[4901]: I0309 04:34:00.172751 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550514-m9gf9" Mar 09 04:34:00 crc kubenswrapper[4901]: I0309 04:34:00.183357 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lq6vs" Mar 09 04:34:00 crc kubenswrapper[4901]: I0309 04:34:00.183431 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 04:34:00 crc kubenswrapper[4901]: I0309 04:34:00.183755 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 04:34:00 crc kubenswrapper[4901]: I0309 04:34:00.190443 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550514-m9gf9"] Mar 09 04:34:00 crc kubenswrapper[4901]: I0309 04:34:00.227791 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98xhh\" (UniqueName: \"kubernetes.io/projected/7ca56c3b-34f4-4ea7-aa9f-5c34d067cf7c-kube-api-access-98xhh\") pod \"auto-csr-approver-29550514-m9gf9\" (UID: \"7ca56c3b-34f4-4ea7-aa9f-5c34d067cf7c\") " pod="openshift-infra/auto-csr-approver-29550514-m9gf9" Mar 09 04:34:00 crc kubenswrapper[4901]: I0309 04:34:00.330153 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98xhh\" (UniqueName: \"kubernetes.io/projected/7ca56c3b-34f4-4ea7-aa9f-5c34d067cf7c-kube-api-access-98xhh\") pod \"auto-csr-approver-29550514-m9gf9\" (UID: \"7ca56c3b-34f4-4ea7-aa9f-5c34d067cf7c\") " pod="openshift-infra/auto-csr-approver-29550514-m9gf9" Mar 09 04:34:00 crc kubenswrapper[4901]: I0309 04:34:00.361324 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98xhh\" (UniqueName: \"kubernetes.io/projected/7ca56c3b-34f4-4ea7-aa9f-5c34d067cf7c-kube-api-access-98xhh\") pod \"auto-csr-approver-29550514-m9gf9\" (UID: \"7ca56c3b-34f4-4ea7-aa9f-5c34d067cf7c\") " pod="openshift-infra/auto-csr-approver-29550514-m9gf9" Mar 09 04:34:00 crc kubenswrapper[4901]: I0309 04:34:00.499899 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550514-m9gf9" Mar 09 04:34:00 crc kubenswrapper[4901]: I0309 04:34:00.863041 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 04:34:00 crc kubenswrapper[4901]: I0309 04:34:00.863582 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 04:34:00 crc kubenswrapper[4901]: I0309 04:34:00.941807 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550514-m9gf9"] Mar 09 04:34:01 crc kubenswrapper[4901]: I0309 04:34:01.412932 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550514-m9gf9" event={"ID":"7ca56c3b-34f4-4ea7-aa9f-5c34d067cf7c","Type":"ContainerStarted","Data":"12e4212788224e2d212a111e2fdf381e380a2a0818ad5cc55f93496de702e4c9"} Mar 09 04:34:03 crc kubenswrapper[4901]: I0309 04:34:03.434978 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550514-m9gf9" event={"ID":"7ca56c3b-34f4-4ea7-aa9f-5c34d067cf7c","Type":"ContainerStarted","Data":"3ef534da286b9976f26b21b00c4abaf314722d2d712ca729c0a9812b0abf8537"} Mar 09 04:34:03 crc kubenswrapper[4901]: I0309 04:34:03.460678 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550514-m9gf9" podStartSLOduration=2.617061974 podStartE2EDuration="3.460656479s" podCreationTimestamp="2026-03-09 04:34:00 +0000 UTC" firstStartedPulling="2026-03-09 04:34:00.9587719 +0000 UTC m=+6765.548435632" lastFinishedPulling="2026-03-09 04:34:01.802366375 +0000 UTC m=+6766.392030137" observedRunningTime="2026-03-09 04:34:03.44844944 +0000 UTC m=+6768.038113172" watchObservedRunningTime="2026-03-09 04:34:03.460656479 +0000 UTC m=+6768.050320221" Mar 09 04:34:04 crc kubenswrapper[4901]: I0309 04:34:04.446909 4901 generic.go:334] "Generic (PLEG): container finished" podID="7ca56c3b-34f4-4ea7-aa9f-5c34d067cf7c" containerID="3ef534da286b9976f26b21b00c4abaf314722d2d712ca729c0a9812b0abf8537" exitCode=0 Mar 09 04:34:04 crc kubenswrapper[4901]: I0309 04:34:04.447012 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550514-m9gf9" event={"ID":"7ca56c3b-34f4-4ea7-aa9f-5c34d067cf7c","Type":"ContainerDied","Data":"3ef534da286b9976f26b21b00c4abaf314722d2d712ca729c0a9812b0abf8537"} Mar 09 04:34:06 crc kubenswrapper[4901]: I0309 04:34:06.809234 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550514-m9gf9" Mar 09 04:34:06 crc kubenswrapper[4901]: I0309 04:34:06.859873 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98xhh\" (UniqueName: \"kubernetes.io/projected/7ca56c3b-34f4-4ea7-aa9f-5c34d067cf7c-kube-api-access-98xhh\") pod \"7ca56c3b-34f4-4ea7-aa9f-5c34d067cf7c\" (UID: \"7ca56c3b-34f4-4ea7-aa9f-5c34d067cf7c\") " Mar 09 04:34:06 crc kubenswrapper[4901]: I0309 04:34:06.887433 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ca56c3b-34f4-4ea7-aa9f-5c34d067cf7c-kube-api-access-98xhh" (OuterVolumeSpecName: "kube-api-access-98xhh") pod "7ca56c3b-34f4-4ea7-aa9f-5c34d067cf7c" (UID: "7ca56c3b-34f4-4ea7-aa9f-5c34d067cf7c"). InnerVolumeSpecName "kube-api-access-98xhh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:34:06 crc kubenswrapper[4901]: I0309 04:34:06.962182 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98xhh\" (UniqueName: \"kubernetes.io/projected/7ca56c3b-34f4-4ea7-aa9f-5c34d067cf7c-kube-api-access-98xhh\") on node \"crc\" DevicePath \"\"" Mar 09 04:34:07 crc kubenswrapper[4901]: I0309 04:34:07.479008 4901 generic.go:334] "Generic (PLEG): container finished" podID="51509b37-855d-47a1-a180-71f2d728321a" containerID="87f753b8aee8fd279e5ace2efcec02185b6071d9f5e154c66ef47f102059eeb2" exitCode=0 Mar 09 04:34:07 crc kubenswrapper[4901]: I0309 04:34:07.479097 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mthrn/crc-debug-9whff" event={"ID":"51509b37-855d-47a1-a180-71f2d728321a","Type":"ContainerDied","Data":"87f753b8aee8fd279e5ace2efcec02185b6071d9f5e154c66ef47f102059eeb2"} Mar 09 04:34:07 crc kubenswrapper[4901]: I0309 04:34:07.483105 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550514-m9gf9" event={"ID":"7ca56c3b-34f4-4ea7-aa9f-5c34d067cf7c","Type":"ContainerDied","Data":"12e4212788224e2d212a111e2fdf381e380a2a0818ad5cc55f93496de702e4c9"} Mar 09 04:34:07 crc kubenswrapper[4901]: I0309 04:34:07.483168 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12e4212788224e2d212a111e2fdf381e380a2a0818ad5cc55f93496de702e4c9" Mar 09 04:34:07 crc kubenswrapper[4901]: I0309 04:34:07.483181 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550514-m9gf9" Mar 09 04:34:07 crc kubenswrapper[4901]: I0309 04:34:07.899788 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550508-xmpp9"] Mar 09 04:34:07 crc kubenswrapper[4901]: I0309 04:34:07.907841 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550508-xmpp9"] Mar 09 04:34:08 crc kubenswrapper[4901]: I0309 04:34:08.118634 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0632f7aa-01aa-4e06-b187-b35a6f7a68dd" path="/var/lib/kubelet/pods/0632f7aa-01aa-4e06-b187-b35a6f7a68dd/volumes" Mar 09 04:34:08 crc kubenswrapper[4901]: I0309 04:34:08.584781 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mthrn/crc-debug-9whff" Mar 09 04:34:08 crc kubenswrapper[4901]: I0309 04:34:08.614126 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-mthrn/crc-debug-9whff"] Mar 09 04:34:08 crc kubenswrapper[4901]: I0309 04:34:08.619920 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-mthrn/crc-debug-9whff"] Mar 09 04:34:08 crc kubenswrapper[4901]: I0309 04:34:08.695496 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/51509b37-855d-47a1-a180-71f2d728321a-host\") pod \"51509b37-855d-47a1-a180-71f2d728321a\" (UID: \"51509b37-855d-47a1-a180-71f2d728321a\") " Mar 09 04:34:08 crc kubenswrapper[4901]: I0309 04:34:08.695630 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/51509b37-855d-47a1-a180-71f2d728321a-host" (OuterVolumeSpecName: "host") pod "51509b37-855d-47a1-a180-71f2d728321a" (UID: "51509b37-855d-47a1-a180-71f2d728321a"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 04:34:08 crc kubenswrapper[4901]: I0309 04:34:08.695810 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvx49\" (UniqueName: \"kubernetes.io/projected/51509b37-855d-47a1-a180-71f2d728321a-kube-api-access-lvx49\") pod \"51509b37-855d-47a1-a180-71f2d728321a\" (UID: \"51509b37-855d-47a1-a180-71f2d728321a\") " Mar 09 04:34:08 crc kubenswrapper[4901]: I0309 04:34:08.696258 4901 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/51509b37-855d-47a1-a180-71f2d728321a-host\") on node \"crc\" DevicePath \"\"" Mar 09 04:34:08 crc kubenswrapper[4901]: I0309 04:34:08.707142 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51509b37-855d-47a1-a180-71f2d728321a-kube-api-access-lvx49" (OuterVolumeSpecName: "kube-api-access-lvx49") pod "51509b37-855d-47a1-a180-71f2d728321a" (UID: "51509b37-855d-47a1-a180-71f2d728321a"). InnerVolumeSpecName "kube-api-access-lvx49". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:34:08 crc kubenswrapper[4901]: I0309 04:34:08.798606 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvx49\" (UniqueName: \"kubernetes.io/projected/51509b37-855d-47a1-a180-71f2d728321a-kube-api-access-lvx49\") on node \"crc\" DevicePath \"\"" Mar 09 04:34:09 crc kubenswrapper[4901]: I0309 04:34:09.502615 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2e6b78563e624ff201c26df00ad1f577f803bcf541db6ef8d8880575771b268" Mar 09 04:34:09 crc kubenswrapper[4901]: I0309 04:34:09.502690 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mthrn/crc-debug-9whff" Mar 09 04:34:09 crc kubenswrapper[4901]: I0309 04:34:09.847894 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mthrn/crc-debug-tbcht"] Mar 09 04:34:09 crc kubenswrapper[4901]: E0309 04:34:09.848294 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51509b37-855d-47a1-a180-71f2d728321a" containerName="container-00" Mar 09 04:34:09 crc kubenswrapper[4901]: I0309 04:34:09.848314 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="51509b37-855d-47a1-a180-71f2d728321a" containerName="container-00" Mar 09 04:34:09 crc kubenswrapper[4901]: E0309 04:34:09.848338 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ca56c3b-34f4-4ea7-aa9f-5c34d067cf7c" containerName="oc" Mar 09 04:34:09 crc kubenswrapper[4901]: I0309 04:34:09.848347 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ca56c3b-34f4-4ea7-aa9f-5c34d067cf7c" containerName="oc" Mar 09 04:34:09 crc kubenswrapper[4901]: I0309 04:34:09.848565 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="51509b37-855d-47a1-a180-71f2d728321a" containerName="container-00" Mar 09 04:34:09 crc kubenswrapper[4901]: I0309 04:34:09.848608 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ca56c3b-34f4-4ea7-aa9f-5c34d067cf7c" containerName="oc" Mar 09 04:34:09 crc kubenswrapper[4901]: I0309 04:34:09.849218 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mthrn/crc-debug-tbcht" Mar 09 04:34:09 crc kubenswrapper[4901]: I0309 04:34:09.918177 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2k24\" (UniqueName: \"kubernetes.io/projected/c137d44b-f059-471b-b7dc-cda4114578c1-kube-api-access-g2k24\") pod \"crc-debug-tbcht\" (UID: \"c137d44b-f059-471b-b7dc-cda4114578c1\") " pod="openshift-must-gather-mthrn/crc-debug-tbcht" Mar 09 04:34:09 crc kubenswrapper[4901]: I0309 04:34:09.918341 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c137d44b-f059-471b-b7dc-cda4114578c1-host\") pod \"crc-debug-tbcht\" (UID: \"c137d44b-f059-471b-b7dc-cda4114578c1\") " pod="openshift-must-gather-mthrn/crc-debug-tbcht" Mar 09 04:34:10 crc kubenswrapper[4901]: I0309 04:34:10.019010 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2k24\" (UniqueName: \"kubernetes.io/projected/c137d44b-f059-471b-b7dc-cda4114578c1-kube-api-access-g2k24\") pod \"crc-debug-tbcht\" (UID: \"c137d44b-f059-471b-b7dc-cda4114578c1\") " pod="openshift-must-gather-mthrn/crc-debug-tbcht" Mar 09 04:34:10 crc kubenswrapper[4901]: I0309 04:34:10.019161 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c137d44b-f059-471b-b7dc-cda4114578c1-host\") pod \"crc-debug-tbcht\" (UID: \"c137d44b-f059-471b-b7dc-cda4114578c1\") " pod="openshift-must-gather-mthrn/crc-debug-tbcht" Mar 09 04:34:10 crc kubenswrapper[4901]: I0309 04:34:10.019280 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c137d44b-f059-471b-b7dc-cda4114578c1-host\") pod \"crc-debug-tbcht\" (UID: \"c137d44b-f059-471b-b7dc-cda4114578c1\") " pod="openshift-must-gather-mthrn/crc-debug-tbcht" Mar 09 04:34:10 crc kubenswrapper[4901]: I0309 04:34:10.036849 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2k24\" (UniqueName: \"kubernetes.io/projected/c137d44b-f059-471b-b7dc-cda4114578c1-kube-api-access-g2k24\") pod \"crc-debug-tbcht\" (UID: \"c137d44b-f059-471b-b7dc-cda4114578c1\") " pod="openshift-must-gather-mthrn/crc-debug-tbcht" Mar 09 04:34:10 crc kubenswrapper[4901]: I0309 04:34:10.125143 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51509b37-855d-47a1-a180-71f2d728321a" path="/var/lib/kubelet/pods/51509b37-855d-47a1-a180-71f2d728321a/volumes" Mar 09 04:34:10 crc kubenswrapper[4901]: I0309 04:34:10.170639 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mthrn/crc-debug-tbcht" Mar 09 04:34:10 crc kubenswrapper[4901]: I0309 04:34:10.511107 4901 generic.go:334] "Generic (PLEG): container finished" podID="c137d44b-f059-471b-b7dc-cda4114578c1" containerID="e6979b32c09d7c282089a9ef5a76428eeaac731bca5a3c305fee66e92656b06d" exitCode=1 Mar 09 04:34:10 crc kubenswrapper[4901]: I0309 04:34:10.511243 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mthrn/crc-debug-tbcht" event={"ID":"c137d44b-f059-471b-b7dc-cda4114578c1","Type":"ContainerDied","Data":"e6979b32c09d7c282089a9ef5a76428eeaac731bca5a3c305fee66e92656b06d"} Mar 09 04:34:10 crc kubenswrapper[4901]: I0309 04:34:10.511484 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mthrn/crc-debug-tbcht" event={"ID":"c137d44b-f059-471b-b7dc-cda4114578c1","Type":"ContainerStarted","Data":"e6b591baebefcf18293baeb17d46ebf05b91aea79679007e668205233c3dfaa3"} Mar 09 04:34:10 crc kubenswrapper[4901]: I0309 04:34:10.553759 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-mthrn/crc-debug-tbcht"] Mar 09 04:34:10 crc kubenswrapper[4901]: I0309 04:34:10.565891 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-mthrn/crc-debug-tbcht"] Mar 09 04:34:11 crc kubenswrapper[4901]: I0309 04:34:11.615653 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mthrn/crc-debug-tbcht" Mar 09 04:34:11 crc kubenswrapper[4901]: I0309 04:34:11.648775 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c137d44b-f059-471b-b7dc-cda4114578c1-host\") pod \"c137d44b-f059-471b-b7dc-cda4114578c1\" (UID: \"c137d44b-f059-471b-b7dc-cda4114578c1\") " Mar 09 04:34:11 crc kubenswrapper[4901]: I0309 04:34:11.648986 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2k24\" (UniqueName: \"kubernetes.io/projected/c137d44b-f059-471b-b7dc-cda4114578c1-kube-api-access-g2k24\") pod \"c137d44b-f059-471b-b7dc-cda4114578c1\" (UID: \"c137d44b-f059-471b-b7dc-cda4114578c1\") " Mar 09 04:34:11 crc kubenswrapper[4901]: I0309 04:34:11.648998 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c137d44b-f059-471b-b7dc-cda4114578c1-host" (OuterVolumeSpecName: "host") pod "c137d44b-f059-471b-b7dc-cda4114578c1" (UID: "c137d44b-f059-471b-b7dc-cda4114578c1"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 04:34:11 crc kubenswrapper[4901]: I0309 04:34:11.649753 4901 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c137d44b-f059-471b-b7dc-cda4114578c1-host\") on node \"crc\" DevicePath \"\"" Mar 09 04:34:11 crc kubenswrapper[4901]: I0309 04:34:11.655445 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c137d44b-f059-471b-b7dc-cda4114578c1-kube-api-access-g2k24" (OuterVolumeSpecName: "kube-api-access-g2k24") pod "c137d44b-f059-471b-b7dc-cda4114578c1" (UID: "c137d44b-f059-471b-b7dc-cda4114578c1"). InnerVolumeSpecName "kube-api-access-g2k24". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:34:11 crc kubenswrapper[4901]: I0309 04:34:11.751066 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2k24\" (UniqueName: \"kubernetes.io/projected/c137d44b-f059-471b-b7dc-cda4114578c1-kube-api-access-g2k24\") on node \"crc\" DevicePath \"\"" Mar 09 04:34:12 crc kubenswrapper[4901]: I0309 04:34:12.125615 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c137d44b-f059-471b-b7dc-cda4114578c1" path="/var/lib/kubelet/pods/c137d44b-f059-471b-b7dc-cda4114578c1/volumes" Mar 09 04:34:12 crc kubenswrapper[4901]: I0309 04:34:12.534502 4901 scope.go:117] "RemoveContainer" containerID="e6979b32c09d7c282089a9ef5a76428eeaac731bca5a3c305fee66e92656b06d" Mar 09 04:34:12 crc kubenswrapper[4901]: I0309 04:34:12.534607 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mthrn/crc-debug-tbcht" Mar 09 04:34:30 crc kubenswrapper[4901]: I0309 04:34:30.862745 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 04:34:30 crc kubenswrapper[4901]: I0309 04:34:30.863392 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 04:34:34 crc kubenswrapper[4901]: I0309 04:34:34.566444 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7989cc7f6f-vv8d8_63fc5507-22a2-4871-a6bf-557a5e4cde6b/init/0.log" Mar 09 04:34:34 crc kubenswrapper[4901]: I0309 04:34:34.726486 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7989cc7f6f-vv8d8_63fc5507-22a2-4871-a6bf-557a5e4cde6b/init/0.log" Mar 09 04:34:34 crc kubenswrapper[4901]: I0309 04:34:34.736273 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7989cc7f6f-vv8d8_63fc5507-22a2-4871-a6bf-557a5e4cde6b/dnsmasq-dns/0.log" Mar 09 04:34:34 crc kubenswrapper[4901]: I0309 04:34:34.930874 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7bd95c8c59-n8g5r_4f5275cc-6600-4603-9252-11131d31cd1b/keystone-api/0.log" Mar 09 04:34:34 crc kubenswrapper[4901]: I0309 04:34:34.987064 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-copy-data_213b167a-15b0-4174-955f-f10bfbab4262/adoption/0.log" Mar 09 04:34:35 crc kubenswrapper[4901]: I0309 04:34:35.208940 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_5de13b0e-52fc-4b56-b8bc-28e67614db67/mysql-bootstrap/0.log" Mar 09 04:34:35 crc kubenswrapper[4901]: I0309 04:34:35.567750 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_5de13b0e-52fc-4b56-b8bc-28e67614db67/galera/0.log" Mar 09 04:34:35 crc kubenswrapper[4901]: I0309 04:34:35.630884 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_5de13b0e-52fc-4b56-b8bc-28e67614db67/mysql-bootstrap/0.log" Mar 09 04:34:35 crc kubenswrapper[4901]: I0309 04:34:35.760519 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_c28b6405-4576-4ec2-b596-db1e1a35d148/mysql-bootstrap/0.log" Mar 09 04:34:35 crc kubenswrapper[4901]: I0309 04:34:35.952298 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_c28b6405-4576-4ec2-b596-db1e1a35d148/galera/0.log" Mar 09 04:34:36 crc kubenswrapper[4901]: I0309 04:34:36.016358 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_c28b6405-4576-4ec2-b596-db1e1a35d148/mysql-bootstrap/0.log" Mar 09 04:34:36 crc kubenswrapper[4901]: I0309 04:34:36.119097 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_6b898827-c802-48ee-b7b9-17e6a6706ef3/openstackclient/0.log" Mar 09 04:34:36 crc kubenswrapper[4901]: I0309 04:34:36.192631 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-copy-data_0955f311-3f27-47c0-85e5-e9a5f8136516/adoption/0.log" Mar 09 04:34:36 crc kubenswrapper[4901]: I0309 04:34:36.253267 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_929b82d8-ca48-4ccc-b938-8224fae0ac25/memcached/0.log" Mar 09 04:34:36 crc kubenswrapper[4901]: I0309 04:34:36.354840 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_c5890487-b519-475d-9856-6449948d8f14/openstack-network-exporter/0.log" Mar 09 04:34:36 crc kubenswrapper[4901]: I0309 04:34:36.427596 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_c5890487-b519-475d-9856-6449948d8f14/ovn-northd/0.log" Mar 09 04:34:36 crc kubenswrapper[4901]: I0309 04:34:36.551153 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_13d33b0e-cd98-42a1-8cff-58f6592d8818/openstack-network-exporter/0.log" Mar 09 04:34:36 crc kubenswrapper[4901]: I0309 04:34:36.562271 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_13d33b0e-cd98-42a1-8cff-58f6592d8818/ovsdbserver-nb/0.log" Mar 09 04:34:36 crc kubenswrapper[4901]: I0309 04:34:36.683851 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_3a4b857e-ca99-46cb-b4dc-ce16c46649ff/openstack-network-exporter/0.log" Mar 09 04:34:36 crc kubenswrapper[4901]: I0309 04:34:36.725469 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_3a4b857e-ca99-46cb-b4dc-ce16c46649ff/ovsdbserver-nb/0.log" Mar 09 04:34:36 crc kubenswrapper[4901]: I0309 04:34:36.876459 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_25f843bc-b581-4ff6-a7e3-fa153dbc5fff/ovsdbserver-nb/0.log" Mar 09 04:34:36 crc kubenswrapper[4901]: I0309 04:34:36.920867 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_25f843bc-b581-4ff6-a7e3-fa153dbc5fff/openstack-network-exporter/0.log" Mar 09 04:34:36 crc kubenswrapper[4901]: I0309 04:34:36.989136 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_423e84f6-36ec-4649-97e4-faf3da93684e/openstack-network-exporter/0.log" Mar 09 04:34:37 crc kubenswrapper[4901]: I0309 04:34:37.065430 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_423e84f6-36ec-4649-97e4-faf3da93684e/ovsdbserver-sb/0.log" Mar 09 04:34:37 crc kubenswrapper[4901]: I0309 04:34:37.170065 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_73820d01-78b9-4eb5-bbdd-1192bba6335b/ovsdbserver-sb/0.log" Mar 09 04:34:37 crc kubenswrapper[4901]: I0309 04:34:37.184660 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_73820d01-78b9-4eb5-bbdd-1192bba6335b/openstack-network-exporter/0.log" Mar 09 04:34:37 crc kubenswrapper[4901]: I0309 04:34:37.304452 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_a75a85d8-38a8-4799-8a0a-ca67151aa49a/openstack-network-exporter/0.log" Mar 09 04:34:37 crc kubenswrapper[4901]: I0309 04:34:37.359993 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_a75a85d8-38a8-4799-8a0a-ca67151aa49a/ovsdbserver-sb/0.log" Mar 09 04:34:37 crc kubenswrapper[4901]: I0309 04:34:37.474888 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_50d7049e-19ad-4936-950c-2bcede63c496/setup-container/0.log" Mar 09 04:34:37 crc kubenswrapper[4901]: I0309 04:34:37.640192 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_50d7049e-19ad-4936-950c-2bcede63c496/rabbitmq/0.log" Mar 09 04:34:37 crc kubenswrapper[4901]: I0309 04:34:37.674696 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_50d7049e-19ad-4936-950c-2bcede63c496/setup-container/0.log" Mar 09 04:34:37 crc kubenswrapper[4901]: I0309 04:34:37.702395 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_8b05fa0b-5691-466e-9256-812e4809adb2/setup-container/0.log" Mar 09 04:34:37 crc kubenswrapper[4901]: I0309 04:34:37.883706 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_8b05fa0b-5691-466e-9256-812e4809adb2/rabbitmq/0.log" Mar 09 04:34:37 crc kubenswrapper[4901]: I0309 04:34:37.890999 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_8b05fa0b-5691-466e-9256-812e4809adb2/setup-container/0.log" Mar 09 04:34:44 crc kubenswrapper[4901]: I0309 04:34:44.575558 4901 scope.go:117] "RemoveContainer" containerID="aacac1deea1b10699ae491849cfe34b5127e5e3d5b51c3656a27cb2dbe97ed64" Mar 09 04:34:55 crc kubenswrapper[4901]: I0309 04:34:55.994113 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f995bc55_1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa/util/0.log" Mar 09 04:34:56 crc kubenswrapper[4901]: I0309 04:34:56.160606 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f995bc55_1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa/util/0.log" Mar 09 04:34:56 crc kubenswrapper[4901]: I0309 04:34:56.198654 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f995bc55_1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa/pull/0.log" Mar 09 04:34:56 crc kubenswrapper[4901]: I0309 04:34:56.206339 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f995bc55_1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa/pull/0.log" Mar 09 04:34:56 crc kubenswrapper[4901]: I0309 04:34:56.365209 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f995bc55_1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa/util/0.log" Mar 09 04:34:56 crc kubenswrapper[4901]: I0309 04:34:56.369623 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f995bc55_1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa/extract/0.log" Mar 09 04:34:56 crc kubenswrapper[4901]: I0309 04:34:56.382815 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f995bc55_1a9a0c4e-56a2-407d-b408-22fc2fb5a3fa/pull/0.log" Mar 09 04:34:56 crc kubenswrapper[4901]: I0309 04:34:56.735286 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-5d87c9d997-bvjjz_e690f8b6-f8ab-42fb-8e9c-be5dcbc52de4/manager/0.log" Mar 09 04:34:57 crc kubenswrapper[4901]: I0309 04:34:57.089786 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-64db6967f8-gclrf_68bf831f-f763-4c43-b57f-13d244b3a21e/manager/0.log" Mar 09 04:34:57 crc kubenswrapper[4901]: I0309 04:34:57.228061 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-cf99c678f-xrb6p_5c11cb5a-ab01-422d-a70b-33bb9dd06f8b/manager/0.log" Mar 09 04:34:57 crc kubenswrapper[4901]: I0309 04:34:57.463894 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-78bc7f9bd9-lfc7j_b99c3916-afa1-4f6c-a25e-ca7a7a30d5c6/manager/0.log" Mar 09 04:34:57 crc kubenswrapper[4901]: I0309 04:34:57.934560 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-545456dc4-5qmp2_3644aa78-798b-40f2-9041-700fc89959e0/manager/0.log" Mar 09 04:34:58 crc kubenswrapper[4901]: I0309 04:34:58.320936 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-f7fcc58b9-24mvj_21d847c9-9877-4e8d-b414-7f8035ebfc32/manager/0.log" Mar 09 04:34:58 crc kubenswrapper[4901]: I0309 04:34:58.382776 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7c789f89c6-tkpsk_6ee7156f-f994-4e79-875d-744fed479fcf/manager/0.log" Mar 09 04:34:58 crc kubenswrapper[4901]: I0309 04:34:58.552256 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-67d996989d-n5px9_1b18c23b-f21d-4935-898a-2864b473119c/manager/0.log" Mar 09 04:34:58 crc kubenswrapper[4901]: I0309 04:34:58.813242 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-7b6bfb6475-h2bk7_49531808-52f2-497a-98bc-61883926e221/manager/0.log" Mar 09 04:34:59 crc kubenswrapper[4901]: I0309 04:34:59.088029 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-54688575f-bvb9v_a6308427-401a-4c01-afdb-e385f8efc20d/manager/0.log" Mar 09 04:34:59 crc kubenswrapper[4901]: I0309 04:34:59.105070 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-55d77d7b5c-7sctc_e7f9248a-6f54-4cbd-9225-a601e2dd4e93/manager/0.log" Mar 09 04:34:59 crc kubenswrapper[4901]: I0309 04:34:59.418139 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-74b6b5dc96-6wntt_e6489b4e-ac8f-4853-90a2-6f7ec0af3367/manager/0.log" Mar 09 04:34:59 crc kubenswrapper[4901]: I0309 04:34:59.433897 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5d86c7ddb7-mksnm_00255202-1625-4595-a5d2-90aadb87fcfc/manager/0.log" Mar 09 04:34:59 crc kubenswrapper[4901]: I0309 04:34:59.742239 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-dc6dbbbd-x7pjx_dbe9fe4b-dcc0-4c01-9de1-8f68a30ca974/manager/0.log" Mar 09 04:34:59 crc kubenswrapper[4901]: I0309 04:34:59.948040 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-6f44f7b99f-btdwg_e881f11e-7b7b-4a3e-9c63-fbd2d8cd61dd/operator/0.log" Mar 09 04:35:00 crc kubenswrapper[4901]: I0309 04:35:00.336297 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-87jmb_3cd16b8c-edb0-460a-988b-539bafafb5a0/registry-server/0.log" Mar 09 04:35:00 crc kubenswrapper[4901]: I0309 04:35:00.370430 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-75684d597f-x98gb_15663ad3-38d8-4a71-88fd-28f74b590e6e/manager/0.log" Mar 09 04:35:00 crc kubenswrapper[4901]: I0309 04:35:00.560190 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-648564c9fc-5z5dd_17fc5096-7a83-4918-ba7f-213f188a1ce3/manager/0.log" Mar 09 04:35:00 crc kubenswrapper[4901]: I0309 04:35:00.743598 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-wvsdc_38e4f81e-db3d-4d6d-824c-cdcf6e42ab1f/operator/0.log" Mar 09 04:35:00 crc kubenswrapper[4901]: I0309 04:35:00.862412 4901 patch_prober.go:28] interesting pod/machine-config-daemon-5c998 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 04:35:00 crc kubenswrapper[4901]: I0309 04:35:00.862687 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 04:35:00 crc kubenswrapper[4901]: I0309 04:35:00.862739 4901 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5c998" Mar 09 04:35:00 crc kubenswrapper[4901]: I0309 04:35:00.863458 4901 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"314b4bd509612b348bb3d7d47b625755ea46c37745d82dbd5b718f92d9be76de"} pod="openshift-machine-config-operator/machine-config-daemon-5c998" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 04:35:00 crc kubenswrapper[4901]: I0309 04:35:00.863506 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerName="machine-config-daemon" containerID="cri-o://314b4bd509612b348bb3d7d47b625755ea46c37745d82dbd5b718f92d9be76de" gracePeriod=600 Mar 09 04:35:00 crc kubenswrapper[4901]: I0309 04:35:00.895503 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9b9ff9f4d-2qbw7_4b68ef50-4ff4-420c-a455-ec9dd86db4cc/manager/0.log" Mar 09 04:35:00 crc kubenswrapper[4901]: E0309 04:35:00.994529 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:35:01 crc kubenswrapper[4901]: I0309 04:35:01.057761 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5fdb694969-8d9wj_37afdbe5-071a-4161-8390-1de33fefd993/manager/0.log" Mar 09 04:35:01 crc kubenswrapper[4901]: I0309 04:35:01.164985 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-55b5ff4dbb-6qs6f_40444baa-6195-4d1a-9704-21874564d865/manager/0.log" Mar 09 04:35:01 crc kubenswrapper[4901]: I0309 04:35:01.321868 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7dfcb4d64f-jmbhm_801bcadf-a5a1-4b3c-9564-e1e21ff68f7f/manager/0.log" Mar 09 04:35:01 crc kubenswrapper[4901]: I0309 04:35:01.331323 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-bccc79885-r2rzc_f86fd171-86dc-4bad-8b53-24cde4942e76/manager/0.log" Mar 09 04:35:01 crc kubenswrapper[4901]: I0309 04:35:01.937893 4901 generic.go:334] "Generic (PLEG): container finished" podID="65e722e8-52c4-4bb6-9927-f378b2f7296a" containerID="314b4bd509612b348bb3d7d47b625755ea46c37745d82dbd5b718f92d9be76de" exitCode=0 Mar 09 04:35:01 crc kubenswrapper[4901]: I0309 04:35:01.937931 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" event={"ID":"65e722e8-52c4-4bb6-9927-f378b2f7296a","Type":"ContainerDied","Data":"314b4bd509612b348bb3d7d47b625755ea46c37745d82dbd5b718f92d9be76de"} Mar 09 04:35:01 crc kubenswrapper[4901]: I0309 04:35:01.937964 4901 scope.go:117] "RemoveContainer" containerID="7f2a067a4431ae093b269f245287846947ed97ffeb5cdb48f9f1af1456173456" Mar 09 04:35:01 crc kubenswrapper[4901]: I0309 04:35:01.938753 4901 scope.go:117] "RemoveContainer" containerID="314b4bd509612b348bb3d7d47b625755ea46c37745d82dbd5b718f92d9be76de" Mar 09 04:35:01 crc kubenswrapper[4901]: E0309 04:35:01.939283 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:35:07 crc kubenswrapper[4901]: I0309 04:35:07.638155 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6db6876945-l7qrj_150f46cd-b329-412c-b0a9-acd69b79a434/manager/0.log" Mar 09 04:35:14 crc kubenswrapper[4901]: I0309 04:35:14.110280 4901 scope.go:117] "RemoveContainer" containerID="314b4bd509612b348bb3d7d47b625755ea46c37745d82dbd5b718f92d9be76de" Mar 09 04:35:14 crc kubenswrapper[4901]: E0309 04:35:14.111055 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:35:24 crc kubenswrapper[4901]: I0309 04:35:24.368843 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-8m6x2_2009785b-23de-4c85-8dbc-285219ade858/control-plane-machine-set-operator/0.log" Mar 09 04:35:24 crc kubenswrapper[4901]: I0309 04:35:24.548430 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-f6c5c_8919ddf1-096f-407f-8ea3-91a26a623f43/kube-rbac-proxy/0.log" Mar 09 04:35:24 crc kubenswrapper[4901]: I0309 04:35:24.589920 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-f6c5c_8919ddf1-096f-407f-8ea3-91a26a623f43/machine-api-operator/0.log" Mar 09 04:35:27 crc kubenswrapper[4901]: I0309 04:35:27.106792 4901 scope.go:117] "RemoveContainer" containerID="314b4bd509612b348bb3d7d47b625755ea46c37745d82dbd5b718f92d9be76de" Mar 09 04:35:27 crc kubenswrapper[4901]: E0309 04:35:27.107255 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:35:38 crc kubenswrapper[4901]: I0309 04:35:38.715649 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-pvmjq_cdf41bf1-67bd-4b7e-b2cd-d629a7b0ab15/cert-manager-controller/0.log" Mar 09 04:35:38 crc kubenswrapper[4901]: I0309 04:35:38.873609 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-bncdj_f07a88fa-51e7-43c8-b640-fdf34d1d2957/cert-manager-cainjector/0.log" Mar 09 04:35:38 crc kubenswrapper[4901]: I0309 04:35:38.877296 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-tskr9_54d6e767-b2a0-472c-ade7-a8f22284526b/cert-manager-webhook/0.log" Mar 09 04:35:41 crc kubenswrapper[4901]: I0309 04:35:41.107566 4901 scope.go:117] "RemoveContainer" containerID="314b4bd509612b348bb3d7d47b625755ea46c37745d82dbd5b718f92d9be76de" Mar 09 04:35:41 crc kubenswrapper[4901]: E0309 04:35:41.108538 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:35:52 crc kubenswrapper[4901]: I0309 04:35:52.107031 4901 scope.go:117] "RemoveContainer" containerID="314b4bd509612b348bb3d7d47b625755ea46c37745d82dbd5b718f92d9be76de" Mar 09 04:35:52 crc kubenswrapper[4901]: E0309 04:35:52.107762 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:35:52 crc kubenswrapper[4901]: I0309 04:35:52.232913 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5dcbbd79cf-sdgzs_2fa716e4-3b2c-4ad8-b89b-cd7a931a658d/nmstate-console-plugin/0.log" Mar 09 04:35:52 crc kubenswrapper[4901]: I0309 04:35:52.417261 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-xfhkr_8397d2d3-a343-42d3-9443-03474b7ad195/nmstate-handler/0.log" Mar 09 04:35:52 crc kubenswrapper[4901]: I0309 04:35:52.509642 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-w2vms_052e1ec6-21bf-4ce6-9460-d639e85112a4/kube-rbac-proxy/0.log" Mar 09 04:35:52 crc kubenswrapper[4901]: I0309 04:35:52.567725 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-w2vms_052e1ec6-21bf-4ce6-9460-d639e85112a4/nmstate-metrics/0.log" Mar 09 04:35:52 crc kubenswrapper[4901]: I0309 04:35:52.694160 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-75c5dccd6c-kzrdg_704c84a3-97b2-4627-a1fe-7d187799db15/nmstate-operator/0.log" Mar 09 04:35:52 crc kubenswrapper[4901]: I0309 04:35:52.744708 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-786f45cff4-l5p5p_23c87d69-7fab-4ce7-99bc-9d076777b1ef/nmstate-webhook/0.log" Mar 09 04:36:00 crc kubenswrapper[4901]: I0309 04:36:00.153464 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550516-4mf6q"] Mar 09 04:36:00 crc kubenswrapper[4901]: E0309 04:36:00.154434 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c137d44b-f059-471b-b7dc-cda4114578c1" containerName="container-00" Mar 09 04:36:00 crc kubenswrapper[4901]: I0309 04:36:00.154454 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="c137d44b-f059-471b-b7dc-cda4114578c1" containerName="container-00" Mar 09 04:36:00 crc kubenswrapper[4901]: I0309 04:36:00.154685 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="c137d44b-f059-471b-b7dc-cda4114578c1" containerName="container-00" Mar 09 04:36:00 crc kubenswrapper[4901]: I0309 04:36:00.155407 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550516-4mf6q" Mar 09 04:36:00 crc kubenswrapper[4901]: I0309 04:36:00.158939 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lq6vs" Mar 09 04:36:00 crc kubenswrapper[4901]: I0309 04:36:00.159004 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 04:36:00 crc kubenswrapper[4901]: I0309 04:36:00.159005 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 04:36:00 crc kubenswrapper[4901]: I0309 04:36:00.166721 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550516-4mf6q"] Mar 09 04:36:00 crc kubenswrapper[4901]: I0309 04:36:00.262713 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rdgj\" (UniqueName: \"kubernetes.io/projected/487c302a-a733-4e69-9cb3-a7e73903dcf3-kube-api-access-8rdgj\") pod \"auto-csr-approver-29550516-4mf6q\" (UID: \"487c302a-a733-4e69-9cb3-a7e73903dcf3\") " pod="openshift-infra/auto-csr-approver-29550516-4mf6q" Mar 09 04:36:00 crc kubenswrapper[4901]: I0309 04:36:00.365103 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rdgj\" (UniqueName: \"kubernetes.io/projected/487c302a-a733-4e69-9cb3-a7e73903dcf3-kube-api-access-8rdgj\") pod \"auto-csr-approver-29550516-4mf6q\" (UID: \"487c302a-a733-4e69-9cb3-a7e73903dcf3\") " pod="openshift-infra/auto-csr-approver-29550516-4mf6q" Mar 09 04:36:00 crc kubenswrapper[4901]: I0309 04:36:00.388945 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rdgj\" (UniqueName: \"kubernetes.io/projected/487c302a-a733-4e69-9cb3-a7e73903dcf3-kube-api-access-8rdgj\") pod \"auto-csr-approver-29550516-4mf6q\" (UID: \"487c302a-a733-4e69-9cb3-a7e73903dcf3\") " pod="openshift-infra/auto-csr-approver-29550516-4mf6q" Mar 09 04:36:00 crc kubenswrapper[4901]: I0309 04:36:00.518069 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550516-4mf6q" Mar 09 04:36:00 crc kubenswrapper[4901]: I0309 04:36:00.988041 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550516-4mf6q"] Mar 09 04:36:01 crc kubenswrapper[4901]: I0309 04:36:01.469396 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550516-4mf6q" event={"ID":"487c302a-a733-4e69-9cb3-a7e73903dcf3","Type":"ContainerStarted","Data":"f2d6c9d574ec6af975c8334c7364254bd77ccf0f8cddfa2a2a20fa7935632667"} Mar 09 04:36:02 crc kubenswrapper[4901]: I0309 04:36:02.479971 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550516-4mf6q" event={"ID":"487c302a-a733-4e69-9cb3-a7e73903dcf3","Type":"ContainerStarted","Data":"14baef5404803e9945575536f7033244a0869c6fad5493dd680a2e1259139fa0"} Mar 09 04:36:02 crc kubenswrapper[4901]: I0309 04:36:02.499640 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550516-4mf6q" podStartSLOduration=1.416520335 podStartE2EDuration="2.499619898s" podCreationTimestamp="2026-03-09 04:36:00 +0000 UTC" firstStartedPulling="2026-03-09 04:36:01.006822669 +0000 UTC m=+6885.596486401" lastFinishedPulling="2026-03-09 04:36:02.089922192 +0000 UTC m=+6886.679585964" observedRunningTime="2026-03-09 04:36:02.491827347 +0000 UTC m=+6887.081491079" watchObservedRunningTime="2026-03-09 04:36:02.499619898 +0000 UTC m=+6887.089283630" Mar 09 04:36:03 crc kubenswrapper[4901]: I0309 04:36:03.491394 4901 generic.go:334] "Generic (PLEG): container finished" podID="487c302a-a733-4e69-9cb3-a7e73903dcf3" containerID="14baef5404803e9945575536f7033244a0869c6fad5493dd680a2e1259139fa0" exitCode=0 Mar 09 04:36:03 crc kubenswrapper[4901]: I0309 04:36:03.491477 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550516-4mf6q" event={"ID":"487c302a-a733-4e69-9cb3-a7e73903dcf3","Type":"ContainerDied","Data":"14baef5404803e9945575536f7033244a0869c6fad5493dd680a2e1259139fa0"} Mar 09 04:36:04 crc kubenswrapper[4901]: I0309 04:36:04.824937 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550516-4mf6q" Mar 09 04:36:04 crc kubenswrapper[4901]: I0309 04:36:04.967447 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rdgj\" (UniqueName: \"kubernetes.io/projected/487c302a-a733-4e69-9cb3-a7e73903dcf3-kube-api-access-8rdgj\") pod \"487c302a-a733-4e69-9cb3-a7e73903dcf3\" (UID: \"487c302a-a733-4e69-9cb3-a7e73903dcf3\") " Mar 09 04:36:04 crc kubenswrapper[4901]: I0309 04:36:04.972749 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/487c302a-a733-4e69-9cb3-a7e73903dcf3-kube-api-access-8rdgj" (OuterVolumeSpecName: "kube-api-access-8rdgj") pod "487c302a-a733-4e69-9cb3-a7e73903dcf3" (UID: "487c302a-a733-4e69-9cb3-a7e73903dcf3"). InnerVolumeSpecName "kube-api-access-8rdgj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:36:05 crc kubenswrapper[4901]: I0309 04:36:05.069882 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rdgj\" (UniqueName: \"kubernetes.io/projected/487c302a-a733-4e69-9cb3-a7e73903dcf3-kube-api-access-8rdgj\") on node \"crc\" DevicePath \"\"" Mar 09 04:36:05 crc kubenswrapper[4901]: I0309 04:36:05.512501 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550516-4mf6q" event={"ID":"487c302a-a733-4e69-9cb3-a7e73903dcf3","Type":"ContainerDied","Data":"f2d6c9d574ec6af975c8334c7364254bd77ccf0f8cddfa2a2a20fa7935632667"} Mar 09 04:36:05 crc kubenswrapper[4901]: I0309 04:36:05.512555 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2d6c9d574ec6af975c8334c7364254bd77ccf0f8cddfa2a2a20fa7935632667" Mar 09 04:36:05 crc kubenswrapper[4901]: I0309 04:36:05.512651 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550516-4mf6q" Mar 09 04:36:05 crc kubenswrapper[4901]: I0309 04:36:05.579151 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550510-fqxbf"] Mar 09 04:36:05 crc kubenswrapper[4901]: I0309 04:36:05.592683 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550510-fqxbf"] Mar 09 04:36:06 crc kubenswrapper[4901]: I0309 04:36:06.112031 4901 scope.go:117] "RemoveContainer" containerID="314b4bd509612b348bb3d7d47b625755ea46c37745d82dbd5b718f92d9be76de" Mar 09 04:36:06 crc kubenswrapper[4901]: E0309 04:36:06.112580 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:36:06 crc kubenswrapper[4901]: I0309 04:36:06.117347 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51dff792-b3db-45b2-91b9-3b0297f15fe1" path="/var/lib/kubelet/pods/51dff792-b3db-45b2-91b9-3b0297f15fe1/volumes" Mar 09 04:36:20 crc kubenswrapper[4901]: I0309 04:36:20.107000 4901 scope.go:117] "RemoveContainer" containerID="314b4bd509612b348bb3d7d47b625755ea46c37745d82dbd5b718f92d9be76de" Mar 09 04:36:20 crc kubenswrapper[4901]: E0309 04:36:20.107787 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:36:22 crc kubenswrapper[4901]: I0309 04:36:22.854461 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-n25nj_53342ff7-72fc-4eda-aa06-6330700e43cb/kube-rbac-proxy/0.log" Mar 09 04:36:23 crc kubenswrapper[4901]: I0309 04:36:23.104069 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-llmsx_95d29111-dd64-49e3-8c7d-2924c047094b/cp-frr-files/0.log" Mar 09 04:36:23 crc kubenswrapper[4901]: I0309 04:36:23.254727 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-n25nj_53342ff7-72fc-4eda-aa06-6330700e43cb/controller/0.log" Mar 09 04:36:23 crc kubenswrapper[4901]: I0309 04:36:23.297799 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-llmsx_95d29111-dd64-49e3-8c7d-2924c047094b/cp-reloader/0.log" Mar 09 04:36:23 crc kubenswrapper[4901]: I0309 04:36:23.301846 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-llmsx_95d29111-dd64-49e3-8c7d-2924c047094b/cp-frr-files/0.log" Mar 09 04:36:23 crc kubenswrapper[4901]: I0309 04:36:23.325105 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-llmsx_95d29111-dd64-49e3-8c7d-2924c047094b/cp-metrics/0.log" Mar 09 04:36:23 crc kubenswrapper[4901]: I0309 04:36:23.418742 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-llmsx_95d29111-dd64-49e3-8c7d-2924c047094b/cp-reloader/0.log" Mar 09 04:36:23 crc kubenswrapper[4901]: I0309 04:36:23.627526 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-llmsx_95d29111-dd64-49e3-8c7d-2924c047094b/cp-frr-files/0.log" Mar 09 04:36:23 crc kubenswrapper[4901]: I0309 04:36:23.636195 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-llmsx_95d29111-dd64-49e3-8c7d-2924c047094b/cp-metrics/0.log" Mar 09 04:36:23 crc kubenswrapper[4901]: I0309 04:36:23.640457 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-llmsx_95d29111-dd64-49e3-8c7d-2924c047094b/cp-reloader/0.log" Mar 09 04:36:23 crc kubenswrapper[4901]: I0309 04:36:23.685604 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-llmsx_95d29111-dd64-49e3-8c7d-2924c047094b/cp-metrics/0.log" Mar 09 04:36:23 crc kubenswrapper[4901]: I0309 04:36:23.814764 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-llmsx_95d29111-dd64-49e3-8c7d-2924c047094b/cp-frr-files/0.log" Mar 09 04:36:23 crc kubenswrapper[4901]: I0309 04:36:23.814814 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-llmsx_95d29111-dd64-49e3-8c7d-2924c047094b/cp-metrics/0.log" Mar 09 04:36:23 crc kubenswrapper[4901]: I0309 04:36:23.846009 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-llmsx_95d29111-dd64-49e3-8c7d-2924c047094b/cp-reloader/0.log" Mar 09 04:36:23 crc kubenswrapper[4901]: I0309 04:36:23.895021 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-llmsx_95d29111-dd64-49e3-8c7d-2924c047094b/controller/0.log" Mar 09 04:36:24 crc kubenswrapper[4901]: I0309 04:36:24.004492 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-llmsx_95d29111-dd64-49e3-8c7d-2924c047094b/frr-metrics/0.log" Mar 09 04:36:24 crc kubenswrapper[4901]: I0309 04:36:24.032843 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-llmsx_95d29111-dd64-49e3-8c7d-2924c047094b/kube-rbac-proxy/0.log" Mar 09 04:36:24 crc kubenswrapper[4901]: I0309 04:36:24.096895 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-llmsx_95d29111-dd64-49e3-8c7d-2924c047094b/kube-rbac-proxy-frr/0.log" Mar 09 04:36:24 crc kubenswrapper[4901]: I0309 04:36:24.181382 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-llmsx_95d29111-dd64-49e3-8c7d-2924c047094b/reloader/0.log" Mar 09 04:36:24 crc kubenswrapper[4901]: I0309 04:36:24.335369 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-z2jph_6c138591-09ca-4d38-90fa-61a52081ac72/frr-k8s-webhook-server/0.log" Mar 09 04:36:24 crc kubenswrapper[4901]: I0309 04:36:24.502894 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-64bbdf86d5-89qqq_79f9ef23-e8a8-4608-90fb-83ee291b5794/manager/0.log" Mar 09 04:36:24 crc kubenswrapper[4901]: I0309 04:36:24.622021 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7dbdd58ff4-c2bzv_a342f2a2-396a-4a32-b09e-0e3327534ca6/webhook-server/0.log" Mar 09 04:36:24 crc kubenswrapper[4901]: I0309 04:36:24.786481 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-l7twq_21029bf0-ecef-45a6-8600-10f73ef1949b/kube-rbac-proxy/0.log" Mar 09 04:36:25 crc kubenswrapper[4901]: I0309 04:36:25.363948 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-l7twq_21029bf0-ecef-45a6-8600-10f73ef1949b/speaker/0.log" Mar 09 04:36:26 crc kubenswrapper[4901]: I0309 04:36:26.101400 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-llmsx_95d29111-dd64-49e3-8c7d-2924c047094b/frr/0.log" Mar 09 04:36:31 crc kubenswrapper[4901]: I0309 04:36:31.106536 4901 scope.go:117] "RemoveContainer" containerID="314b4bd509612b348bb3d7d47b625755ea46c37745d82dbd5b718f92d9be76de" Mar 09 04:36:31 crc kubenswrapper[4901]: E0309 04:36:31.107614 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:36:39 crc kubenswrapper[4901]: I0309 04:36:39.430822 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82scnqs_ecd1807d-8e6c-4310-b7ba-e710f97e7d87/util/0.log" Mar 09 04:36:39 crc kubenswrapper[4901]: I0309 04:36:39.900975 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82scnqs_ecd1807d-8e6c-4310-b7ba-e710f97e7d87/util/0.log" Mar 09 04:36:39 crc kubenswrapper[4901]: I0309 04:36:39.911851 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82scnqs_ecd1807d-8e6c-4310-b7ba-e710f97e7d87/pull/0.log" Mar 09 04:36:39 crc kubenswrapper[4901]: I0309 04:36:39.921505 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82scnqs_ecd1807d-8e6c-4310-b7ba-e710f97e7d87/pull/0.log" Mar 09 04:36:40 crc kubenswrapper[4901]: I0309 04:36:40.002857 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82scnqs_ecd1807d-8e6c-4310-b7ba-e710f97e7d87/util/0.log" Mar 09 04:36:40 crc kubenswrapper[4901]: I0309 04:36:40.076686 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82scnqs_ecd1807d-8e6c-4310-b7ba-e710f97e7d87/extract/0.log" Mar 09 04:36:40 crc kubenswrapper[4901]: I0309 04:36:40.174655 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82scnqs_ecd1807d-8e6c-4310-b7ba-e710f97e7d87/pull/0.log" Mar 09 04:36:40 crc kubenswrapper[4901]: I0309 04:36:40.200199 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bnx7v_a365c75a-4afc-41ca-8005-4674a8097d40/util/0.log" Mar 09 04:36:40 crc kubenswrapper[4901]: I0309 04:36:40.387415 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bnx7v_a365c75a-4afc-41ca-8005-4674a8097d40/util/0.log" Mar 09 04:36:40 crc kubenswrapper[4901]: I0309 04:36:40.407644 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bnx7v_a365c75a-4afc-41ca-8005-4674a8097d40/pull/0.log" Mar 09 04:36:40 crc kubenswrapper[4901]: I0309 04:36:40.456097 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bnx7v_a365c75a-4afc-41ca-8005-4674a8097d40/pull/0.log" Mar 09 04:36:40 crc kubenswrapper[4901]: I0309 04:36:40.592648 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bnx7v_a365c75a-4afc-41ca-8005-4674a8097d40/util/0.log" Mar 09 04:36:40 crc kubenswrapper[4901]: I0309 04:36:40.594032 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bnx7v_a365c75a-4afc-41ca-8005-4674a8097d40/extract/0.log" Mar 09 04:36:40 crc kubenswrapper[4901]: I0309 04:36:40.606922 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bnx7v_a365c75a-4afc-41ca-8005-4674a8097d40/pull/0.log" Mar 09 04:36:40 crc kubenswrapper[4901]: I0309 04:36:40.743051 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xkv4b_6306095b-e7a3-4041-b513-2340505b5bef/extract-utilities/0.log" Mar 09 04:36:40 crc kubenswrapper[4901]: I0309 04:36:40.934808 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xkv4b_6306095b-e7a3-4041-b513-2340505b5bef/extract-utilities/0.log" Mar 09 04:36:40 crc kubenswrapper[4901]: I0309 04:36:40.950742 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xkv4b_6306095b-e7a3-4041-b513-2340505b5bef/extract-content/0.log" Mar 09 04:36:40 crc kubenswrapper[4901]: I0309 04:36:40.959372 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xkv4b_6306095b-e7a3-4041-b513-2340505b5bef/extract-content/0.log" Mar 09 04:36:41 crc kubenswrapper[4901]: I0309 04:36:41.118388 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xkv4b_6306095b-e7a3-4041-b513-2340505b5bef/extract-utilities/0.log" Mar 09 04:36:41 crc kubenswrapper[4901]: I0309 04:36:41.137862 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xkv4b_6306095b-e7a3-4041-b513-2340505b5bef/extract-content/0.log" Mar 09 04:36:41 crc kubenswrapper[4901]: I0309 04:36:41.329394 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-r8t8d_6dd2e3f1-5a7c-44fc-900c-706a3fba8ff4/extract-utilities/0.log" Mar 09 04:36:41 crc kubenswrapper[4901]: I0309 04:36:41.505753 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-r8t8d_6dd2e3f1-5a7c-44fc-900c-706a3fba8ff4/extract-utilities/0.log" Mar 09 04:36:41 crc kubenswrapper[4901]: I0309 04:36:41.506628 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-r8t8d_6dd2e3f1-5a7c-44fc-900c-706a3fba8ff4/extract-content/0.log" Mar 09 04:36:41 crc kubenswrapper[4901]: I0309 04:36:41.584859 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-r8t8d_6dd2e3f1-5a7c-44fc-900c-706a3fba8ff4/extract-content/0.log" Mar 09 04:36:41 crc kubenswrapper[4901]: I0309 04:36:41.776626 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-r8t8d_6dd2e3f1-5a7c-44fc-900c-706a3fba8ff4/extract-content/0.log" Mar 09 04:36:41 crc kubenswrapper[4901]: I0309 04:36:41.831062 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xkv4b_6306095b-e7a3-4041-b513-2340505b5bef/registry-server/0.log" Mar 09 04:36:41 crc kubenswrapper[4901]: I0309 04:36:41.893913 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-r8t8d_6dd2e3f1-5a7c-44fc-900c-706a3fba8ff4/extract-utilities/0.log" Mar 09 04:36:42 crc kubenswrapper[4901]: I0309 04:36:42.090372 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44t8xx_cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6/util/0.log" Mar 09 04:36:42 crc kubenswrapper[4901]: I0309 04:36:42.251555 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44t8xx_cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6/util/0.log" Mar 09 04:36:42 crc kubenswrapper[4901]: I0309 04:36:42.335845 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44t8xx_cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6/pull/0.log" Mar 09 04:36:42 crc kubenswrapper[4901]: I0309 04:36:42.352008 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44t8xx_cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6/pull/0.log" Mar 09 04:36:42 crc kubenswrapper[4901]: I0309 04:36:42.517423 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44t8xx_cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6/util/0.log" Mar 09 04:36:42 crc kubenswrapper[4901]: I0309 04:36:42.549794 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44t8xx_cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6/pull/0.log" Mar 09 04:36:42 crc kubenswrapper[4901]: I0309 04:36:42.560353 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44t8xx_cd3f1bf2-2f9e-4369-9fb6-e042d67eeac6/extract/0.log" Mar 09 04:36:42 crc kubenswrapper[4901]: I0309 04:36:42.584130 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-r8t8d_6dd2e3f1-5a7c-44fc-900c-706a3fba8ff4/registry-server/0.log" Mar 09 04:36:42 crc kubenswrapper[4901]: I0309 04:36:42.895250 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-xrwmx_4383cf51-078c-4924-ac4d-746918c62fad/marketplace-operator/0.log" Mar 09 04:36:42 crc kubenswrapper[4901]: I0309 04:36:42.941227 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dsnfd_cd6311cf-10e6-4c9f-a90c-9ed2a95680d8/extract-utilities/0.log" Mar 09 04:36:43 crc kubenswrapper[4901]: I0309 04:36:43.100368 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dsnfd_cd6311cf-10e6-4c9f-a90c-9ed2a95680d8/extract-utilities/0.log" Mar 09 04:36:43 crc kubenswrapper[4901]: I0309 04:36:43.142133 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dsnfd_cd6311cf-10e6-4c9f-a90c-9ed2a95680d8/extract-content/0.log" Mar 09 04:36:43 crc kubenswrapper[4901]: I0309 04:36:43.196483 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dsnfd_cd6311cf-10e6-4c9f-a90c-9ed2a95680d8/extract-content/0.log" Mar 09 04:36:43 crc kubenswrapper[4901]: I0309 04:36:43.274547 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dsnfd_cd6311cf-10e6-4c9f-a90c-9ed2a95680d8/extract-content/0.log" Mar 09 04:36:43 crc kubenswrapper[4901]: I0309 04:36:43.299356 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dsnfd_cd6311cf-10e6-4c9f-a90c-9ed2a95680d8/extract-utilities/0.log" Mar 09 04:36:43 crc kubenswrapper[4901]: I0309 04:36:43.501579 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8l687_4331116b-95d4-404d-9f0a-97919df59eb4/extract-utilities/0.log" Mar 09 04:36:43 crc kubenswrapper[4901]: I0309 04:36:43.519308 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dsnfd_cd6311cf-10e6-4c9f-a90c-9ed2a95680d8/registry-server/0.log" Mar 09 04:36:43 crc kubenswrapper[4901]: I0309 04:36:43.602929 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8l687_4331116b-95d4-404d-9f0a-97919df59eb4/extract-utilities/0.log" Mar 09 04:36:43 crc kubenswrapper[4901]: I0309 04:36:43.603681 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8l687_4331116b-95d4-404d-9f0a-97919df59eb4/extract-content/0.log" Mar 09 04:36:43 crc kubenswrapper[4901]: I0309 04:36:43.650852 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8l687_4331116b-95d4-404d-9f0a-97919df59eb4/extract-content/0.log" Mar 09 04:36:43 crc kubenswrapper[4901]: I0309 04:36:43.819119 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8l687_4331116b-95d4-404d-9f0a-97919df59eb4/extract-content/0.log" Mar 09 04:36:43 crc kubenswrapper[4901]: I0309 04:36:43.826648 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8l687_4331116b-95d4-404d-9f0a-97919df59eb4/extract-utilities/0.log" Mar 09 04:36:44 crc kubenswrapper[4901]: I0309 04:36:44.471140 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8l687_4331116b-95d4-404d-9f0a-97919df59eb4/registry-server/0.log" Mar 09 04:36:44 crc kubenswrapper[4901]: I0309 04:36:44.694498 4901 scope.go:117] "RemoveContainer" containerID="75698da71b77ab4f8aaa4215103d2e1275e326ecff932edfb55bd3c5a047d420" Mar 09 04:36:46 crc kubenswrapper[4901]: I0309 04:36:46.133635 4901 scope.go:117] "RemoveContainer" containerID="314b4bd509612b348bb3d7d47b625755ea46c37745d82dbd5b718f92d9be76de" Mar 09 04:36:46 crc kubenswrapper[4901]: E0309 04:36:46.150197 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:36:59 crc kubenswrapper[4901]: I0309 04:36:59.107182 4901 scope.go:117] "RemoveContainer" containerID="314b4bd509612b348bb3d7d47b625755ea46c37745d82dbd5b718f92d9be76de" Mar 09 04:36:59 crc kubenswrapper[4901]: E0309 04:36:59.107906 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:37:13 crc kubenswrapper[4901]: I0309 04:37:13.106124 4901 scope.go:117] "RemoveContainer" containerID="314b4bd509612b348bb3d7d47b625755ea46c37745d82dbd5b718f92d9be76de" Mar 09 04:37:13 crc kubenswrapper[4901]: E0309 04:37:13.107007 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:37:24 crc kubenswrapper[4901]: I0309 04:37:24.107440 4901 scope.go:117] "RemoveContainer" containerID="314b4bd509612b348bb3d7d47b625755ea46c37745d82dbd5b718f92d9be76de" Mar 09 04:37:24 crc kubenswrapper[4901]: E0309 04:37:24.108411 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:37:36 crc kubenswrapper[4901]: I0309 04:37:36.117629 4901 scope.go:117] "RemoveContainer" containerID="314b4bd509612b348bb3d7d47b625755ea46c37745d82dbd5b718f92d9be76de" Mar 09 04:37:36 crc kubenswrapper[4901]: E0309 04:37:36.118637 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:37:48 crc kubenswrapper[4901]: I0309 04:37:48.107507 4901 scope.go:117] "RemoveContainer" containerID="314b4bd509612b348bb3d7d47b625755ea46c37745d82dbd5b718f92d9be76de" Mar 09 04:37:48 crc kubenswrapper[4901]: E0309 04:37:48.110198 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:38:00 crc kubenswrapper[4901]: I0309 04:38:00.166564 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550518-gdx2m"] Mar 09 04:38:00 crc kubenswrapper[4901]: E0309 04:38:00.168206 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="487c302a-a733-4e69-9cb3-a7e73903dcf3" containerName="oc" Mar 09 04:38:00 crc kubenswrapper[4901]: I0309 04:38:00.168234 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="487c302a-a733-4e69-9cb3-a7e73903dcf3" containerName="oc" Mar 09 04:38:00 crc kubenswrapper[4901]: I0309 04:38:00.168398 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="487c302a-a733-4e69-9cb3-a7e73903dcf3" containerName="oc" Mar 09 04:38:00 crc kubenswrapper[4901]: I0309 04:38:00.168894 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550518-gdx2m" Mar 09 04:38:00 crc kubenswrapper[4901]: I0309 04:38:00.171853 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lq6vs" Mar 09 04:38:00 crc kubenswrapper[4901]: I0309 04:38:00.172033 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 04:38:00 crc kubenswrapper[4901]: I0309 04:38:00.172146 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 04:38:00 crc kubenswrapper[4901]: I0309 04:38:00.178472 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550518-gdx2m"] Mar 09 04:38:00 crc kubenswrapper[4901]: I0309 04:38:00.220969 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs4lw\" (UniqueName: \"kubernetes.io/projected/9e2d9cfa-dcf8-47b8-9929-8befdf926bb8-kube-api-access-rs4lw\") pod \"auto-csr-approver-29550518-gdx2m\" (UID: \"9e2d9cfa-dcf8-47b8-9929-8befdf926bb8\") " pod="openshift-infra/auto-csr-approver-29550518-gdx2m" Mar 09 04:38:00 crc kubenswrapper[4901]: I0309 04:38:00.322793 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rs4lw\" (UniqueName: \"kubernetes.io/projected/9e2d9cfa-dcf8-47b8-9929-8befdf926bb8-kube-api-access-rs4lw\") pod \"auto-csr-approver-29550518-gdx2m\" (UID: \"9e2d9cfa-dcf8-47b8-9929-8befdf926bb8\") " pod="openshift-infra/auto-csr-approver-29550518-gdx2m" Mar 09 04:38:00 crc kubenswrapper[4901]: I0309 04:38:00.351833 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs4lw\" (UniqueName: \"kubernetes.io/projected/9e2d9cfa-dcf8-47b8-9929-8befdf926bb8-kube-api-access-rs4lw\") pod \"auto-csr-approver-29550518-gdx2m\" (UID: \"9e2d9cfa-dcf8-47b8-9929-8befdf926bb8\") " pod="openshift-infra/auto-csr-approver-29550518-gdx2m" Mar 09 04:38:00 crc kubenswrapper[4901]: I0309 04:38:00.520849 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550518-gdx2m" Mar 09 04:38:00 crc kubenswrapper[4901]: I0309 04:38:00.985981 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550518-gdx2m"] Mar 09 04:38:00 crc kubenswrapper[4901]: W0309 04:38:00.998854 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e2d9cfa_dcf8_47b8_9929_8befdf926bb8.slice/crio-483d0ac2b74377ad09184c9b0496cfe9bf02b729e8f73985293afcf707368554 WatchSource:0}: Error finding container 483d0ac2b74377ad09184c9b0496cfe9bf02b729e8f73985293afcf707368554: Status 404 returned error can't find the container with id 483d0ac2b74377ad09184c9b0496cfe9bf02b729e8f73985293afcf707368554 Mar 09 04:38:01 crc kubenswrapper[4901]: I0309 04:38:01.002196 4901 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 04:38:01 crc kubenswrapper[4901]: I0309 04:38:01.106948 4901 scope.go:117] "RemoveContainer" containerID="314b4bd509612b348bb3d7d47b625755ea46c37745d82dbd5b718f92d9be76de" Mar 09 04:38:01 crc kubenswrapper[4901]: E0309 04:38:01.107370 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:38:01 crc kubenswrapper[4901]: I0309 04:38:01.988306 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550518-gdx2m" event={"ID":"9e2d9cfa-dcf8-47b8-9929-8befdf926bb8","Type":"ContainerStarted","Data":"483d0ac2b74377ad09184c9b0496cfe9bf02b729e8f73985293afcf707368554"} Mar 09 04:38:02 crc kubenswrapper[4901]: I0309 04:38:02.998475 4901 generic.go:334] "Generic (PLEG): container finished" podID="9e2d9cfa-dcf8-47b8-9929-8befdf926bb8" containerID="ce0519d2fcb9a9b62accac89776b338a3c24090cfd75f793b1051177e2e98c3c" exitCode=0 Mar 09 04:38:02 crc kubenswrapper[4901]: I0309 04:38:02.998520 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550518-gdx2m" event={"ID":"9e2d9cfa-dcf8-47b8-9929-8befdf926bb8","Type":"ContainerDied","Data":"ce0519d2fcb9a9b62accac89776b338a3c24090cfd75f793b1051177e2e98c3c"} Mar 09 04:38:04 crc kubenswrapper[4901]: I0309 04:38:04.363454 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550518-gdx2m" Mar 09 04:38:04 crc kubenswrapper[4901]: I0309 04:38:04.390438 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rs4lw\" (UniqueName: \"kubernetes.io/projected/9e2d9cfa-dcf8-47b8-9929-8befdf926bb8-kube-api-access-rs4lw\") pod \"9e2d9cfa-dcf8-47b8-9929-8befdf926bb8\" (UID: \"9e2d9cfa-dcf8-47b8-9929-8befdf926bb8\") " Mar 09 04:38:04 crc kubenswrapper[4901]: I0309 04:38:04.395922 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e2d9cfa-dcf8-47b8-9929-8befdf926bb8-kube-api-access-rs4lw" (OuterVolumeSpecName: "kube-api-access-rs4lw") pod "9e2d9cfa-dcf8-47b8-9929-8befdf926bb8" (UID: "9e2d9cfa-dcf8-47b8-9929-8befdf926bb8"). InnerVolumeSpecName "kube-api-access-rs4lw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:38:04 crc kubenswrapper[4901]: I0309 04:38:04.491598 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rs4lw\" (UniqueName: \"kubernetes.io/projected/9e2d9cfa-dcf8-47b8-9929-8befdf926bb8-kube-api-access-rs4lw\") on node \"crc\" DevicePath \"\"" Mar 09 04:38:05 crc kubenswrapper[4901]: I0309 04:38:05.012860 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550518-gdx2m" event={"ID":"9e2d9cfa-dcf8-47b8-9929-8befdf926bb8","Type":"ContainerDied","Data":"483d0ac2b74377ad09184c9b0496cfe9bf02b729e8f73985293afcf707368554"} Mar 09 04:38:05 crc kubenswrapper[4901]: I0309 04:38:05.013144 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="483d0ac2b74377ad09184c9b0496cfe9bf02b729e8f73985293afcf707368554" Mar 09 04:38:05 crc kubenswrapper[4901]: I0309 04:38:05.013207 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550518-gdx2m" Mar 09 04:38:05 crc kubenswrapper[4901]: I0309 04:38:05.464721 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550512-d8j8v"] Mar 09 04:38:05 crc kubenswrapper[4901]: I0309 04:38:05.475719 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550512-d8j8v"] Mar 09 04:38:06 crc kubenswrapper[4901]: I0309 04:38:06.123319 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="faa014b3-f618-46b3-ba3b-94d0627b7572" path="/var/lib/kubelet/pods/faa014b3-f618-46b3-ba3b-94d0627b7572/volumes" Mar 09 04:38:16 crc kubenswrapper[4901]: I0309 04:38:16.120958 4901 scope.go:117] "RemoveContainer" containerID="314b4bd509612b348bb3d7d47b625755ea46c37745d82dbd5b718f92d9be76de" Mar 09 04:38:16 crc kubenswrapper[4901]: E0309 04:38:16.121965 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:38:18 crc kubenswrapper[4901]: I0309 04:38:18.184791 4901 generic.go:334] "Generic (PLEG): container finished" podID="79837b41-60c2-458a-8e72-77886e95872d" containerID="d792d9a1d6db75eea2e213f06be048dc8538f2f24e71a060113e0d106a8464dc" exitCode=0 Mar 09 04:38:18 crc kubenswrapper[4901]: I0309 04:38:18.184850 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mthrn/must-gather-mbwfc" event={"ID":"79837b41-60c2-458a-8e72-77886e95872d","Type":"ContainerDied","Data":"d792d9a1d6db75eea2e213f06be048dc8538f2f24e71a060113e0d106a8464dc"} Mar 09 04:38:18 crc kubenswrapper[4901]: I0309 04:38:18.187460 4901 scope.go:117] "RemoveContainer" containerID="d792d9a1d6db75eea2e213f06be048dc8538f2f24e71a060113e0d106a8464dc" Mar 09 04:38:18 crc kubenswrapper[4901]: I0309 04:38:18.292660 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-mthrn_must-gather-mbwfc_79837b41-60c2-458a-8e72-77886e95872d/gather/0.log" Mar 09 04:38:25 crc kubenswrapper[4901]: I0309 04:38:25.512395 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-mthrn/must-gather-mbwfc"] Mar 09 04:38:25 crc kubenswrapper[4901]: I0309 04:38:25.513346 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-mthrn/must-gather-mbwfc" podUID="79837b41-60c2-458a-8e72-77886e95872d" containerName="copy" containerID="cri-o://431356b34be708cf60262caac9cf7429fa18787d7b0e4bbbb02cbe34c366c2f9" gracePeriod=2 Mar 09 04:38:25 crc kubenswrapper[4901]: I0309 04:38:25.529629 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-mthrn/must-gather-mbwfc"] Mar 09 04:38:26 crc kubenswrapper[4901]: I0309 04:38:26.003865 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-mthrn_must-gather-mbwfc_79837b41-60c2-458a-8e72-77886e95872d/copy/0.log" Mar 09 04:38:26 crc kubenswrapper[4901]: I0309 04:38:26.004892 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mthrn/must-gather-mbwfc" Mar 09 04:38:26 crc kubenswrapper[4901]: I0309 04:38:26.111604 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/79837b41-60c2-458a-8e72-77886e95872d-must-gather-output\") pod \"79837b41-60c2-458a-8e72-77886e95872d\" (UID: \"79837b41-60c2-458a-8e72-77886e95872d\") " Mar 09 04:38:26 crc kubenswrapper[4901]: I0309 04:38:26.111739 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sk9c2\" (UniqueName: \"kubernetes.io/projected/79837b41-60c2-458a-8e72-77886e95872d-kube-api-access-sk9c2\") pod \"79837b41-60c2-458a-8e72-77886e95872d\" (UID: \"79837b41-60c2-458a-8e72-77886e95872d\") " Mar 09 04:38:26 crc kubenswrapper[4901]: I0309 04:38:26.119998 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79837b41-60c2-458a-8e72-77886e95872d-kube-api-access-sk9c2" (OuterVolumeSpecName: "kube-api-access-sk9c2") pod "79837b41-60c2-458a-8e72-77886e95872d" (UID: "79837b41-60c2-458a-8e72-77886e95872d"). InnerVolumeSpecName "kube-api-access-sk9c2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:38:26 crc kubenswrapper[4901]: I0309 04:38:26.215746 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sk9c2\" (UniqueName: \"kubernetes.io/projected/79837b41-60c2-458a-8e72-77886e95872d-kube-api-access-sk9c2\") on node \"crc\" DevicePath \"\"" Mar 09 04:38:26 crc kubenswrapper[4901]: I0309 04:38:26.242634 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79837b41-60c2-458a-8e72-77886e95872d-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "79837b41-60c2-458a-8e72-77886e95872d" (UID: "79837b41-60c2-458a-8e72-77886e95872d"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 04:38:26 crc kubenswrapper[4901]: I0309 04:38:26.273032 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-mthrn_must-gather-mbwfc_79837b41-60c2-458a-8e72-77886e95872d/copy/0.log" Mar 09 04:38:26 crc kubenswrapper[4901]: I0309 04:38:26.274780 4901 generic.go:334] "Generic (PLEG): container finished" podID="79837b41-60c2-458a-8e72-77886e95872d" containerID="431356b34be708cf60262caac9cf7429fa18787d7b0e4bbbb02cbe34c366c2f9" exitCode=143 Mar 09 04:38:26 crc kubenswrapper[4901]: I0309 04:38:26.274851 4901 scope.go:117] "RemoveContainer" containerID="431356b34be708cf60262caac9cf7429fa18787d7b0e4bbbb02cbe34c366c2f9" Mar 09 04:38:26 crc kubenswrapper[4901]: I0309 04:38:26.274998 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mthrn/must-gather-mbwfc" Mar 09 04:38:26 crc kubenswrapper[4901]: I0309 04:38:26.305019 4901 scope.go:117] "RemoveContainer" containerID="d792d9a1d6db75eea2e213f06be048dc8538f2f24e71a060113e0d106a8464dc" Mar 09 04:38:26 crc kubenswrapper[4901]: I0309 04:38:26.318516 4901 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/79837b41-60c2-458a-8e72-77886e95872d-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 09 04:38:26 crc kubenswrapper[4901]: I0309 04:38:26.361902 4901 scope.go:117] "RemoveContainer" containerID="431356b34be708cf60262caac9cf7429fa18787d7b0e4bbbb02cbe34c366c2f9" Mar 09 04:38:26 crc kubenswrapper[4901]: E0309 04:38:26.362375 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"431356b34be708cf60262caac9cf7429fa18787d7b0e4bbbb02cbe34c366c2f9\": container with ID starting with 431356b34be708cf60262caac9cf7429fa18787d7b0e4bbbb02cbe34c366c2f9 not found: ID does not exist" containerID="431356b34be708cf60262caac9cf7429fa18787d7b0e4bbbb02cbe34c366c2f9" Mar 09 04:38:26 crc kubenswrapper[4901]: I0309 04:38:26.362426 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"431356b34be708cf60262caac9cf7429fa18787d7b0e4bbbb02cbe34c366c2f9"} err="failed to get container status \"431356b34be708cf60262caac9cf7429fa18787d7b0e4bbbb02cbe34c366c2f9\": rpc error: code = NotFound desc = could not find container \"431356b34be708cf60262caac9cf7429fa18787d7b0e4bbbb02cbe34c366c2f9\": container with ID starting with 431356b34be708cf60262caac9cf7429fa18787d7b0e4bbbb02cbe34c366c2f9 not found: ID does not exist" Mar 09 04:38:26 crc kubenswrapper[4901]: I0309 04:38:26.362473 4901 scope.go:117] "RemoveContainer" containerID="d792d9a1d6db75eea2e213f06be048dc8538f2f24e71a060113e0d106a8464dc" Mar 09 04:38:26 crc kubenswrapper[4901]: E0309 04:38:26.362749 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d792d9a1d6db75eea2e213f06be048dc8538f2f24e71a060113e0d106a8464dc\": container with ID starting with d792d9a1d6db75eea2e213f06be048dc8538f2f24e71a060113e0d106a8464dc not found: ID does not exist" containerID="d792d9a1d6db75eea2e213f06be048dc8538f2f24e71a060113e0d106a8464dc" Mar 09 04:38:26 crc kubenswrapper[4901]: I0309 04:38:26.362777 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d792d9a1d6db75eea2e213f06be048dc8538f2f24e71a060113e0d106a8464dc"} err="failed to get container status \"d792d9a1d6db75eea2e213f06be048dc8538f2f24e71a060113e0d106a8464dc\": rpc error: code = NotFound desc = could not find container \"d792d9a1d6db75eea2e213f06be048dc8538f2f24e71a060113e0d106a8464dc\": container with ID starting with d792d9a1d6db75eea2e213f06be048dc8538f2f24e71a060113e0d106a8464dc not found: ID does not exist" Mar 09 04:38:28 crc kubenswrapper[4901]: I0309 04:38:28.106298 4901 scope.go:117] "RemoveContainer" containerID="314b4bd509612b348bb3d7d47b625755ea46c37745d82dbd5b718f92d9be76de" Mar 09 04:38:28 crc kubenswrapper[4901]: E0309 04:38:28.106731 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:38:28 crc kubenswrapper[4901]: I0309 04:38:28.123661 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79837b41-60c2-458a-8e72-77886e95872d" path="/var/lib/kubelet/pods/79837b41-60c2-458a-8e72-77886e95872d/volumes" Mar 09 04:38:39 crc kubenswrapper[4901]: I0309 04:38:39.106733 4901 scope.go:117] "RemoveContainer" containerID="314b4bd509612b348bb3d7d47b625755ea46c37745d82dbd5b718f92d9be76de" Mar 09 04:38:39 crc kubenswrapper[4901]: E0309 04:38:39.108584 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:38:44 crc kubenswrapper[4901]: I0309 04:38:44.792528 4901 scope.go:117] "RemoveContainer" containerID="f6ac12b0c6378005df9db812c89474ac5e6181a1aae0b5a0d62a4dae49fc67cd" Mar 09 04:38:47 crc kubenswrapper[4901]: I0309 04:38:47.167206 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-56nqd"] Mar 09 04:38:47 crc kubenswrapper[4901]: E0309 04:38:47.168083 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79837b41-60c2-458a-8e72-77886e95872d" containerName="copy" Mar 09 04:38:47 crc kubenswrapper[4901]: I0309 04:38:47.168097 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="79837b41-60c2-458a-8e72-77886e95872d" containerName="copy" Mar 09 04:38:47 crc kubenswrapper[4901]: E0309 04:38:47.168145 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79837b41-60c2-458a-8e72-77886e95872d" containerName="gather" Mar 09 04:38:47 crc kubenswrapper[4901]: I0309 04:38:47.168152 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="79837b41-60c2-458a-8e72-77886e95872d" containerName="gather" Mar 09 04:38:47 crc kubenswrapper[4901]: E0309 04:38:47.168160 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e2d9cfa-dcf8-47b8-9929-8befdf926bb8" containerName="oc" Mar 09 04:38:47 crc kubenswrapper[4901]: I0309 04:38:47.168168 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e2d9cfa-dcf8-47b8-9929-8befdf926bb8" containerName="oc" Mar 09 04:38:47 crc kubenswrapper[4901]: I0309 04:38:47.168349 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e2d9cfa-dcf8-47b8-9929-8befdf926bb8" containerName="oc" Mar 09 04:38:47 crc kubenswrapper[4901]: I0309 04:38:47.168360 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="79837b41-60c2-458a-8e72-77886e95872d" containerName="copy" Mar 09 04:38:47 crc kubenswrapper[4901]: I0309 04:38:47.168371 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="79837b41-60c2-458a-8e72-77886e95872d" containerName="gather" Mar 09 04:38:47 crc kubenswrapper[4901]: I0309 04:38:47.169529 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-56nqd" Mar 09 04:38:47 crc kubenswrapper[4901]: I0309 04:38:47.177922 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-56nqd"] Mar 09 04:38:47 crc kubenswrapper[4901]: I0309 04:38:47.307783 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2nfh\" (UniqueName: \"kubernetes.io/projected/b6385dcc-da9d-4582-a1a3-b05ef6dbfd74-kube-api-access-f2nfh\") pod \"community-operators-56nqd\" (UID: \"b6385dcc-da9d-4582-a1a3-b05ef6dbfd74\") " pod="openshift-marketplace/community-operators-56nqd" Mar 09 04:38:47 crc kubenswrapper[4901]: I0309 04:38:47.307941 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6385dcc-da9d-4582-a1a3-b05ef6dbfd74-utilities\") pod \"community-operators-56nqd\" (UID: \"b6385dcc-da9d-4582-a1a3-b05ef6dbfd74\") " pod="openshift-marketplace/community-operators-56nqd" Mar 09 04:38:47 crc kubenswrapper[4901]: I0309 04:38:47.308204 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6385dcc-da9d-4582-a1a3-b05ef6dbfd74-catalog-content\") pod \"community-operators-56nqd\" (UID: \"b6385dcc-da9d-4582-a1a3-b05ef6dbfd74\") " pod="openshift-marketplace/community-operators-56nqd" Mar 09 04:38:47 crc kubenswrapper[4901]: I0309 04:38:47.409627 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6385dcc-da9d-4582-a1a3-b05ef6dbfd74-catalog-content\") pod \"community-operators-56nqd\" (UID: \"b6385dcc-da9d-4582-a1a3-b05ef6dbfd74\") " pod="openshift-marketplace/community-operators-56nqd" Mar 09 04:38:47 crc kubenswrapper[4901]: I0309 04:38:47.409684 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2nfh\" (UniqueName: \"kubernetes.io/projected/b6385dcc-da9d-4582-a1a3-b05ef6dbfd74-kube-api-access-f2nfh\") pod \"community-operators-56nqd\" (UID: \"b6385dcc-da9d-4582-a1a3-b05ef6dbfd74\") " pod="openshift-marketplace/community-operators-56nqd" Mar 09 04:38:47 crc kubenswrapper[4901]: I0309 04:38:47.409751 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6385dcc-da9d-4582-a1a3-b05ef6dbfd74-utilities\") pod \"community-operators-56nqd\" (UID: \"b6385dcc-da9d-4582-a1a3-b05ef6dbfd74\") " pod="openshift-marketplace/community-operators-56nqd" Mar 09 04:38:47 crc kubenswrapper[4901]: I0309 04:38:47.410194 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6385dcc-da9d-4582-a1a3-b05ef6dbfd74-utilities\") pod \"community-operators-56nqd\" (UID: \"b6385dcc-da9d-4582-a1a3-b05ef6dbfd74\") " pod="openshift-marketplace/community-operators-56nqd" Mar 09 04:38:47 crc kubenswrapper[4901]: I0309 04:38:47.410191 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6385dcc-da9d-4582-a1a3-b05ef6dbfd74-catalog-content\") pod \"community-operators-56nqd\" (UID: \"b6385dcc-da9d-4582-a1a3-b05ef6dbfd74\") " pod="openshift-marketplace/community-operators-56nqd" Mar 09 04:38:47 crc kubenswrapper[4901]: I0309 04:38:47.431250 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2nfh\" (UniqueName: \"kubernetes.io/projected/b6385dcc-da9d-4582-a1a3-b05ef6dbfd74-kube-api-access-f2nfh\") pod \"community-operators-56nqd\" (UID: \"b6385dcc-da9d-4582-a1a3-b05ef6dbfd74\") " pod="openshift-marketplace/community-operators-56nqd" Mar 09 04:38:47 crc kubenswrapper[4901]: I0309 04:38:47.494988 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-56nqd" Mar 09 04:38:47 crc kubenswrapper[4901]: I0309 04:38:47.959930 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-56nqd"] Mar 09 04:38:48 crc kubenswrapper[4901]: I0309 04:38:48.481680 4901 generic.go:334] "Generic (PLEG): container finished" podID="b6385dcc-da9d-4582-a1a3-b05ef6dbfd74" containerID="bdc26a1b5a4e807554ffd5c4a7ba59ed2137559dd46acfc54b94a134c3576b77" exitCode=0 Mar 09 04:38:48 crc kubenswrapper[4901]: I0309 04:38:48.481761 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-56nqd" event={"ID":"b6385dcc-da9d-4582-a1a3-b05ef6dbfd74","Type":"ContainerDied","Data":"bdc26a1b5a4e807554ffd5c4a7ba59ed2137559dd46acfc54b94a134c3576b77"} Mar 09 04:38:48 crc kubenswrapper[4901]: I0309 04:38:48.482038 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-56nqd" event={"ID":"b6385dcc-da9d-4582-a1a3-b05ef6dbfd74","Type":"ContainerStarted","Data":"6ba2fc40b1acf015d4894da4a001c147a0e6628af74c3917ee61b460ca5f98ff"} Mar 09 04:38:49 crc kubenswrapper[4901]: I0309 04:38:49.494850 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-56nqd" event={"ID":"b6385dcc-da9d-4582-a1a3-b05ef6dbfd74","Type":"ContainerStarted","Data":"8ab9f5f25cc5417f46e11119060f7fa0c0699b66e6ee4e986ee502948b1d42ba"} Mar 09 04:38:50 crc kubenswrapper[4901]: I0309 04:38:50.507274 4901 generic.go:334] "Generic (PLEG): container finished" podID="b6385dcc-da9d-4582-a1a3-b05ef6dbfd74" containerID="8ab9f5f25cc5417f46e11119060f7fa0c0699b66e6ee4e986ee502948b1d42ba" exitCode=0 Mar 09 04:38:50 crc kubenswrapper[4901]: I0309 04:38:50.507367 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-56nqd" event={"ID":"b6385dcc-da9d-4582-a1a3-b05ef6dbfd74","Type":"ContainerDied","Data":"8ab9f5f25cc5417f46e11119060f7fa0c0699b66e6ee4e986ee502948b1d42ba"} Mar 09 04:38:51 crc kubenswrapper[4901]: I0309 04:38:51.107385 4901 scope.go:117] "RemoveContainer" containerID="314b4bd509612b348bb3d7d47b625755ea46c37745d82dbd5b718f92d9be76de" Mar 09 04:38:51 crc kubenswrapper[4901]: E0309 04:38:51.108335 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:38:51 crc kubenswrapper[4901]: I0309 04:38:51.341314 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-67pgs"] Mar 09 04:38:51 crc kubenswrapper[4901]: I0309 04:38:51.342978 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-67pgs" Mar 09 04:38:51 crc kubenswrapper[4901]: I0309 04:38:51.346471 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-67pgs"] Mar 09 04:38:51 crc kubenswrapper[4901]: I0309 04:38:51.488479 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bklf\" (UniqueName: \"kubernetes.io/projected/525481f4-ac39-4e03-b0a4-13ec08983073-kube-api-access-5bklf\") pod \"redhat-marketplace-67pgs\" (UID: \"525481f4-ac39-4e03-b0a4-13ec08983073\") " pod="openshift-marketplace/redhat-marketplace-67pgs" Mar 09 04:38:51 crc kubenswrapper[4901]: I0309 04:38:51.488724 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/525481f4-ac39-4e03-b0a4-13ec08983073-catalog-content\") pod \"redhat-marketplace-67pgs\" (UID: \"525481f4-ac39-4e03-b0a4-13ec08983073\") " pod="openshift-marketplace/redhat-marketplace-67pgs" Mar 09 04:38:51 crc kubenswrapper[4901]: I0309 04:38:51.488836 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/525481f4-ac39-4e03-b0a4-13ec08983073-utilities\") pod \"redhat-marketplace-67pgs\" (UID: \"525481f4-ac39-4e03-b0a4-13ec08983073\") " pod="openshift-marketplace/redhat-marketplace-67pgs" Mar 09 04:38:51 crc kubenswrapper[4901]: I0309 04:38:51.523873 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-56nqd" event={"ID":"b6385dcc-da9d-4582-a1a3-b05ef6dbfd74","Type":"ContainerStarted","Data":"575c26a83ca1cc2518ed4c800a3add08a0240395b06f5d7af3dab60ffedaec1a"} Mar 09 04:38:51 crc kubenswrapper[4901]: I0309 04:38:51.540979 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-56nqd" podStartSLOduration=2.077644221 podStartE2EDuration="4.540960246s" podCreationTimestamp="2026-03-09 04:38:47 +0000 UTC" firstStartedPulling="2026-03-09 04:38:48.485835854 +0000 UTC m=+7053.075499626" lastFinishedPulling="2026-03-09 04:38:50.949151909 +0000 UTC m=+7055.538815651" observedRunningTime="2026-03-09 04:38:51.539528091 +0000 UTC m=+7056.129191823" watchObservedRunningTime="2026-03-09 04:38:51.540960246 +0000 UTC m=+7056.130623988" Mar 09 04:38:51 crc kubenswrapper[4901]: I0309 04:38:51.590704 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bklf\" (UniqueName: \"kubernetes.io/projected/525481f4-ac39-4e03-b0a4-13ec08983073-kube-api-access-5bklf\") pod \"redhat-marketplace-67pgs\" (UID: \"525481f4-ac39-4e03-b0a4-13ec08983073\") " pod="openshift-marketplace/redhat-marketplace-67pgs" Mar 09 04:38:51 crc kubenswrapper[4901]: I0309 04:38:51.591097 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/525481f4-ac39-4e03-b0a4-13ec08983073-catalog-content\") pod \"redhat-marketplace-67pgs\" (UID: \"525481f4-ac39-4e03-b0a4-13ec08983073\") " pod="openshift-marketplace/redhat-marketplace-67pgs" Mar 09 04:38:51 crc kubenswrapper[4901]: I0309 04:38:51.591159 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/525481f4-ac39-4e03-b0a4-13ec08983073-utilities\") pod \"redhat-marketplace-67pgs\" (UID: \"525481f4-ac39-4e03-b0a4-13ec08983073\") " pod="openshift-marketplace/redhat-marketplace-67pgs" Mar 09 04:38:51 crc kubenswrapper[4901]: I0309 04:38:51.592588 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/525481f4-ac39-4e03-b0a4-13ec08983073-catalog-content\") pod \"redhat-marketplace-67pgs\" (UID: \"525481f4-ac39-4e03-b0a4-13ec08983073\") " pod="openshift-marketplace/redhat-marketplace-67pgs" Mar 09 04:38:51 crc kubenswrapper[4901]: I0309 04:38:51.592806 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/525481f4-ac39-4e03-b0a4-13ec08983073-utilities\") pod \"redhat-marketplace-67pgs\" (UID: \"525481f4-ac39-4e03-b0a4-13ec08983073\") " pod="openshift-marketplace/redhat-marketplace-67pgs" Mar 09 04:38:51 crc kubenswrapper[4901]: I0309 04:38:51.627623 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bklf\" (UniqueName: \"kubernetes.io/projected/525481f4-ac39-4e03-b0a4-13ec08983073-kube-api-access-5bklf\") pod \"redhat-marketplace-67pgs\" (UID: \"525481f4-ac39-4e03-b0a4-13ec08983073\") " pod="openshift-marketplace/redhat-marketplace-67pgs" Mar 09 04:38:51 crc kubenswrapper[4901]: I0309 04:38:51.663832 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-67pgs" Mar 09 04:38:52 crc kubenswrapper[4901]: W0309 04:38:52.113745 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod525481f4_ac39_4e03_b0a4_13ec08983073.slice/crio-34e0b03d7d96ff2c83e1d2b159b5ef69236cded4510891f34140bfb26a176da3 WatchSource:0}: Error finding container 34e0b03d7d96ff2c83e1d2b159b5ef69236cded4510891f34140bfb26a176da3: Status 404 returned error can't find the container with id 34e0b03d7d96ff2c83e1d2b159b5ef69236cded4510891f34140bfb26a176da3 Mar 09 04:38:52 crc kubenswrapper[4901]: I0309 04:38:52.135124 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-67pgs"] Mar 09 04:38:52 crc kubenswrapper[4901]: I0309 04:38:52.535977 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67pgs" event={"ID":"525481f4-ac39-4e03-b0a4-13ec08983073","Type":"ContainerDied","Data":"d5323bc2926d592858d17cf4d1ea8483d55270358125d469830cf2aa3a1dba8c"} Mar 09 04:38:52 crc kubenswrapper[4901]: I0309 04:38:52.535794 4901 generic.go:334] "Generic (PLEG): container finished" podID="525481f4-ac39-4e03-b0a4-13ec08983073" containerID="d5323bc2926d592858d17cf4d1ea8483d55270358125d469830cf2aa3a1dba8c" exitCode=0 Mar 09 04:38:52 crc kubenswrapper[4901]: I0309 04:38:52.537965 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67pgs" event={"ID":"525481f4-ac39-4e03-b0a4-13ec08983073","Type":"ContainerStarted","Data":"34e0b03d7d96ff2c83e1d2b159b5ef69236cded4510891f34140bfb26a176da3"} Mar 09 04:38:53 crc kubenswrapper[4901]: I0309 04:38:53.554188 4901 generic.go:334] "Generic (PLEG): container finished" podID="525481f4-ac39-4e03-b0a4-13ec08983073" containerID="edf7b962b70d6298a92d0842dcec259a818c034de2dea437288aa7c01d5d0898" exitCode=0 Mar 09 04:38:53 crc kubenswrapper[4901]: I0309 04:38:53.554270 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67pgs" event={"ID":"525481f4-ac39-4e03-b0a4-13ec08983073","Type":"ContainerDied","Data":"edf7b962b70d6298a92d0842dcec259a818c034de2dea437288aa7c01d5d0898"} Mar 09 04:38:54 crc kubenswrapper[4901]: I0309 04:38:54.566820 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67pgs" event={"ID":"525481f4-ac39-4e03-b0a4-13ec08983073","Type":"ContainerStarted","Data":"2d5f4d1eb6e26f8e6501217e2ce75facf15a15d6ab118bdd258427ebaf6391c7"} Mar 09 04:38:54 crc kubenswrapper[4901]: I0309 04:38:54.593641 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-67pgs" podStartSLOduration=2.155275032 podStartE2EDuration="3.593615837s" podCreationTimestamp="2026-03-09 04:38:51 +0000 UTC" firstStartedPulling="2026-03-09 04:38:52.539792624 +0000 UTC m=+7057.129456396" lastFinishedPulling="2026-03-09 04:38:53.978133439 +0000 UTC m=+7058.567797201" observedRunningTime="2026-03-09 04:38:54.586909003 +0000 UTC m=+7059.176572735" watchObservedRunningTime="2026-03-09 04:38:54.593615837 +0000 UTC m=+7059.183279579" Mar 09 04:38:57 crc kubenswrapper[4901]: I0309 04:38:57.495838 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-56nqd" Mar 09 04:38:57 crc kubenswrapper[4901]: I0309 04:38:57.496279 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-56nqd" Mar 09 04:38:57 crc kubenswrapper[4901]: I0309 04:38:57.577252 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-56nqd" Mar 09 04:38:57 crc kubenswrapper[4901]: I0309 04:38:57.652978 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-56nqd" Mar 09 04:38:58 crc kubenswrapper[4901]: I0309 04:38:58.738769 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-56nqd"] Mar 09 04:39:00 crc kubenswrapper[4901]: I0309 04:39:00.629476 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-56nqd" podUID="b6385dcc-da9d-4582-a1a3-b05ef6dbfd74" containerName="registry-server" containerID="cri-o://575c26a83ca1cc2518ed4c800a3add08a0240395b06f5d7af3dab60ffedaec1a" gracePeriod=2 Mar 09 04:39:01 crc kubenswrapper[4901]: I0309 04:39:01.183077 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-56nqd" Mar 09 04:39:01 crc kubenswrapper[4901]: I0309 04:39:01.298422 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2nfh\" (UniqueName: \"kubernetes.io/projected/b6385dcc-da9d-4582-a1a3-b05ef6dbfd74-kube-api-access-f2nfh\") pod \"b6385dcc-da9d-4582-a1a3-b05ef6dbfd74\" (UID: \"b6385dcc-da9d-4582-a1a3-b05ef6dbfd74\") " Mar 09 04:39:01 crc kubenswrapper[4901]: I0309 04:39:01.298577 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6385dcc-da9d-4582-a1a3-b05ef6dbfd74-utilities\") pod \"b6385dcc-da9d-4582-a1a3-b05ef6dbfd74\" (UID: \"b6385dcc-da9d-4582-a1a3-b05ef6dbfd74\") " Mar 09 04:39:01 crc kubenswrapper[4901]: I0309 04:39:01.298634 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6385dcc-da9d-4582-a1a3-b05ef6dbfd74-catalog-content\") pod \"b6385dcc-da9d-4582-a1a3-b05ef6dbfd74\" (UID: \"b6385dcc-da9d-4582-a1a3-b05ef6dbfd74\") " Mar 09 04:39:01 crc kubenswrapper[4901]: I0309 04:39:01.299480 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6385dcc-da9d-4582-a1a3-b05ef6dbfd74-utilities" (OuterVolumeSpecName: "utilities") pod "b6385dcc-da9d-4582-a1a3-b05ef6dbfd74" (UID: "b6385dcc-da9d-4582-a1a3-b05ef6dbfd74"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 04:39:01 crc kubenswrapper[4901]: I0309 04:39:01.305910 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6385dcc-da9d-4582-a1a3-b05ef6dbfd74-kube-api-access-f2nfh" (OuterVolumeSpecName: "kube-api-access-f2nfh") pod "b6385dcc-da9d-4582-a1a3-b05ef6dbfd74" (UID: "b6385dcc-da9d-4582-a1a3-b05ef6dbfd74"). InnerVolumeSpecName "kube-api-access-f2nfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:39:01 crc kubenswrapper[4901]: I0309 04:39:01.372744 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6385dcc-da9d-4582-a1a3-b05ef6dbfd74-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b6385dcc-da9d-4582-a1a3-b05ef6dbfd74" (UID: "b6385dcc-da9d-4582-a1a3-b05ef6dbfd74"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 04:39:01 crc kubenswrapper[4901]: I0309 04:39:01.400984 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2nfh\" (UniqueName: \"kubernetes.io/projected/b6385dcc-da9d-4582-a1a3-b05ef6dbfd74-kube-api-access-f2nfh\") on node \"crc\" DevicePath \"\"" Mar 09 04:39:01 crc kubenswrapper[4901]: I0309 04:39:01.401546 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6385dcc-da9d-4582-a1a3-b05ef6dbfd74-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 04:39:01 crc kubenswrapper[4901]: I0309 04:39:01.401700 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6385dcc-da9d-4582-a1a3-b05ef6dbfd74-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 04:39:01 crc kubenswrapper[4901]: I0309 04:39:01.640736 4901 generic.go:334] "Generic (PLEG): container finished" podID="b6385dcc-da9d-4582-a1a3-b05ef6dbfd74" containerID="575c26a83ca1cc2518ed4c800a3add08a0240395b06f5d7af3dab60ffedaec1a" exitCode=0 Mar 09 04:39:01 crc kubenswrapper[4901]: I0309 04:39:01.640801 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-56nqd" Mar 09 04:39:01 crc kubenswrapper[4901]: I0309 04:39:01.640797 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-56nqd" event={"ID":"b6385dcc-da9d-4582-a1a3-b05ef6dbfd74","Type":"ContainerDied","Data":"575c26a83ca1cc2518ed4c800a3add08a0240395b06f5d7af3dab60ffedaec1a"} Mar 09 04:39:01 crc kubenswrapper[4901]: I0309 04:39:01.640981 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-56nqd" event={"ID":"b6385dcc-da9d-4582-a1a3-b05ef6dbfd74","Type":"ContainerDied","Data":"6ba2fc40b1acf015d4894da4a001c147a0e6628af74c3917ee61b460ca5f98ff"} Mar 09 04:39:01 crc kubenswrapper[4901]: I0309 04:39:01.641020 4901 scope.go:117] "RemoveContainer" containerID="575c26a83ca1cc2518ed4c800a3add08a0240395b06f5d7af3dab60ffedaec1a" Mar 09 04:39:01 crc kubenswrapper[4901]: I0309 04:39:01.666464 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-67pgs" Mar 09 04:39:01 crc kubenswrapper[4901]: I0309 04:39:01.668806 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-67pgs" Mar 09 04:39:01 crc kubenswrapper[4901]: I0309 04:39:01.697368 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-56nqd"] Mar 09 04:39:01 crc kubenswrapper[4901]: I0309 04:39:01.702010 4901 scope.go:117] "RemoveContainer" containerID="8ab9f5f25cc5417f46e11119060f7fa0c0699b66e6ee4e986ee502948b1d42ba" Mar 09 04:39:01 crc kubenswrapper[4901]: I0309 04:39:01.708284 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-56nqd"] Mar 09 04:39:01 crc kubenswrapper[4901]: I0309 04:39:01.740296 4901 scope.go:117] "RemoveContainer" containerID="bdc26a1b5a4e807554ffd5c4a7ba59ed2137559dd46acfc54b94a134c3576b77" Mar 09 04:39:01 crc kubenswrapper[4901]: I0309 04:39:01.753279 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-67pgs" Mar 09 04:39:01 crc kubenswrapper[4901]: I0309 04:39:01.764165 4901 scope.go:117] "RemoveContainer" containerID="575c26a83ca1cc2518ed4c800a3add08a0240395b06f5d7af3dab60ffedaec1a" Mar 09 04:39:01 crc kubenswrapper[4901]: E0309 04:39:01.764789 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"575c26a83ca1cc2518ed4c800a3add08a0240395b06f5d7af3dab60ffedaec1a\": container with ID starting with 575c26a83ca1cc2518ed4c800a3add08a0240395b06f5d7af3dab60ffedaec1a not found: ID does not exist" containerID="575c26a83ca1cc2518ed4c800a3add08a0240395b06f5d7af3dab60ffedaec1a" Mar 09 04:39:01 crc kubenswrapper[4901]: I0309 04:39:01.764857 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"575c26a83ca1cc2518ed4c800a3add08a0240395b06f5d7af3dab60ffedaec1a"} err="failed to get container status \"575c26a83ca1cc2518ed4c800a3add08a0240395b06f5d7af3dab60ffedaec1a\": rpc error: code = NotFound desc = could not find container \"575c26a83ca1cc2518ed4c800a3add08a0240395b06f5d7af3dab60ffedaec1a\": container with ID starting with 575c26a83ca1cc2518ed4c800a3add08a0240395b06f5d7af3dab60ffedaec1a not found: ID does not exist" Mar 09 04:39:01 crc kubenswrapper[4901]: I0309 04:39:01.764897 4901 scope.go:117] "RemoveContainer" containerID="8ab9f5f25cc5417f46e11119060f7fa0c0699b66e6ee4e986ee502948b1d42ba" Mar 09 04:39:01 crc kubenswrapper[4901]: E0309 04:39:01.765359 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ab9f5f25cc5417f46e11119060f7fa0c0699b66e6ee4e986ee502948b1d42ba\": container with ID starting with 8ab9f5f25cc5417f46e11119060f7fa0c0699b66e6ee4e986ee502948b1d42ba not found: ID does not exist" containerID="8ab9f5f25cc5417f46e11119060f7fa0c0699b66e6ee4e986ee502948b1d42ba" Mar 09 04:39:01 crc kubenswrapper[4901]: I0309 04:39:01.765404 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ab9f5f25cc5417f46e11119060f7fa0c0699b66e6ee4e986ee502948b1d42ba"} err="failed to get container status \"8ab9f5f25cc5417f46e11119060f7fa0c0699b66e6ee4e986ee502948b1d42ba\": rpc error: code = NotFound desc = could not find container \"8ab9f5f25cc5417f46e11119060f7fa0c0699b66e6ee4e986ee502948b1d42ba\": container with ID starting with 8ab9f5f25cc5417f46e11119060f7fa0c0699b66e6ee4e986ee502948b1d42ba not found: ID does not exist" Mar 09 04:39:01 crc kubenswrapper[4901]: I0309 04:39:01.765431 4901 scope.go:117] "RemoveContainer" containerID="bdc26a1b5a4e807554ffd5c4a7ba59ed2137559dd46acfc54b94a134c3576b77" Mar 09 04:39:01 crc kubenswrapper[4901]: E0309 04:39:01.765804 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdc26a1b5a4e807554ffd5c4a7ba59ed2137559dd46acfc54b94a134c3576b77\": container with ID starting with bdc26a1b5a4e807554ffd5c4a7ba59ed2137559dd46acfc54b94a134c3576b77 not found: ID does not exist" containerID="bdc26a1b5a4e807554ffd5c4a7ba59ed2137559dd46acfc54b94a134c3576b77" Mar 09 04:39:01 crc kubenswrapper[4901]: I0309 04:39:01.765849 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdc26a1b5a4e807554ffd5c4a7ba59ed2137559dd46acfc54b94a134c3576b77"} err="failed to get container status \"bdc26a1b5a4e807554ffd5c4a7ba59ed2137559dd46acfc54b94a134c3576b77\": rpc error: code = NotFound desc = could not find container \"bdc26a1b5a4e807554ffd5c4a7ba59ed2137559dd46acfc54b94a134c3576b77\": container with ID starting with bdc26a1b5a4e807554ffd5c4a7ba59ed2137559dd46acfc54b94a134c3576b77 not found: ID does not exist" Mar 09 04:39:02 crc kubenswrapper[4901]: I0309 04:39:02.118905 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6385dcc-da9d-4582-a1a3-b05ef6dbfd74" path="/var/lib/kubelet/pods/b6385dcc-da9d-4582-a1a3-b05ef6dbfd74/volumes" Mar 09 04:39:02 crc kubenswrapper[4901]: I0309 04:39:02.702323 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-67pgs" Mar 09 04:39:04 crc kubenswrapper[4901]: I0309 04:39:04.135973 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-67pgs"] Mar 09 04:39:05 crc kubenswrapper[4901]: I0309 04:39:05.687344 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-67pgs" podUID="525481f4-ac39-4e03-b0a4-13ec08983073" containerName="registry-server" containerID="cri-o://2d5f4d1eb6e26f8e6501217e2ce75facf15a15d6ab118bdd258427ebaf6391c7" gracePeriod=2 Mar 09 04:39:06 crc kubenswrapper[4901]: I0309 04:39:06.114714 4901 scope.go:117] "RemoveContainer" containerID="314b4bd509612b348bb3d7d47b625755ea46c37745d82dbd5b718f92d9be76de" Mar 09 04:39:06 crc kubenswrapper[4901]: E0309 04:39:06.115732 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:39:06 crc kubenswrapper[4901]: I0309 04:39:06.201040 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-67pgs" Mar 09 04:39:06 crc kubenswrapper[4901]: I0309 04:39:06.309385 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bklf\" (UniqueName: \"kubernetes.io/projected/525481f4-ac39-4e03-b0a4-13ec08983073-kube-api-access-5bklf\") pod \"525481f4-ac39-4e03-b0a4-13ec08983073\" (UID: \"525481f4-ac39-4e03-b0a4-13ec08983073\") " Mar 09 04:39:06 crc kubenswrapper[4901]: I0309 04:39:06.309683 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/525481f4-ac39-4e03-b0a4-13ec08983073-catalog-content\") pod \"525481f4-ac39-4e03-b0a4-13ec08983073\" (UID: \"525481f4-ac39-4e03-b0a4-13ec08983073\") " Mar 09 04:39:06 crc kubenswrapper[4901]: I0309 04:39:06.309789 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/525481f4-ac39-4e03-b0a4-13ec08983073-utilities\") pod \"525481f4-ac39-4e03-b0a4-13ec08983073\" (UID: \"525481f4-ac39-4e03-b0a4-13ec08983073\") " Mar 09 04:39:06 crc kubenswrapper[4901]: I0309 04:39:06.311616 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/525481f4-ac39-4e03-b0a4-13ec08983073-utilities" (OuterVolumeSpecName: "utilities") pod "525481f4-ac39-4e03-b0a4-13ec08983073" (UID: "525481f4-ac39-4e03-b0a4-13ec08983073"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 04:39:06 crc kubenswrapper[4901]: I0309 04:39:06.317248 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/525481f4-ac39-4e03-b0a4-13ec08983073-kube-api-access-5bklf" (OuterVolumeSpecName: "kube-api-access-5bklf") pod "525481f4-ac39-4e03-b0a4-13ec08983073" (UID: "525481f4-ac39-4e03-b0a4-13ec08983073"). InnerVolumeSpecName "kube-api-access-5bklf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:39:06 crc kubenswrapper[4901]: I0309 04:39:06.354473 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/525481f4-ac39-4e03-b0a4-13ec08983073-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "525481f4-ac39-4e03-b0a4-13ec08983073" (UID: "525481f4-ac39-4e03-b0a4-13ec08983073"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 04:39:06 crc kubenswrapper[4901]: I0309 04:39:06.412586 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/525481f4-ac39-4e03-b0a4-13ec08983073-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 04:39:06 crc kubenswrapper[4901]: I0309 04:39:06.412624 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/525481f4-ac39-4e03-b0a4-13ec08983073-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 04:39:06 crc kubenswrapper[4901]: I0309 04:39:06.412639 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bklf\" (UniqueName: \"kubernetes.io/projected/525481f4-ac39-4e03-b0a4-13ec08983073-kube-api-access-5bklf\") on node \"crc\" DevicePath \"\"" Mar 09 04:39:06 crc kubenswrapper[4901]: I0309 04:39:06.698669 4901 generic.go:334] "Generic (PLEG): container finished" podID="525481f4-ac39-4e03-b0a4-13ec08983073" containerID="2d5f4d1eb6e26f8e6501217e2ce75facf15a15d6ab118bdd258427ebaf6391c7" exitCode=0 Mar 09 04:39:06 crc kubenswrapper[4901]: I0309 04:39:06.698724 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67pgs" event={"ID":"525481f4-ac39-4e03-b0a4-13ec08983073","Type":"ContainerDied","Data":"2d5f4d1eb6e26f8e6501217e2ce75facf15a15d6ab118bdd258427ebaf6391c7"} Mar 09 04:39:06 crc kubenswrapper[4901]: I0309 04:39:06.698782 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67pgs" event={"ID":"525481f4-ac39-4e03-b0a4-13ec08983073","Type":"ContainerDied","Data":"34e0b03d7d96ff2c83e1d2b159b5ef69236cded4510891f34140bfb26a176da3"} Mar 09 04:39:06 crc kubenswrapper[4901]: I0309 04:39:06.698813 4901 scope.go:117] "RemoveContainer" containerID="2d5f4d1eb6e26f8e6501217e2ce75facf15a15d6ab118bdd258427ebaf6391c7" Mar 09 04:39:06 crc kubenswrapper[4901]: I0309 04:39:06.699927 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-67pgs" Mar 09 04:39:06 crc kubenswrapper[4901]: I0309 04:39:06.720167 4901 scope.go:117] "RemoveContainer" containerID="edf7b962b70d6298a92d0842dcec259a818c034de2dea437288aa7c01d5d0898" Mar 09 04:39:06 crc kubenswrapper[4901]: I0309 04:39:06.740915 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-67pgs"] Mar 09 04:39:06 crc kubenswrapper[4901]: I0309 04:39:06.748369 4901 scope.go:117] "RemoveContainer" containerID="d5323bc2926d592858d17cf4d1ea8483d55270358125d469830cf2aa3a1dba8c" Mar 09 04:39:06 crc kubenswrapper[4901]: I0309 04:39:06.750094 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-67pgs"] Mar 09 04:39:06 crc kubenswrapper[4901]: I0309 04:39:06.779542 4901 scope.go:117] "RemoveContainer" containerID="2d5f4d1eb6e26f8e6501217e2ce75facf15a15d6ab118bdd258427ebaf6391c7" Mar 09 04:39:06 crc kubenswrapper[4901]: E0309 04:39:06.780040 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d5f4d1eb6e26f8e6501217e2ce75facf15a15d6ab118bdd258427ebaf6391c7\": container with ID starting with 2d5f4d1eb6e26f8e6501217e2ce75facf15a15d6ab118bdd258427ebaf6391c7 not found: ID does not exist" containerID="2d5f4d1eb6e26f8e6501217e2ce75facf15a15d6ab118bdd258427ebaf6391c7" Mar 09 04:39:06 crc kubenswrapper[4901]: I0309 04:39:06.780074 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d5f4d1eb6e26f8e6501217e2ce75facf15a15d6ab118bdd258427ebaf6391c7"} err="failed to get container status \"2d5f4d1eb6e26f8e6501217e2ce75facf15a15d6ab118bdd258427ebaf6391c7\": rpc error: code = NotFound desc = could not find container \"2d5f4d1eb6e26f8e6501217e2ce75facf15a15d6ab118bdd258427ebaf6391c7\": container with ID starting with 2d5f4d1eb6e26f8e6501217e2ce75facf15a15d6ab118bdd258427ebaf6391c7 not found: ID does not exist" Mar 09 04:39:06 crc kubenswrapper[4901]: I0309 04:39:06.780095 4901 scope.go:117] "RemoveContainer" containerID="edf7b962b70d6298a92d0842dcec259a818c034de2dea437288aa7c01d5d0898" Mar 09 04:39:06 crc kubenswrapper[4901]: E0309 04:39:06.780415 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edf7b962b70d6298a92d0842dcec259a818c034de2dea437288aa7c01d5d0898\": container with ID starting with edf7b962b70d6298a92d0842dcec259a818c034de2dea437288aa7c01d5d0898 not found: ID does not exist" containerID="edf7b962b70d6298a92d0842dcec259a818c034de2dea437288aa7c01d5d0898" Mar 09 04:39:06 crc kubenswrapper[4901]: I0309 04:39:06.780435 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edf7b962b70d6298a92d0842dcec259a818c034de2dea437288aa7c01d5d0898"} err="failed to get container status \"edf7b962b70d6298a92d0842dcec259a818c034de2dea437288aa7c01d5d0898\": rpc error: code = NotFound desc = could not find container \"edf7b962b70d6298a92d0842dcec259a818c034de2dea437288aa7c01d5d0898\": container with ID starting with edf7b962b70d6298a92d0842dcec259a818c034de2dea437288aa7c01d5d0898 not found: ID does not exist" Mar 09 04:39:06 crc kubenswrapper[4901]: I0309 04:39:06.780447 4901 scope.go:117] "RemoveContainer" containerID="d5323bc2926d592858d17cf4d1ea8483d55270358125d469830cf2aa3a1dba8c" Mar 09 04:39:06 crc kubenswrapper[4901]: E0309 04:39:06.780735 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5323bc2926d592858d17cf4d1ea8483d55270358125d469830cf2aa3a1dba8c\": container with ID starting with d5323bc2926d592858d17cf4d1ea8483d55270358125d469830cf2aa3a1dba8c not found: ID does not exist" containerID="d5323bc2926d592858d17cf4d1ea8483d55270358125d469830cf2aa3a1dba8c" Mar 09 04:39:06 crc kubenswrapper[4901]: I0309 04:39:06.780759 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5323bc2926d592858d17cf4d1ea8483d55270358125d469830cf2aa3a1dba8c"} err="failed to get container status \"d5323bc2926d592858d17cf4d1ea8483d55270358125d469830cf2aa3a1dba8c\": rpc error: code = NotFound desc = could not find container \"d5323bc2926d592858d17cf4d1ea8483d55270358125d469830cf2aa3a1dba8c\": container with ID starting with d5323bc2926d592858d17cf4d1ea8483d55270358125d469830cf2aa3a1dba8c not found: ID does not exist" Mar 09 04:39:08 crc kubenswrapper[4901]: I0309 04:39:08.133651 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="525481f4-ac39-4e03-b0a4-13ec08983073" path="/var/lib/kubelet/pods/525481f4-ac39-4e03-b0a4-13ec08983073/volumes" Mar 09 04:39:18 crc kubenswrapper[4901]: I0309 04:39:18.108145 4901 scope.go:117] "RemoveContainer" containerID="314b4bd509612b348bb3d7d47b625755ea46c37745d82dbd5b718f92d9be76de" Mar 09 04:39:18 crc kubenswrapper[4901]: E0309 04:39:18.109165 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:39:33 crc kubenswrapper[4901]: I0309 04:39:33.107793 4901 scope.go:117] "RemoveContainer" containerID="314b4bd509612b348bb3d7d47b625755ea46c37745d82dbd5b718f92d9be76de" Mar 09 04:39:33 crc kubenswrapper[4901]: E0309 04:39:33.109079 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:39:45 crc kubenswrapper[4901]: I0309 04:39:45.106469 4901 scope.go:117] "RemoveContainer" containerID="314b4bd509612b348bb3d7d47b625755ea46c37745d82dbd5b718f92d9be76de" Mar 09 04:39:45 crc kubenswrapper[4901]: E0309 04:39:45.107483 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:39:57 crc kubenswrapper[4901]: I0309 04:39:57.106636 4901 scope.go:117] "RemoveContainer" containerID="314b4bd509612b348bb3d7d47b625755ea46c37745d82dbd5b718f92d9be76de" Mar 09 04:39:57 crc kubenswrapper[4901]: E0309 04:39:57.107661 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5c998_openshift-machine-config-operator(65e722e8-52c4-4bb6-9927-f378b2f7296a)\"" pod="openshift-machine-config-operator/machine-config-daemon-5c998" podUID="65e722e8-52c4-4bb6-9927-f378b2f7296a" Mar 09 04:40:00 crc kubenswrapper[4901]: I0309 04:40:00.163368 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550520-d8z8h"] Mar 09 04:40:00 crc kubenswrapper[4901]: E0309 04:40:00.164489 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="525481f4-ac39-4e03-b0a4-13ec08983073" containerName="extract-utilities" Mar 09 04:40:00 crc kubenswrapper[4901]: I0309 04:40:00.164516 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="525481f4-ac39-4e03-b0a4-13ec08983073" containerName="extract-utilities" Mar 09 04:40:00 crc kubenswrapper[4901]: E0309 04:40:00.164553 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6385dcc-da9d-4582-a1a3-b05ef6dbfd74" containerName="extract-content" Mar 09 04:40:00 crc kubenswrapper[4901]: I0309 04:40:00.164565 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6385dcc-da9d-4582-a1a3-b05ef6dbfd74" containerName="extract-content" Mar 09 04:40:00 crc kubenswrapper[4901]: E0309 04:40:00.164585 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6385dcc-da9d-4582-a1a3-b05ef6dbfd74" containerName="registry-server" Mar 09 04:40:00 crc kubenswrapper[4901]: I0309 04:40:00.164596 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6385dcc-da9d-4582-a1a3-b05ef6dbfd74" containerName="registry-server" Mar 09 04:40:00 crc kubenswrapper[4901]: E0309 04:40:00.164623 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="525481f4-ac39-4e03-b0a4-13ec08983073" containerName="extract-content" Mar 09 04:40:00 crc kubenswrapper[4901]: I0309 04:40:00.164635 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="525481f4-ac39-4e03-b0a4-13ec08983073" containerName="extract-content" Mar 09 04:40:00 crc kubenswrapper[4901]: E0309 04:40:00.164654 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="525481f4-ac39-4e03-b0a4-13ec08983073" containerName="registry-server" Mar 09 04:40:00 crc kubenswrapper[4901]: I0309 04:40:00.164667 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="525481f4-ac39-4e03-b0a4-13ec08983073" containerName="registry-server" Mar 09 04:40:00 crc kubenswrapper[4901]: E0309 04:40:00.164690 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6385dcc-da9d-4582-a1a3-b05ef6dbfd74" containerName="extract-utilities" Mar 09 04:40:00 crc kubenswrapper[4901]: I0309 04:40:00.164702 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6385dcc-da9d-4582-a1a3-b05ef6dbfd74" containerName="extract-utilities" Mar 09 04:40:00 crc kubenswrapper[4901]: I0309 04:40:00.164997 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6385dcc-da9d-4582-a1a3-b05ef6dbfd74" containerName="registry-server" Mar 09 04:40:00 crc kubenswrapper[4901]: I0309 04:40:00.165031 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="525481f4-ac39-4e03-b0a4-13ec08983073" containerName="registry-server" Mar 09 04:40:00 crc kubenswrapper[4901]: I0309 04:40:00.165919 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550520-d8z8h" Mar 09 04:40:00 crc kubenswrapper[4901]: I0309 04:40:00.167976 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 04:40:00 crc kubenswrapper[4901]: I0309 04:40:00.169303 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lq6vs" Mar 09 04:40:00 crc kubenswrapper[4901]: I0309 04:40:00.169846 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 04:40:00 crc kubenswrapper[4901]: I0309 04:40:00.177026 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ktzj\" (UniqueName: \"kubernetes.io/projected/148f5083-5439-41ef-b6e4-ef8c36079b02-kube-api-access-5ktzj\") pod \"auto-csr-approver-29550520-d8z8h\" (UID: \"148f5083-5439-41ef-b6e4-ef8c36079b02\") " pod="openshift-infra/auto-csr-approver-29550520-d8z8h" Mar 09 04:40:00 crc kubenswrapper[4901]: I0309 04:40:00.180180 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550520-d8z8h"] Mar 09 04:40:00 crc kubenswrapper[4901]: I0309 04:40:00.278717 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ktzj\" (UniqueName: \"kubernetes.io/projected/148f5083-5439-41ef-b6e4-ef8c36079b02-kube-api-access-5ktzj\") pod \"auto-csr-approver-29550520-d8z8h\" (UID: \"148f5083-5439-41ef-b6e4-ef8c36079b02\") " pod="openshift-infra/auto-csr-approver-29550520-d8z8h" Mar 09 04:40:00 crc kubenswrapper[4901]: I0309 04:40:00.302582 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ktzj\" (UniqueName: \"kubernetes.io/projected/148f5083-5439-41ef-b6e4-ef8c36079b02-kube-api-access-5ktzj\") pod \"auto-csr-approver-29550520-d8z8h\" (UID: \"148f5083-5439-41ef-b6e4-ef8c36079b02\") " pod="openshift-infra/auto-csr-approver-29550520-d8z8h" Mar 09 04:40:00 crc kubenswrapper[4901]: I0309 04:40:00.517584 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550520-d8z8h" Mar 09 04:40:01 crc kubenswrapper[4901]: I0309 04:40:01.008501 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550520-d8z8h"] Mar 09 04:40:01 crc kubenswrapper[4901]: W0309 04:40:01.015391 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod148f5083_5439_41ef_b6e4_ef8c36079b02.slice/crio-81b16f7130bc10ca70fc46217982f7b64656c9738c44495ca0173cb405318225 WatchSource:0}: Error finding container 81b16f7130bc10ca70fc46217982f7b64656c9738c44495ca0173cb405318225: Status 404 returned error can't find the container with id 81b16f7130bc10ca70fc46217982f7b64656c9738c44495ca0173cb405318225 Mar 09 04:40:01 crc kubenswrapper[4901]: I0309 04:40:01.268539 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550520-d8z8h" event={"ID":"148f5083-5439-41ef-b6e4-ef8c36079b02","Type":"ContainerStarted","Data":"81b16f7130bc10ca70fc46217982f7b64656c9738c44495ca0173cb405318225"} Mar 09 04:40:03 crc kubenswrapper[4901]: I0309 04:40:03.294605 4901 generic.go:334] "Generic (PLEG): container finished" podID="148f5083-5439-41ef-b6e4-ef8c36079b02" containerID="2562b46caad2a0888507b0b4d0c48bcf0ed6178ce6b1ce4f92ed28489d3f3022" exitCode=0 Mar 09 04:40:03 crc kubenswrapper[4901]: I0309 04:40:03.294688 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550520-d8z8h" event={"ID":"148f5083-5439-41ef-b6e4-ef8c36079b02","Type":"ContainerDied","Data":"2562b46caad2a0888507b0b4d0c48bcf0ed6178ce6b1ce4f92ed28489d3f3022"} Mar 09 04:40:04 crc kubenswrapper[4901]: I0309 04:40:04.746113 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550520-d8z8h" Mar 09 04:40:04 crc kubenswrapper[4901]: I0309 04:40:04.868699 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ktzj\" (UniqueName: \"kubernetes.io/projected/148f5083-5439-41ef-b6e4-ef8c36079b02-kube-api-access-5ktzj\") pod \"148f5083-5439-41ef-b6e4-ef8c36079b02\" (UID: \"148f5083-5439-41ef-b6e4-ef8c36079b02\") " Mar 09 04:40:04 crc kubenswrapper[4901]: I0309 04:40:04.879728 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/148f5083-5439-41ef-b6e4-ef8c36079b02-kube-api-access-5ktzj" (OuterVolumeSpecName: "kube-api-access-5ktzj") pod "148f5083-5439-41ef-b6e4-ef8c36079b02" (UID: "148f5083-5439-41ef-b6e4-ef8c36079b02"). InnerVolumeSpecName "kube-api-access-5ktzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 04:40:04 crc kubenswrapper[4901]: I0309 04:40:04.972074 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ktzj\" (UniqueName: \"kubernetes.io/projected/148f5083-5439-41ef-b6e4-ef8c36079b02-kube-api-access-5ktzj\") on node \"crc\" DevicePath \"\"" Mar 09 04:40:05 crc kubenswrapper[4901]: I0309 04:40:05.321106 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550520-d8z8h" event={"ID":"148f5083-5439-41ef-b6e4-ef8c36079b02","Type":"ContainerDied","Data":"81b16f7130bc10ca70fc46217982f7b64656c9738c44495ca0173cb405318225"} Mar 09 04:40:05 crc kubenswrapper[4901]: I0309 04:40:05.321164 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81b16f7130bc10ca70fc46217982f7b64656c9738c44495ca0173cb405318225" Mar 09 04:40:05 crc kubenswrapper[4901]: I0309 04:40:05.321717 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550520-d8z8h" Mar 09 04:40:05 crc kubenswrapper[4901]: I0309 04:40:05.825329 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550514-m9gf9"] Mar 09 04:40:05 crc kubenswrapper[4901]: I0309 04:40:05.831482 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550514-m9gf9"] Mar 09 04:40:06 crc kubenswrapper[4901]: I0309 04:40:06.122141 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ca56c3b-34f4-4ea7-aa9f-5c34d067cf7c" path="/var/lib/kubelet/pods/7ca56c3b-34f4-4ea7-aa9f-5c34d067cf7c/volumes" Mar 09 04:40:08 crc kubenswrapper[4901]: I0309 04:40:08.108618 4901 scope.go:117] "RemoveContainer" containerID="314b4bd509612b348bb3d7d47b625755ea46c37745d82dbd5b718f92d9be76de" Mar 09 04:40:08 crc kubenswrapper[4901]: I0309 04:40:08.354083 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5c998" event={"ID":"65e722e8-52c4-4bb6-9927-f378b2f7296a","Type":"ContainerStarted","Data":"59682615936c92f2e7bbfae7c99fa2231959334856cd11353fcf0785ce9948cf"}